ETL Informatica/Hadoop Developer
Appxpertise Inc
Charlotte, NC
Apply
JOB DETAILS
SALARY
$113,894–$140,400 Per Year
SKILLS
Agile Programming Methodologies, Apache HBase, Apache Hadoop, Apache Hive, Apache Pig, Apache Sqoop, Apache ZooKeeper, Business Intelligence, Business Support, CA Workload Automation AE (AutoSys Edition), Computer Science, Data Warehousing, Database Extract Transform and Load (ETL), HDFS (Hadoop Distributed File System), Informatica, JSON, MapReduce, NoSQL, Oracle Database, Oracle PL-SQL, Process Development, SQL (Structured Query Language), XML (EXtensible Markup Language)
LOCATION
Charlotte, NC
POSTED
8 days ago
Role: ETL Informatica/Hadoop Developer
Location: Charlotte, NC: Plano, TX
Duration: Full-Time
Interview: Online/ Video
Job Description
Must Have Technical/Functional Skills
Primary Skill: ETL Informatica Developer
Secondary: Oracle, SQL
Experience: Minimum 10 years
Roles & Responsibilities
- Bachelor’s or master’s degree in computer science or related field. ETL Process Development: Design, develop, and maintain ETL processes using Informatica PowerCenter or other relevant Informatica tools.
- Deep understanding of HDFS, YARN, MapReduce, Hive, Pig, HBase, Flume, Sqoop, Zookeeper, Oozie.
- Experience with Spark, Kafka, NoSQL databases.
- Experience in Agile Methodology
- Experience with code versioning tools like Bit-Bucket
- SQL Proficiency: Utilize SQL/PLSQL to extract, transform, and load data.
- Exposure to advanced transformations like data transformations, Parsing JSON/XML messages
- Experience in Job scheduling tools Like Autosys
- Data Integration: Integrate data from various sources, ensuring data consistency and quality.
- Data Warehouse Design: Design and maintain data warehouses to support business intelligence activities.
- Performance Optimization: Optimize SQL script/queries for speed and efficiency.
- Troubleshooting: Identify and resolve issues in ETL processes.
- Documentation: Create and maintain technical documentation for ETL processes.
- Testing: Perform unit, integration, and system testing on ETL processes.
- Collaboration: Collaborate with cross-functional teams to ensure successful implementation of ETL processes.
- Data Quality: Ensure data quality by implementing data cleansing and transformation processes.
- Data Modeling: Develop and maintain relational and dimensional data models.
About the Company
A