Senior Data Engineer

Tech Providers Inc.

Los Angeles, CA

Apply
JOB DETAILS
JOB TYPE
Per diem
SKILLS
Application Programming Interface (API), Big Data, Business Intelligence, Business Intelligence Software, Cloud Computing, Computer Security, Continuous Deployment/Delivery, Continuous Integration, Data Lake, Data Management, Data Modeling, Data Processing, Data Storage, Data Warehousing, Database Design, Database Extract Transform and Load (ETL), Database Technology, Ecosystems, Enterprise Architecture, Entity Relationship Diagram (ERD), GitHub, Information Technology & Information Systems, Information/Data Security (InfoSec), Microsoft Windows Azure, OLAP (OnLine Analytical Processing), Performance Tuning/Optimization, Power BI, Programming Languages, Python Programming/Scripting Language, Release Management/Engineering, Resource Management, SQL (Structured Query Language), Sales Pipeline, Scala Programming Language, Scripting (Scripting Languages), Source Code/Configuration Management (SCM), System Integration (SI), Tableau, Transaction Processing/Management
LOCATION
Los Angeles, CA
POSTED
23 days ago
Job Title         : Senior Data Engineer
Job location : Los Angeles, CA (Onsite)
Duration         : 12+ month contract with possibility for extension

Job Responsibilities:
Cloud Platforms: Deep understanding of Azure ecosystem, including Azure Data Factory, Data Lake Storage, Blob Storage, power apps, and Functions. Additionally, in-depth understanding and implementation of API management such as Apigee. Big Data Technologies: Proficiency in Databricks, Spark, PySpark, Scala, and SQL. Data Engineering Fundamentals: Expertise in ETL/ELT processes, data pipelines, data modeling, schema design, and data warehousing. Programming Languages: Strong Python and SQL skills, with knowledge of other languages like Scala or R beneficial. Data Warehousing and Business Intelligence: Strong ERD concepts, designs, and patterns, Understanding of OLAP/OLTP systems, performance tuning, Database Server concepts, and BI tools (Power BI, Tableau). Data Governance: Strong understanding of RBAC/ABAC, Data Lineage, Data leak prevention, Data security, and compliance. Deep understanding and implementation knowledge of audit and monitoring in Cloud. Infrastructure Deployment: GitHub version control, CI/CD pipelines, release management, Terraform and YAML templates, and script-based deployments.
 
Job Description:
  • Seven (7) years of applying Enterprise Architecture principles, with at least five (5) years in a lead capacity. Five (5) years of hands-on experience with Azure Data Factory, Azure Databricks, API implementation and management solution, and managing Azure resources.
  • Five (5) years of experience in the following: developing data models and pipelines using Python; working with Lakehouse platforms; GitHub CI/CD pipelines and infrastructure automation, Terraform scripting; and with data warehousing systems, OLAP/OLTP systems, and integration of BI tools.
 
Education:
  • This classification requires the possession of a bachelor’s degree in an IT-related or Engineering field.

About the Company

T

Tech Providers Inc.