Agile Programming Methodologies, Amazon Web Services (AWS), Cloud Computing, Continuous Deployment/Delivery, Continuous Integration, Cross-Functional, Data Management, Data Processing, Data Warehousing, GCP (Good Clinical Practices), Identify Issues, Microsoft Windows Azure, Performance Tuning/Optimization, Problem Solving Skills, Python Programming/Scripting Language, SQL (Structured Query Language), Scalable System Development, Writing Skills
LOCATION
Indianapolis, IN
POSTED
9 days ago
Location: Onsite (Preferred) Employment Type: Direct HireĀ
Overview:
We are seeking a highly skilled Data Engineer/Developer to join a growing team focused on building scalable, efficient data solutions. This role is ideal for someone who thrives in a modern data stack environment and takes a pragmatic, problem-solving approach to development. The ideal candidate is not just a coder, but an efficiency-focused engineer; someone who builds reusable components, leverages existing code, and avoids unnecessary reinvention.
Key Responsibilities:
Design, develop, and maintain data pipelines and workflows using modern tools and frameworks
Write and optimize complex code using Python and SQL
Build and manage data transformations using DBT and/or SQLMesh
Develop solutions within Databricks to support scalable data processing
Implement and support CI/CD pipelines for data workflows
Create reusable, modular code to improve development efficiency and scalability
Collaborate with cross-functional teams to understand data requirements and deliver solutions
Troubleshoot and optimize performance across data systems
Required Qualifications:
3-7+ years of experience in data engineering or data development
Strong hands-on experience with:
Databricks
Python
SQL
Experience with DBT and/or SQLMesh
Familiarity with CI/CD practices in a data environment
Proven ability to write clean, scalable, and efficient code
Strong problem-solving skills and the ability to work independently
Preferred Qualifications:
Experience with cloud platforms (AWS, Azure, or GCP)
Familiarity with data warehousing and lakehouse architectures
Experience building modular or reusable code frameworks