Data Engineer Jobs in the United States
Georgia Tek Systems
Chicago, IL
Collaboration: Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps, to understand data requirements and deliver solutions. Key Responsibilities Design & Development: Develop and optimize scalable, reliable, and secure data pipelines and platforms using Azure Data Services.
Diverse Linx
Toronto, ON
Strong experience in coding, debugging, and automation using Python, with a focus on delivering business-relevant solutions. A quality- and performance-driven mindset focused on enabling business teams with the right data at the right time.
Diverse Linx
Austin, TX
Working with Python for data manipulation, analysis, and automation: This could include using Python for data extraction, transformation, loading (ETL) processes, and automating routine tasks. Developing and maintaining Tableau dashboards and reports: This would involve translating business requirements into technical specifications and creating visually compelling and interactive dashboards.
Apple
Seattle, WA
As part of the analytics team, you will sit at the intersection of data engineering, statistical insight, and product impact - building the pipelines and tools that help Apple teams run better experiments and ship better features and next-generation GenAI products. From architecting robust pipelines and data models to delivering dashboards and data-driven features, your work will directly shape how Apple teams measure, iterate, and innovate.
Artech LLC
$42 - $45
Plano, TX
This role involves leading the development and management of data engineering solutions, focusing on cloud migration and data pipeline optimization. Experience with IBM Infosphere Information Server (IIS) products like Data Stage and Quality Stage.
Artech LLC
$45 - $60
Pittsburgh, PA
The ideal candidate will have extensive experience in data engineering, particularly with Python and Hadoop, and will be responsible for designing and maintaining robust data pipelines and infrastructure. This role involves leading technical projects, ensuring data quality and scalability, and collaborating with cross-functional teams.
JPMorgan Chase Bank, N.A.
Columbus, OH
As a Senior Lead Data Engineer at JPMorgan Chase within Enterprise Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. Partner with engineering and central platform teams to enable and scale core data lake capabilities on Databricks, covering ingestion, processing, governance, cataloging, quality, lineage, access, and consumption.
Career Land Center
Richmond, Virginia
Role Summary: Seeking a Database Administrator / Data Engineer with experience migrating on-prem SQL Server databases to AWS and Snowflake. The role will support database migration, Python-based data processing, and development of AWS data pipelines for applications using Amazon RDS and DynamoDB.
Techstra Solutions
Pittsburgh, PA
Techstra Solutions is a certified woman-owned (WBENC) management consulting firm specializing in strategy, technology, and implementation services for large organizations undergoing digital and talent transformation. The ideal candidate is a strong engineer who enjoys solving complex data problems, understands domain-driven design principles, and thrives in a collaborative, fast-moving environment.
Americor
Irvine, CA
Answer data servicing needs every day with designs, builds and maintenance that supports ongoing data collection, cleansing, storage, processing and retrieval. Any unsolicited resumes sent to Americor Funding Inc. by an Agency—including those sent to any Company mailing address, fax, email, employee, or the resume database—will be considered the property of the Company.
Siritech Solutions Corp
whippany, NJ
Build and maintain high-throughput data pipelines, infrastructure, and storage solutions specifically to feed, train, and deploy AI/ML models, implementing RAG (Retrieval-Augmented Generation) systems, data cleaning, and model evaluation to ensure efficient, scalable, and reliable LLM applications. Experienced and skilled in designing, building, and maintaining high-quality data pipelines, preprocessing workflows, and vector databases required for training, fine-tuning, and deploying Large Language Models (LLMs).
Medpace, Inc.
Cincinnati, Ohio
Responsibilities : Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and. Qualifications : Bachelor's Degree in Computer Science, Data Science, or a related field; 5+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and.
Medpace, Inc.
Cincinnati, Ohio
Responsibilities : Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and. Qualifications : Bachelor's Degree in Computer Science, Data Science, or a related field; Internship experience in Data or Software Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and.
Medpace, Inc.
Cincinnati, Ohio
Responsibilities : Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and. Qualifications : Bachelor's Degree in Computer Science, Data Science, or a related field; 3+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and.
URBN
Philadelphia, Pennsylvania
Role Responsibilities: Collaborate with cross-functional teams to integrate AI solutions into digital products and workflows, partnering with engineers to translate prototypes into scalable features and services. In this unique opportunity, you will use your full-stack engineering skills to implement middleware experiences and cutting-edge algorithms into our rapidly evolving products across our digital ecosystem.
Southwest Research Institute
San Antonio, TX
Who We Are: The Kinesioception Section develops automated perception and decision-support solutions for commercial and government clients in multiple domains, including human performance, data science, artificial intelligence, computer vision, machine learning, decision intelligence, and cognitive performance. Requirements: Requires a Bachelors, Masters or a PhD in Computer Science, Biomedical Engineering, Electrical Engineering, or directly related degree field- 3 years experience with PhD, 4 years experience with Masters, 5 years experience with Bachelors.
Publix
Lakeland, FL
Bachelor’s degree in Management Information Systems, Information Technology, Computer Science, Computer Engineering, or similar technical discipline, ability to quickly develop an understanding of new information, processes, and technologies. Azure Databricks (workbooks, jobs, cluster management, ADLS integration, Delta tables, SQL Warehouses, Unity Catalog, CLI, etc.
Proofpoint
$136200 - $214005
Seattle, WA
Familiarity with using the Linux command line, and tools for manipulating and extracting content from text files + Good knowledge of regular expressions + Familiarity with how mail delivery works, including SMTP + General curiosity about the headers and structure of email messages + Experience in a data science or similar role ( a plus) + Willingness to interact with customers through our web-based ticketing system to help resolve their issues + Ability to work independently but also to collaborate with worldwide, remote teams + Positive, friendly attitude that enjoys problem solving + BSc or equivalent in IT related subject, or equivalent technical experience + Experience with signature-based detections such as Clam, Yara, or similar an advantage + Familiarity with a scripting language such as Python or Perl an advantage (a Big Plus) + U.S. Citizenship required \#LI-AN1 **Why Proofpoint?** + Research into new trends and creation of pro-active detection to stop new threats before they start + Contribute to the development of new tools and automation to aid in front line analysis, and to identify the latest threats + Work with the team to come up with new and novel ways to detect threats + Take on more complex customer false negative or false positive cases escalated by other analysts in the team that require more in-depth investigation and analysis + Work on internal escalation tickets created by field teams for customers experiencing more complex or systemic recurring issues that have not been solved through usual means, collaborating with other engineering teams where necessary to find the best solutions + On-call work - that means responding to high priority alerts sent by our threat monitoring system, and periodic monitoring of essential systems.
APN Consulting Inc
Oldwick, NJ
The ideal candidate will engage with cross-functional teams to gather data requirements, propose enhancements to existing data pipelines and structures, and ensure the reliability and efficiency of data processes. We strongly encourage applications from candidates of all genders, races, ethnicities, abilities, and experiences to join our team and help us build a culture of belonging.
Applied Medical
$100000 - $130000
Rancho Santa Margarita, CA
The Microsoft Fabric Data Engineer contributes to enterprise data quality and system performance by building scalable data workflows, optimizing data storage, and supporting advanced analytics across business teams. Our unique business model, combined with our dedication to delivering the highest quality products, enables team members to contribute in a larger capacity than is possible in typical positions.
OZ Digital LLC
Boca Raton, FL
This individual will collaborate with cross-functional stakeholders including data engineers, BI developers, analysts, and business leaders to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making. Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
Apple
Seattle, WA
APX is part of the broader Apple Services Engineering division that powers App Store, Apple TV+, Apple Music, Apple Podcasts, Apple Books, Fitness+, the iTunes Store and more. We drive significant increases in efficiency and productivity through a flawless ecosystem of frameworks and products that unlock observability, knowledge and enable data quality-driven orchestration at scale.
PDS Inc, LLC
Phoenix, AZ
The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
SmartLight Analytics
Tulsa, OK
With this end in mind, SmartLight works for self-funded employers to reduce the wasteful spend in their healthcare plan through our proprietary data analysis. Job Summary: The Data Engineer is responsible for building and maintaining robust data processing solutions within our healthcare analytics platform.
PDS Inc, LLC
Phoenix, AZ
In addition, the Senior Cloud Data Engineer serves as a subject matter expert in Azure cloud data and AI/ML technologies, actively shaping our data and AI/ML platforms, services and engineering processes and plays a pivotal role ensuring the success of our data and AI/ML initiatives. Provide expertise, guidance, and mentoring to other platform engineers across teams and across projects, as needed, in order promote development of internal skills, talent and experience in new technologies.
Soal Technologies Inc
$50 - $60
undefined, undefined
Summary: Build and run a data migration and streaming architecture moving on prem DB2 to AWS PostgreSQL (raw landing) and then to AWS MongoDB (transaction cache), applying transformation rules so downstream apps can consume “friendly” data. • Model MongoDB collections, indexes, sharding/partitioning, TTL/retention for cache use cases.
Charles Schwab
Southlake, TX
Join us as we lead the most significant technical transformation in Client Data Technology —architecting AI-native systems using Retrieval-Augmented Generation (RAG) and Model Context Protocol (MCP) to modernize legacy applications through Domain-Driven Design. With executive-level support, substantial budget, and technical freedom, you’ll pioneer RAG architecture and multi-agent orchestration in a highly regulated, mission-critical environment.
GeoYeti
Herndon, Virginia
Bcore accelerates decisive advantage for warfighters and intelligence professionals by fusing human insight, rapid-fire engineering, precision-measured outcomes, and relentless grit into mission-ready solutions. Whether it’s architecting critical IT solutions, producing actionable intelligence, or developing cutting edge technology, we succeed because of the expertise, collaboration, and agility of our teams.
Open Systems Technologies
Philadelphia, PA
Skill set includes: Expertise on Mainframe Systems. Mainframe Database and Data Structure.
Conflux Systems, Inc.
Nashville, TN
One plus Years of hands-on experience with GCP platform and experience with many of the following components: Cloud Run, Cloud Functions, Pub/Sub, Bigtable, Firestore, Cloud SQL, Cloud Spanner, JSON, Avro, Parquet, Python, Terraform, Big Query, Dataflow, Data Fusion, Cloud Composer, DataProc, CI/CD, Cloud Logging, GitHub. · Strong verbal, written, and interpersonal skills, including a desire to work within a highly matrixed, team-oriented environment.
Latica
Palo Alto, CA
Latica is a secure data network and medical intelligence platform that gives the healthcare ecosystem the ability to quickly and safely access de-identified real-world health and patient data to accelerate diagnostic and therapeutic solutions and improve patient outcomes. Our solution enables the long-hoped-for vision of rapid, safe access to healthcare data and accelerated R&D of diagnostic and therapeutic solutions, pharmaceutical studies, and medical devices.
Resilience
$80000 - $121250
West Chester, OH
This position may also include the following conditions: Sitting and working on computers, meeting with stakeholders for design requirements, working with vendors and regulatory authorities, occasionally working on the plant floor and interacting with equipment. We’re building a sustainable network of high-tech, end-to-end manufacturing solutions to better withstand disruptive events, serve scientific discovery, and reach those in need.
Lorven Technologies
Not available, WA
Demonstrated experience developing and applying various machine learning algorithms (e.g., regression, classification, clustering, tree-based methods, ensemble methods). • Proven experience in developing and implementing solutions using Large Language Models (LLMs) and core Machine Learning techniques.
Amazon Data Services, Inc.
Covington, GA
The Data Center Construction Manager will be responsible for construction project management and oversight of construction related activities as they relate to new builds or general capital projects which includes the ownership of the project scope, quality, schedule, and budget. - Experience defining data center system-level architecture, documenting performance and equipment requirements, creating and communicating conceptual designs, and creating and maintaining project documentation.
Amazon Data Services, Inc.
Culpeper, VA
The Data Center Construction Manager will be responsible for construction project management and oversight of construction related activities as they relate to new builds or general capital projects which includes the ownership of the project scope, quality, schedule, and budget. You will be on the construction site daily interacting with the construction trades, as Amazon’s owner’s representative and be directly responsible for driving cost, schedule, and quality while managing construction vendors and contractors building data centers.
Dako
Dearborn, MI
You will be responsible for architecting the underlying systems that power our data ecosystem - integrating LLMs, RAG (Retrieval-Augmented Generation) solutions, and autonomous agents into enterprise-scale platforms on Google Cloud (GCP). This role is designed for an engineer who views data as a product and possesses a proven track record of building production-grade AI-powered applications and agentic workflows.
Blue Origin LLC
Seattle, WA
and/or transports placardable amounts of hazardous materials by ground in any vehicle on a public road while in commerce, may be subject to additional Federal Motor Carrier Safety Regulations including: Driver Qualification Files, Medical Certification (obtained before onboarding), Road Test, Hours of Service, Drug and Alcohol Testing (CDL drivers only), vehicle inspection requirements, CDL requirements (if applicable) and hazardous materials transportation/shipping training. To conform to U.S. Government commercial space technology export regulations, including the International Traffic in Arms Regulations (ITAR), 8 U.S.C. § 1324b(a)(3), applicants for employment at Blue Origin must be a U.S. citizen or national, lawfully admitted for permanent residence into the U.S. (i.e. current green card holder), or lawfully admitted as a refugee or granted asylum under 8 U.S.C. § 1157-1158.
JPMorgan Chase
$142500 - $185000
Wilmington, DE
As a Lead Data Engineer - Mainframe, DBA/IMS/DB2 at JPMorgan Chase within the Consumer & Community Banking - Card Platform Services Team, you are an integral part of an agile team that works to enhance, build, and deliver database requirements to meet product business deliverables by ensuring the data is always available, accessible , reliable and recoverable. Chase is a leading financial services firm, helping nearly half of America's households and small businesses achieve their financial goals through a broad range of financial products.
JPMorgan Chase
$142500 - $185000
Jersey City, NJ
Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. As a Lead Data Engineer at JPMorgan Chase within Commercial & Investment Bank - Digital Platform Services, you are an integral part of the team that innovates new product offerings and leads the end-to-end product life cycle.
Charlie Health Engineering, Product & Design
$175000 - $255000
New York, NY
The Revenue Engine team at Charlie Health services all parts of our business related to revenue collection, financial reporting, and the patient financial experience by sourcing, curating, and activating internally and externally sourced datasets, and building user-facing products. As a data engineer on the Revenue Engine team, you will be responsible for building ELT pipelines, developing custom DAGs, transforming and warehousing data with DBT, and building integrations between our systems.
Innovative Rocket Technologies Inc.
Hauppauge, NY
We’re recruiting a Data Engineer at iRocket to build pipelines, analytics, and tools that support propulsion test, launch operations, manufacturing, and vehicle performance. Proficient in Python, SQL, cloud data platforms (AWS, GCP, Azure), streaming/real-time analytics, and dashboarding (e.g., Tableau, PowerBI).
Open Systems Technologies
$100 - $130
New York, NY
A financial firm is looking for a Data Engineer - Databricks to join their team. Pay: $100-130/hr.
Conflux Systems, Inc.
$60
Atlanta, GA
Technical Stack • Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake) • Azure Databricks • SQL Server / SQL Managed Instances • Power BI • SSIS (migration and maintenance) • LangGraph and RAG DB (for advanced data workflows) Qualifications • Bachelor's degree in computer science, Information Systems, or related field. This role focuses on assisting with data pipeline development, migration of legacy systems, and maintaining scalable, secure, and efficient data solutions using modern technologies, particularly Microsoft Fabric and Azure-based platforms.
Akraya Inc.
$45 - $48
Sunnyvale, CA
Most recently, we were recognized Stevie Employer of the Year 2025, SIA Best Staffing Firm to work for 2025, Inc 5000 Best Workspaces in US (2025 & 2024) and Glassdoor's Best Places to Work (2023 & 2022)! We are seeking a Senior Data Engineer & Platform Architect to design, develop, and oversee a contemporary data platform aimed at enhancing engineering and manufacturing analytics.
PDS Inc, LLC
Seattle, WA
As a Senior DE you will create solutions to integrate with multi heterogeneous data sources, aggregate and retrieve data in a fast and safe mode, curate data that can be used in reporting, analysis, machine learning models and ad-hoc data requests. You should have excellent business and communication skills to be able to work with business owners, Product teams and Tech leaders to gather infrastructure requirements, design data infrastructure, build up data pipelines and data-sets to meet business needs.
University of North Texas System
Denton, Texas
Bachelor's degree in computer science, information technology, engineering, or a related field and six (6) years of progressively responsible experience in enterprise network engineering, operations, or support; or any equivalent combination of education, training, and experience. These include: shared resources of data centers, computing hardware, software applications, network and voice communications, collaboration and productivity capabilities, central web services, risk management, and security and compliance services.
Beechwood Computing Limited
$90 - $100
Blythewood, SC
10+ years of experience with advanced troubleshooting, routing, and switching. 10+ years of experience with knowledge of monitoring and tracing tools.
Deloitte
$130800 - $241000
Seattle, WA
Preferred Qualifications:* Advanced degrees such as Masters or PhD are preferred* Certifications in AI/ML technologies and Cloud platforms, such as AWS Certified Machine Learning - Specialty, Google Cloud Professional Machine Learning Engineer, Azure AI Engineer, Azure Data Scientist, or Azure Solutions Architect* 5 + years of experience in Data Science, Statistics, and Machine Learning* 5+ years of experience in Generative AI/LLMs, preferably experienced in delivering and productionizing* 5+ years of experience in machine learning model development, natural language processing, and data analysis; Experienced in Supervised and Unsupervised learning, feature engineering, model training, and deployment* 5+ year of experience in implementing cloud-based AI/ML workloads on any of AWS, Microsoft and Azure. *6+ years of consulting experience leading delivery teams, including onshore and offshore team members *6+ years of experience gathering non-functional requirements and defining application architecture frameworks, including validation and testing deliverables *5+ years of experience working in an AI environment *5+ years of experience translating requirements into client ready design documents *5+ years of experience in software application architecture analysis, design, and delivery *5+ years of experience executing full system development life cycle implementations *Ability to travel 0-25%, on average, based on the work you do and the clients and industries/sectors you serve.
Kforce Inc.
$29.75 - $40.25
Doral, FL
Comfort owning complex technical areas in a long lived, evolving codebase, making pragmatic trade-offs, and delivering reliable solutions in production environments. Proven ability to work effectively in hybrid SwiftUI/UIKit codebases, supporting legacy implementations while driving incremental modernization.
DMS Vision Inc
Dallas, TX
Expert-level configuration and administration of FortiGate Next-Generation Firewalls (NGFW), Forti switch and Forti AP, This includes advanced SD-WAN implementation, High Availability (HA) clustering, and proficiency in Forti Manager for centralized orchestration and Forti Analyzer for threat logging and. The role requires strong expertise in Cisco Nexus switching and routing, enterprise firewall management (Fortinet FortiGate and Palo Alto), and hybrid connectivity solutions including Azure ExpressRoute.