We are seeking a Senior Data Engineer to design, build, and maintain scalable data infrastructure that powers analytics, reporting, and strategic decision-making. This role focuses on developing robust data pipelines, API integrations, and governance frameworks to ensure high-quality, reliable, and accessible data. The ideal candidate combines hands-on engineering expertise with a strategic mindset to optimize the organization’s data ecosystem and drive business value.
Key Responsibilities
Design, implement, and maintain secure, scalable data pipelines in cloud environments (e.G., Azure), ensuring data integrity, availability, and performance.
Develop API-based integrations and workflows to handle growing data volumes and complexity across multiple systems.
Collaborate with analysts, report developers, and stakeholders to improve data models and accessibility in BI tools (Power BI).
Build and optimize ETL/ELT processes to transform and load data from diverse sources.
Monitor, troubleshoot, and resolve pipeline issues, maintaining high availability of critical data assets.
Partner with business and technology teams to define and evolve long-term data platform architecture.
Implement and maintain data quality frameworks, including lineage, metadata management, and anomaly detection.
Identify, integrate, and validate data sources to ensure compatibility, cleanliness, and compliance.
Utilize DevOps tools (e.G., Git, CI/CD) to deploy and manage data workflows.
Collaborate in Agile teams and cross-functional initiatives, providing technical guidance and support.
Mentor team members and contribute to other data engineering projects as assigned.
Minimum Qualifications
Proven ability to transform raw data into structured datasets that drive business decisions.
Strong problem-solving, troubleshooting, and prioritization skills.
Effective written and verbal communication skills with strong documentation practices.
Process-oriented mindset with a focus on data accuracy, security, and governance.
Education & Experience
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
5+ years of experience in data architecture, pipeline development, or related roles.
5+ years hands-on experience with Python and SQL.
Solid experience in dimensional modeling, data warehousing, and database design.
Microsoft DP-203 certification preferred.
Preferred Experience
5+ years managing enterprise data migrations and transformations.
3+ years building and deploying pipelines in Azure Data Services (Data Factory, Data Lake, Databricks, Event Hubs).
3+ years experience in Agile/DevOps environments, including CI/CD practices.
Familiarity with modern data quality and governance tools (e.G., Azure Purview).
About the Company
P
Peyton Resource Group
Established in 2001, Peyton Resource Group is a solution-based staffing company that matches businesses with top talent for short-term, long-term or permanent needs.
People are a business’s most valuable asset. Peyton Resource Group is dedicated to helping companies find the best talent, matching professionals with jobs where they will thrive.
With locations in Dallas/Fort Worth, San Antonio and Austin, we are available to serve your staffing needs throughout Texas and across the country.