Senior Enterprise Applications Engineer
Cox Automotive
Stone Mountain, GA
Apply
JOB DETAILS
SALARY
$92,300–$153,900
SKILLS
Accounting, Accounting Consolidation, Agile Programming Methodologies, Amazon Web Services (AWS), Analysis Skills, Analytical Development, Apache, Application Programming Interface (API), Artificial Intelligence (AI), Backlog Prioritization, Best Practices, Business Intelligence, Business Intelligence Software, Cloud Computing, Communication Skills, Continuous Deployment/Delivery, Continuous Improvement, Continuous Integration, Cross-Functional, Data Analysis, Data Management, Data Modeling, Data Quality, Data Science, Data Storage, Data Warehousing, Data Warehousing Applications, Database Extract Transform and Load (ETL), DevOps, Documentation, Ecosystems, Emerging Technology, Enterprise Applications, Establish Priorities, Finance, Finance Software, Financial Analysis, Financial Planning, Financial Planning and Analysis (FP&A), Financial Reporting, Forecasting, Functional Programming Languages, Git, GitHub, Informatica, Mentoring, Metadata, Microsoft SQL Server, Microsoft Windows Azure, MySQL, Needs Assessment, Organizational Development/Management, Performance Metrics, PostgreSQL, Power BI, Predictive Modeling, Presentation/Verbal Skills, Problem Solving Skills, Process Improvement, Project/Program Management, Regression Testing, Relational Databases (RDBMS), Reporting Dashboards, Requirements Management, SQL (Structured Query Language), Sales/Support Engineering (SE), Scalable System Development, Snowflake Schema, Software Development Lifecycle (SDLC), Software Engineering, Sprint Planning, Streaming Technology, Tableau, Technical Presentation, Test Plan/Schedule, Validation Testing, Waterfall Model of Software Development, Workflow Analysis
LOCATION
Stone Mountain, GA
POSTED
Today
This position interacts with a variety of teams-including Finance, Accounting, FP&A, and cross-functional technology groups-to understand business needs and deliver data-driven solutions. The insights and infrastructure provided by this individual will be used to make informed prioritizations, support standard and ad-hoc reporting, and enable the development of new analytical capabilities. The Senior Engineer is expected to perform these duties with minimal daily oversight while mentoring junior team members, contributing to Agile planning, and staying current on emerging technologies.
What You'll Do:
Primary Responsibilities:
Data Engineering, Quality & Pipeline Development
- Design, build, and maintain scalable data pipelines and ETL/ELT workflows using Snowflake, AWS, dbt, and/or Informatica, integrating enterprise financial applications with data warehouses, relational databases (PostgreSQL, MySQL, SQL Server), and BI platforms.
- Ensure data integrity by implementing automated quality checks, regression testing, validation frameworks, and anomaly monitoring-working with internal and external data providers to customize data feeds and mappings.
Project Delivery & Solution Design
- Independently plan, manage, and deliver small to medium-sized projects end-to-end (requirements gathering, solution design, development, testing, deployment, documentation, and post-implementation support), while also contributing technical components to larger cross-functional programs.
Analytics, Reporting & Continuous Improvement
- Support dashboard and reporting development using Power BI, Tableau, or similar tools to deliver KPI insights, and lead enhancements that improve financial processes, reduce manual work, and increase accuracy.
- Leverage AI-assisted tools (e.g., Claude, GitHub Copilot, Snowflake Cortex, M365 Copilot) to accelerate development, data validation, and analysis workflows.
Agile Planning & Emerging Technologies
- Partner with finance and analytics teams to understand day-to-day challenges and design viable data solutions; recommend improvements to processes, technology, and interfaces that reduce technical debt.
- Stay current on new data technologies, AI, ML, Data Science, CPM platform innovations, and best practices; share insights and contribute to design standards across the organization.
Who Are You:
Minimum Requirements:
- Bachelor's degree in related discipline and 4+ years of experience in data engineering or architecture. The right candidate could also have a different combination, such as a master's degree and 2 years' experience; a Ph.D. and up to 1 year of experience; or 16 years' experience in a related field.
- 4+ years of hands-on experience in data engineering, business intelligence, and/or enterprise analytics across multiple functional areas (reporting, dashboards, data pipelines, and data modeling).
- Advanced SQL proficiency and experience with data integration tools (e.g., MS SQL Developer, dbt, Informatica).
- Must have strong working experience with data modeling, data access, schemas, and data storage techniques within Snowflake.
- Working experience in design, development, and implementation of scalable data pipelines in cloud environments (AWS, Snowflake).
- Working experience with ETL/ELT patterns, data warehousing concepts, and data orchestration tools.
- Working experience working with relational databases such as SQL, MySQL, Postgres/PostgreSQL.
- Working experience with business intelligence tools and platforms (Power BI, Tableau, or similar).
- Working experience with data quality tools.
- Working experience with application lifecycle methodologies (e.g. waterfall, agile, iterative).
- Demonstrated project management experience with complex system implementations.
- Experience working with Git.
- Experience working with AI related tools such as GitHub Copilot, Snowflake Cortex, or M365 Copilot.
- Experience implementing new tools into environments.
- Excellent analytical, problem-solving, and communication skills with the ability to present technical concepts to non-technical stakeholders.
Preferred Qualifications:
- Experience with AI/ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) for data transformation and predictive modeling.
- Familiarity with data orchestration tools such as Apache Airflow, dbt, or Dagster.
- Hands-on experience with cloud-native data platforms (e.g., Snowflake, AWS Redshift, Azure Synapse).
- Knowledge of data governance and metadata management best practices.
- Experience integrating external data sources and APIs into enterprise data ecosystems.
- Strong understanding of CI/CD pipelines and DevOps practices for data engineering.
- Ability to work in Agile environments and contribute to sprint planning and backlog grooming.
- Exposure to real-time data streaming technologies (e.g., Kafka, Kinesis) is a plus.
USD 92,300.00 - 153,900.00 per year
Compensation:
Compensation includes a base salary in the range of $92,300.00 - $153,900.00. The base salary may vary within the anticipated base pay range based on factors such as the ultimate location of the position and the selected candidate's knowledge, skills, and abilities. Position may be eligible for additional compensation that may include an incentive program.
Benefits:
The Company offers eligible employees the flexibility to take as much vacation with pay as they deem consistent with their duties, the company's needs, and its obligations; seven paid holidays throughout the calendar year; and up to 160 hours of paid wellness annually for their own wellness or that of family members. Employees are also eligible for additional paid time off in the form of bereavement leave, time off to vote, jury duty leave, volunteer time off, military leave, and parental leave.
About the Company
C