Software Engineer Calibration

Epitec, Inc

Dearborn, MI

JOB DETAILS
SALARY
$36.75–$42 Per Hour
SKILLS
Acceptance Testing, Application Programming Interface (API), Artificial Intelligence (AI), Atlassian JIRA, Authentication, Automotive Engineering, Automotive Manufacturing, Benchmarking, Bill of Materials (BOM), Bug Tracking/Defect Management, Calibration, Cloud Computing, Continuous Deployment/Delivery, Continuous Integration, Data Management, Documentation, Embedded Systems, Environmental Monitoring, Error Handling, Expert Systems, Financial Reporting, Functional Testing, GCP (Good Clinical Practices), ISTQB Foundation, Integration Testing, Leadership, Load Testing, Mail Services, Metrics, Microservices, Multiplatform/Cross-Platform, Performance Analysis, Performance Testing, Problem Solving Skills, Project/Program Management, Quality Assurance, Quality Assurance Methodology, Quality Control, Quality Engineering, Quality Management, Quality Metrics, REST (Representational State Transfer), Regression Testing, Reporting Dashboards, Risk, Scalability Testing, Service Level Agreement (SLA), Smoke Testing, Software Development Lifecycle (SDLC), Software Engineering, Software Testing, Statement of Work (SOW), System Validation, Team Player, Telemetry, Test Automation, Test Case, Test Data, Test Design, Test Driven Development (TDD), Test Plan/Schedule, Test Scenario, Test Suite, Test Tools, Testing, Unit Test, Use Cases, Writing Skills
LOCATION
Dearborn, MI
POSTED
Today

Job Title: Software Engineer Calibration

Location: Dearborn, MI

Job Type: Automotive/Engineering

Expected hours per week (must include “per week”): 40/week

Schedule (include days, hours, onsite/hybrid or remote): 09:00 AM to 05:00 PM, Monday-Friday, 4 days in office and 1 day remote

Pay Range (must include “per hour”): $36.75-$42.00/hr

Job Description:
Own the formal Deliverable Acceptance process for TOP's external vendor engagement, reviewing each submitted Deliverable against the Acceptance Criteria defined in the vendor SOW and issuing written acceptance or a specific, actionable defect list · Verify vendor container image deliverables against the Software Bill of Materials (SBOM), confirming that all declared components are present and no unapproved components are included · Validate that vendor-delivered AI engine deployments comply with the client’s technical architecture requirements, including confirming that fine-tuned model weights are stored in Vertex AI Model Registry and not embedded in container images · Design, build, and maintain automated test suites for TOP platform backend services, APIs, and data pipelines · Develop AI engine output evaluation frameworks that test inference quality against defined accuracy benchmarks for each dealer service use case · Own the UAT (User Acceptance Testing) process for dealer-facing interfaces, coordinating with Service stakeholders to recruit pilot users, design test scenarios, and capture structured feedback · Manage Jira project bug triage workflow: creating Bug records for confirmed Tier 2 issues, assigning priority in alignment with SLA tiers, tracking vendor acknowledgment and resolution timelines, and reporting SLA compliance metrics · Perform regression testing for each new container image version before the client authorizes production deployment · Define test environments within client's GCP project space in collaboration with the GCP Cloud Engineer, ensuring test environments accurately reflect production configurations · Produce quality metrics reports for program leadership covering defect rates, SLA compliance, and test coverage across all TOP platform components
Skills Required:
API, Cloud Infrastructure, Google Cloud Platform, User Acceptance Testing, Application Testing, Software Testing, Test Cases, Jira, Ad Hoc Reporting, Test Integration Testing
1. API – 3–5 years designing, executing, and validating API test cases using tools such as Postman or REST-assured. This includes verifying request/response contracts, authentication flows, error handling, and integration behavior across microservices within the Telemetry & Observability Platform.
2. Cloud Infrastructure – 3–5 years of working knowledge of cloud-native infrastructure concepts including containerization, networking, IAM, and storage. Experience validating deployments and testing service behavior in cloud environments is expected.
3. Google Cloud Platform – 2–5 years of hands-on experience working within GCP, including familiarity with services such as Pub/Sub, BigQuery, GCS, or Cloud Run as they relate to testing data pipelines and telemetry workloads.
4. User Acceptance Testing – 2–5 years coordinating and executing UAT cycles with internal stakeholders and platform consumers, ensuring delivered features meet business and functional requirements before production release.
5. Application Testing – 3–5 years developing and executing test plans covering functional, regression, and smoke testing across platform applications and services throughout the SDLC.
6. Software Testing – 3–5 years of broad software testing experience including unit, integration, system, and end-to-end testing, with the ability to contribute to or maintain automated test suites.
7. Test Cases – 2–5 years authoring clear, traceable, and reusable test cases tied to acceptance criteria, with documentation maintained in Jira or a comparable test management tool.
8. Jira - Ad Hoc Reporting – 2–5 years creating ad hoc Jira queries, dashboards, and filters to surface test coverage, defect trends, and sprint quality metrics for engineering and leadership audiences.
9. Test Integration Testing – 3–5 years planning and executing integration tests that validate end-to-end data and event flows across platform services. including ingestion, processing, and telemetry delivery pipelines.
Skills Preferred:
Artificial Intelligence & Expert Systems, Dynatrace, Quality Assurance Concepts and Standards, Quality Assurance/Control, Testing - Performance
1. Artificial Intelligence & Expert Systems – 1–3 years of familiarity with AI-assisted testing approaches, including the use of LLM-based tools for test generation, anomaly detection, or intelligent defect triage within developer workflows.
2. Dynatrace – 1–3 years using Dynatrace for observability-driven testing, including leveraging traces, dashboards, and alerting to validate system health and performance during test cycles.
3. Quality Assurance Concepts and Standards – 2–5 years applying QA methodologies such as shift-left testing, risk-based testing, and test-driven development, with experience maintaining consistent quality standards across teams.
4. Quality Assurance/Control – 2–5 years owning quality gates within a CI/CD pipeline, including defining acceptance thresholds, managing test environments, and working with engineering teams to resolve defects efficiently.
5. Testing – Performance – 2–4 years designing and running load, stress, or scalability tests against platform services, with the ability to interpret results and collaborate with engineers to resolve bottlenecks.
Experience Required:
4 or more years of professional software quality assurance experience, with demonstrated experience in cloud-native service testing
· Experience performing formal vendor Deliverable acceptance reviews against documented acceptance criteria in a contractual delivery context
· Proficiency in test automation frameworks for API and service testing, such as pytest, Postman, or equivalent
· Experience testing containerized applications in GCP or equivalent cloud environments, including container deployment validation and service integration testing
· Familiarity with Jira for bug management and workflow configuration; experience managing structured escalation workflows in Jira is strongly preferred
· Experience designing and executing UAT programs with real end users, including test scenario design, facilitating sessions, and synthesizing structured feedback
· Ability to write clear, specific defect reports that enable developers to reproduce and resolve issues without further clarification
· Strong understanding of software acceptance criteria and the ability to evaluate whether a delivered artifact meets them objectively and defensibly
Experience Preferred:
Experience evaluating LLM or AI model outputs for production quality, including prompt regression testing and AI output consistency validation · Familiarity with Dynatrace or equivalent APM (Application Performance Monitoring) tooling for test environment monitoring · Experience in automotive or manufacturing quality contexts where structured acceptance processes are the norm · ISTQB certification or equivalent formal QA training · Experience with performance and load testing for cloud-native APIs serving high-volume, real-time workloads
Education Required:
 
Education Preferred:
 
Additional Safety Training/Licensing/Personal Protection Requirements:
 
Additional Information :
***HYBRID / 4 days per week in the office*** 9. Test Integration Testing – 3–5 years planning and executing integration tests that validate end-to-end data and event flows across platform services. including ingestion, processing, and telemetry delivery pipelines.
Benefits: 80 hours paid time off, and medical insurance contributions, dental vision and our 401k retirement savings plan

#indoem #LI-JZ1


About the Company

E

Epitec, Inc

Epitec is a leading staffing and recruiting services company with a mission to make staffing personal. We go beyond traditional hiring by truly understanding our candidates and matching them with the perfect opportunities. We offer competitive compensation, career growth, and support throughout the entire process. Working with top Fortune 500 companies, we are recognized for our excellence with numerous awards, including Best & Brightest and diversity recognitions. At Epitec, we're redefining the future of employment. 

COMPANY SIZE
2,500 to 4,999 employees
INDUSTRY
Staffing/Employment Agencies
EMPLOYEE BENEFITS
Professional Development, 401K, Employee Referral Program, Life Insurance
FOUNDED
1978
WEBSITE
https://epitec.com/