GCP Data Architect
Miracle Software
Dearborn, MI
JOB DETAILS
JOB TYPE
Full-time, Employee
SKILLS
Automation, Cloud Computing, Continuous Deployment/Delivery, Continuous Integration, Cost Control, Data Analysis, Data Management, Data Modeling, Data Modeling Tools, Data Sets, DataArchitect Data Modeling Tool, Database Design, Documentation, Documentation Models, Enterprise Architecture, GCP (Good Clinical Practices), Git, Java, Operational Audit, Performance Tuning/Optimization, Python Programming/Scripting Language, Query Analysis, Requirements Management, SAP, SQL (Structured Query Language), Scala Programming Language, Sybase PowerDesigner, Test Automation, Use Cases, erwin Data Modeler
LOCATION
Dearborn, MI
POSTED
15 days ago
- Design end-to-end enterprise data architectures for large-scale analytics and operational use cases.
- Translate business requirements into conceptual, logical, and physical data models.
- Hands-on experience developing data pipelines on GCP using Dataflow for batch and/or streaming processing.
- Hands-on experience building transformation layers using Dataform and/or dbt (modeling, testing, documentation, deployment patterns).
- Deep BigQuery experience including schema design, partitioning/clustering strategies, and cost/performance optimization.
- Expert SQL capability to write complex transformations and analytics queries across large datasets.
- Programming experience in at least one language (e.g., Python, Java, Scala) to support automation, pipeline logic, and data utilities.
- Strong data modeling tool proficiency in SAP PowerDesigner and/or ERwin for enterprise-grade modeling and documentation.
- Familiarity with CI/CD and Git-based workflows for data/analytics engineering (branching, reviews, automated tests, deployments).
Required Skills:
- GCP (Google Cloud Platform)
- BigQuery
- Data Governance
- Data Modeler
- SAP PowerDesigner and/or ERwin
About the Company
M