Job Title : Data Engineer Duration: 12 months Position location: Lansing MI On-site: Hybrid position - Onsite 2 days per week Description : : "• Lead the design and development of scalable and high-performance solutions using AWS services. • Experience with Databricks, Elastic search, Kibanna, S3. • Experience with Extract, Transform, and Load (ETL) processes and data pipelines. • Write clean, maintainable, and efficient code in Python/Scala. • Experience with AWS Cloud-based Application Development • Experience in Electronic Health Records (EHR) HL7 solutions. • Implement and manage Elastic Search engine for efficient data retrieval and analysis. • Experience with data warehousing, data visualization Tools, data integrity • Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects. • Excellent knowledge in designing both logical and physical database model • Develop database objects including stored procedures, functions, • Extensive knowledge on source control tools such as GIT • Develop software design documents and work with stakeholders for review and approval. • Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements • Experience working on large agile projects. • Experience or Knowledge on creating CI/CD pipelines using Azure Devops Top Skills & Years of Experience: 12+ years developing complex database systems. 8+ years Databricks. 8+ years using Elastic search, Kibanna. 8+ years using Python/Scala. 8+ years Oracle. 5+ years' experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines. 5+ years' experience with AWS. 5+ years' experience with data warehousing, data visualization Tools, data integrity. 5+ years using CMM/CMMI Level 3 methods and practices. 5+ years implemented agile development processes including test driven development.