Responsible for collaborating with cross-functional teams to integrate services and data across platform, ensuring seamless communication and efficient data flow. Duties: • Leverage cloud technologies and services to build scalable and reliable solutions that meet business needs. • Design, develop and optimize streaming data pipelines using Kafka for real-time data processing. • Build event-driven architecture using Apache, Kafka to enable more real time data streaming and processing; improving responsiveness and agility. • Implement and improve continuous integration and continuous deployment (CI/CD) pipelines to streamline software delivery. • Set up robust alerting and monitoring systems to ensure system health and proactively address issues and implement data quality checks and validation processes. May work remotely from anywhere in the US.Requirements: Must have a bachelor’s in any engineering or related field of study; and experience five years which must be post-graduate progressive as Sr. Software Engineer, Engineering Manager, Senior Engineer or related. Experience must include the following: • Building Big Data infrastructure for executing streaming and batch workloads. • Building and operating big data infrastructure to support the enterprise needs for processing and deriving insights from data across the enterprise. • Building devops capabilities for CI/CD and cloud infrastructure automation. • Collaborating with partner teams to onboard and support them to build solutions on infrastructure.