Jacksonville, FL28 days ago
Design and build scalable Data Lakes, Data Warehouses, and Data Lakehouses; - Design and implement robust ETL/ELT processes at scale using Python and pipeline orchestration tools like Airflow; - Develop ingestion workflows from diverse third-party APIs and data sources; - Manage and optimize file formats such as Parquet, Avro, and ORC for high-performance data retrieval; - Work with AI development tools to support machine learning initiatives and advanced analytics; - Act as a technical consultant to gather requirements, understand business goals, and translate them into technical roadmaps; - Work with Terraform and other tools to build AWS and on-prem infrastructure. - Familiarity with the fintech industry and financial data domains; - Documentation skills for data pipelines, architecture designs, and best practices; - OpenSearch or Elasticsearch; - AWS SageMaker Studio and Jupyter for data analysis; - Terraform; - Scala.