AI Data Engineer
Talent Software Services
Redmond, WA
JOB DETAILS
SALARY
$45–$50 Per Hour
JOB TYPE
Full-time, Employee
SKILLS
Artificial Intelligence (AI), Circuit Components, Computer Hardware, Computer Systems, Electrical Engineering, Electricity, Electronics, Hardware Development, Instrumentation, Laboratory, Manufacturing, Network Operations Center, Oscilloscope, PCI Express (PCI-E), Printed Circuit Board (PCB), Soldering, System Test, Test Fixtures, Testing, Vector Network Analyzers
LOCATION
Redmond, WA
POSTED
14 days ago
AI Data Engineer 1
Job Summary: Talent Software Services is in search of an AI Data Engineer for a contract position in Redmond, WA. The opportunity will be four months with a strong chance for a long-term extension.
Position Summary: This team is responsible for designing, developing, and maintaining data platforms. You will have the opportunity to work closely with stakeholders across the company to gather business requirements, build data models, and ensure data quality and accessibility. Your expertise in Python, SQL, Airflow, and Spark will be crucial in optimizing our data infrastructure and enabling data-driven decision-making. This role will contribute to designing, developing, and maintaining efficient and reliable data pipelines.
Primary Responsibilities/Accountabilities:
- Data Platform: Design, build, and maintain scalable data platforms and pipelines using Python, SQL, Airflow, and Spark.
- Business Requirements Gathering: Collaborate with stakeholders to understand and translate business requirements into technical specifications.
- Data Modeling: Develop and implement data models that support analytics and reporting needs.
- Data Quality and Governance: Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks.
- Stakeholder Collaboration: Work with cross-functional teams, including data analysts, data scientists, and business leaders, to deliver high-quality data solutions.
- Performance Optimization: Continuously monitor and optimize data pipelines for performance, scalability, and cost-efficiency.
- Monitoring and Observability: Build and implement monitoring and observability metrics to ensure data quality and detect anomalies in data pipelines.
- Documentation and Communication: Maintain clear and comprehensive documentation of data processes and communicate technical concepts effectively to non-technical stakeholders.
Qualifications:
- Experience: 2 years of experience in data engineering and infrastructure.
- Technical Skills: Proficiency in data warehouse management, Python, SQL, Airflow, and Spark.
- Data Pipeline Expertise: Strong experience in building and maintaining robust data pipelines and ETL processes.
- Analytical Skills: Ability to gather business requirements and debug issues for ingestion or other areas of the data warehouse.
- Communication: Excellent verbal and written communication skills, with the ability to convey technical information to non-technical audiences.
- Collaboration: Proven ability to work effectively in a collaborative, cross-functional environment.
- Education: A Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field.
- Data Engineering background that includes Python, SQL, Kubernetes, Airflow, and Scala.
Preferred:
- Experience with cloud platforms such as AWS, GCP, or Azure.
- Familiarity with data warehousing technology (e.g., Deltalake, Azure Fabric, Snowflake, Redshift, BigQuery).
- Knowledge of data governance and data security best practices.
If this job is a match for your background, we would be honoured to receive your application!
Providing consulting opportunities to TALENTed people since 1987, we offer a host of opportunities, including contract, contract to hire, and permanent placement. Let's talk!
About the Company
T