Data Engineer 4
Artech LLC
Salem, OR
Apply
JOB DETAILS
SALARY
$65–$70 Per Hour
LOCATION
Salem, OR
POSTED
30+ days ago
Job Title: Data Engineer 4
Location: Portland, OR
Salary Range: $65/hr to $70/hr on W2
Introduction
Join our dynamic team where you will contribute to designing and implementing data products and features alongside product owners, data analysts, and business partners using Agile/Scrum methodologies. This is a senior-level role requiring experience in large-scale software and big data development.
Required Skills & Qualifications
- Applicants must be able to work directly on W2.
- Bachelor’s degree in Computer Science or a related technical discipline.
- 10 years of experience in large-scale software development, including 5 years of big data experience.
- Proficiency in programming languages such as Python or Scala.
- Experience working with Hadoop, Spark, Hive, and related processing frameworks.
- Strong SQL skills and understanding of data warehousing concepts.
- Experience with workflow orchestration tools like Apache Airflow.
- Proficiency with source code control tools such as Github or Bitbucket.
- Effective communication skills, both verbal and written.
- Experience in Agile/Scrum application development.
Preferred Skills & Qualifications
- Experience with Java.
- Experience in a public cloud environment, particularly AWS and Databricks.
- Familiarity with cloud warehouse tools like Snowflake.
- Experience with NoSQL data stores such as HBase, DynamoDB.
- Experience building RESTful APIs for data consumption.
- Experience with build tools like Terraform or CloudFormation and automation tools such as Jenkins or Circle CI.
- Knowledge of Continuous Development, Continuous Integration, and Automated Testing practices.
Day-to-Day Responsibilities
- Design and implement data products and features in collaboration with product owners and data analysts.
- Contribute to the architecture, frameworks, and patterns for processing and storing large data volumes.
- Implement distributed data processing pipelines using big data ecosystem tools and languages.
- Build utilities, user-defined functions, libraries, and frameworks to enhance data flow patterns.
- Work with engineering leads and other teams to ensure quality solutions are implemented and best practices are followed.
- Build and incorporate automated unit tests and participate in integration testing efforts.
- Utilize and advance continuous integration and deployment frameworks.
- Troubleshoot data issues and perform root cause analysis.
- Work across teams to resolve operational and performance issues.
Company Benefits & Culture
- Collaborative team environment focused on problem-solving and delivering results.
- Opportunities for continuous learning and knowledge sharing.
- Commitment to diversity and inclusion in the workplace.
For immediate consideration please click APPLY to begin the screening process.
About the Company
A