Big Data Engineer
Artech LLC
Chicago, IL
JOB DETAILS
SKILLS
Agile Programming Methodologies, Analysis Skills, Apache HBase, Apache Hadoop, Apache Spark, Apache Sqoop, Apiary/Beekeeping, Application Programming Interface (API), Big Data, Cloud Computing, Cloudera, Community Support, Cross-Functional, Customer Experience, Data Analysis, Data Formats, Data Management, Data Quality, Distributed Computing, Elasticsearch, Git, HortonWorks, JSON, Jenkins, Linux Operating System, MapReduce, Messaging Middleware, MySQL, NoSQL, PHP Scripting Language (PHP Hypertext Preprocessor), Performance Tuning/Optimization, Problem Solving Skills, Programming Languages, Python Programming/Scripting Language, REST (Representational State Transfer), Relational Databases (RDBMS), SOLR, SQL (Structured Query Language), Scala Programming Language, Software Development Lifecycle (SDLC), Source Code Control System (SCCS), Source Code/Configuration Management (SCM), Subversion, Team Player, Unix Shell Programming, XML (EXtensible Markup Language)
LOCATION
Chicago, IL
POSTED
6 days ago
Introduction
We invite you to join our team as a Big Data Developer/Engineer. We are a tight-knit, supportive community passionate about delivering the best experience for our customers while remaining sensitive to their unique needs.
Required Skills & Qualifications
- Strong SQL Skills – one or more of MySQL, HIVE, Impala, SPARK SQL
- Data ingestion experience from message queue, file share, REST API, relational database, etc., and experience with data formats like JSON, CSV, XML
- Experience working with SPARK Structured Streaming
- Experience working with Hadoop/Big Data and Distributed Systems
- Working experience with Spark, Sqoop, Kafka, MapReduce, NoSQL Database like HBase, SOLR, CDP or HDP, Cloudera or Hortonworks, Elastic Search, Kibana, etc.
- Hands-on programming experience in at least one of Scala, Python, PHP, or Shell Scripting
- Performance tuning experience with Spark/MapReduce or SQL jobs
- Experience and proficiency with Linux operating system is a must
- Experience in end-to-end design and build process of Near-Real Time and Batch Data Pipelines
- Experience working in Agile development process and deep understanding of various phases of the Software Development Life Cycle
- Experience using Source Code and Version Control systems like SVN, Git, BitBucket, etc.
- Experience working with Jenkins and Jar management
- Self-starter who works with minimal supervision and the ability to work in a team of diverse skill sets
- Ability to comprehend customer requests and provide the correct solution
- Strong analytical mind to help tackle complicated problems
- Desire to resolve issues and dive into potential issues
- Ability to adapt and continue to learn new technologies is important
Preferred Skills & Qualifications
- Experience in additional programming languages or technologies
- Experience with cloud platforms
- Advanced data analytics skills
Day-to-Day Responsibilities
- Design, develop, and maintain data pipelines and architecture
- Collaborate with cross-functional teams to meet project goals
- Monitor and optimize system performance and data integrity
Company Benefits & Culture
- Inclusive and diverse work environment
- Opportunities for professional growth and development
- Comprehensive benefits package
For immediate consideration, please click APPLY to begin the screening process.
About the Company
A