0 suggestions are available, use up and down arrow to navigate them
What job do you want?

Data Engineer - TS/SCI + Full Scope Poly job in McLean at Tallon Recruiting and Staffing

Create Job Alert.

Get similar jobs sent to your email

List of Jobs and Events

Apply to this job.
Think you're the perfect candidate?
Data Engineer - TS/SCI + Full Scope Poly at Tallon Recruiting and Staffing

Data Engineer - TS/SCI + Full Scope Poly

Tallon Recruiting and Staffing McLean, VA Full-Time

We are recruiting for multiple Data Engineer positions working in support of artificial intelligence (AI) and advanced analytics for a variety of data engineering projects. Qualified data Engineers will be supporting a DevOps environment and should have experience with CI/CD and familiarity with the manipulation of unstructured data in a data analytics environment.This work is supporting a multi-year contract.  All positions are full time offering excellent benefits.

Location Note: All work is performed on-site.

Role and responsibilities include: 

  • Support the data analyst and data scientist teams providing data engineering support for continuous pipelines of large pools of filtered information for data sets to be available analysis.

  • Work with open-source tools

  • Support cloud computing, machine learning, and data visualization areas

  • Other duties as assigned

Requirements include:

  • U.S. citizenship

  • Current TS/SCI security clearance and Full Scope Polygraph

  • 7+ years of related data engineering experience including one or more data related languages such as Apache, Hadoop, or similar 

  • 3 years of overall experience supporting data analytics preferably in a DoD or Intelligence Community environment

  • Bachelor's degree in a related field (Master's degree in a related field may be substituted for 3 years of related experience)

There are multiple openings and technical requirements vary by position and may include one or more of the following:

  • Experience deploying scripts with AWS

  • Experience with ETL code and tools

  • Programming experience using Python, Java, Perl, C, C++

  • Experience with ELK Stack (Elasticsearch, Logstash, Kibana)

  • Experience maintaining and optimizing Elastic clusters

  • Experience working and developing capabilities in Linux and Windows

  • Experience with automating project builds using shell scripts, Jenkins, and using/writing Makefiles

  • Experience with Linux bash shell scripting

  • Experience with SQL database systems including Postgre SQL and MySQL/MariaDB

  • Experience with large-scale, massively parallel distributed data processing tools including Spark,

  • Hadoop, Presto, NiFi

  • Experience with HTML, JavaScript, Regular Expressions, JSON

  • Experience with API usage and integration

  • Experience building a Python web server and performing setup in Flask

  • Experience working with varied file types including text, image, video, audio, binary

  • Experience working with geospatial tools and data

Preferred Skills/Experience Areas include:

  • Experience using automated data frameworks

  • Experience working with automated deployment tools including CloudFormation, Jenkins, Docker

  • Experience with Microservices FastAPI  

  • Postgre SQL and MySQL/MariaDB


Recommended Skills

  • Api
  • Amazon Web Services
  • Apache Hadoop
  • Apache Nifi
  • Apache Spark
  • Artificial Intelligence
Apply to this job.
Think you're the perfect candidate?

Help us improve CareerBuilder by providing feedback about this job:

Job ID: 979

CareerBuilder TIP

For your privacy and protection, when applying to a job online, never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction. Learn more.

By applying to a job using CareerBuilder you are agreeing to comply with and be subject to the CareerBuilder Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.