0 suggestions are available, use up and down arrow to navigate them
What job do you want?

DevSecOps/Data Engineer job in Reston at Base-2 Solutions, LLC

Create Job Alert.

Get similar jobs sent to your email

List of Jobs

Apply to this job.
Think you're the perfect candidate?
DevSecOps/Data Engineer at Base-2 Solutions, LLC

DevSecOps/Data Engineer

Base-2 Solutions, LLC Reston, VA Full-Time

Required Security Clearance: Top Secret

City: Reston

State/Territory: Virginia

Travel: None

Potential for Teleworking: No

Schedule: Full Time

DoD 8570 IAT Requirement: None

DoD 8570 IAM Requirement: None

DoD 8570 IASAE Requirement: None

DoD CSSP Requirement: None

Do you like to design and develop big data solutions? If so we have a DevSecOps/Data Engineer position to assist with the design, development and implementation of alternative data ingestion pipelines to augment the National Media Exploitation Center (NMEC) data services. The DOMEX Data Discovery Platform (D3P) program is a next generation machine learning pipeline platform providing cutting edge data enrichment, triage, and analytics capabilities to Defense and Intelligence Community members. This engineer will collaborate as part of a cross-functional Agile team to create and enhance data ingestion pipelines and addressing big data challenges. You will work closely with the chief architect, systems engineers, software engineers, and data scientists on the following key tasks: Fun stuff you will do on the job

  • Provide Extraction, Transformation, and Load (ETL) experience coupled with enterprise search capabilities to solve Big Data challenges
  • Design and implement high-volume data ingestion and streaming pipelines using Open Source frameworks like Apache Spark, Flink, Nifi, and Kafka on AWS Cloud
  • Leverage strategic and analytical skills to understand and solve customer and business centric questions
  • Create prototypes and proofs of concept for iterative development
  • Learn new technologies and apply the knowledge in production systems
  • Monitor and troubleshoot performance issues on the enterprise data pipelines and the data lake
  • Partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives 
This is you
  • BS in Computer Science, Systems Engineering, or related technical field or equivalent experience with at least 8+ years in systems engineering or administration (6+ years with a MS/MIS Degree).  (4 years for mid-level)
  • Must have an active Top Secret security clearance and able to obtain a TS/SCI with Polygraph.
  • 2 years of experience with big data tools: Hadoop, Spark, Kafka, NiFi, Pulsar
  • 2 years of experience with object-oriented/object function scripting languages: Python (preferred) and/or Java
  • 2 years of experience with and managing data across relational SQL and NoSQL databases like MySQL, Postgres, Cassandra, HDFS, Redis, and Elasticsearch
  • 2 years of experience working in a Linux environment
  • Experience working with and designing REST APIs
  • Experience developing data ingest workflows with stream-processing systems: Spark-Streaming, Kafka Streams and/or Flink
  • Experience transforming data in various formats, including JSON, XML, CSV, and zipped files
  • Experience developing flexible ontologies to fit data from multiple sources and implementing the ontology in the form of database mappings / schemas
  • Good interpersonal and communication skills necessary to work effectively with customers and other team members.
You will wow us if you have these skills
  • Data engineering experience in Intelligence Community or other government agencies
  • Experience with Microservices architecture components, including Docker and Kubernetes.  
  • Experience with AWS cloud services: EC2, S3, EMR, RDS, Redshift, Athena and/or Glue
  • Experience with Jira, Confluence and extensive experience with Agile methodologies.
  • Knowledge about security and best practices.
  • Experience developing flexible data ingest and enrichment pipelines, to easily accommodate new and existing data sources
  • Experience with software configuration management tools such as Git/Gitlab, Salt, Confluence, etc.
  • Experience with continuous integration and deployment (CI/CD) pipelines and their enabling tools such as Jenkins, Nexus, etc.
  • Detailed oriented/self-motivated with the ability to learn and deploy new technology quickly

Recommended Skills

  • Administration
  • Agile Methodology
  • Amazon Elastic Compute Cloud
  • Amazon Redshift
  • Amazon Relational Database Service
  • Amazon S3
Apply to this job.
Think you're the perfect candidate?

Help us improve CareerBuilder by providing feedback about this job:

Job ID: 5RbvlM

CareerBuilder TIP

For your privacy and protection, when applying to a job online, never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction. Learn more.

By applying to a job using CareerBuilder you are agreeing to comply with and be subject to the CareerBuilder Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.