Location
Herndon, VA | United States
Job description
REQUIRED QUALIFICATIONS:
- Demonstrated experience with cloud environments.
- Demonstrated experience building and optimizing large scale data pipelines, architectures and data sets
- Demonstrated experience applying Sponsor provided data models for the ETL process
- Demonstrated experience of manipulating, processing, transforming and loading data into target data models
- Demonstrated experience with ETL tools such as Databricks, Pentaho or informatica
- Demonstrated experience in performing the ETL process for a variety of data types (structured and unstructured)
- Demonstrated experience working and developing capabilities on Linux and Windows
- Demonstrated experience with object-oriented scripting languages such as Python, Java, C++, or Scala
- Demonstrated experience developing, testing and maintaining Python programs as packages or notebooks
- Demonstrated advanced working experience with Structured Query Language (SQL) database systems, particularly PostgreSQL and RDS
- Demonstrated experience with big data tools such as Hadoop, Spark, or Kafka
- Demonstrated experience with data pipeline and workflow management tools like Airflow.
- Demonstrated experience using ElasticSearch
- Demonstrated knowledge of message queuing, stream processing, and highly scalable data stores
CLEARANCE:
- Full Scope Polygraph minimum
Job tags
Salary