GCP Data Engineer _ Remote
Location
West Virginia | United States
Job description
Hi there,
Please find the below requirement and let me know if you are interested!
Title: Senior Software Engineer
Location: Remote, US
Duration: Long Term
Required Qualification:
- Experience writing Python and Pyspark scripts.
- Knowledge of Apache Beam, apache spark, Pubsub or Kafka and IAM.
- Basic understanding of cloud networking and infrastructure.
- Have hands on experience with building streaming and batch pipelines.
- Experience with Cloud function or cloud run.
- Working knowledge of CICD tools like Jenkins, cloud build and a code management tool like git
- Possess excellent knowledge of SQL along with its variation for popular cloud database like BigQuery,cloud sql,spanner etc.
- experience with relational data bases like mysql,oracle,postgress.
- Development experience building ETL pipelines using cloud tools like dataflow, lambda.
- Experience in tuning SQL queries to maximize performance.
- Working knowledge on implementing Data quality checks.
- Experience with Airflow/Composer or Tidal to orchestrate the data pipelines.
- Excellent critical reasoning, problem-solving skills and teamwork skills.
- Solid written and verbal communication skills and able to articulate complex solutions to technical and non-technical personnel.
- Experience working for clients in healthcare space.
'Must Have' Experience:
- Experienced in healthcare domain.
- Hands-on with Google Cloud services for data engineering- Dataflow, Data Proc, Big Query, Composer, Pubsub.
- Manage end to end data pipeline from extraction from source, landing on google platform and transformation.
- Implemented data management best practices Data quality, Capture of metadata
- Experience with DevOps and GKE patterns
Kishore Kallem
Accounts Manager
Email: | Web:
100 Overlook Center, Suite 200
Princeton, NJ 08540.
Job tags
Salary