logo

JobNob

Your Career. Our Passion.

GCP Data Architect


Aequor Information Technologies Priivate Limited


Location

Chennai | India


Job description

Experience: 10+

Location: Kolkata, Chennai, Pune, Mumbai, Bangalore, Hyderabad, Bhubaneswar.

Notice Period: Immediate to 30 days

JD: GCP Data Architect with Bigquery, Data Flow, Airflow, Java or Python

Minimum 10-15 years of IT Experience and 3years Development on the GCP projects and 5 projects implmented
Possess In depth knowledge and hands on development experience in building Distributed Data Solutions including ingestion, processing, consumption) (Must Have)
You have experience with developing winning themes and then writing technical responses to bids (RFP's & RFI's)
Strong Development Experience in either one of the Distributed Big Data processing (bulk) engines preferably using Spark on Dataproc or DataFLow (Must Have)
Strong understanding and experience with Cloud Storage infrastructure and operationalizing GCP based storage services & solutions prefer GCP Bucket or related (Must Have)
Strong experience on either one or more MPP Data Warehouse Platforms prefer BigQuery, CoudSQL,CLoudSpanner, DataStore, Firestore or similar (Must Have)
Strong Development Experience on at least one or more event driven streaming platforms prefer PUB/SUB, Kafka or related (Must Have)
Strong Development Experience on the Networking on GCP (Must Have)
Strong Data Orchestration experience using tools such has Cloud Functions, DataFlow, Cloud Composer, Apache Airflow or related (Must Have)
Strong Development Experience to build data piple using Kubernaties (Must Have)
Strong Development Experience in IAM, KSM, Container Registr
Assess use cases for various teams within the client company and evaluate pros and cons and justify recommended tooling and component solution options using GCP services, 3rd party and open source solutions (Must Have)
Strong technical communication skills and ability to engage a variety of business and technical audiences explaining features, metrics of Big Data technologies based on experience with previous solutions (Must Have)
Strong Data Cataloging experience preferably using Data Catalog (Must Have)
Strong Understanding and experience in Logging and Cloud Monitoring solutions (Must Have)
Strong Understanding of at least one or more Cluster Managers (Yarn, Hive, Pig, etc) (Must Have)
Strong knowledge and understanding of CI/CD processes and tools (Must Have)
Interface with client project sponsors to gather, assess and interpret client needs and requirements
Advising on database performance, altering the ETL process, providing SQL transformations, discussing API integration, and deriving business and technical KPIs
Develop a data model around stated use cases to capture client's KPIs and data transformations
Assess, document and translate goals, objectives, problem statements, etc. to our offshore team and onshore management


Job tags



Salary

All rights reserved