logo

JobNob

Your Career. Our Passion.

Data Architect - Google Cloud Platform


ReYcruit Consultancy


Location

Hyderabad | India


Job description

Job Description :- Minimum of 5 years of experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture using GCP, Hadoop, Hive, HDFS, Hbase, Spark etc. - Minimum of 3 years of experience in architecting and implementing next generation data and analytics platforms on GCP. - Expertise in one of the programming language- Scala/Java/Python. - Experience with Data lake, data warehouse ETL build and design, and Data migration from legacy systems including Hadoop, Exadata, Oracle Teradata, or Netezza etc. - Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification. - Demonstrable knowledge & Experience using Google Cloud Big Query is mandatory. - Experience on Google Managed Services such as - Cloud Storage, BigQuery, Bigtable, Dataflow, Dataproc, Cloud composer, Cloud PubSub, and Data Fusion etc. - Good understanding of batch and streaming GCP Architecture. Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy - Experience with Agile methodologies and DevOps, CICD principles - Strong knowledge of data technologies and data modeling. - As a lead architect, work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on-premises and cloud. - Lead cloud solution and scoping to generate estimates and approaches for proposals and SOWs for customers. - Create detailed target state technical architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition. - Conduct full technical discovery, identifying pain points, business and technical requirements, "as is" and "to be" scenarios. - Data migration, Data Pipeline migration to GCP, setting up the best practices, guidelines and right set of architectures. Must have : - GCP (Data Architect) - GCP - BigQuery, Cloud Storage, Bigtable, Dataflow, Dataproc, Cloud composer, Cloud PubSub, and Data Fusion etc - Expertise in one of the programming language- Scala/Java/Python. - Hadoop, Hive, HDFS, Hbase, Spark. (ref:hirist.tech)


Job tags



Salary

All rights reserved