logo

JobNob

Your Career. Our Passion.

Data architect - cloud


Global Data Management Inc


Location

Phoenix, AZ | United States


Job description

Position: Google Cloud Platform Data Architect

Location: Phoenix, AZ (Day 1 Onsite)

Duration: 12+months

Onsite Locations: Phoenix, AZ first, then NYC or Sunrise FL

Need 12 years of experience candidates

Mandatory Skills:

  1. Extensive experience working with Google Cloud Platform Data-related Services such as Cloud Storage, Dataflow, Dataproc, Big Query, Bigtable
  2. Very strong experience with Google Composer and Apache Airflow; ability to set up, monitor, and debug a complex environment running a large number of concurrent tasks
  3. Good Exposure to RDBMS / SQL fundamentals
  4. Exposure to Spark, Hive, Google Cloud Platform Data Fusion, Google Cloud Platform Astronomer, Pub/Sub Messaging, Vertex, and the Python Programming Language

Minimum Qualifications:

Bachelor degree in Engineering or Computer Science or equivalent OR Master in Computer Applications or equivalent.

A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must.

Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition

Minimum of 12 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with 3rd parties - Spark, Hive, Cloud Dataproc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud Big Query, Cloud PubSub, Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to Google Cloud Platform cloud

A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must.

Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition

Minimum of 8 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with 3rd parties - Spark, Hive, Cloud Dataproc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud Big Query, Cloud PubSub, Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to Google Cloud Platform cloud

Experience with Data lake, data warehouse ETL build and design

Experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big Table Proven ability in one or more of the following programming or scripting languages- Python, JavaScript, Java,

Report this job


Job tags

Contract work


Salary

All rights reserved