logo

JobNob

Your Career. Our Passion.

Data engineer cloud


Fusion Global Solutions


Location

Sunnyvale, CA | United States


Job description

Data Engineer with Google Cloud Platform must.

Sunnyvale, CA - considering non-locals as well, provided candidates are ready to relocate day one.

12+yrs exp.

Release Comments: Looking for Senior Data Engineer with Spark, Scala, Google Cloud Platform experience.

Must Have Skills
Spark 8+ Yrs of Exp
Scala 8+ Yrs of Exp
Google Cloud Platform 5+ Yrs of Exp
Hive 8+Yrs of Exp
SQL - 8+ Yrs of Exp
ETL Process / Data Pipeline - 8+ Years of experience

Mandatory if Applicable
Domain Experience (If any ) Retail preferably Past client experience
Must have Certifications N
If Yes provide dates , details of account/project
Location Sunnyvale, CA
Onsite Requirement - Yes
Number of days onsite 2 to 3 days to start with
If Onsite Office Address 860 W California Ave, Sunnyvale, CA 94086

Responsibilities:

As a Senior Data Engineer, you will
Design and develop big data applications using the latest open source technologies.
Desired working in offshore model and Managed outcome
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow.
Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka.
Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
Mentor junior engineers on the team
Lead daily standups and design reviews
Groom and prioritize backlog using JIRA
Act as the point of contact for your assigned business domain
Requirements:
8+ years of hands-on experience with developing data warehouse solutions and data products.
4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive,Scala, Airflow or a workflow orchestration solution are required
. 4 + years of experience in Google Cloud Platform,GCS Data proc, BIG Query
2+ years of hands-on experience in modeling(Erwin) and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
Gitflow
Atlassian products BitBucket, JIRA, Confluence etc.

Report this job


Job tags

Contract workLocal areaRelocationOffshore


Salary

All rights reserved