Location
Dublin, CA | United States
Job description
Please Note: As of July 22, 2021, our team will require that all candidate submissions include a LinkedIn profile. Please do not submit any candidates that do not have a LinkedIn.
NA has a large retail client who is looking to hire 4 Data Engineers in Dublin, CA. This role is onsite 2 days a week.
Summary:
As a key member of the Data engineering team, this Data Engineer will work on diverse data technologies such as Azure Cloud, Snowflake, StreamSets, dbt, Starburst, DataOps, Data Observability, and others to build insightful, scalable, and robust data pipelines that feed our various analytics platforms.
iLabor Jobs Details
Client Industry
Full Job Description
See Primary Skills
Visa Restrictions?
None
Locals Only/ Out of Area/ Remote?
Hybrid in Dublin, CA - will take relocation
Interview Type (Phone, Video, Face to Face)
Video
Anticipated Start (in weeks)
2 weeks
3-5 Must Haves
- 10+ YOE in data engineering
- 3+ YOE in data architecture
- Snowflake
- DBT
- Azure
- Good communication skills are a must
Required Skills : * DBA is a must have, they want someone with an architect background * 10 years in-depth data engineering experience and execution of data pipelines, data ops, data observability, scripting, and SQL queries * 5 years of proven data architecture experience - must have demonstrable experience with data architecture, accountable for data standards, designing data models for data warehousing and modern analytics use-cases (e.g., from operational data store to semantic models) * 5 years of hands-on experience with data warehouse design, development, and data modeling best practices for modern data architectures * At least 3 years of experience in modern data architecture that support advanced analytics including Snowflake, Azure, etc.; Experience with Snowflake and other Cloud Data Warehousing/Data Lake preferred * Experience with modern data modelling tools, data preparation tools * Experience with adding data lineage, technical glossary from data pipelines to data catalog tools * Hands on DevOps/Data Ops experience * Expert in engineering data pipelines using various data technologies - ETL/ELT, big data technologies (Hive, Spark) on large-scale data sets demonstrated through years of experience * Knowledge/working experience in reporting tools such as MicroStrategy, Power BI * Highly skilled in data orchestration with experience in tools like Ctrl-M, Apache Airflow * Highly proficient in Data analysis - analyzing SQL, Python scripts, ETL/ELT transformation scripts * Highly proficient in at least one of these programming languages: Java, Python * Must have good soft skills * Experience with StreamSets, dbt preferred
Rank :A3
Requested Date :2024-02-01
Job tags
Salary