logo

JobNob

Your Career. Our Passion.

Snowflake Developer - DW / Azure / SQL / Python


System Soft Technologies


Location

Delhi | India


Job description

Job Summary: Design and implementation of large-scale data solutions with strong knowledge of the SDLC process

Top Skills : SQL / Python / Azure / Snowflake / ETL

Qualifications: 5+ years of experience: Design and implementation of large-scale data solutions with strong knowledge of the SDLC process. 3+ years of experience: Working with the Snowflake Cloud Data Warehouse. Strong expertise in Healthcare Insurance domain: Specifically, health plan data (claims, membership). Snowflake Proficiency : Hands-on development experience with features like Snow SQL, Snowpipe, Tasks, Streams, Time Travel, Zero Copy Cloning, Optimizer, Metadata Manager, data sharing, copy into, and stored procedures. Cloud Expertise : Strong understanding of Snowflake on Azure architecture, design, implementation, and operationalization of large-scale data and analytics solutions. Data Pipeline Development: Create automated pipelines using batch and streaming mechanisms to ingest and process structured and unstructured data from source systems into analytical platforms. SQL and Python Expertise : Advanced SQL : Write complex queries for testing and validation activities. Expert SQL : Proficient in both testing and development, including type1 and type2, facts, and merge statements. Strong Python : Develop and maintain robust data pipelines using Python to ingest data from various sources (SFTP, SharePoint, etc.) into Azure Data Lake Storage (ADLS) Gen2. Strong Python - Implement efficient file handling processes, including unzipping, decryption, and parsing, ensuring data integrity and security. Strong Python - Implement comprehensive logging and error handling mechanisms for data pipelines, enabling issue identification, troubleshooting, and optimization on Azure Data Lake Storage (ADLS) Gen2. Data Warehousing: Extensive experience with OLTP, OLAP, dimensions, facts, and data modeling. ETL/ELT Expertise : Professional experience with ETL/ELT processes. Data Transformation: Translate mapping specifications into data transformation design and development strategies, incorporating best practices for optimal execution. Quality Assurance: Test ETL workflows, ensure data meets standards, and understand RDBS systems. Data Pipeline Automation : Understand modern cloud-based data pipeline automation methods and document implementations clearly. Collaboration and Communication : Perform code reviews, provide production support, and communicate effectively with cross-functional teams. Technical Documentation : Build and maintain detailed documentation for all data processing workflows, adhering to best practices. Test Plan Development : Develop test plans and scripts for all data processing workflows.

Preferred Skills: Snowflake certification (e.g., SnowPro) SSIS, SSRS, and Tableau knowledge Agile/Scrum methodology experience (JIRA preferred) Version control tool experience (preferably Bitbucket)


Job tags



Salary

All rights reserved