Sysmind LLC
Location
San Francisco, CA | United States
Job description
Position
- SnowFlake developer
Job Title
SnowFlake developer
Location Details
San Francisco, CA
Subject
Job SnowFlake developer is shared with you
Description
Dear {JobSeeker : firstname},
My name is Kunal Dhall and I am a Staffing Specialist at SYSMIND. I am reaching out to you regarding a job opportunity with one of our clients.
Job Title - SnowFlake developer
Location - San Francisco, CA
Required Azure,Sql,ETL
Job Description :
| Technical Skills: Knowledge of SQL language and cloud-based technologies Data Warehousing concepts, data modeling, metadata management Data lakes, multi-dimensional models, data dictionaries Migration to AWS or Azure Snowflake platform Performance tuning and setting up resource monitors Snowflake modeling roles, databases, schemas SQL performance measuring, query tuning, and database tuning ETL Tools with cloud-driven skills Integration with third-party tools Ability to build analytical solutions and models Coding in languages like Python, Java, Root cause analysis of models with solutions Hadoop, Spark, and other warehousing tools Managing sets of XML, JSON, and CSV from disparate sources SQL-based databases like Oracle SQL Server, Teradata, etc. Snowflake warehousing, architecture, processing, administration Data ingestion into Snowflake Enterprise-level technical exposure to Snowflake applicationsSoft Skills: Project management Problem-solving Innovation and best coding practices Interpersonal, presentation, and communication skills Critical and out-of-the-box thinking Analytical, quantitative, problem-solving, and organizational skills Testing and test case preparation abilities Create, test, and implement enterprise-level apps with Snowflake Design and implement features for identity and access management Create authorization frameworks for better access control Implement novel query optimization, major security competencies with encryption Solve performance issues and scalability issues in the system Transaction management with distributed data processing algorithms Possess ownership right from start to finish Build, monitor, and optimize ETL and ELT processes with data models Migrate solutions from on-premises setup to cloud-based platforms Understand and implement the latest delivery approaches based on data architecture Project documentation and tracking based on understanding user requirements Perform data integration with third-party tools including architecting, designing, coding, and testing phases Manage documentation of data models, architecture, and maintenance processes Continually review and audit data models for enhancement Maintenance of ideal data pipeline based on ETL tools Coordination with BI experts and analysts for customized data models and integration Code updates, new code development, and reverse engineering Performance tuning, user acceptance training, application support Maintain confidentiality of data Risk assessment, management, and mitigation plans Regular engagement with teams for status reporting and routine activities Migration activities from one database to another or on-premises to cloud |
Full Name:
Current Location:
Annual Salary (for Fulltime) or Hourly rate (for Contract):
Work Authorization:
Earliest Available date to start:
Date and times available to interview:
Two Professional References:(Preferably Supervisory references):
Best Regards,
Kunal Dhall
Job tags
Salary