logo

JobNob

Your Career. Our Passion.

Samta - Data Architect - Snowflake DB


SAMTA INFOTECH PRIVATE LIMITED


Location

Nagpur | India


Job description

Position : Data Architect (Snowflake)Experience : 3 - 7 yearsLocation : Noida/NagpurJob Responsibilities :Designing and implementing data architecture :- Work with business stakeholders to understand the data requirements and design a data architecture accordingly. - Determine the data storage and processing requirements, design data models, and create data pipelines to ingest, transform, and load data into Snowflake.Data pipeline development : - Develop and maintain data pipelines that extract, transform, and load data from various sources into Snowflake. - They use tools such as SQL, Python, and ETL/ELT platforms to build scalable and efficient data pipelines.Performance optimization and tuning : - Data Engineers optimize the performance of data pipelines and queries in Snowflake by implementing best practices for data loading, data transformation, and query optimization. - They also monitor the performance of the data platform and take proactive measures to improve efficiency.Data security and compliance : - SnowPro Data Engineers ensure that data within Snowflake is secure and compliant with data privacy regulations. - They implement access controls, encryption, and other security measures to protect sensitive data.Collaborating with stakeholders : - Work closely with business stakeholders, data analysts, and data scientists to understand their data requirements and provide technical expertise. - They collaborate with cross-functional teams to ensure that the data solutions meet the needs of the organizationPurpose of the Position : As a Snowflake Data Engineer and Developer, this position requires candidate who are enthusiastic about specialized skills in Snowflake Technology and features. As a member of the team, you will help our clients, by building Models that supports to progress on their Snowflake journey.Work and Technical Experience :- 5+ years of experience to Design and develop efficient and scalable data models in Snowflake to support our organization's data analytics and reporting needs.- Experience working on platform transformation projects migrating data from multiple source sources to Snowflake.- Working experience in migrating from on premise and other cloud databases to Snowflake.- Design and implement database objects such as tables, views, schemas, and stored procedures in Snowflake to organize and manage data effectively.- Understand data pipelines and ETL processes to extract, transform, and load data from various sources into Snowflake using tools such as Snowflake's native features, Informatica, Talend, or other ETL tools.- Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling principles.- Optimize data models, queries, and data pipelines for performance and efficiency, including partitioning strategies, indexing, query optimization, and data compression techniques.- Proficiency in SQL and experience with Snowflake's SQL dialect for querying and manipulating data.- Monitor and maintain Snowflake database performance, availability, and reliability, including proactive monitoring, troubleshooting, and capacity planning.- Collaborate with data engineers, data analysts, and business stakeholders to understand data requirements, define data models, and deliver data solutions that meet business needs.- Document data models, database designs, ETL processes, and data governance policies to ensure clear understanding and maintainability of data solutions.- Experience with performance tuning and optimization techniques for Snowflake databases and data pipelines.- Provide technical support and training to users. - Excellent problem-solving skills and the ability to work in a collaborative team environment.Must Have : Snowflake, ETL Development, PL-SQLGood To Have : Programming languages like Python (ref:hirist.tech)


Job tags



Salary

All rights reserved