logo

JobNob

Your Career. Our Passion.

Azure Data Engineer


Tredence Analytics Solutions Private Limited


Location

Gurgaon | India


Job description

Role: Senior Databricks Engineer / Databricks Engineer 

Experience : 5-8 years

Location: Bangalore, Chennai, Delhi, Pune, Kolkata


About Tredence:

Tredence is a global data science solutions provider founded in 2013 by Shub Bhowmick, Sumit Mehra, and Shashank Dubey focused on solving the last-mile problem in AI. Headquartered in San Jose, California, the company embraces a vertical-first approach and an outcome-driven mindset to help clients win and accelerate value realization from their analytics investments. The aim is to bridge the gap between insight delivery and value realization by providing customers with a differentiated approach to data and analytics through tailor-made solutions. Tredence is 2200-plus employees strong with offices in San Jose, Foster City, Chicago, London, Toranto, and Bangalore, with the largest companies in retail, CPG, hi-tech, telecom, healthcare, travel, and industrials as clients.

As we completed 10 years of Tredence, we are on the cusp of an ambitious and exciting phase of expansion and growth.
• Tredence recently closed a USD 175 million Series B funding, which will help us build on growth momentum, strengthen vertical capabilities, and reach a broader customer base.
• Apart from our geographic footprint in the US, Canada & UK, we plan to open offices in Kolkata and a few tier 2 cities in India. In 2024, we also plan to hire more than 1000 employees across markets.
• Tredence is a Great Place to Work (GPTW) certified company that values its employees and creates a positive work culture by providing opportunities for professional development and promoting work-life balance.
• At Tredence, nothing is impossible; we believe in pushing ourselves to limitless possibilities and staying true to our tagline, Beyond Possible.

I appreciate your interest in Tredence Inc. and wish you the very best.

Primary Roles and Responsibilities: 

● Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack 

● Ability to provide solutions that are forward-thinking in data engineering and analytics space 

● Collaborate with DW/BI leads to understand new ETL pipeline development requirements. 

● Triage issues to find gaps in existing pipelines and fix the issues

● Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs 

● Help joiner team members to resolve issues and technical challenges. 

● Drive technical discussion with client architect and team members 

● Orchestrate the data pipelines in scheduler via Airflow 

Skills and Qualifications: 

● Bachelor's and/or master's degree in computer science or equivalent experience. 

● Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. 

● Deep understanding of Star and Snowflake dimensional modelling.

● Strong knowledge of Data Management principles 

● Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture 

● Should have hands-on experience in SQL, Python and Spark (PySpark) 

● Candidate must have experience in AWS/ Azure stack 

● Desirable to have ETL with batch and streaming (Kinesis). 

● Experience in building ETL / data warehouse transformation processes 

● Experience with Apache Kafka for use with streaming data / event-based data 

● Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) 

● Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)

● Experience working with structured and unstructured data including imaging & geospatial data. 

● Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. 

● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot 

● Databricks Certified Data Engineer Associate/Professional Certification (Desirable). 

● Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects

● Should have experience working in Agile methodology 

● Strong verbal and written communication skills. 

● Strong analytical and problem-solving skills with a high attention to detail. 

Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks


Job tags



Salary

All rights reserved