Tredence Inc.
Location
Bangalore | India
Job description
Recognized as a Leader in Customer Analytics by Forrester Research, Tredence is a global analytics services and solutions company. We are one of the fastest growing private companies in the country for three straight years according to the Inc. 5000 and we continue to set ourselves apart from our competitors by attracting the greatest talent in the data analytics and data science space. Our capabilities range from Data Visualization, Data Management to Advanced analytics, Big Data and Machine Learning. Our uniqueness is in building Scalable Big Data Solutions on Onprem/GCP/Azure cloud in a very cost effective and easily scalable manner for our clients. We also come in with some strong IP and pre-built analytics solutions in data mining, BI and Big Data
Role: Senior Data bricks Engineer / Data bricks Technical Lead/ Data Architect
Experience: 3-8 years
Location: Bangalore, Chennai, Delhi, Pune, Kolkata.
Primary Roles and Responsibilities:
● Developing Modern Data Governance solutioning using Data bricks and AWS/ Azure Stack
● Ability to provide solutions that are forward-thinking in data engineering and analytics space
● Collaborate with Data Governance leads to understand new ETL pipeline development requirements.
● Triage issues to find gaps in existing pipelines and fix the issues
● Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs
● Help joiner team members to resolve issues and technical challenges.
● Drive technical discussion with client architect and team members
● Orchestrate the data pipelines in scheduler via Airflow.
Skills and Qualifications:
● Bachelor's and/or master's degree in computer science or equivalent experience.
● Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects.
● Deep understanding of Star and Snowflake dimensional modelling.
● Strong knowledge of Data Management principles
● Good understanding of Data bricks Data & AI platform and Data bricks Delta Lake Architecture
● Should have hands-on experience in SQL, Python and Spark (PySpark)
● Candidate must have experience in AWS/ Azure stack
● Desirable to have ETL with batch and streaming (Kinesis).
● Experience in building ETL / data warehouse transformation processes
● Experience with Apache Kafka for use with streaming data / event-based data
● Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala)
● Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)
● Experience working with structured and unstructured data including imaging & geospatial data. ● Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT.
● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
● Data bricks Certified Data Engineer Associate/Professional Certification (Desirable).
● Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects
● Should have experience working in Agile methodology
● Strong verbal and written communication skills.
● Strong analytical and problem-solving skills with a high attention to detail.
Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Data bricks with Data Governance solutioning.
Job tags
Salary