logo

JobNob

Your Career. Our Passion.

Hadoop Developer


Nisum


Location

Hyderabad | India


Job description

At least 5 years of experience building ETL/ELT, data warehousing, and big data solutions. At least 5 years of experience in building data models and pipelines to process large datasets. At least 3 years of experience with Python, Spark, Hive, Hadoop, Kinesis, and Kafka. Proven expertise in relational and dimensional data modeling. Understand PII standards, processes, and security protocols. Experience building a data warehouse using Cloud Technologies such as AWS or GCP Services, and Cloud Data Warehouse preferably Google BigQuery. Able to confidently express the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior levels of management. Familiar with coding best practices, develop and manage code in a modularized and scalable way. Experience implementing and supporting operational data stores, data warehouses, data marts, and data integration applications. In-depth knowledge of Big Data solutions and the Hadoop ecosystem. Ability to effectively share technical information, and communicate technical issues and solutions to all levels of business. Able to juggle multiple projects - can identify primary and secondary objectives, prioritize time, and communicate timelines to team members. Passionate about designing and developing elegant ETL/ELT pipelines and Frameworks. Ability and desire to take product/project ownership. Ability to think creatively, strategically, and technically. Ability to work a flexible schedule based on department and Company needs. Cloud Architect (AWS GCP or Azure) Certification is a plus.


Job tags



Salary

All rights reserved