logo

JobNob

Your Career. Our Passion.

Hadoop Developer


Nisum


Location

Hyderabad | India


Job description

Nisum is a leading global digital commerce firm headquartered in California, with services spanning digital strategy and transformation, insights and analytics, blockchain, business agility, and custom software development. Founded in 2000 with the customer-centric motto “ Building Success Together® ,” Nisum has grown to over 1,800 professionals across the United States, Chile,Colombia, India, Pakistan and Canada. A preferred advisor to leading Fortune 500 brands, Nisum enables clients to achieve direct business growth by building the advanced technology they need to reach end customers in today’s world, with immersive and seamless experiences across digital and physical channels.

Job Brief A Hadoop Developer helps build large-scale data storage and processing software and infrastructure. Knowledge of existing tools is essential, as is the capacity to write software using the Hadoop API. What You'll Do Write software to interact with HDFS and MapReduce. Assess requirements and evaluate existing solutions. Build, operate, monitor, and troubleshoot Hadoop infrastructure. Develop tools and libraries, and maintain processes for other engineers to access data and write MapReduce programs. Develop documentation and playbooks to operate Hadoop infrastructure. Evaluate and use hosted solutions on AWS / Google Cloud / Azure. Write scalable and maintainable ETLs. Understand Hadoop’s security mechanisms and implement Hadoop security. Write software to ingest data into Hadoop. What You Know At least 5 years of experience building ETL/ELT, data warehousing, and big data solutions. At least 5 years of experience in building data models and pipelines to process large datasets. At least 3 years of experience with Python, Spark, Hive, Hadoop, Kinesis, and Kafka. Proven expertise in relational and dimensional data modeling. Understand PII standards, processes, and security protocols. Experience building a data warehouse using Cloud Technologies such as AWS or GCP Services, and Cloud Data Warehouse preferably Google BigQuery. Able to confidently express the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior levels of management. Familiar with coding best practices, develop and manage code in a modularized and scalable way. Experience implementing and supporting operational data stores, data warehouses, data marts, and data integration applications. In-depth knowledge of Big Data solutions and the Hadoop ecosystem. Ability to effectively share technical information, and communicate technical issues and solutions to all levels of business. Able to juggle multiple projects - can identify primary and secondary objectives, prioritize time, and communicate timelines to team members. Passionate about designing and developing elegant ETL/ELT pipelines and Frameworks. Ability and desire to take product/project ownership. Ability to think creatively, strategically, and technically. Ability to work a flexible schedule based on department and Company needs. Cloud Architect (AWS GCP or Azure) Certification is a plus.


Job tags



Salary

All rights reserved