logo

JobNob

Your Career. Our Passion.

Hadoop Administrator


Location

India | India


Job description

Job Description

Hadoop Administrator - Bangalore/ Pune/Chennai/Gurgaon- 6-12 Years

Are you curious, motivated, and forward-thinking? At FIS you’ll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun. 

What You Will Be Doing:

Service Management: Administer, configure, and maintain Hadoop ecosystem services within both Cloudera Data Platform (CDP) and Cloudera Distributed Hadoop (CDH), including HDFS, YARN, Hive, Spark, Ranger, Sentry, Kerberos security, KMS, KTS,  Zookeeper, and more.

Job Scheduling and Workflow Management: Utilize Oozie, Zena, Airflow to schedule and orchestrate complex data jobs and workflows, meeting business requirements efficiently. This includes day-end SLA jobs to ensure timely data processing.

Security and Access Control: Manage security measures such as Kerberos authentication, safeguarding Hadoop clusters and data in both CDP and CDH environments. Implement and maintain role-based access control (RBAC) policies in Ranger and Sentry.

Job Monitoring and Tuning: Set up robust monitoring systems, including advanced tools like Unravel, to track cluster health, job execution, and resource utilization in both CDP and CDH environments. Continuously fine-tune configurations for optimized performance.

Issue Resolution and Troubleshooting: Identify and troubleshoot issues related to job failures, data processing, and cluster performance in both CDP and CDH environments. Develop and implement solutions to mitigate recurring problems.

Cross-Functional Collaboration: Act as a bridge between technical teams, including application support, development, and scheduling. Facilitate effective communication to understand and address their requirements and issues.

Collaboration: Collaborating with different team to install patch, Haodop updates, version upgrade across CDP and CDH environments. Contribute to the development of data engineering solutions when required.

Automation and Scripting: Create and maintain automation scripts for administrative tasks and provisioning, applying Infrastructure as Code (IAC) principles for efficient cluster management in both CDP and CDH.

Documentation and Reporting: Maintain detailed documentation of configurations, procedures, and best practices for both CDP and CDH environments. Generate reports on cluster health, job scheduling, and performance metrics.

What You Will Bring:

What We Offer You:

A challenging and impactful role at the forefront of data-driven innovation. Opportunities for continuous learning and professional development. A collaborative and supportive team environment. Competitive compensation and benefits package.

Privacy Statement

FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice.

Sourcing Model

Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company.

#pridepass


Job tags



Salary

All rights reserved