Mandatory Skills-
JD
• 4+ years of hands-on experience - Hadoop, System administration with sound knowledge in
Unix based Operating System internals.
• Working experience on Cloudera CDP and CDH and Hortonworks HDP...
...experience- Minimum 4 years of experience in Big Data and distributed computing.- Proven experience building pipelines on Big Data- Technologies/Stack - Hadoop, Spark, Hive, Kafka, Airflow- Deep understanding of the...
We are Having Openings GCP Data Architect for one of our MNC Client in Bangalore and Gurgaon location.Payroll : Quess. Experience : 8 to 1... ...: B. Tech / B.E.Must Skills :. Job Description :- GCP B...
...Role : Application Developer Role Description : Design, build and configure applications to meet business process and application requirements. Must have Skills : Hadoop Good to Have Skills : Banking Strategy Job Re...
...Responsibilities:
We are seeking skilled and motivated Data Engineers and Data Analysts to join our Clients team. The selected candidates... ...,data handling,data quality assessment,bi,ap...
Role: Sr Big Data
Location Lower Parel (Mumbai)
Exp: 12 yrs
Job Summary:
Candidate to be part of Business Intelligence Unit and will be involved in developing application leveraging Big data tools and Hadoop platfor...
Data Specialist at Nilasu Consulting Services Pvt Ltd
Company Overview:
Nilasu Consulting Services Pvt Ltd is a leading human resources firm in India specializing in talent acquisition workforce management and HR consulting.
Role and Respon...
Required Qualifications:
3-5 years of experience with Hadoop (required)
5 years of Unix/ Linux admin activities related to the Hadoop platform (required)
3-5 years of Spark and Hive (required)
Healthcare Industry experience
Internal client-facing exp...
Required Qualifications:
3-5 years of experience with Hadoop (required)
5 years of Unix/ Linux admin activities related to the Hadoop platform (required)
3-5 years of Spark and Hive (required)
Healthcare Industry experience
Internal client-facing exp...
...engineers to access data and write MapReduce programs.
Develop documentation and playbooks to operate Hadoop infrastructure.
Evaluate and use hosted solutions on AWS / Google Cloud / Azure.
Write scalable and mainta...