Mandatory Skills-
JD
• 4+ years of hands-on experience - Hadoop, System administration with sound knowledge in
Unix based Operating System internals.
• Working experience on Cloudera CDP and CDH and Hortonworks HDP...
Job Title : Python Devloper
Location : Ahmedabad
Experience : 4 to 12 Years
Skills : Python, Unix Shell Scripting, OOPS Concepts Job Description :
TCS has always been in the spotlight for being adept in the next big technologies. What w...
...experience- Minimum 4 years of experience in Big Data and distributed computing.- Proven experience building pipelines on Big Data- Technologies/Stack - Hadoop, Spark, Hive, Kafka, Airflow- Deep understanding of the...
We are Having Openings GCP Data Architect for one of our MNC Client in Bangalore and Gurgaon location.Payroll : Quess. Experience : 8 to 1... ...: B. Tech / B.E.Must Skills :. Job Description :- GCP B...
...What You'll Do
This position is expected to:
Expertise in designing and developing applications using Big Data and Cloud technologies
Hands-on experience on Spark, and Hadoop echo system components
Hands...
...Responsibilities:
We are seeking skilled and motivated Data Engineers and Data Analysts to join our Clients team. The selected candidates... ...,data handling,data quality assessment,bi,ap...
Role: Sr Big Data
Location Lower Parel (Mumbai)
Exp: 12 yrs
Job Summary:
Candidate to be part of Business Intelligence Unit and will be involved in developing application leveraging Big data tools and Hadoop platfor...
Required Qualifications:
3-5 years of experience with Hadoop (required)
5 years of Unix/ Linux admin activities related to the Hadoop platform (required)
3-5 years of Spark and Hive (required)
Healthcare Industry experience
Internal client-facing exp...
Required Qualifications:
3-5 years of experience with Hadoop (required)
5 years of Unix/ Linux admin activities related to the Hadoop platform (required)
3-5 years of Spark and Hive (required)
Healthcare Industry experience
Internal client-facing exp...
Minimum 7 years of experience with at least 4 to 5 years in Cloud computing. 1) Primary Skill: Hands on experience in building EKS clusters(Expert level) Spark on EKS AWS(EMR S3 RDS) Python as programming skills and Shell scripting. Work experience ...