...consuming web services that leverage Java;
● Web service hosting and consuming, using Guidewire studio toolkits;
● XML Coding;
● ANT, MAVEN and code repository such as Clear Case, SVN, TFS
If you're interested Please share your upda...
Hyderabad
Teamware Solutions a division of Quantum Leap Consulting Pvt...
Job Title: Senior Data Engineer
Must-Have Skills: Hive, Spark, Kafka, Streaming, SQL, Hadoop platform
Location: Pune/Mumbai/Gurgaon/Bangalore... ...best practices adhered to
Develop set process for Da...
Job Description :- SQL, Python and Java , Data Modelling, Hadoop Platform, Cloud Platform(GCP preferable)- Developing and implementing an overall organisational data strategy that is in line with business...
At least 5 years of experience building ETL/ELT, data warehousing, and big data solutions.
At least 5 years of experience in building data models and pipelines to process large datasets.
At least 3 years of experience with Python, Spark, Hive, Hadoop...
...and has expanded its operations in the Americas, Asia and Africa. Exusia has recently also been recognized by publications such as the CIO Review, Industry Era, Insight Success and the CIO Bulletin for the company’s...
Senior Python developer Exp - 6 to 8 Yrs Location - Hyderabad JD : - Hands-on experience in Python. - Client is technology agnostic, and they will be tested mainly on their thinking approach and problem-solving abilities. - Senior positions in their ...
...consuming web services that leverage Java;
● Web service hosting and consuming, using Guidewire studio toolkits;
● XML Coding;
● ANT, MAVEN and code repository such as Clear Case, SVN, TFS
If you're interested Please share your upda...
Hyderabad
Teamware Solutions a division of Quantum Leap Consulting Pvt...
...scalability and reliability of software and infrastructure that support thousands of Client customer instances across thousands of server farms globally distributed across the world.Working in an agile team, you wil...
Required Qualifications:
3-5 years of experience with Hadoop (required)
5 years of Unix/ Linux admin activities related to the Hadoop platform (required)
3-5 years of Spark and Hive (required)
Healthcare Industry experience
Internal client-facing exp...
...engineers to access data and write MapReduce programs.
Develop documentation and playbooks to operate Hadoop infrastructure.
Evaluate and use hosted solutions on AWS / Google Cloud / Azure.
Write scalable and mainta...