Location
Bangalore | India
Job description
It's fun to work in a company where people truly BELIEVE in what they are doing!
We're committed to bringing passion and customer focus to the business. Job Responsibilities - Be an integral part of large-scale client business development and delivery engagements by understanding the business requirements.
- Hands-on with Dataflow/Apache beam and Realtime data streaming
- Engineer ingestion and processing pipelines on GCP using python libraries, Java, BigQuery and composer.
- Automate the repeatable tasks into a framework that can be reused in other parts of the projects.
- Handle the data quality, governance, and reconciliation during the development phases.
- Being able to communicate with internal/external customers, desire to develop communication and client-facing skills.
- Understand and contribute in all the agile ceremonies to ensure the efficiency in delivery.
Qualification & Experience - A bachelor's degree in Computer Science or related field.
- Minimum 5 years of experience in software development.
- Minimum 3 years of technology experience in Data Engineering projects
- Minimum 3 years of experience in GCP.
- Minimum 3 years of experience in python programming.
- Minimum 3 years of experience in SQL/PL SQL Scripting.
- Minimum 3 years of experience in Data Warehouse / ETL.
- Ability to build streaming/batching solutions.
- Exposure to project management tools like JIRA, Confluence and GIT.
- Ability to define, create, test, and execute operations procedures.
Must Have Skills - Strong understanding of real time streaming concepts
- Strong problem solving and analytical skills.
- Good communication skills.
- Understanding of message queues like Kafka, Rabbit MQ, PubSub
- Understanding of fast data caching systems like Redis/Memory Store
- GCP experience – 3+ years
- Dataflow/Apache beam hands of experience – Custom templates
- Understanding of Composer
- Good experience with Big Query and PubSub
- Good hands-on experience with Python
- Hands on experience with modular java code development involving design patterns – Factory, Reflection, etc.
Good To Have Skills - GCP Professional Data Engineer certification is an added advantage.
- Understanding of Terraform script.
- Understanding of Devops Pipeline
- Identity and Access Management, Authentication protocols
- Google drive APIs, One drive APIs
Location - Hyderabad (Client location)
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Not the right fit Let us know you're interested in a future opportunity by clicking
Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Job tags
Salary