logo

JobNob

Your Career. Our Passion.

Data Engineer - Data Engineering


Bristlecone


Location

Pune | India


Job description

Requirements
BA BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, Big Query, Dataproc, Datalab, Dataprep, Pub Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, Big Query Data Transfer Experience with data processing software such as Hadoop, Kafka, Spark, Pig, Hive and with data processing algorithms MapReduce, Flume. Experience working with technical customers. Experience in writing software in one or more languages such as Java, Python 6 to 10 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator

Job Responsibilities
Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL ELT and reporting analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production grade Big Data solutions in virtualized environments such as Google Cloud Platform mandatory and AWS Azure good to have Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multitier high availability applications with modern web technologies such as NoSQL, Kafka, NPL, MongoDB, SparkML, Tensorflow. Working knowledge of ITIL and or agile methodologies Google Data Engineer certified Good knowledge of Agile ScrumRequirements BA BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, Big Query, Dataproc, Datalab, Dataprep, Pub Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, Big Query Data Transfer Experience with data processing software such as Hadoop, Kafka, Spark, Pig, Hive and with data processing algorithms MapReduce, Flume. Experience working with technical customers. Experience in writing software in one or more languages such as Java, Python 6 to 10 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicatorJob ResponsibilitiesExperience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL ELT and reporting analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production grade Big Data solutions in virtualized environments such as Google Cloud Platform mandatory and AWS Azure good to have Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multitier high availability applications with modern web technologies such as NoSQL, Kafka, NPL, MongoDB, SparkML, Tensorflow. Working knowledge of ITIL and or agile methodologies Google Data Engineer certified Good knowledge of Agile Scrum


Job tags



Salary

All rights reserved