logo

JobNob

Your Career. Our Passion.

Senior Data Engineer


Eaton


Location

Pune | India


Job description

What You'll Do

Eaton Corporation's Center for Intelligent Power has an opening for a Senior Data Engineer As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining our data infrastructure and systems. You will collaborate with cross-functional teams to understand data requirements, implement data pipelines, and ensure the availability, reliability, and scalability of our data solutions..He can program in several languages and understands the end-to-end software development cycle including CI/CD and software release.

  • Design, develop, and maintain scalable data pipelines and data integration processes to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake.
  • Collaborate with stakeholders to understand data requirements and translate them into efficient and scalable data engineering solutions.
  • Optimize data models, database schemas, and data processing algorithms to ensure efficient and high-performance data storage and retrieval.
  • Implement and maintain data quality and data governance processes, including data cleansing, validation, and metadata management.
  • Work closely with data scientists, analysts, and business intelligence teams to support their data needs and enable data-driven decision-making.
  • Develop and implement data security and privacy measures to ensure compliance with regulations and industry best practices.
  • Monitor and troubleshoot data pipelines, identifying and resolving performance or data quality issues in a timely manner.
  • Stay up to date with emerging technologies and trends in the data engineering field, evaluating and recommending new tools and frameworks to enhance data processing and analytics capabilities.
  • Collaborate with infrastructure and operations teams to ensure the availability, reliability, and scalability of data systems and infrastructure.
  • Mentor and provide technical guidance to junior data engineers, promoting best practices and knowledge sharing.

Qualifications

  • Bachelor's degree in computer science or software engineering
  • 5+ years of progressive experience in delivering technology solutions in a production environment
  • 5+ years of experience in the software industry as a developer, with a proven track record of shipping high quality products
  • 3+ years working with customers (internal and external) on developing requirements and working as a solutions architect to deliver

'Required

'- Bachelor Degree in Computer Science or Software Engineering or Information Technoogy

  • Experience on Cloud Development Platforms - Azure & AWS and their associated data storage options
  • Cloud based Analytics (AWI, REST API, Microservices)
  • Knowledge of IoT technologies, including cloud processing, like Azure IoT Hub.
  • Experience on CI/CD (Continuous Integration/Delivery) i.e. Jenkins, GIT, Travis-CI
  • Virtual build environments (Container, VMs and Microservices) and Container orchestration - Docker Swarm, Kubernetes/Red Hat Openshift.
  • Relational & non-relational database systems - SQL, Postgres SQL, NoSQL, MongoDB, CosmosDB, DocumentDB
  • Data Warehousing & ETL - Write complex queries that are accessible, secure and perform in optimized manner that outputs to different consumers and systems
  • ETL on Big Data Technologies - Hive, Impala
  • Progamming Knowledge - Java and/or Python and associated IDE's (Eclipse, IntelliJ, PyCharm etc.)
  • Data pipelining, scripting, reporting
  • In-depth knowledge of SOA (Service Oriented Architecture)
  • Experience in Azure Tools - Blob, SQL, Data Lake, Hive, Hadoop, Data Factory , Databricks, Azure Functions
  • SW Development life-cycle process & tools
  • Agile development methodologies and concepts including handson with Jira, bitbucket and confluence.
  • Knowledge of streaming technologies like Apache Kafka, AWS Kinesis, Azure EventHubs
  • Knowledge of data analysis tools, like Apache Presto, Hive, Azure Data Lake Analytics, AWS Athena, Zeppelin
  • Ability to specify and write code that is accessible, secure, and performs in an optimized manner with an ability to output to different types of consumers and systems
  • Proven experience working as a Data Engineer, with at least 6+ years of experience in data engineering, data warehousing, or related fields.
  • Strong proficiency in SQL and experience with relational databases like MySQL, PostgreSQL, or similar.
  • Hands-on experience with big data technologies such as Apache Hadoop, Spark, Airflow or similar frameworks.
  • Expertise in data modeling, data integration, and ETL processes.
  • Proficiency in programming languages like Python, Java, or Scala, with experience in building data pipelines and automation scripts.
  • Familiarity with cloud-based data platforms and services such as AWS, Azure, or Google Cloud Platform.
  • Experience with data visualization tools like Tableau, Power BI, or similar.
  • Knowledge of data security and privacy principles, including GDPR and data protection regulations.
  • Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve complex data engineering challenges.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.
  • Experience with agile development methodologies and version control systems is preferred.

Desired

  • Experience in Design Thinking or human-centered methods to identify and creatively solve customer needs, through a holistic understanding of customer's problem area
  • Knowledgeable in leveraging multiple data transit protocols and technologies (MQTT, Rest API, JDBC, etc)
  • Knowledge of Hadoop and MapReduce/Spark or related frameworks
  • Knowledge of Scala

Skills

  • Excellent verbal and written communication skills including the ability to effectively communicate technical concepts as a part of virtual, global teams
  • Good interpersonal, negotiation and conflict resolution skills
  • Ablity to understand academic research and apply new data science techniques
  • Experience being part of larger teams with established big data platform practices, as well as smaller teams where they made a bigger impact in terms of scope.
  • Experience of working with global teams work Experience and awareness, Strong communication skills to interact with global teams.
  • Innate curiosity
  • Self-directed and hungry to learn – a person, who with time in his/her hands, will independently find interesting ways to push the envelope, learning new skills and growing Self and the team.
  • Team player- we work in small, fast moving teams.

]]


Job tags



Salary

All rights reserved