logo

JobNob

Your Career. Our Passion.

Data Engineer


Helius Technologies


Location

Hyderabad | India


Job description

Work Location: Hyderabad Client: Singapore based Job Type: 1 year renewable contract

Required Qualifications : § 5+ years of strong data warehousing experience using RDBMS and Non-RDBMS databases. § 5 years of recent hands-on experience (actively coding) working as a data engineer (back-end software engineer considered). § 2+ years of experience in ETL (AWS Glue), Amazon S3, Amazon RDS, Amazon Kinesis, Amazon Lambda, Apache Airflows, Amazon Step Functions. § Strong knowledge in scripting languages like Python, UNIX shell and Spark is required. § Understanding of RDBMS, Data ingestions, Data flows, Data Integrations etc. § Working in an agile, dynamic and customer facing environment is required. § Understanding of distributed systems and cloud technologies (AWS) is preferred. § Understanding of data streaming and scalable data processing is preferred to have. § Experience with large scale datasets, data lake and data warehouse technologies such as AWS Redshift, Google BigQuery, Snowflake. Snowflake is preferred. § Technical expertise with data models, data mining and segmentation techniques. § Experience with full SDLC lifecycle and Lean or Agile development methodologies. § Knowledge of CI/CD and GIT Deployments.

Responsibilities : § Work with stakeholders to understand needs for data structure, availability, scalability, and accessibility. § Develop tools to improve data flows between internal/external systems and the data lake/warehouse. § Build robust and reproducible data ingest pipelines to collect, clean, harmonize, merge, and consolidate data sources. § Understanding existing data applications and infrastructure architecture § Build and support new data feeds for various Data Management layers and Data Lakes § Evaluate business needs and requirements § Support migration of existing data transformation jobs in Oracle, and MS-SQL to Snowflake. § Lead the migration of the existing data transformation jobs in Oracle, Hive, Impala etc. into Spark, Python on Glue etc. § Develop and maintain datasets. § Improve data quality and efficiency. § Lead Business requirements and deliver accordingly. § Collaborate with Data Scientists, Architect and Team on several Data Analytics projects.


Job tags



Salary

All rights reserved