Data Engineer in an International Airlines
Location
Kochi | India
Job description
Job Purpose – Leverage SQL and Python to define and schedule pipelines that incrementally process new data from a variety of data sources. Able to work on Enterprise Integration Platform as well as various Operations analytic applications and dashboards in the Data Lake.
Key Accountabilities -- Application Development
- Leverage the Data bricks to perform core responsibilities for data pipeline development
- Programming for Enterprise Integration Platform (EIP) Connectivity data flow
- Use SQL and Python to write production data pipelines to extract, transform, and load data into tables and views in the Lake house
- Simplify data ingestion and incremental change propagation using Databricks-native features and syntax, including Delta Live Tables
Skills Required For The Role - Proficiency in SQL query syntax, including writing queries using SELECT, WHERE, GROUP BY, ORDER BY, LIMIT, and JOIN
- Proficiency in SQL DML statements, including DELETE, INSERT, UPDATE, and MERGE
- Proficiency in SQL DDL statements to create, alters, and drop databases and tables
- Experience with or knowledge of data engineering practices on cloud platforms, including cloud features such as virtual
- Familiarity with Python variables, functions, libraries and control flow
- Machines, object storage, identity management, and meta stores
- Ability to plan and deliver
- Ability to write optimized SQL queries (space and time complex)
- Ability to positively resolve issues.
- Willing to learn machine learning/Tableau for analytics
Job tags
Salary