PTL
Location
Pune | India
Job description
Job Description
Responsibilities: Build data pipelines and data processing using Apache Airflow, Data Lake, Spark and SQL Database. Involve in design and build of data service APIs. Mandatory Skill Set Apache Airflow, PySpark, Python, SQL server, T-SQL Experience developing data pipelines using Airflow and working with big data processing using PySpark Skills: pyspark,sql,airflowJob tags
Salary