System Soft Technologies
Location
Delhi | India
Job description
Job Summary: A large global retailer is looking for a Data Engineer for join the Finance Data Factory team in the Finance Technology organization! -IBG -Data lake Current project is in Azure data bricks, but we are moving to GCP
Required Skills and Experience: 3-6 years of experience in building of large-scale data pipelines using big data technologies (Spark/Kafka/Cassandra/Hadoop/Hive/Presto/Airflow) 3-6 years of experience in systems design, algorithms, and distributed systems Experience with cloud infrastructure, such as Open Stack, Azure, GCP, or AWS PySpark, Parquet, Python Framework Development Azure DevOps and its interaction with Databricks and Data Factory On-premises big data technology Optimization in GBQ ETL & Data Warehousing concepts Strong in connecting to and ingesting from multiple source types to Google Cloud Storage or another cloud platform Large scale distributed systems experience, including scalability and fault tolerance Preferred Skills: A continuous drive to explore, improve, enhance, automate, and optimize systems and tools. Strong computer science fundamentals in data structures and algorithms Exposure to information retrieval, statistics, and machine learning. Good understanding of metadata driven developmentJob tags
Salary