Aeries Technology Group
Location
Hyderabad | India
Job description
About Aeries :Aeries Technology is a Nasdaq listed global professional services and consulting partner, headquartered in Mumbai, India, with centers in the USA, Mexico, Singapore, and Dubai. We provide mid-size technology companies with the right mix of deep vertical specialty, functional expertise, and the right systems & solutions to scale, optimize and transform their business operations with unique customized engagement models. Aeries is Great Place to Work certified by GPTW India, reflecting our commitment to fostering a positive and inclusive workplace culture for our employees. Read about us at : - Build data pipeline between source systems (RDS/SQL Server) to Snowflake on AWS- ETL data into AWS S3- ELT Data into Snowflake - Realtime data integration through CDC and AWS managed Kafka streams.- Embedded Tableau reporting- Data transformations using SQL, Python, PySpark, and DBT- Through understanding of data engineering principles including data lakes, and data warehouse- Data orchestration using Apache Airflow- Data Analysis skills- Best practices in data management, data quality and data stewardship were applied to ensure data integrity and reliability. - Optimize data pipelines for productivity and efficiency by considering factors such as scalability, reliability, and cost effectiveness. - Troubleshoot and resolve data infrastructure and pipelines in a timely manner. Here's What You'll Do :- Bachelor's or master's degree in computer science, Engineering, or a related field.- Proven experience (6-10 years) as a Data Engineer- Proficiency in SQL for data extraction, transformation, and analysis across various database systems.- Strong programming skills in Python for data manipulation, scripting, and building data engineering solutions.- Experience in Tableau for data visualization, dashboard creation, and presenting insights.- Hands-on experience with PySpark for large-scale data processing and analysis.- Familiarity with Apache Airflow for workflow management and automation.- Practical knowledge of AWS services like S3, EC2, EMR, etc., for building data solutions.- Understanding and usage of Kafka for real-time data streaming and processing.- Specific expertise in Kafka Streams for advanced data stream processing.- Excellent problem-solving skills and the ability to work in a fast-paced environment.- Strong communication and collaboration skills with a proactive attitude towards learning and implementing new technologies.- Curiosity about Business Process, experience in optimizing and automating processes (ref:hirist.tech)
Job tags
Salary