logo

JobNob

Your Career. Our Passion.

Big Data Engineer - SQL/Python


GSPANN Technologies Inc.


Location

Bangalore | India


Job description

Role and Responsibilities :- Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support.- Solve complex business problems by utilizing a disciplined development methodology.- Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies.- Analyze source and target system data. Map the transformation that meets the requirements.- Interact with the client and onsite coordinators during different phases of a project.- Design and implement product features in collaboration with business and technology stakeholders.- Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers to maintain data engineering standards. - Analyze and profile data to design scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to resolve product issues proactively.- Build data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive, and PySpark.Skills and Experience :- Must have a bachelor's degree in computer science.- 3+ years of experience in developing data and analytic solutions. - Expertise in relational SQL, scripting languages such as Python, and working with agile teams.- Hands-on with source control tools such as GitHub and related development processes.- Good understanding of workflow scheduling tools such as Airflow. - In-depth knowledge of AWS cloud (S3, EMR, Databricks).- Should be passionate about data solutions.- Strong problem-solving and analytical mindset. - Thorough with the design, development, and testing of data pipelines. (ref:hirist.tech)


Job tags



Salary

All rights reserved