logo

JobNob

Your Career. Our Passion.

Data Reliability Engineer I


Location

Secunderabad | India


Job description

HIRING FOR UPGRAD

Responsibilities:

1. **Monitoring and Maintenance**:
- Monitor and maintain data platforms, databases, and data pipelines to ensure optimal performance and reliability, proactively identifying and addressing any issues that may arise.

2. **Issue Resolution**:
- Troubleshoot and resolve data-related issues, including performance bottlenecks, data inconsistencies, and data quality problems, ensuring minimal disruption to production systems.

3. **Monitoring Systems**:
- Implement and maintain monitoring, alerting, and logging systems to detect and address data-related issues in a timely manner, minimizing downtime and maximizing system uptime.

4. **Collaboration with Engineering Teams**:
- Liaise with engineering teams for Root Cause Analysis (RCAs) and permanent resolutions for issues impacting production sites, fostering effective communication and collaboration between teams.

5. **Data Pipeline Maintenance**:
- Collaborate with the data engineering team to maintain data pipelines and workflows, ensuring smooth data ingestion, processing, and delivery.

6. **Capacity Planning and Optimization**:
- Assist in capacity planning and optimization of data storage and processing resources, ensuring efficient utilization and scalability of data infrastructure.

7. **Guidance and Support**:
- Provide guidance and support to data engineering and data science teams on data infrastructure and best practices, leveraging expertise to address technical challenges and optimize system performance.

8. **Incident Response and Improvement**:
- Participate in incident response, root cause analysis, and post-incident reviews to continuously improve system reliability and performance, implementing corrective actions and preventive measures as needed.

9. **Continuous Learning**:
- Stay updated with the latest industry trends, technologies, and best practices related to data infrastructure and operations, incorporating new knowledge and insights into existing practices.

Requirements:

1. **Educational Background**:
- Bachelor's degree in Computer Science, Engineering, or a related field, providing a strong foundation in data concepts and principles.

2. **Technical Skills**:
- Strong understanding of data concepts, relational databases, and SQL, along with familiarity with data infrastructure technologies such as data warehouses, data lakes, and ETL/ELT processes.

3. **Programming Proficiency**:
- Proficiency in scripting or programming languages such as Python, Perl, or Java, essential for automating tasks and developing data solutions.

4. **ETL Tools Experience**:
- Experience with ETL tools such as Nifi, Airflow, and Debezium, critical for managing data pipelines and workflows efficiently.

5. **Cloud and Containerization Knowledge**:
- Experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization (e.g., Docker, Kubernetes) is advantageous, as well as familiarity with data warehousing solutions like Redshift.

6. **Monitoring Tools Familiarity**:
- Knowledge of monitoring and observability tools for data systems, such as Prometheus, Grafana, and ELK stack, beneficial for ensuring system health and performance.

7. **Version Control and CI/CD**:
- Experience with version control and CI/CD tools like Git and Jenkins preferred, facilitating collaboration and automation in the development and deployment processes.

8. **Problem-Solving and Communication Skills**:
- Excellent problem-solving skills and strong communication skills essential for effectively collaborating with technical and non-technical stakeholders and addressing complex data challenges.


Job tags



Salary

All rights reserved