logo

JobNob

Your Career. Our Passion.

Devops - Data Engineering


Marc Ellis


Location

UAE | United Kingdom


Job description

Job Title: DevOps Engineer - Data Engineering
Job Location: Dubai – UAE
Job Duration: 12 months extendable

We're on the lookout for a skilled DevOps Engineer with a strong background in data engineering to join
our team. In this role, you'll play a pivotal part in enhancing our data engineering processes by building
and maintaining CI/CD pipelines, automating data workflows, and optimizing our data infrastructure.

Responsibilities:

• Create and manage CI/CD pipelines for data engineering projects, handling data ingestion,
transformation, and delivery processes.

• Automate data workflows, ETL processes, and data pipeline deployments to boost efficiency and
reduce manual work

• Utilize infrastructure as code (IaaC) tools like Terraform or Ansible to oversee data processing and
storage resources.

• Establish and maintain monitoring and alerting systems to proactively spot and address issues in data
pipelines and infrastructure.

• Implement and reinforce security best practices and data governance within data engineering
workflows.

• Collaborate with data engineers, data scientists, and business analysts to comprehend their needs and
deliver DevOps support for data-related projects

• Keep comprehensive documentation of data workflows, configurations, and procedures.

• Continuously fine-tune data pipelines and infrastructure for improved performance, scalability, and
cost-efficiency.

• Take charge of the release process, encompassing version control, change management, and detailed
release notes

• Swiftly address and resolve data engineering-related incidents to minimize disruptions to business
operations

Qualifications:

• Bachelor’s degree in computer science, Information Technology, or related field.

• Demonstrated experience as a DevOps Engineer with a focus on data engineering projects.

• Strong background in developing and maintaining CI/CD pipelines.

• Proficient in automating data workflows, ETL processes, and data pipeline deployments.

• Knowledge of infrastructure as code (IaaC) tools like Terraform, Ansible, or similar.

• Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and data warehousing concepts.

• Proficiency in scripting and programming languages commonly used in data engineering (e.g., Python,
SQL, Shell).

• Experience with version control systems (e.g., Git) and collaborative development.

• Strong problem-solving abilities with great attention to detail.

• Excellent communication and collaboration skills.

• Experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g.,
Docker, Kubernetes) is advantageous.

• Knowledge of data governance, compliance, and data security best practices is a plus


Job tags

Full time


Salary

All rights reserved