logo

JobNob

Sua carreira. Nossa paixão.

Data Engineer


Luxoft


Location

Brasil | Brazil


Job description

One of the world's largest multichannel video programming distributor is searching for a Data Engineer. The project focuses on a video streaming platform development. You'll be developing and maintaining a mixture of production-ready components for various digital enterprise systems, including Metadata and Content Management systems. Our team spans across multiple time zones (majorly, USA & Poland) and covers the entire product life cycle, including architecture, design, coding, DevOps, testing, and the rest of the software development activities required to ensure uninterrupted content streaming service to millions of customers worldwide. We work using Scaled Agile Framework (SAFe) as a process framework and use software development best practices such as CI/CD, TDD, etc.

Responsibilities

- Design, create and execute data processing, enrichment and summarization workflows utilizing SQL and No-SQL
- Develop new technical KPIs utilizing existing data by creating SQL-based workflows running on AWS and/or Snowflake/Redshift
- Ensure that various KPI summarization workflows execute according to designated daily/weekly/monthly SLAs
- Maintain real-time Kinesis-based data pipeline that is used for alerting and monitoring (Kinesis, OpenSearch/Kibana, Grafana, Prometheus, Glue, RedShift, RDS, etc.)
- Investigate data discrepancies, identify root cause and apply corrective changes/fixes following pre-determined SLAs
- Security vulnerability remediation (patching, key rotations, security ticket management, etc.)
- Data retention/lifecycle management (maintain cost structure)
- Ability to automate manual tasks into recurring reports/tasks/workflows
- Apply AI/ML/GenAI capabilities to automate manual tasks
- Adept at AWS development practices using OpenSearch, Glue, Kinesis, RedShift, CloudWatch, Athena, Grafana, Prometheus
- Familiarity with external data sets (NewRelic, ServiceNow, JIRA)
- Maintain existing data workflows to ensure correctness of hourly/daily/weekly/monthly/quarterly) reports
- Maintain up-to-date documentation

Skills

Must have

- Develop workflows using AWS Airflow server that utilize RDS, Snowflake, Redshift as back-end data sources
- Develop workflows using AWS Kinesis, OpenSearch, Blue, Grafana, Redshift
- Maintain data stored in AWS and Snowflake/Redshift following predetermined SLA and cost forecast
- Familiar with JIRA, Confluence, Bitbucket, Airflow/MWAA, Snowflake, Tableau
- Familiar with AWS cloud stack supporting data services (OpenSearch, Athena, Grafana, Kibana, Kinesis, RedShift)
- Dev skill set - Python, Java, JS, ES|QL

Nice to have

- Good written/verbal communication skills.
- Good problem-solving, and analytical abilities
- Good planning, organizational, and follow-up skills.
- Familiar with Git, JIRA, Bitbucket, Jenkins,
- Familiar with Agile software development practices
- Operations - Experience with troubleshooting and data analysis
- Developers - Adept experience with various languages such as SQL, python, Java, ES|QL
- Developers - Deep understanding of software development principals
- Attention to detail and completing work with high degree of accuracy

#J-18808-Ljbffr


Job tags



Salary

Todos os direitos reservados