Network18 Media & Investments Limited
Location
Noida | India
Job description
Key Responsibilities
- Lead the design and implementation of the data & analytics architecture ensuring compliance, quality, and sustainable platform growth
- Build scalable end-to-end data pipelines to integrate and model datasets from different sources that meet functional and non-functional requirements
- Manage the technical scope and architecture of the project before, during, and after delivery
- Work with product, business and functional stakeholders to understand data requirements and downstream analytics needs
- Responsible for ratifying technology solutions, producing concise Design documents, contributing to work estimates
- Translate business requirements & E2E designs into technical implementations based on system capabilities
- Define, and promote re-usable, extensible, scalable, and maintainable solutions considering trade-off for vs benefit.
- Communicate at all levels clearly and credibly about the importance of solution design
- Foster a data-driven culture throughout the team
- Drive innovation through a good understanding of data, business drivers, and business needs
Required Experience, Skills & Qualifications
- Around 8-10 years of relevant experience working with High-Performance Data Products or Data Systems as a Data Architect/Engineer.
- Advanced level proficiency in Designing and Developing Data Products, orchestration tools/services
- Proficient with Software Engineering best practices, such as unit testing and integration testing, and software development tools
- Extensive experience in at least one cloud platform with Big Data services (EMR, Databricks, etc.)
- Relevant experience in databases (columnar, NoSQL, and MPP databases: Redshift, Dynamodb, Aurora, Postgres, Google Bigquery and/or Snowflake) along with best practices in portioning and clustering tables for efficient performance.
- Should be aware of Security compliances and design practices
- Exceptional interpersonal, analytical, and communication skills including the ability to explain and discuss DevOps concepts with colleagues and teams
- Fully adhere to and evangelize an entire CI/CD pipeline
- Understands API development and use of JSON/XML as data formats
- Knowledge and hands-on experience with data engineering tools for any cloud provider. Eg. Apache Kafka, Apache Flink, Amazon S3 (Simple Storage Service), AWS Glue, Amazon Athena, Presto, Apache Airflow, Apache Spark, Amazon EMR (Elastic MapReduce), Amazon OpenSearch, Kibana, SQL (Structured Query Language), JDBC (Java Database Connectivity), ODBC (Open Database Connectivity), Parquet file format, Redshift, Trino, Airbyte
Job tags
Salary