logo

JobNob

Your Career. Our Passion.

Senior Manager


EXL


Location

Gurgaon | India


Job description

Title: Lead Data/Solution Architect

Experience: 7+ years

Location: Gurgaon/Bengaluru (Hybrid)

Job Summary

EXL is seeking a talented and experienced Data/Solution Architect to join our team. The Data/Solution Architect will play a crucial role in defining the principles, standards, best practices & guidelines for designing, implementing, and maintaining data and technology solutions that align with our business goals and objectives. This role requires a deep understanding of data architecture, technology stacks, and the ability to translate business requirements into scalable and efficient solutions.

Responsibilities

  • Data Architecture Design: Define, develop and maintain data architecture models, including data schemas, data flow diagrams, and data integration strategies to ensure data quality, consistency, availability, and reliability while leveraging leading big data & analytics architecture practices and advanced data technologies to meet complex & enterprise scale business & IT needs.
  • Solution Design: Collaborate with cross-functional teams to define and design end-to-end data solutions involving various technologies, that meet business requirements, considering scalability, security, and performance. Ensure that data and technology solutions meet security and compliance requirements, including GDPR, HIPAA, or industry-specific regulations.
  • Data Governance: Data management best practices, including data quality standards.
  • Technical Leadership: Provide technical leadership and guidance to development teams, ensuring that solutions are aligned with architectural standards and best practices.
  • Vendor Assessment: Evaluate and recommend third-party tools, technologies, and services that can enhance our data and technology landscape.
  • Documentation: Create and maintain detailed architectural documentation, including system diagrams, process flows, and technical specifications.
  • Performance Optimization: Continuously assess and optimize data and technology solutions to improve performance, scalability, and cost-efficiency.
  • Training and Mentorship: Provide training and mentorship to technical teams to promote best practices in data and solution architecture.
  • Problem Solving: Identify and resolve technical issues, roadblocks, and challenges that may arise during project implementation.
  • Define the principles, standards, best practices & guidelines for Architecture & Data Engineering frameworks & processes from scratch
  • Act as subject matter expert for internal & external stakeholders.
  • Identify & research relevant technologies, performs PoCs and recommends product architecture
  • Develop & recommend productivity aids to accelerate deliveries
  • Work with multiple clients, identify data management opportunities in the form of gaps or new scope, work on solution design & technical roadmap for the same. And provide advice & guidance to assigned groups on implementation of data & analytics solutions.
  • Stay abreast with new tools, tech & trends in the market, create PoVs on where they fit in the Data Management scope and how they support the client architectures.
  • Business Development in the form of RFP responses, building accelerators, data products etc.

Preferred Qualification

  • 8+ years of exp with experience in Data Modeling, ETL, Data Warehousing, Data Engineering, Master Data Management, Data Quality, with 2+ years working as Data/Solution architect.
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data/Solution Architect or a similar role in a relevant industry.
  • Strong hands-on experience of data modeling, data ingestion, data integration, and data management concepts.
  • Strong hands-on experience with data warehousing, data lakes and big data technologies.
  • Has worked on at least one On-Prem to Cloud migration project from design to delivery stage.
  • Proficiency in SQL, Python & PySpark
  • Strong knowledge of data engineering tools and distributed processing frameworks & stream processing systems (e.g., Hadoop, Hive, Spark, Kafka, Airflow, storm, etc)
  • Strong hands-on experience on Storage Technologies & Data Modeling techniques - SQL databases (e.g., MySQL, PostgreSQL, SQL Server), NoSQL databases (e.g., MongoDB, Cassandra, Elasticsearch), and in-memory databases (e.g., Redis), HDFS, Data Lake, delta lake/iceberg, etc.
  • Proficiency in ETL Tools like Fivetran, Talend, HVR, Informatica, etc.
  • Proficiency in at least one cloud platform (AWS, Azure, GCP) & relevant data services.
  • Should have good experience in one of modern data platforms like Databricks or Snowflake.
  • Good experience in fetching & exposing data via APIs
  • Good knowledge of Data Architecture maturity evaluation concepts & metrics.
  • Should have good experience in creating Workflows & scheduling the pipelines using orchestration tools like Airflow.
  • Should have good exposure on how to package Python applications.
  • Should have good understanding of Data-Ops, CI/CD pipelines, monitoring tools, Dockers, Kubernetes etc. hands-on experience in good to have .
  • Proficiency in version control systems (e.g., Git) and understanding on CI/CD tools like Jenkins
  • Highly motivated self-starter with a proven track record of rapidly acquiring and mastering new tools and technologies to drive innovative solutions and enhance productivity.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.

Good To Have

  • Relevant certifications (e.g., TOGAF, AWS Certified Solutions Architect) are a plus.
  • Experience in streaming data pipelines
  • Experience in Data Governance policies & data security
  • Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes)


Job tags



Salary

All rights reserved