Senior Analyst Programmer
Location
Bangalore | India
Job description
About the role
The role is responsible for maintaining, governing, development of common frameworks & tools for FIL Data Lake with specific focus on Data Ecosystem for reporting and analytics. This role engages across all engineering development & data teams as well as technical data architect.
The function will be responsible for understanding business, data & technology services with a view to create a strategic approach to data management and data lake adoption. Integrate data siloed across the disparate systems, manages the unified data for centralized data access and governance.
What is expected from you
- You have excellent software designing, programming, engineering, and problem-solving skills.
- Strong experience working on Data Ingestion, Transformation and Distribution using AWS & Snowflake
- Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT
- Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPC's
- Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring)
- Experience with designing, implementing, and overseeing the integration of data systems and ETL processes.
- Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark
- Establish strategies for data extraction, ingestion, transformation, automation, and consumption.
- Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data
- Experience in creating CI/CD Process for Snowflake
- Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage
- Ability, willingness & openness to experiment / evaluate / adopt new technologies
- Passion for technology, problem solving and team working
- Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems
- Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization
- Effective collaborator adept at using all effective modes of communication and collaboration tools
- Experience delivering on data related Non-Functional Requirements like-
- Hands-on experience dealing with large volumes of historical data across markets/geographies.
- Manipulating, processing, and extracting value from large, disconnected datasets.
- Building water-tight data quality gates on investment management data
- Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errors etc.
What is there for me in this role
- Excellent opportunity to work on emerging data technologies and learn with highly energised technical and engineering community, creating an environment of collaborative learning, sharing and positive challenge.
- Work in highly meritocratic set-up where excellence is suitably enabled, supported and rewarded.
- Ability to work on mission critical, enterprise grade financial services systems which support millions of customers.
- Exposure to work towards our technical strategy which has taken on the challenge of relentless simplification, cloud onboarding, modern technologies and ways of working/problem solving.
Experience and Qualifications
Job tags
Salary