logo

JobNob

Your Career. Our Passion.

rq07115 business intelligence specialist - etl developer.


Randstad


Location

North York, ON | Canada


Job description

This is a contract position for a Business Intelligence Specialist - ETL Developer for our public sector client.
Work location: Downtown Toronto
Duration: Twelve Months
7.25 hours/day
This is a Hybrid role that requires the candidate to be onsite 3 days of the week (This is a mandatory requirement of the role and is non-negotiable outside of exceptional circumstances.)

Advantages
Good compensation working on projects that will make a difference to the people of Ontario.

Responsibilities
Design, develop and implement ingestion framework from Oracle data source to Azure Data Lake - initial load and incremental ETL. Tools used are:
- Azure Data Factory (good knowledge required) to maintain pipeline from Oracle to Azure Data Lake
- Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake
- Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into datamart model

- Review the requirements, database tables, and database relationships - Identify gaps and inefficiencies in current production reporting environment and provide recommendations to address them in the new platform

- Design ingestion framework and CDC - tools used are Oracle Golden Gate and Azure Data Factory
- Prepare design artifacts
- Work with IT partner on configuration of Golden Gate - responsible to provide direction and "how to"
- Maintain dynamic pipeline for ETL ingestion to add new tables and data elements
- Data design - physical model mapping from data source to reporting destination
- Understand the requirements. Recommend changes to the physical model to support ETL design
- Reverse engineer and document existing SQL logic to improve design effort
- Assist with data modelling and updates of source-to-target mapping documentation
- Develop scripts for the physical model, and update database and/or data lake structure
- Access Oracle DB, SQL Server, and Azure environments, using SSIS, SQLDeveloper, Azure Data Studio, Azure Data Factory, Databricks and other tools to develop solution
- Proactively communicate with business and IT experts on any changes required to conceptual, logical and physical models, communicate and review timelines, dependencies, and risks
- Development of ETL strategy and solution for different sets of data modules
- Understand the Tables and Relationships in the data model
- Create low level design documents and test cases for ETL development
- Create the workflows and pipeline design
- Development and testing of data pipelines with Incremental and Full Load
- Develop high quality ETL mappings/scripts/notebooks
- Develop and maintain pipeline from Oracle data source to Azure Data Lake and Databricks Sql Warehouse
- Develop ETL to update datamarts built in Databricks Sql Warehouse
- Perform unit testing
- Ensure performance monitoring and improvement
- Performance review, data consistency checks
- Troubleshoot performance issues, ETL issues, log activity for each pipeline and transformation
- Review and optimize overall ETL performance
- End-to-end integrated testing for Full Load and Incremental Load
- Plan for Go Live, Production Deployment
- Create production deployment steps
- Configure parameters, scripts for go live. Test and review the instructions
- Create release documents and help build and deploy code across servers
- Go Live Support and Review after Go Live
- Review existing ETL process, tools and provide recommendation on improving performance and reduce ETL timelines
- Review infrastructure and remediate issues for overall process improvement
- Knowledge Transfer to Ministry staff, development of documentation on the work completed
- Document work and share the ETL end-to-end design, troubleshooting steps, configuration and scripts review
- Transfer documents, scripts and review of documents to Ministry

Qualifications
- 7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL
- 2+ Azure Data Lake and Databricks, and building Azure Data Factory and Azure Databricks pipelines
- 2+ years Python and PySpark
- Oracle Golden Gate
- SQL Server
- Oracle
- Ability to present technical requirements to the business

Nice to have:
- Knowledge and experience building data ingestion, history, change data capture using Oracle Golden Gate is an asset

Summary
If interested and qualified for this role, please apply today for immediate consideration!

Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.

Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to [email protected] to ensure their ability to fully participate in the interview process. show more


Job tags

Contract workImmediate startDowntown3 days per week


Salary

All rights reserved