Valid
Location
São Paulo, SP | Brazil
Job description
Challenge We are looking for a candidate to work in our RDO department on developing and maintaining the architecture and infrastructure needed to extract, transform and load data (ETL) from various sources. The ideal candidate will have a strong background in data engineering , a deep understanding of data pipelines and proficiency in implementing scalable and efficient solutions . If you'd like to get into this field, are good with data modeling and want to take on challenges in the analytics area, this job is for you! Responsibilities Data Architecture - Design, implement, and maintain scalable and robust data architecture to support the organization's data needs. Collaborate with analysts, and other stakeholders to understand data requirements and translate them into technical specifications. ETL Development - Develop, optimize, and maintain ETL processes to ensure the efficient and accurate flow of data from source systems to data warehouses. Implement data cleansing, transformation, and enrichment techniques to ensure data quality and consistency. Data Warehousing - Manage and optimize data warehouse structures for fast querying and reporting. Work with large datasets and implement strategies for partitioning, indexing, and optimizing query performance. Data Integration: Integrate data from various sources, both internal and external, ensuring seamless data flow and consistency. Implement real-time data integration solutions to support business needs. Data Modeling: Design and implement data models that align with business requirements and support analytical processes. Ensure data models are scalable, flexible, and meet performance requirements. Monitoring and Maintenance: Implement monitoring systems to proactively identify and address data pipeline issues. Perform regular maintenance, updates, and optimizations to enhance data pipeline performance. Reporting - Implement reports in Power BI to generate insights to different stakeholders. Background Proven experience as a Data Engineer or in a similar role. Strong programming skills in languages such as Python or Java Proficiency in SQL and experience with database systems (e.g., PostgreSQL, MySQL, MongoDB). Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with ETL platforms such Azure Synapses and Alteryx Strong knowledge of reporting tools such as Power BI and Tableau Excellent problem-solving and analytical skills Strong communication and collaboration skills Education Bachelor's degree in computer science, data science, or a related field Fluent English Spanish will be a plus Remote opportunity to work in São Paulo
Job tags
Salary