AgileEngine
Location
São Paulo, SP | Brazil
Job description
AgileEngine is a top-ranking provider of software solutions to Fortune 500, Global 500, and Future 50 companies. Listed on Inc. 5000 among the fastest-growing US companies, we are always open to talented software, UX, and data experts in the Americas, Europe, and Asia.
If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment daily, there’s no better place — guaranteed! :)
What you will do
Build sophisticated data pipelines using dbt, Airflow, and Snowflake, with special emphasis on performance optimization and data integrity using Great Expectations;
Lead architectural design sessions for the modern data stack, focusing on solutions that seamlessly integrate with our technology stack, which encompasses Snowflake, Airflow, dbt, Great Expectations, and AWS data services;
Work with our data science and product management teams to design, rapidly prototype, and productize new data product ideas and capabilities;
Conquer complex problems by finding new ways to solve with simple, efficient approaches with a focus on reliability, scalability, quality, and cost of our platforms;
Build processes supporting data transformation, data structures metadata, and workload management;
Collaborate with the team to perform root cause analysis and audit internal and external data and processes to help answer specific business questions.
Must haves
Master’s Degree (or a B.S. degree with relevant industry experience) in math, statistics, computer science, or equivalent technical field;
5+ years of professional Dimensional Data Warehousing/Data Modeling and ‘Big Data’ Experience;
5+ years in a pivotal Software/Data Engineering role, with deep exposure to modern data stacks, particularly Snowflake, Airflow, dbt, and AWS data services;
Expertise in applying Python and SQL to execute complex data operations, customize ETL/ELT processes, and perform advanced data transformations across the platform;
Expertise in establishing data quality assurance frameworks particularly using Great Expectations;
Experience working directly with data analytics to bridge business requirements with data engineering;
Experience with AWS infrastructure;
Excellent troubleshooting and problem-solving skills;
Ability to operate in an agile, entrepreneurial start-up environment, and prioritize;
Excellent communication and teamwork, and a passion for learning;
Curiosity and passion for data, visualization, and solving problems;
Willingness to question the validity, accuracy of data and assumptions.
Nice to haves
Experience with Redshift, Snowflake, or other MPP databases is a plus;
Knowledge for ETL/ELT tools like Informatica, Matillion, IBM DataStage, or SaaS ETL tools is a plus;
Experience with Tableau or other reporting tools is a plus;
Experience with CI/CD using tools like CircleCI, Harness is a plus.
Professional growth
Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.
Competitive compensation
We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.
A selection of exciting projects
Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.
Flextime
Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
Job tags
Salary