AgileEngine
Location
Tallahassee, FL | United States
Job description
AgileEngine is a top-ranking provider of software solutions to Fortune 500, Global 500, and Future 50 companies. Listed on Inc. 5000 among the fastest-growing US companies, we are always open to talented software, UX, and data experts in the Americas, Europe, and Asia.
If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment daily, there’s no better place — guaranteed! :)
What you will do
Lift and Shift ETL pipelines from Legacy to New environments;
Monitor data pipelines, identify bottlenecks, optimize data processing and storage for performance and cost-effectiveness;
Analyze sources and build Cloud Data Warehouse and Data Lake solution;
Collaborate effectively with cross-functional teams including data scientists, analysts, software engineers, and business stakeholders.
Must haves
3+ years of professional experience in a Data Engineering role;
Proficiency in programming languages commonly used in data engineering such as Python, SQL, and optionally Scala for working with data processing frameworks like Spark and libs like Pandas;
Proficiency in designing, deploying, and managing data pipelines using Apache Airflow for workflow orchestration and scheduling;
Ability to design, develop, and optimize ETL processes to move and transform data from various sources into the data warehouse, ensuring data quality, reliability, and efficiency;
Knowledge of big data technologies and frameworks such as Apache Spark for processing large volumes of data efficiently;
Extensive hands-on experience with various AWS services relevant to data engineering, including but not limited to Amazon MWAA, Amazon S3, Amazon RDS, Amazon EMR, AWS Lambda, AWS Glue, Amazon Redshift, AWS Data Pipeline, Amazon DynamoDB;
Deep understanding and practical experience in building and optimizing cloud data warehousing solutions;
Ability to monitor data pipelines, identify bottlenecks, and optimize data processing and storage for performance and cost-effectiveness;
Excellent communication skills to collaborate effectively with cross-functional teams including data scientists, analysts, software engineers, and business stakeholders;
Bachelor’s degree in computer science/engineering or other technical field, or equivalent experience.
Nice to haves
Familiarity with the fintech industry, understanding of financial data, regulatory requirements, and business processes specific to the domain;
Documentation skills to document data pipelines, architecture designs, and best practices for knowledge sharing and future reference;
GCP services relevant to data engineering;
Snowflake;
OpenSearch, Elasticsearch;
Jupyter for analyze data;
Bitbucket, Bamboo;
Terraform.
Professional growth
Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.
Competitive compensation
We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.
A selection of exciting projects
Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.
Flextime
Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
Job tags
Salary