Location
Bangalore | India
Job description
DESIRED QUALIFICATIONS:
- Knowledge of Python 3, with the ability to build REST API using FastAPI or Flask REST Framework
Experience in building data pipelines and processing statistical data using python data science libraries. Strong proficiency in SQL Experience working with Jupyter Notebook. Experience working with GraphQL using Graphene.
- Experience designing and developing ETL pipelines to move data from source systems (CRM, weblogs, etc.) into data lake (HDFS, S3 etc.).
Experience in designing scalable microservices and deploying using Docker. Experience in Docker, Docker compose and Kubernetes. Knowledge of user authentication and authorization using OAuth2. Experience in working with AGILE model. Experience working with cloud environment like AWS, GCP or Azure Working knowledge of Git
- Deep proficiency with Python 3 and basic libraries for machine learning (E.g., scikit-learn, NumPy, pandas)
Perform Exploratory Data Analysis to gain insights from the data. Python REST API development for at least 3 years is mandatory.
- Ability to write unit test cases, integration test cases and maintain 80 percent code coverage.
Working knowledge of overall machine learning workflow Experience in building machine learning workflows. Training models, tuning hyperparameters and production deployment for the Customers.
- Perform research, POC and analyse ML algorithms that could be used to solve a given problem.
Experience in Marketing mix and Sales forecasting using Linear model Must be a good team player and self-motivated to achieve positive results.
- Takes ownership for responsibilities.
Demonstrates a high degree of reliability, integrity, and trustworthiness. Demonstrates strong communication presentation skills. Ability to manage time and meet/exceed all deadlines. RESPONSIBILITIES: Build and maintain complex ETL and data pipelines in python Strong marketing domain acumen to understand marketing strategies and convert it into Python code
- Perform data manipulation, wrangling, cleansing, and analysis and be responsible for the complex and large-scale datasets to be used for statistical modelling and data mining.
Job tags
Salary