Walmart
Location
Chennai | India
Job description
What youll do:
As a data scientist youll understand the business problem and develop ETL based solution
Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech
Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome products.
Excellent proficiency in data science technologies Excellent proficiency in data engineering technologies (ETL, yarn, spark, kafka, hive, GCP, Python / R / Scala, Linux / Unix / Shell environments, query optimisation, data proc )
Software development by providing engineering patterns to deliver the optimal product; including communicating design patterns.
Partner with product owners and business SMEs to analyse the business need and provide a supportable and sustainable engineered solution. Ensure that the overall technical solution is aligned with the business needs.
Drive the creation and modifications of the product portfolio components, identify and engage all technical resources necessary to contribute to the solution ensure the solution is consistent with Walmart architecture, design, and development standards.
Experiment: This is a startup-like environment so everything can change as we experiment with doing more custom partnership work
Develop applications using industry best practices. Make adjustments to adopt new methodologies that provide the business with increased flexibility and agility.
Stay current with the latest development tools, technology ideas, patterns and methodologies; share knowledge by clearly articulating results and ideas to key stakeholders.
What youll bring:
Experience in Spark and Distributed computing frameworks.
Professional hand-on experience in scala/python.
Professional hand-on experience in Sql and Query Optimization.
Experience in programming design patterns.
Experience in system design.
Should have Experience with CI/CD tool like Jenkins etc.
Should have Experience with Orchestration tools like Airflow/ Automic etc.
Should have hand-on experience in Data processing and Data manipulation skills like Data warehousing concepts, SCD types etc.
Atleast one cloud exposure (GCP is preferred).
Exposure to Streaming data use-cases via kafka or structured streaming etc.
Exposure to multi-hop Architecture would be added advantage .
Job tags
Salary