Software Engineering Specialist
Location
Pune | India
Job description
Job ID: 183949
Required Travel : Minimal
Managerial - No
Location: :India- Pune (Amdocs Site)
Who are we?
Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers’ innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our 30,000 employees around the globe are here to accelerate service providers’ migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $4.3 billion in fiscal 2021. For more information, visit Amdocs at
In one sentence
Responsible for the design, development, modification, debugging and/or maintenance of software systems. Works on specific modules, applications or technologies, and deals with sophisticated assignments during the software development process.
What will your job look like?
- Perform Development & Support activities for Data warehousing domain
- Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues
- Perform Development & Deployment. Should be able to Code, Unit Test & Deploy
- Creation necessary documentation for all project deliverable phases
- Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met
All you need is...
- Technical Skills:
- Mandatory
- Should have very clear understanding of Snowflake Architecture
- At least 3+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , and other advanced features like snowpipe , semi structured data load
- Worked on AVRO, PARQUET files loading to snowflake
- Knowledge of SQL, Unix & advanced Unix Shell Scripting
- Hands on file transfer mechanism (NDM, sFTP , Datarouter etc)
- Knowledge of Schedulers like TWS
- Good to have
- 3+ Years Hands on experience of Python ProgrammingHands-on Experience or working knowledge of Spark/Databricks
- Working knowledge of Kafka, Ni-fi
- Experience of working on Cloud (AWS/Azure/GCP) Data Warehousing Environment
- Willingness to learn all data warehousing technologies & work out of the comfort zone in other ETL technologies (Datastage, Oracle, Mainframe etc). Hands on working experience is a plus
- Behavioral skills :
- Eagerness & Hunger to learn
- Good problem solving & decision making skills
- Good communication skills within the team, site and with the customer
- Ability to stretch respective working hours when necessary, to support business needs
- Ability to work independently and drive issues to closure
- Consult when necessary with relevant parties, raise timely risks
- Effectively handle multiple and complex work assignments, while consistently deliver high quality work
Why you will love this job:
- You will work on large challenging and complex programs.
- You will be able to work with the best and brightest minds to collaborate across accounts/regions to provide a single voice.
- You will have the opportunity to work with the industry most advanced technologies including Cloud.
Job tags
Salary