logo

JobNob

Your Career. Our Passion.

Aws Data Architect


Blazeclan Technologies


Location

Delhi | India


Job description

Position: AWS Data Architect Experience: 7+ yearsLocation: Pune, Mumbai, BangaloreLooking for an AWS Data Architect resource on a contract position for 6 months who will be working from the Customer Mumbai Office. Professional Skill Requirements Proven ability to build, manage, and foster a team-oriented environment Proven ability to work creatively and analytically in a problem-solving environment Desire to work in an information systems environment Excellent communication (written and oral) and interpersonal skills Excellent leadership and management skillsRequirements Minimum of 5 years of Consulting or client service delivery experience on Amazon AWS (AWS) Minimum of 7 years of experience in big data, database, and data warehouse architecture and delivery Minimum of 5 years of professional experience in 2 of the following areas: Solution/technical architecture in the cloud Big Data/analytics/information analysis/database management in the cloud Experience with private and public cloud architectures, pros/cons, and migration considerations. Extensive hands-on experience implementing data migration and data processing using AWS services: VPC/SG, EC2, S3, AutoScaling, CloudFormation, LakeFormation, DMS, Kinesis, CDC processing Redshift, Snowflake, RDS, Aurora, DynamoDB, Cloudtrail, CloudWatch, Docker, Lambda, Spark, Glue, API GW, etc. Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, H base, etc. Bachelor's Degree or equivalent amount of work experience (12 years), or Associate's Degree with six years of work experience.Certifications Certified AWS Solutions Architect AssociateGood to have: Certified AWS Solutions Architect Professional (Nice to have) Certified AWS Big Data Specialty (Nice to have) Certified AWS AI/ML Specialty (Nice to have) DevOps on an AWS platform. Multi-cloud experience is a plus. Experience developing and deploying ETL solutions on AWS Strong in Spark, and PySpark. Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog, etc. Multi-cloud experience a plus Azure, AWS, GoogleInterested candidates can drop their CVs on [email protected]


Job tags



Salary

All rights reserved