DivTek Global Solutions Inc.
Location
Reno, NV | United States
Job description
Job Title: Senior Data Engineer (AWS)
Job Location: Remote or Reno, NV
Salary: competitive salary, and benefits
Job Type: Full-Time
About Company: This is a full-time opening with our direct client
Note: Client is unable to sponsor applicants for work visas
REQUIREMENTS
- 10+ years of professional experience
- Strong experience with AWS connectÂ
- 5+ years of experience in data management, data engineering, or data architecture.
- Enterprise Data Warehouse development preferred
- Experience working with healthcare data
- Experience with AWS infrastructure including AWS Connect and AWS Lambda.
- 3+ years of SQL programming experience and associated SQL tools (SSIS, SSMS, SSRS, etc.).
- Experience with Visual Studio is preferred.
- At least 3 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions.
- Minimum of 3 years of RDBMS experience.
- Extensive hands-on experience implementing data migration and data processing using Amazon Web Services. Includes knowledge of Amazon Connect and Amazon Lambda.
- Knowledge of medical terminology, especially ICD-10 codes, CPT codes, DRG codes, and an understanding of adjudicated claims data.
RESPONSIBILITIES
- Manage and optimize the movement and validation of data from an Epic EMR system to SQL databases in AWS or Azure and either from or to the Salesforce platform.
- Accountable for data engineering lifecycle including research, proof of concepts, architecture, design, development, test, deployment, and maintenance.
This role will focus on AWS optimization with some work occurring in other cloud-based environments. - Oversee the development of novel data pipelines that integrate and normalize large data from a variety of sources (e.g., electronic health record, claims, wearable device, publicly available data, etc.) to enable learning health, machine learning model development, and deployment.
- Design, direct and implement ETL processes, including data capture, data quality, testing and validation methods.
- Layer in instrumentation in the development process so that data pipelines can be monitored. Measurements are used to detect internal problems before they result into user visible outages or data quality issues.
- Build processes and diagnostic tools to troubleshoot, maintain and optimize engineering environments and respond to production issues.
- Provide subject matter expertise and hands on delivery of data capture, curation, and consumption pipelines for AWS.
- Participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data applications on the Azure platform.
- Develop documentation, such as data dictionaries, guides, or data flow diagrams that assists staff in identifying, locating, and using the organization s data
How To Apply : If interested please apply through Dice.com or send an email to pramod AT dtgsi.com with jobid 'DIV24-AWSDEC'
Report this job
- Dice Id: 10264163
- Position Id: DIV24-AWSDEC
Job tags
Salary