Senior Data Engineering Lead – ETL
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together
Primary Responsibilities:
- Understand Caredata architecture/domain and start contributing for new/existing business requests
- Experience in Data Integration and Data Warehousing, Cloud
- Develop efficient and high performing ETL solutions
- Design, develop, test and performance tune complex ETLs
- Develops and maintains scalable data pipelines in response to customer requirements
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
- Design, develop, implement, and run data solutions that improve data efficiency, reliability, and quality
- Participate in design review sessions and peer ETL reviews
- Assess and profile source and target data (data structure, quality, completeness, schema, nulls, etc.) and requested business use cases
- Summarizes testing and validation results and can communicate and make recommendations/decisions on the best course of action to remediate
- Resourceful at coming up with solutions using existing or available resources based on knowledge of the organization and level of execution effort
- Participate in agile work environment, attend daily scrums, and complete sprint deliverables on time
- Supports practices, policies and operating procedures and ensures alignment to departmental objectives and strategy
- Ensure the code is meeting the desired quality checks using Sonar
- To make sure all the cloud infra is intact and resolve any issues encountered
- Schedule the deployed pipelines using Airflow following proper dependency hierarchy of jobs
- Move the code/application to higher environments(stage/prod) from non-prod for go-live
- Build and maintain pipelines and automation through Git Ops (GitHub Actions, Jenkins, etc)
- Identifies solutions to non-standard complex requests and problems and creates solution using available technologies
- Builds solid relationship with IT operational leaders to ensure connectivity to the business
- Supports a work environment in which people can perform to the best of their abilities. Holds self-accountable for technical abilities, productive results, and leadership attributes
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor or master’s degree in computer science or information technology or equivalent
- 10+ years of experience in designing and developing ETL solutions
- 10+ years of working knowledge in a Data Warehouse/BI environment
- Solid DBMS experience: Snowflake, SQL, Hive
- Good experience in job schedulers like Airflow
- Windows batch PowerShell scripting and/or UNIX shell scripting and Python experience
- Good experience in working on cloud environment, preferably Azure
- Experience with Continuous Integration CI/CD pipelines using Jenkins, GitHub Actions
- Experience with contemporary SDLC methodologies such as Agile, Scrum
- Expertise in BigData frameworks like Spark and good understanding of Hadoop concepts
- Hands on experience with Rally
- Solid ETL skills using Bigdata (Databricks, Spark, Scala, Python), Kafka and Azure Cloud
- Solid SQL skills including complex SQL constructs, DDL generation
- Proven excellent organizational, analytical, writing, problem solving and interpersonal skills
Preferred Qualifications:
- Good understanding of US healthcare
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone – of every race, gender, sexuality, age, location and income – deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission.
Información adicional sobre la vacante
Número de la requisición 2272264
Segmento de negocio Optum
Disponibilidad para viajar No
Ubicaciónes adicionales de la vacante
Gurgaon, Haryana, IN
Hyderabad, Telangana, IN
Estado de horas extras Exempt
Vacante de teletrabajo No