Data Engineering Consultant
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- The Data Engineering Consultant role will be working in support of the Information Security organization Data Enablement team. Defining, designing, constructing, processing and supporting data pipelines, data products and data assets, all in support of a wide variety of client use cases within the Security organization
- In this role, you will be working with a variety of programming languages on a tech stack that leverages Snowflake, various Cloud CSP data storage offerings, Airflow orchestration and scheduling and various internal CICD and Devops packages. You will be supporting our ongoing data strategy and solutions with a focus on Medalion Data architecture
- You will be working with other engineers, analysts and clients in understanding business problems, designing proposed solutions and supporting the products that you help to build
-
A successful candidate for this role will have experience working in a data products environment, is deeply skilled with Snowflake and Airflow and is skilled in building and supporting complex data pipelines and ETL/ELT with Python, Spark, Pyspark, Scala, Java and SQL and has demonstrated experience supporting industry standard data governance processes and data quality processes. Demonstrated ability to conduct data modeling and data model construction and support are also essential as is the ability to performance tune and optimize data pipelines
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Undergraduate degree in Computer Science, Data Science, Data Analytics, Mathematics, Information Systems or some related field with an emphasis on coding, analytics or applied mathematics
- 4+ years previous experience with direct hands on data engineering on Snowflake with Airflow
- 5+ years hand on experience coding complex data pipelines with Python, Pyspark, Scala, SQL and related languages
- 3+ years experience working with and integrating pipelines and tech stack components to database products like Postgres, MySQL, MS SQLserver, Oracle, Mongo, Cassandra
- Demonstrated experience working with and operating on one of the following cloud storage technologies – AWS, GCP, Azure
- Demonstrated experience operating in an environment with modern data governance processes and protocols\
- Demonstrated experience building data quality monitoring solutions
- Demonstrated experience working with modern CICD and devops principals including automated deployment
- Demonstrated problem solving skills
- Solid communications skills – verbally and written
Preferred Qualifications:
- Advanced degree in computer science, math, analytics, data science or other similarly technical field
- Experience with Security data and information security
- Experience with Health care data
- Experience implementing data privacy controls for variety of data from HIPAA to PII to PCI etc.
- Experience with streaming technologies like KAFKA
- Experience building and ingesting data from APIs
- Experience with Azure Data Factory
- Exposure to reporting technologies like Tableau, Powerbi, Microstrategy
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone – of every race, gender, sexuality, age, location and income – deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission.
#NJP
Additional Job Detail Information
Requisition Number 2305930
Business Segment Optum
Employee Status Regular
Travel No
Country: IN
Overtime Status Exempt
Schedule Full-time
Shift Day Job
Telecommuter Position No
Similar Jobs:
Our Hiring Process
We want you to know what our hiring process looks like. Watch the video and find out what to expect along the way.
What It’s Like
Watch the video and hear how our employees describe what it’s like to work here in Customer Service.
Careers at Optum
If you want to use your abilities to help us challenge the status quo and achieve on our ambitious mission, this is the right place for you. We are creating and delivering quality health care solutions that deeply impact the health care system. And this means opportunities for people like you to grow and innovate with us.
Closing the GAP
Our team members help close the gap in health care. Take a closer look and see how Lisa helps members navigate a complex health care system.