Skip to main content
Open search form

Explore remote jobs

Pursue your passion and potential

AI or ML Engineer

Noida, India

Caring. Connecting. Growing together.

With these values to guide us, our people are committed to making a meaningful difference in the lives of those we are honored to serve.

AI or ML Engineer

Requisition number: 2348560 Job category: Technology Primary location: Noida, Uttar Pradesh Date posted: 04/07/2026 Overtime status: Exempt Travel: No

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.


The AI/ML engineer (SG27) will be responsible for building and scaling next-generation Big Data platforms and cloud-native data applications that enable advanced analytics, AI/ML, and intelligent automation. This role requires solid hands-on expertise in PySpark, Scala Spark, and Azure Cloud, with a focus on batch and streaming architectures.


The role involves close collaboration with data science, AI/ML, architecture, and business teams to define technical solutions, integrate machine learning and LLM-based capabilities, and ensure high performance, reliability, and scalability of enterprise data systems. The engineer will also contribute to architectural decisions, automation strategy, and continuous improvement of the Big Data ecosystem.


Primary Responsibilities:

  • Design, code, test, document, and maintain high-quality, scalable Big Data applications using PySpark and Scala Spark on Azure Cloud platforms
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into robust data engineering solutions
  • Assist in defining and evolving technical and data architecture, ensuring alignment with enterprise standards and long-term scalability
  • Build and maintain batch and real-time streaming data pipelines, leveraging technologies such as Kafka and Spark Structured Streaming
  • Partner with AI/ML teams to integrate machine learning models into data pipelines for predictive analytics, automation, and intelligent insights
  • Support deployment, versioning, and monitoring of ML models using platforms such as MLflow, Azure ML, and Databricks ML
  • Work with Large Language Models (LLMs) to develop intelligent data-driven solutions, including prompt engineering, secure API integrations, and contextual data enrichment
  • Embed LLM capabilities into data and development pipelines following governance, security, and context management best practices
  • Research, evaluate, and implement new tools, frameworks, and automation patterns to enhance sustainability and performance of the Big Data platform
  • Identify gaps, risks, and optimization opportunities in existing data solutions and recommend improvements
  • Perform performance tuning, optimization, and application migration across cloud and Big Data environments
  • Build and maintain CI/CD automation pipelines using Jenkins, GitHub Actions, Docker, Kubernetes, Maven, and related tools
  • Analyze complex data architectures, design frameworks, and integrate with multiple databases and data warehouses, including Snowflake
  • Develop prototypes, proof-of-concepts (POCs), and participate in design and code reviews to ensure engineering excellence
  • Collaborate with management, architecture, QA, and cross-functional development teams to deliver end-to-end solutions
  • Write comprehensive technical documentation and provide support for production deployments and incident resolution
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or equivalent
  • 7+ years of professional experience in data engineering and Big Data development
  • Hands-on experience with Snowflake and solid expertise in SQL and PL/SQL
  • Hands-on experience with Docker, Kubernetes, and modern DevOps practices
  • Hands-on exposure to LLMs (e.g., OpenAI GPT, Azure OpenAI), including prompt design, fine-tuning concepts, and secure workflow integration
  • Solid hands-on experience with Python and Scala
  • Experience building large-scale batch and streaming data processing systems
  • Experience with Shell scripting for automation and operational support
  • Experience integrating AI/ML models using frameworks such as scikit-learn, TensorFlow, or PyTorch
  • Experience building and managing CI/CD pipelines using Jenkins, GitHub Actions, and Git-based workflows
  • Experience working in Agile development environments
  • Proven experience working in cloud environments, preferably Microsoft Azure
  • Expertise in Apache Spark and solid understanding of Hadoop ecosystem concepts
  • Understanding of the ML model lifecycle, including training, evaluation, deployment, and monitoring
  • Proven solid collaboration skills with cross-functional teams and stakeholders
  • Proven ability to analyze complex problems and deliver solution-focused outcomes


Preferred Qualifications:

  • Good understanding of US healthcare domain 
  • Familiarity with Microsoft Copilot Studio for building and deploying conversational AI solutions


At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Benefits

Our mission of helping people live healthier lives extends to our team members. Learn more about our range of benefits designed to help you live well.

Life

Resources and support to focus on what matters most to you, in every facet of your life.

Emotional

Education, tools and resources to help you reduce and manage stress, build resilience and more.

Physical

Health plans and other coverage to support wellness for you and your loved ones.

Financial

Benefits for today and to help you plan for the future, including your retirement.

Learn more
testimonial-img-1
testimonial-img-2
testimonial-img-3

We’re honored to be recognized for our exceptional work culture

AGWF recognition award
2025 Campus Forward Award badge from RippleMatch
LinkedIn Top Companies 2025 award badge
Forbes Best Large Employers in the United States 2024 award badge
America’s Greatest Workplaces 2024 award badge