Develop data pipelines for analytics and reporting, optimize workflows, assist in data ingestion, monitor job performance, and collaborate on solutions.
Requisition Number: 2355738
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Developing, enhancing, and maintaining data pipelines that enable analytics and enterprise reporting solutions
- Assist in building and optimizing data processing workflows using Python, Spark, and Scala, following established development standards and best practices
- Support batch and streaming data ingestion processes, including exposure to Kafka based data pipelines
- Apply data warehousing and data modeling fundamentals while working with structured and semi structured data
- Participate in activities such as job monitoring, incident resolution, and ensuring day to day platform stability
- Assist with addressing security vulnerabilities, implementing fixes, and following data security guidelines across environments
- Support tool and software upgrades, testing changes and validating data pipelines post deployment
- Create, modify, and maintain Azure Data Factory (ADF) pipelines under the guidance of senior engineers
- Support Databricks workloads, including job execution, performance troubleshooting, and basic optimizations
- Manage and support cloud data storage and data movement using Azure Blob Storage and AZ Copy
- Collaborate with cross functional teams, document solutions, and participate in knowledge transfer activities
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor's degree or equivalent experience
- Foundational experience with UNIX/Linux, DataStage, and Teradata
- Working knowledge of Python for data processing and automation tasks
- Exposure to Apache Spark and basic big data processing concepts
- Exposure to or interest in working with healthcare data systems
- Familiarity with Airflow or similar scheduling/orchestration tools
- Basic understanding of Databricks and Snowflake platforms
- Understanding of ETL/ELT processes, data warehousing concepts, and SQL fundamentals
- Proven solid communication skills, analytical thinking, and eagerness to learn
Preferred Qualifications:
- Experience supporting production systems or participating in operational activities
- Exposure to Kafka or streaming data concepts
- Familiarity with GitHub and GitHub Copilot
- Basic knowledge of Azure cloud services, especially Azure Data Factory and Blob Storage
- Proven ability to work effectively in a team oriented, enterprise environment
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and develop ETL processes, optimize data pipelines, and ensure data quality while collaborating with teams to meet complex data requirements in a cloud-native environment.
Top Skills:
AzureDatabricksDatastagePythonSparkSpark SqlSQLUnix
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and develop ETL processes, optimize data pipelines, and ensure data quality while collaborating with teams to meet complex data requirements in a cloud-native environment.
Top Skills:
AzureDatabricksDatastagePythonSparkSpark SqlSQLUnix
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Tech Support Analyst ensures the stability of applications on Azure, manages the CI/CD process, troubleshoots incidents, and drives resolution while enhancing automation and monitoring tools.
Top Skills:
AksAzure MonitorCi/CdKubernetesAzureSplunk
What you need to know about the Kolkata Tech Scene
When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

