The role involves designing and maintaining ETL pipelines, ensuring data quality and compliance, leading teams, and collaborating with various business partners.
Description and Requirements
Position Summary :
MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology and operations capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science
Role Proposition
Data Engineer plays a critical role in data and analytics life cycle and significantly contributes to production grade data and analytics solutions. The role requires one to demonstrate Big Data, Engineering and Cloud expertise. Besides contributing in individual capacity to solve wide ranging business problems, this role also leads and develops Data and Analytics talent
Job Responsibilities :
• Design, build, and maintain robust ETL/ELT pipelines on cloud(Azure) or on-prem to collect, ingest and store large volumes of structured and unstructured data for batch/real time processing
• Monitor, optimize, and troubleshoot data pipelines to ensure reliability, scalability, and performance
• Ensure data processing, quality, security, and compliance guidelines, policies and standards are followed
• Collaborate with multiple partners from Business, Technology, Operations and D&A capabilities (Data Governance, Data Quality, Data Modeling, Data Architecture, Data science, DevOps, BI & insights)
• Independently lead design, solutioning & estimations
• Provide people leadership: coach, develop and engage talent
Education, Technical Skills & Other Critical Requirement
Education : Bachelors degree in computer science, information technology or equivalent educational qualification
Experience (in years) :10-12+ years of relevant experience
Technical Skills :
• SQL, Python/Scala
• NoSql and distributed databases (Hbase, Cosmos DB)
• ETL pipleine design and development; Solutioning and estimation
• Big Data Frameworks : Apache Spark, Hadoop, Hive
• Cloud platforms: Azure data factory, Eventhub, Azure functions, Synapse, Databricks
• Datawarehouses, data marts, data lakes
• Medallion architecture
• Performance tuning, optimization, and data quality validation
• Real-time and batch data processing , streaming pieplines with Spark
• Communication skills, analytical skills, structured problem-solving skills,mentorship & people leadership skills
• Storytelling skills , Partner & Stakeholder engagement experience
• People leadership: talent development & engagement experience
Other preferred skills :
• DevOps practices: Git, AzureDevops, CI/CD pipelines
• Unix shell scripting, MongoDB, Nifi
• Exposure to Gen AI technology and tools
• Banking Financial Services and Insurance domain knowledge
About MetLife
Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World's 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world's leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.
Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we're inspired to transform the next century in financial services. At MetLife, it's #AllTogetherPossible . Join us!
#BI-Hybrid
Position Summary :
MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology and operations capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science
Role Proposition
Data Engineer plays a critical role in data and analytics life cycle and significantly contributes to production grade data and analytics solutions. The role requires one to demonstrate Big Data, Engineering and Cloud expertise. Besides contributing in individual capacity to solve wide ranging business problems, this role also leads and develops Data and Analytics talent
Job Responsibilities :
• Design, build, and maintain robust ETL/ELT pipelines on cloud(Azure) or on-prem to collect, ingest and store large volumes of structured and unstructured data for batch/real time processing
• Monitor, optimize, and troubleshoot data pipelines to ensure reliability, scalability, and performance
• Ensure data processing, quality, security, and compliance guidelines, policies and standards are followed
• Collaborate with multiple partners from Business, Technology, Operations and D&A capabilities (Data Governance, Data Quality, Data Modeling, Data Architecture, Data science, DevOps, BI & insights)
• Independently lead design, solutioning & estimations
• Provide people leadership: coach, develop and engage talent
Education, Technical Skills & Other Critical Requirement
Education : Bachelors degree in computer science, information technology or equivalent educational qualification
Experience (in years) :10-12+ years of relevant experience
Technical Skills :
• SQL, Python/Scala
• NoSql and distributed databases (Hbase, Cosmos DB)
• ETL pipleine design and development; Solutioning and estimation
• Big Data Frameworks : Apache Spark, Hadoop, Hive
• Cloud platforms: Azure data factory, Eventhub, Azure functions, Synapse, Databricks
• Datawarehouses, data marts, data lakes
• Medallion architecture
• Performance tuning, optimization, and data quality validation
• Real-time and batch data processing , streaming pieplines with Spark
• Communication skills, analytical skills, structured problem-solving skills,mentorship & people leadership skills
• Storytelling skills , Partner & Stakeholder engagement experience
• People leadership: talent development & engagement experience
Other preferred skills :
• DevOps practices: Git, AzureDevops, CI/CD pipelines
• Unix shell scripting, MongoDB, Nifi
• Exposure to Gen AI technology and tools
• Banking Financial Services and Insurance domain knowledge
About MetLife
Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World's 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world's leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.
Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we're inspired to transform the next century in financial services. At MetLife, it's #AllTogetherPossible . Join us!
#BI-Hybrid
Top Skills
Spark
Azure Data Factory
Azure Functions
Cosmos Db
Databricks
Event Hub
Hadoop
Hbase
Hive
MongoDB
Nifi
NoSQL
Python
Scala
SQL
Synapse
Similar Jobs at MetLife
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, and maintain Big Data solutions for Disability & Absence products. Collaborate with teams to improve existing solutions and create automation scripts.
Top Skills:
Ci/CdHbaseHiveKafkaNifiNoSQLPigPythonShell ScriptSolrSpark
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Associate Software Platform Engineer II will develop and maintain applications, document user requirements, and collaborate on design and integration within ServiceNow.
Top Skills:
CSSFlow DesignerGlide ApiHTMLItilItsmJavaScriptRestServicenowSoap
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Associate Software Platform Engineer II - ServiceNow is responsible for developing applications and platforms, leading user requirement documentation, and resolving production issues.
Top Skills:
CSSGlide ApiHTMLJavaScriptRestServicenowSoap
What you need to know about the Kolkata Tech Scene
When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

