The role involves managing Cloudera Data Platform (CDP), automation through scripting, troubleshooting Hadoop clusters, and ensuring system efficiency and security.
Description and Requirements
Position Summary
A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux.
Strong expertise in DevOps practices, automation, and scripting (e.g. Ansible, Azure DevOps, Shell, Python) to streamline operations and improve efficiency is highly valued.
Job Responsibilities
Education, Technical Skills & Other Critical Requirement
Education
Bachelor's degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience.
Experience
(In Years)
7+ Years Total IT experience & 4+ Years relevant experience in Big Data database
Technical Skills
Other Critical Requirements
About MetLife
Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World's 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world's leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.
Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we're inspired to transform the next century in financial services. At MetLife, it's #AllTogetherPossible . Join us!
#BI-Hybrid
Position Summary
A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux.
Strong expertise in DevOps practices, automation, and scripting (e.g. Ansible, Azure DevOps, Shell, Python) to streamline operations and improve efficiency is highly valued.
Job Responsibilities
- Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux.
- Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters.
- Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency.
- Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features.
- Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos.
- Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity.
- Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency.
- Analyze logs and use tools like Splunk to debug and resolve production issues.
- Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency.
- Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management.
Education, Technical Skills & Other Critical Requirement
Education
Bachelor's degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience.
Experience
(In Years)
7+ Years Total IT experience & 4+ Years relevant experience in Big Data database
Technical Skills
- Big Data Platform Management : Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr, Apache Hive, Apache Kafka, Apache NiFi, Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL.
- Automation and Scripting : Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency.
- DevOps Practices : Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices.
- Monitoring and Troubleshooting : Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues.
- Linux Administration : Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning.
- Backup and Recovery : Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity.
- Security and Access Management : Understanding of security best practices, including user access management and integration with tools like Kerberos.
- Agile Methodologies : Knowledge of Agile practices and frameworks, such as SAFe, with experience working in Agile environments.
- ITSM Tools : Familiarity with ITSM processes and tools like ServiceNow for incident and change management.
Other Critical Requirements
- Excellent Analytical and Problem-Solving skills
- Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability.
- Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders.
- Prior experience in handling state side and offshore stakeholders
- Experience in creating and delivering Business presentations.
- Demonstrate ability to work independently and in a team environment
- Demonstrate willingness to learn and adopt new technologies and tools to improve operational efficiency
About MetLife
Recognized on Fortune magazine's list of the "World's Most Admired Companies" and Fortune World's 25 Best Workplaces™, MetLife, through its subsidiaries and affiliates, is one of the world's leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.
Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by our core values - Win Together, Do the Right Thing, Deliver Impact Over Activity, and Think Ahead - we're inspired to transform the next century in financial services. At MetLife, it's #AllTogetherPossible . Join us!
#BI-Hybrid
Top Skills
Ansible
Apache Hadoop
Apache Hbase
Apache Hive
Apache Kafka
Apache Nifi
Apache Ranger
Apache Solr
Spark
Azure Devops
Cloudera Data Platform
Cloudera Flow Management
Ibm Bigsql
Janusgraph
Python
Redhat Linux
Shell
Splunk
Similar Jobs at MetLife
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Oracle Database Administrator designs, implements, and supports database systems, collaborating with teams to ensure performance and security. Responsibilities include performance tuning, backup management, and automating tasks, while mentoring staff and managing databases for applications.
Top Skills:
Active Data GuardAnsibleAzureData PumpElasticGoldengateLinuxOemOidOraclePeoplesoftPythonRmanSqlplusTerraformUnix
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Assistant Manager will oversee IBM MQ administration and support for integration technologies, including installations, configurations, monitoring, and troubleshooting of middleware applications. Responsibilities also include coordinating with teams, managing patches and upgrades, and creating operational documentation.
Top Skills:
AixAnsibleAzure DevopsCp4I AceIbm Api ConnectIbm Integration BusIbm MqJSONLinuxPowershellPythonSslWebmethodsWebsphere Message BrokerYaml
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
This role involves assisting in management of technology services within MetLife, contributing to enhancing operational efficiency and customer experience.
What you need to know about the Kolkata Tech Scene
When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.