JPMorganChase Logo

JPMorganChase

Data Engineer III

Posted 2 Hours Ago
Be an Early Applicant
Hybrid
Bengaluru, Bengaluru Urban, Karnataka
Senior level
Hybrid
Bengaluru, Bengaluru Urban, Karnataka
Senior level
As a Data Engineer III at JPMorgan Chase, you will design and implement data pipelines, optimize data systems, and ensure the reliability and security of data solutions for various banking functions.
The summary above was generated by AI
Job Description
At JP Morgan Chase, we understand that customers seek exceptional value and a seamless experience from a trusted financial institution. That's why we launched Chase UK to transform digital banking with intuitive and enjoyable customer journeys. With a strong foundation of trust established by millions of customers in the US, we have been rapidly expanding our presence in the UK and soon across Europe. We have been building the bank of the future from the ground up, offering you the chance to join us and make a significant impact.
As a Data Engineer III at JPMorgan Chase within the International Consumer Bank, you will be part of a team that plays a crucial role in this initiative, dedicated to delivering an outstanding banking experience to our customers. You will work in a collaborative environment as part of a diverse, inclusive, and geographically distributed team. We are seeking individuals who are solution oriented, with a curious mindset, a passion for collaboration, and a keen interest in new technology. Our engineers are naturally solution-oriented and possess an interest in the financial sector and focus on addressing our customer needs. We work in teams focused on specific banking functions and products, providing opportunities to build data pipelines and reporting capabilities for functional areas such as finance and business management, treasury operations, financial crime prevention, regulatory reporting and analytics. We collaborate with product teams such as card payments, electronic payments, lending, customer onboarding, core banking, and insurance to understand their product data models and deliver tailored data solutions that meet business needs.
Job responsibilities
  • Deliver end-to-end data pipeline solutions on cloud infrastructure leveraging the latest technologies and best industry practices.
  • Use domain modeling techniques to build best-in-class business products.
  • Structure software for easy understanding, testing, and evolution.
  • Build solutions that avoid single points of failure using scalable architectural patterns.
  • Develop secure code to protect our customers and ourselves from malicious actors.
  • Promptly investigate and fix issues, ensuring they do not resurface.
  • Ensure releases happen with zero downtime for end-users.
  • Optimize data writing and reading for our needs.
  • Monitor performance, identifying and solving problems effectively and ensure systems are reliable and easy to operate.
  • Continuously update technologies and patterns.
  • Support products through their entire lifecycle, including production and incident management.

Required qualifications, capabilities and skills
  • Formal training or certification in Public Cloud engineering concepts and 5+ years of applied experience.
  • Excellent programming skills, ideally in Python or another modern programming language.
  • Understanding of Agile methodologies, Applicant Resiliency, and Security.
  • Experience with Public Cloud services in Production (AWS or other).
  • Hands-on experience with big data technologies (e.g. Redshift, EMR)
  • Comprehensive understanding of modern data platforms, including data governance and observability.
  • Self-starter capable of delivering production-ready solutions with minimal supervision.
  • Solid theoretical fundamentals in a wide range of topics, which could include database internals, distributed systems, and design patterns.

Preferred qualifications, capabilities and skills
  • Strong experience with EMR
  • Cloud Certifications including AWS Networking Specialty, AWS Developer Associate, AWS Solutions Architect Associate.

Top Skills

AWS
Emr
Python
Redshift

Similar Jobs at JPMorganChase

3 Days Ago
Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
Senior level
Senior level
Financial Services
The Software Engineer III will design and develop cloud-native applications on AWS, focusing on event-driven architecture, API development, database management, and data engineering with Databricks. Responsibilities include implementing best practices in DevOps and collaborating with teams on application lifecycle management.
Top Skills: AWSCi/CdDatabricksDjangoEvent-Driven ArchitectureFastapiPostgresPythonSpark
3 Days Ago
Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
Mid level
Mid level
Financial Services
As a Data Engineer II, you will design and maintain data solutions, optimize data workflows, implement ETL processes, and ensure data integrity in a collaborative team environment.
Top Skills: Amazon S3Apache AirflowSparkAws CodepipelineAws EmrAws GlueAws LambdaAws RedshiftCloudwatchDatadogGitJenkinsPythonSQL
2 Hours Ago
Hybrid
Bengaluru, Bengaluru Urban, Karnataka, IND
Mid level
Mid level
Financial Services
The Control Manager - Analyst will oversee Reg W transactions, performing limit analysis, ensuring timely reporting, and managing stakeholder queries. They will support special projects and implement controls for balance reporting.
Top Skills: Microsoft AccessExcelPowerPoint

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account