Capco Logo

Capco

Sr AWS Data Engineer

Posted 2 Hours Ago
Be an Early Applicant
Remote or Hybrid
Hiring Remotely in India
Senior level
Remote or Hybrid
Hiring Remotely in India
Senior level
Design, build, and operate scalable ETL/ELT pipelines using PySpark and AWS data services. Orchestrate workflows with Apache Airflow, implement AWS Glue jobs and Data Catalog, manage Lake Formation permissions, publish datasets for BI, and deliver QuickSight visualizations while ensuring data quality and performance.
The summary above was generated by AI

Job Title: Sr Data Engineer (GCP)

 About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO? 

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

Job Description:

Role: Data Engineer with AWS

Location: Bangalore / Chennai/Gurgoan

Skills & Experience

  • Job Description – Data Engineer

    We are looking for a Data Engineer with strong experience in building and operationalizing data pipelines, ETL workflows, and analytics platforms using PySpark, Apache Airflow, and AWS data services.

    Key Responsibilities
    • Build scalable ETL/ELT pipelines using PySpark on distributed processing frameworks

    • Orchestrate workflows using Apache Airflow (DAG design, scheduling, monitoring)

    • Develop data ingestion and transformation jobs using AWS Glue

    • Manage secure, compliant data access using AWS Lake Formation

    • Maintain and optimize AWS Glue Data Catalog for metadata, schema, and table management

    • Work with analytics teams to publish datasets for BI and dashboards

    • Build and support visualizations using Amazon QuickSight

    • Ensure data quality, performance, and reliability across all pipelines

    Required Skills
    • Strong hands-on experience with PySpark for large-scale data processing

    • Deep knowledge of Airflow DAGs, operators, sensors, and CI/CD integration

    • Expertise in AWS Glue (ETL jobs, crawlers, Glue Studio, Glue Job Bookmarks)

    • Experience with Lake Formation permissions, governance, and data lakes

    • Familiarity with Glue Data Catalog for metadata management

    • Ability to build dashboards in Amazon QuickSight

    • Understanding of data modeling, partitioning, and performance optimization

    Nice to Have
    • Experience with S3, Athena, Redshift, or EMR

    • Knowledge of Python-based automation and testing

    • Exposure to cloud-native DevOps (IaC, Terraform/CloudFormation


   

Top Skills

Pyspark,Apache Airflow,Aws Glue,Aws Lake Formation,Aws Glue Data Catalog,Amazon Quicksight

Similar Jobs at Capco

2 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Lead automation testing: design and implement Java-based automation frameworks (Selenium, TestNG, JUnit, Cucumber), create test plans/cases/suites, perform API and database validation using SQL, debug issues from logs, report and prioritize defects, and collaborate with Agile teams to define testing approach and risks.
Top Skills: Java,Junit,Postman,Selenium,Testng,Cucumber,Bdd,Tdd,Sql,Jira,Zephyr,Capture,Confluence,Selenium Grid,Sauce Labs,Browserstack
2 Days Ago
Remote or Hybrid
India
Entry level
Entry level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Provide L1 functional support for banking applications in a 24x7 environment: monitor, log, and triage incidents; perform initial troubleshooting and root-cause analysis; analyze logs and transaction data; produce RCA reports; escalate to L2/L3; maintain documentation and knowledge base; work rotational/night shifts.
Top Skills: Servicenow,Jira,Remedy,Itil
2 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Serve as Business Analyst/Product Owner for credit transformation, specializing in credit assessment and RWA calculations. Own backlog, write user stories and acceptance criteria, support build/test/UAT, facilitate scrum ceremonies, train users, and provide post-go-live support while driving continuous process and UX improvements.
Top Skills: Rwa Calculator,Scrum,Agile,Uat,Uvt

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account