Zensar Technologies Logo

Zensar Technologies

Data Engineer

Posted Yesterday
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
Design and build ETL pipelines using Apache Spark/PySpark, implement Iceberg table operations, and validate analytical schema. Requires strong data engineering experience and collaboration with stakeholders.
The summary above was generated by AI

Key responsibilities:**


- Design and build the Apache Spark/PySpark ETL pipeline (Bronze → Silver → Gold medallion architecture)


- Implement Apache Iceberg table operations (MERGE, UPSERT, SCD Type 2 logic, incremental loads)


- Design and validate the analytical star schema (fact/dimension tables, conformed dimensions)


- Define and execute three-tier data quality rules, dead-letter handling, and validation logic


- Build business logic connectors, transformation helpers, and custom derivations


- Collaborate with stakeholders to clarify KPIs, query patterns, and analytical use cases


- Write comprehensive unit, integration, and end-to-end tests



**Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

Responsibilities

Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

Qualifications

Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

About UsAt Zensar, we’re “experience-led everything”. We are committed to conceptualizing, designing, engineering, marketing, and managing digital solutions and experiences for over 130 leading enterprises. We are a company driven by a bold purpose: Together, we shape experiences for better futures. Whether for our clients, our people, or the world around us, this belief powers everything we do. At the heart of our culture is ONE with Client - a set of four core values that reflect who we are and how we work: One Zensar, Nurturing, Empowering, and Client Focus.
Part of the $4.8 billion RPG Group, we’re a community of 10,000+ innovators across 30+ global locations, including Milpitas, Seattle, Princeton, Cape Town, London, Zurich, Singapore, and Mexico City. Explore Life at Zensar and join us to Grow. Own. Achieve. Learn. to be the best version of yourself.
We believe the best work happens when individuality is celebrated, growth is encouraged, and well-being is prioritized. We are an equal employment opportunity (EEO) and affirmative action employer, committed to creating an inclusive workplace. All qualified applicants will be considered without regard to race, creed, color, ancestry, religion, sex, national origin, citizenship, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veteran status.

Zensar Technologies Kolkata, West Bengal, IND Office

12th floor, Srijan Corporate Park, Street Number 25, GP Block, Sector V, Bidhannagar, Kolkata, West Bengal, India, 700091

Similar Jobs

9 Days Ago
Remote or Hybrid
India
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Design and optimize SQL scripts, develop Java applications and APIs, lead collaboration sessions, and ensure solutions adhere to RDBMS principles.
Top Skills: Aws RdsCore JavaGitOciOracle DbPl/SqlRest ApisSQL
9 Days Ago
Remote or Hybrid
India
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
The Business Data Analyst II will design and optimize SQL and PL/SQL scripts, develop Java applications and RESTful APIs, gather requirements, and ensure solutions maintain data integrity and security.
Top Skills: Aws RdsCore JavaGitOciOracle DbPl/SqlRest ApisSQLSql Developer
5 Days Ago
Remote
India
Mid level
Mid level
Information Technology • Consulting
As a Data Engineer, you will design and implement big data technologies, manage data pipelines, and analyze data to facilitate informed decision-making.
Top Skills: Azure Data FactoryAzure StorageC#PythonSQL

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account