Equifax Inc. Logo

Equifax Inc.

Data Engineer

Sorry, this job was removed at 08:12 a.m. (IST) on Monday, Feb 10, 2025
Be an Early Applicant
In-Office
Trivandrum, Thiruvananthapuram, Kerala
In-Office
Trivandrum, Thiruvananthapuram, Kerala

Similar Jobs

10 Hours Ago
In-Office
Trivandrum, Thiruvananthapuram, Kerala, IND
Senior level
Senior level
Automotive
Lead the development of data quality standards, implement governance policies, and ensure compliance with regulations like GDPR and the EU Data Act.
Top Skills: AWSMicrosoft Purview/FabricSnowflake
2 Days Ago
In-Office
Trivandrum, Thiruvananthapuram, Kerala, IND
Senior level
Senior level
Automotive
The Senior Data Engineer ensures data integrity and governance, leads data modelling, oversees the DRM process, and collaborates with vendor teams for data quality and platform alignment.
Top Skills: AWSErwinGlueSnowflakeSQLTalend
14 Days Ago
In-Office
Kochi, Ernakulam, Kerala, IND
Senior level
Senior level
Consulting
Design and implement ML/AI models, develop Generative AI solutions, perform data purification and feature engineering, and integrate AI into business applications.
Top Skills: AWSAzureDjangoDockerFastapiFlaskGCPHugging FaceKubernetesNoSQLNumpyPandasPythonPyTorchScikit-LearnSQLTensorFlow

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds,  and make a meaningful impact, we want to hear from you.

What You Will Do:

  • Design and Build Data Pipelines: Develop and maintain robust data pipelines using Apache Beam in Java and Apache Spark to process large datasets efficiently across Google Cloud Platform (GCP).

  • Data Integration: Collaborate with data scientists, analysts, and other stakeholders to design data integration solutions for various business needs, ensuring data quality and reliability.

  • Cloud Infrastructure Management: Utilize GCP services like DataProc, BigQuery, and Cloud Storage to manage cloud based data processing and storage solutions.

  • Performance Tuning: Identify and optimize performance bottlenecks in data processing workflows, ensuring high efficiency and low latency in pipeline execution.

  • Monitoring and Maintenance: Establish monitoring, logging, and alerting mechanisms for data pipelines to ensure operational excellence.

  • Documentation: Create technical documentation outlining data flows, architecture design, and maintenance procedures for future reference.

  • Stay Current with Technologies: Keep abreast of the latest developments in data engineering tools and technologies, particularly within the GCP ecosystem, to continuously improve processes and solutions.

What Experience You Need:

  • Experience with GCP: 1+  years of experience working with cloud platform services, particularly in data processing and analytics.

  • Apache Beam In Java: Demonstrated expertise in building ETL processes using Apache Beam in Java.

     Spark and DataProc: Handson experience with Apache Spark, including optimizing jobs in DataProc for performance and efficiency.

  • SQL Proficiency: Strong skills in Hive SQL and familiarity with data warehousing concepts, along with experience in writing complex queries for data manipulation and analysis.

  • Data Loading and Transformation: Experience in managing and transforming large data sets, and knowledge of data storage solutions and frameworks.

  • CI/CD Practices: Familiarity with version control tools (like Git) and CI/CD methodologies for code deployment.

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.

What Could Set You Apart:

  • Strong Programming Skills: Proficiency in additional programming languages (Python, Scala, etc.) and frameworks that support data engineering tasks.

  • Big Data Technologies: Experience with other big data frameworks or tools (e.g., Hadoop, Kafka, Airflow) that complement data engineering efforts.

  • Cloud Certifications: Relevant GCP certifications (e.g., Google Cloud Professional Data Engineer) that demonstrate your commitment and expertise in the field.

  • Architectural Knowledge: Understanding of data architecture principles, including data lakes, data warehousing, and the concepts of batch and stream processing.

  • Active Participation in the Community: Contributions to opensource projects, speaking engagements at conferences, or involvement in data engineering forums can enhance your profile.

  • Business Acumen: Ability to translate technical requirements into actionable business solutions and insights.

We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks.

Are you ready to power your possible?  Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference!

Primary Location:

IND-Trivandrum-Equifax Analytics-PEC

Function:

Function - Tech Dev and Client Services

Schedule:

Full time

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account