Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Trivandrum, Thiruvananthapuram, Kerala
Junior
Fintech • Consulting
The Role
As a Data Engineer, you will develop and maintain data pipelines using Apache Beam in Java and Apache Spark within Google Cloud Platform. You will ensure data quality and reliability through integration, optimize performance of data workflows, and manage cloud infrastructure services such as DataProc, BigQuery, and Cloud Storage, while also creating technical documentation and staying updated with industry technologies.
Summary Generated by Built In

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you.

What You Will Do:

  • Design and Build Data Pipelines: Develop and maintain robust data pipelines using Apache Beam in Java and Apache Spark to process large datasets efficiently across Google Cloud Platform (GCP).

  • Data Integration: Collaborate with data scientists, analysts, and other stakeholders to design data integration solutions for various business needs, ensuring data quality and reliability.

  • Cloud Infrastructure Management: Utilize GCP services like DataProc, BigQuery, and Cloud Storage to manage cloud based data processing and storage solutions.

  • Performance Tuning: Identify and optimize performance bottlenecks in data processing workflows, ensuring high efficiency and low latency in pipeline execution.

  • Monitoring and Maintenance: Establish monitoring, logging, and alerting mechanisms for data pipelines to ensure operational excellence.

  • Documentation: Create technical documentation outlining data flows, architecture design, and maintenance procedures for future reference.

  • Stay Current with Technologies: Keep abreast of the latest developments in data engineering tools and technologies, particularly within the GCP ecosystem, to continuously improve processes and solutions.

What Experience You Need:

  • Experience with GCP: 1+ years of experience working with cloud platform services, particularly in data processing and analytics.

  • Apache Beam In Java: Demonstrated expertise in building ETL processes using Apache Beam in Java.

     Spark and DataProc: Handson experience with Apache Spark, including optimizing jobs in DataProc for performance and efficiency.

  • SQL Proficiency: Strong skills in Hive SQL and familiarity with data warehousing concepts, along with experience in writing complex queries for data manipulation and analysis.

  • Data Loading and Transformation: Experience in managing and transforming large data sets, and knowledge of data storage solutions and frameworks.

  • CI/CD Practices: Familiarity with version control tools (like Git) and CI/CD methodologies for code deployment.

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.

What Could Set You Apart:

  • Strong Programming Skills: Proficiency in additional programming languages (Python, Scala, etc.) and frameworks that support data engineering tasks.

  • Big Data Technologies: Experience with other big data frameworks or tools (e.g., Hadoop, Kafka, Airflow) that complement data engineering efforts.

  • Cloud Certifications: Relevant GCP certifications (e.g., Google Cloud Professional Data Engineer) that demonstrate your commitment and expertise in the field.

  • Architectural Knowledge: Understanding of data architecture principles, including data lakes, data warehousing, and the concepts of batch and stream processing.

  • Active Participation in the Community: Contributions to opensource projects, speaking engagements at conferences, or involvement in data engineering forums can enhance your profile.

  • Business Acumen: Ability to translate technical requirements into actionable business solutions and insights.

We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks.

Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference!

Primary Location:

IND-Trivandrum-Equifax Analytics-PEC

Function:

Function - Tech Dev and Client Services

Schedule:

Full time

Top Skills

Java
Python
Scala
SQL
The Company
HQ: Atlanta, GA
16,742 Employees
On-site Workplace

What We Do

At Equifax (NYSE: EFX), we believe knowledge drives progress. As a global data, analytics, and technology company, we play an essential role in the global economy by helping financial institutions, companies, employers, and government agencies make critical decisions with greater confidence. Our unique blend of differentiated data, analytics, and cloud technology drives insights to power decisions to move people forward.

Headquartered in Atlanta and supported by nearly 15,000 employees worldwide, Equifax operates or has investments in 24 countries in North America, Central and South America, Europe, and the Asia Pacific region.

For more information, visit Equifax.com.

Similar Jobs

Equifax Inc. Logo Equifax Inc.

Jr. Data Engineer

Fintech • Consulting
Trivandrum, Thiruvananthapuram, Kerala, IND
16742 Employees

Equifax Inc. Logo Equifax Inc.

Data Engineer

Fintech • Consulting
Trivandrum, Thiruvananthapuram, Kerala, IND
16742 Employees

Equifax Inc. Logo Equifax Inc.

Data Engineer

Fintech • Consulting
Trivandrum, Thiruvananthapuram, Kerala, IND
16742 Employees

Equifax Inc. Logo Equifax Inc.

Data Engineer

Fintech • Consulting
Trivandrum, Thiruvananthapuram, Kerala, IND
16742 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account