TransUnion Logo

TransUnion

Developer, Data Development

Sorry, this job was removed at 08:19 p.m. (IST) on Monday, Jun 02, 2025
Be an Early Applicant
Hybrid
Hyderabad, Telangana
Hybrid
Hyderabad, Telangana

Similar Jobs at TransUnion

2 Days Ago
Hybrid
Hyderabad, Telangana, IND
Junior
Junior
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
The Associate Developer will design, develop, and maintain data pipelines and applications using Java and Big Data technologies in a cloud environment.
Top Skills: AWSGCPHadoopHdfsJavaPythonScalaSparkSQLUnix/Linux
2 Days Ago
Hybrid
Hyderabad, Telangana, IND
Expert/Leader
Expert/Leader
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
Lead and mentor a team of engineers in designing and implementing Big Data and AI solutions. Develop applications and data pipelines, optimize cloud-based infrastructure, and enhance team collaboration.
Top Skills: AIAthenaAWSBig DataBigQueryGCPHadoopHdfsJavaPythonPyTorchScalaScikit-LearnSnowflakeSparkSQLTensorFlow
5 Days Ago
Hybrid
Hyderabad, Telangana, IND
Senior level
Senior level
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
The role requires designing and implementing high-performance applications, mentoring engineers, ensuring security and compliance, and optimizing system performance.
Top Skills: AWSCi/CdDockerGCPHibernateJavaKubernetesReactRest ApisSpring BootVue

TransUnion's Job Applicant Privacy Notice

What We'll Bring:

We are seeking a talented and experienced Senior Data Engineer/Big Data Developer to join our growing team. In this role, you will be responsible for designing, developing, and maintaining our Big Data infrastructure and pipelines. You will work with large datasets to build scalable solutions for data processing, analytics, and machine learning initiatives. You will collaborate with data scientists, analysts, and other engineers to deliver high-quality, reliable, and performant data solutions that drive business decisions.

What You'll Bring:

Responsibilities:

  • Design, develop, and maintain robust and scalable Big Data pipelines using technologies like Hadoop, Spark, and Scala.
  • Develop and maintain data ingestion, processing, and storage solutions on cloud platforms such as AWS and GCP.
  • Write high-quality, well-documented, and testable code in Java, Scala, Python, and SQL.
  • Optimize data processing performance and ensure data quality.
  • Collaborate with data scientists and analysts to understand their requirements and build data solutions that meet their needs.
  • Work with large datasets and perform data analysis to identify trends and patterns.
  • Implement data governance and security best practices.
  • Contribute to the evolution of our Big Data architecture and technology stack.
  • Troubleshoot and resolve issues in the data pipelines and infrastructure.
  • Stay up-to-date with the latest Big Data technologies and trends.
  • Participate in code reviews and contribute to the team's knowledge sharing.
  • Work in a fast-paced, agile environment.

Impact You'll Make:

Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 4+ years of experience in software development with a focus on Big Data technologies.
  • Strong proficiency in Java, Scala, Python, and SQL.
  • Experience with Hadoop ecosystem components such as HDFS, MapReduce, YARN, and Hive.
  • Extensive experience with Apache Spark for large-scale data processing.
  • Experience building and deploying data pipelines on cloud platforms (AWS and/or GCP).
  • Solid understanding of data modeling, data warehousing, and ETL concepts.
  • Experience with data analytics and visualization tools is a plus.
  • Strong understanding of distributed systems principles and architecture.
  • Proficiency in Unix/Linux environments.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Experience with data governance and security best practices.
  • Experience with Agile development methodologies.

Bonus Points:

  • Experience with data streaming technologies like Kafka or Kinesis.
  • Experience with NoSQL databases like Cassandra or MongoDB.
  • Experience with machine learning frameworks like TensorFlow or PyTorch.
  • AWS or GCP certifications.
  • Contributions to open-source projects.

This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week.

TransUnion Job Title

Developer, Data Analysis and Consulting

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account