NewRocket Logo

NewRocket

Data Engineer

Posted 2 Days Ago
Remote
Hiring Remotely in India
Junior
Remote
Hiring Remotely in India
Junior
The Data Engineer will build and maintain data pipelines, ensuring data quality for AI, machine learning, and data science applications. Responsibilities include data ingestion, processing, and collaboration for high-quality dataset delivery.
The summary above was generated by AI

Why Us

NewRocket is the AI-first Elite ServiceNow Partner that activates real value on the Now Platform. As a trusted advisor to enterprise leaders, we combine industry expertise, human-centered design, and enterprise-grade AI to help organizations navigate change and scale with confidence.With two decades of experience guiding clients to realize the full potential of the ServiceNow AI Platform, we are one of the largest pure-play ServiceNow partners, uniquely focused on enabling enterprises to adopt AI they trust—AI that delivers lasting business value.

We #GoBeyondWorkflows to create new kinds of experiences for our customers.

Come join our Crew!

The Role

We are looking for a Data Engineer to join our AI Product Team. You will be responsible for working with large-scale datasets, developing robust data pipelines, and ensuring that data is cleaned, transformed, and made available for AI, Machine Learning and Data Science applications.

We are #GoingBeyond Come join our crew!

What You Will Be Doing

  • Build and maintain scalable and reliable data pipelines to support various data science and machine learning initiatives.
  • Ingest, clean, and process large volumes of structured and unstructured data from multiple sources.
  • Implement data validation, cleansing, and transformation logic to ensure data quality and consistency.
  • Work with cloud platforms including AWS (SageMaker, Lambda, S3), Azure, and Google Cloud (Vertex AI).
  • Collaborate with team to understand data needs and deliver high-quality datasets.
  • Work with big data technologies such as Apache Spark and Snowflake for large-scale data processing and analytics.
  • Design and optimize ETL pipelines for data quality management, transformation, and validation.
  • Utilize SQL, MySQL, PostgreSQL, and MongoDB for database management and query optimization.
  • May perform additional duties as assigned.

What You Bring Along

  • 2-3 years of experience as Data Engineer
  • Strong SQL skills and understanding of relational databases.
  • Familiarity with data processing frameworks like Apache Spark, Pandas, or PySpark.
  • Knowledge of data cleaning techniques and dealing with missing or inconsistent data.
  • Proficiency in Python or another scripting language used in data workflows.
  • Basic understanding of data modelling concepts and data warehousing.
  • Strong problem-solving skills and attention to detail.

 Nice to Have

  •  Experience with cloud platforms like AWS, GCP, or Azure.
  •  Experience with unstructured data.
  •  Exposure to tools like Airflow, DBT, or Kafka.
  •  Understanding of version control systems (e.g., Git).
  •  Familiarity with ML workflows and data preparation for ML models.

Education: 

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, AI/ML, or related field.

We Take Care of Our People 

NewRocket is committed to a diverse and inclusive workplace. We value and celebrate diversity, believing that every employee matters and should be respected and heard.  We are proud to be an equal opportunity workplace and affirmative action employer, committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin, or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, citizenship, military, or Veteran status. For individuals with disabilities who would like to request an accommodation, please contact [email protected]


Top Skills

Spark
AWS
Azure
GCP
MongoDB
MySQL
Pandas
Postgres
Pyspark
Python
Snowflake
SQL

Similar Jobs

13 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
As a Lead Data Engineer, you'll design and implement solutions for Disability & Absence products, improve existing systems, and collaborate with teams to enhance customer experience.
Top Skills: Big DataCi/CdHbaseHiveKafkaNoSQLPigPythonScalaShell ScriptingSolrSpark
Yesterday
Remote or Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
The Senior Data Engineer will design and implement scalable data ingestion and processing systems, focusing on ELT processes and using AWS services. The role involves developing data pipelines, collaborating with teams for data integration, and staying updated on emerging technologies.
Top Skills: Apache AirflowAWSDbtDockerEltFlinkHelmJenkinsJvmKafkaKubernetesKustomizeNoSQLRdbmsSnowflakeSparkTerraform
13 Hours Ago
Remote or Hybrid
Hyderabad, Telangana, IND
Expert/Leader
Expert/Leader
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
As a Staff Data Platform Software Engineer, you will develop core components of ServiceNow's database, optimize performance, and collaborate with cross-functional teams to enhance system scalability and reliability.
Top Skills: C,C++,Postgres,Mariadb

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account