Decision Foundry Logo

Decision Foundry

Senior Data / Platform Engineer (Embedded - Data & Analytics Engineering)

Posted 2 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Bengaluru, Bengaluru Urban, Karnataka
Senior level
In-Office or Remote
Hiring Remotely in Bengaluru, Bengaluru Urban, Karnataka
Senior level
The Senior Data/Platform Engineer will enhance a data platform, build pipelines, and improve scalability and reliability of data workflows using Python and other technologies.
The summary above was generated by AI

Welcome to Decision Foundry - Data Analytics Division!

We are proud to introduce ourselves as a certified "Great Place to Work," where we prioritize creating an exceptional work environment. As a global company, we embrace a diverse culture, fostering inclusivity across all levels.

Originating from a well-established 19-year web analytics company, we remain dedicated to our employee-centric approach. By valuing our team members, we aim to enhance engagement and drive collective success.

We are passionate about harnessing the power of data analytics to transform decision-making processes. Our mission is to empower data-driven decisions that contribute to a better world. In our workplace, you will enjoy the freedom to experiment and explore innovative ideas, leading to outstanding client service and value creation.

We win as an organization through our core tenets. They include:

·       One Team. One Theme.

·       We sign it. We deliver it.

·       Be Accountable and Expect Accountability.

·       Raise Your Hand or Be Willing to Extend it

About the Role

We’re looking for a Senior Data / Platform Engineer to embed directly into our Data & Analytics Engineering team and help accelerate delivery across a highly customized, API-driven data platform. This role is focused on augmenting and hardening the existing platform, building and expanding pipelines, and developing reusable infrastructure and library components to support scalable ingestion and transformation workflows.

This is a hands-on engineering role best suited for someone who thrives in software-engineering style data work—building modular Python libraries, deploying pipeline infrastructure, and improving reliability, observability, and test coverage across a production data ecosystem.

Location: Remote – EST Hours
Type: Contract
Team: Data Platform / Analytics Engineering

Key Responsibilities:

What You’ll Work On

You will integrate into our team to accelerate well-scoped execution work, including:

  • Data pipeline and ingestion expansion across multiple sources and delivery patterns
  • Platform hardening and refactoring initiatives to improve scalability and maintainability
  • Observability, testing, and reliability improvements across orchestration and batch workloads
  • Deployment and modularization of pipeline components to support repeatable onboarding of net-new data capabilities
  • Supporting dbt model and mart development (big plus) and maintaining analytics transformations in Snowflake


Core Responsibilities

  • Build and maintain serverless, containerized batch pipelines orchestrated via Prefect (similar to Airflow)
  • Expand ingestion and connectivity patterns across:
    • APIs
    • S3-based sources
    • SFTP infrastructure
    • Email scraping
    • Web scraping
  • Develop and enhance internal Python libraries used to standardize ingestion, transformation, and pipeline deployment patterns
  • Implement and improve data observability practices including monitoring, alerting, and failure diagnostics
  • Contribute to infrastructure-as-code using Terraform to support repeatable deployments and environment consistency
  • Support and improve the data warehouse ecosystem:
    • Snowflake as the primary data warehouse
    • dbt on Snowflake for modeling and analytics transformations
  • Collaborate closely with internal engineers through PR reviews, sprint workflows, and team standards.
  • Operate within existing repos, processes, and CI/CD workflows to increase throughput while maintaining quality


Technical Environment

  • Python (expert level required)
  • Prefect (workflow orchestration)
  • AWS (cloud-native compute, containerized/serverless batch workloads)
  • Terraform (IaC)
  • Snowflake (data warehouse)
  • dbt (transformations and marts)
  • Highly integrated and customized platform with heavy API-based data flows


What Success Looks Like

  • Net-new ingestion capabilities are delivered faster without sacrificing reliability
  • Pipelines are more modular, reusable, and deployable through standardized patterns
  • Failures are easier to detect and debug through improved observability and testing
  • The platform becomes easier to maintain as codebases are refactored and hardened
  • Internal senior engineers retain architectural ownership while execution throughput increases

Requirements

Required Qualifications

  • 6+ years of experience in Data Engineering, Platform Engineering, or Software Engineering with strong data systems exposure
  • Expert-level Python skills with a track record of building production-grade libraries and services
  • Strong experience building and operating batch pipeline infrastructure in cloud environments (AWS preferred)
  • Experience with workflow orchestration tools such as Prefect, Airflow, Dagster, etc.
  • Strong understanding of data pipeline design: modularity, idempotency, retries, deployment patterns, and maintainability
  • Experience implementing data observability, monitoring, logging, alerting, and testing frameworks
  • Hands-on experience with Terraform or similar infrastructure-as-code tooling
  • Comfortable working in an embedded model: collaborating inside existing repos, PR workflows, and delivery processes


Preferred / Nice-to-Have

  • Strong experience with dbt (models, marts, testing, documentation)
  • Experience with Snowflake performance optimization and warehouse best practices
  • Experience with web scraping and/or email scraping pipelines
  • Familiarity with containerized workloads and serverless compute patterns
  • Strong instincts for platform refactoring, system hardening, and reliability engineering


Working Model / Team Approach

Our internal team retains ownership of architecture, modeling standards, and technical direction. This role operates as an embedded senior engineer within our workflows to accelerate delivery, increase throughput, and protect senior internal capacity—without compromising quality.


Benefits

Equal Opportunity Statement

We are committed to building a diverse and inclusive team. We welcome applications from candidates of all backgrounds and are an equal opportunity employer. We provide equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, gender identity, sexual orientation, marital status, or veteran status.

Top Skills

AWS
Dbt
Prefect
Python
Snowflake
Terraform

Similar Jobs

3 Hours Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Artificial Intelligence • Edtech • Mobile • Natural Language Processing • Productivity • Software
Lead and scale QuillBot's AI Engineering & MLOps function by overseeing the full ML lifecycle, optimizing performance, and mentoring a team while collaborating cross-functionally.
Top Skills: AIGCPKubernetesMlMlopsTensorrtVertex Ai
3 Hours Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Artificial Intelligence • Enterprise Web • Information Technology • Productivity • Sales • Software • Database
As a Senior Backend Engineer at Apollo.io, you'll design scalable backend solutions, mentor teammates, and work cross-functionally to enhance product quality and performance.
Top Skills: AIAnsibleDockerElasticsearchKubernetesMongoDBNode.jsReactRedisReduxRubyRuby On RailsTerraform
3 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Financial Services
Manage relationships while delivering excellent customer experience. Design solutions using AWS, develop code in Java or Python, analyze data trends, and improve data processes.
Top Skills: AWSCloudFormationJavaPythonTerraform

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account