DeleteMe Logo

DeleteMe

Data & Analytics Engineer

Posted 2 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Bengaluru, Bengaluru Urban, Karnataka
Senior level
In-Office or Remote
Hiring Remotely in Bengaluru, Bengaluru Urban, Karnataka
Senior level
The Data & Analytics Engineer designs, builds, and optimizes scalable data models in Snowflake using dbt, ensuring data integrity and high performance while collaborating with business units to translate requirements into technical specifications.
The summary above was generated by AI

About DeleteMe: 

DeleteMe is the leader in proactive privacy protection. We help Individuals, Families, Businesses and Security teams reduce their human attack surface by continuously monitoring and removing exposed personal data (PII) from the open web — the very data threat actors use to launch social engineering, phishing, Gen-AI deepfake, doxxing campaigns, physical threats, and identity fraud.

Operating as a fast-growing, global SaaS company, DeleteMe serves both consumers and enterprises. DeleteMe has completed over 100 million opt-out removals, helping customers reduce risks associated with identity theft, spam, doxxing, and other cybersecurity threats. We deliver detailed privacy reports, continuous monitoring, and expert support to ensure ongoing protection.

DeleteMe acts as a scalable, managed defense layer for your most vulnerable attack vector: your people. That’s why 30% of the Fortune 100, top tech firms, major banks, federal agencies, and U.S. states rely on DeleteMe to protect their workforce.

DeleteMe is led by a passionate and experienced team and driven by a powerful mission to empower consumers with privacy.

Job Summary:

This position is a key partner across the organization, sitting within the Data Warehouse team to bridge the gap between raw data engineering and business strategy. The Data & Analytics Engineer is responsible for designing, building, and optimizing scalable data models in Snowflake using dbt, ensuring data integrity and high performance. This role balances technical warehouse architecture with the ability to translate complex business requirements into actionable data products.

Job Responsibilities

  • Data Modeling & Development: Architect and maintain robust, modular data models in Snowflake using dbt, following industry-standard modeling methodologies (e.g., Kimball).
  • Warehouse Optimization: Write and tune advanced SQL to ensure optimal query performance, cost-efficiency, and resource management within the Snowflake environment.
  • Data Observability & Quality: Implement and manage automated testing, monitoring, and alerting frameworks to ensure data accuracy, freshness, and lineage.
  • Stakeholder Collaboration: Partner with business units to define KPIs, capture requirements, and translate business logic into technical data specifications.
  • End-to-End Delivery: Own the full data lifecycle from ingestion to production-grade data marts and strategic BI visualizations and dashboard building.
  • Engineering Excellence: Apply software engineering best practices to data development, including version control (Git), CI/CD, and detailed technical documentation.
  • Process Improvement: Continuous refactoring of legacy code and data structures to improve maintainability and scalability of the analytics stack.

Job Requirements

  • Mastery of complex SQL, including window functions, CTEs, and performance tuning for large-scale datasets.
  • Proven experience building production-grade dbt projects, including macros, seeds, and testing suites.

  • Strong understanding of Snowflake-specific features such as clustering, virtual warehouses, and zero-copy cloning.

  • Deep knowledge of dimensional modeling, fact/dimension design, and data warehousing principles.

  • Availability in US Eastern (EST) hours.

  • Ability to understand organizational drivers and communicate technical details effectively to non-technical stakeholders.

  • Strong problem-solving skills with the ability to identify root causes in data discrepancies or performance bottlenecks.

Qualifications

  • Bachelor’s degree in Computer Science, Data Science, Statistics, Business, or a related field.

  • 5+ years of experience in Analytics Engineering, Data Engineering, or a highly technical BI role.

  • Proficiency in Snowflake, dbt (with strong SQL), and data architecture.

  • Proven track record of delivering end-to-end data solutions in a cloud warehouse environment.

  • Strong data storytelling and presentation skills.

  • Experience supporting various business functions like Finance, Operations, Sales, Marketing , preferably in SaaS.

Nice to Have

  • Experience with Python for data scripting or automation.

  • Familiarity with data observability tools (e.g., Monte Carlo, Elementary).

  • Experience in a high-growth startup environment.

  • Cybersecurity experience

What We Offer

    Comprehensive health benefits – Group Medical Coverage (GMC),Personal accident insurance and group term life insurance

    Flexible work schedule 

    Provident Fund (PF) 

    Gratuity 

    Paid time off – 18 days of earned leave annually to rest and recharge.

    Sick leave – 10 days per year to support employee health and well-being.

    Company-paid holidays – 10 national and festival holidays annually.

    Parental leave benefits – 26 weeks of maternity leave, 2 weeks of paternity leave, and adoption leave as per company policy.

    Childcare expense reimbursement – Supporting working parents.

    Learning and development support – Complimentary access to Udemy for continuous learning and professional growth.

    Annual performance bonus 

    Employee Stock Options (ESOPs) 

    Quarterly team lunches and dinners 

    Birthday time off – Celebrate your special day with paid leave.

Top Skills

Ci/Cd
Dbt
Git
Snowflake
SQL

Similar Jobs

2 Days Ago
In-Office or Remote
Senior level
Senior level
Information Technology
The role involves developing and optimizing Power BI reports, advanced SQL development, and implementing CI/CD workflows for reporting assets. It requires collaboration with stakeholders to translate business needs into technical solutions.
Top Skills: Azure DevopsAzure SqlData FactoryDatabricksGithub ActionsPower BIPowershellPythonSparkSQLSynapse
5 Days Ago
Remote
Karpura, Bangalore, Karnataka, IND
Senior level
Senior level
Healthtech • Pharmaceutical • Manufacturing
The Lead Data Engineer will design and develop scalable data pipelines, lead system integration, mentor the team, and ensure data quality for analytics solutions.
Top Skills: AirflowAribaAws (S3Aws GlueBpcs)ConcurDbt CloudErp Applications (Sap S4FivetranInformatica IicsLambdaMs DynamicsOraclePower Bi/FabricSalesforceSnowflakeSQLStep Function)Workday
11 Days Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Cloud • Hardware • Software • Database
The Senior Data Analytics Engineer at MariaDB designs and builds data pipelines, writes SQL queries, develops internal tools, and translates data into actionable insights. They will collaborate closely with stakeholders and leverage AI tools to enhance analytics workflows.
Top Skills: AIBigQueryDockerFastapiKubernetesMachine LearningMariadbPythonSQL

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account