BETSOL is a cloud-first digital transformation and data management company offering products and IT services to enterprises in over 40 countries. BETSOL team holds several engineering patents, is recognized with industry awards, and BETSOL maintains a net promoter score that is 2x the industry average.
BETSOL’s open source backup and recovery product line, Zmanda (Zmanda.com), delivers up to 50% savings in total cost of ownership (TCO) and best-in-class performance.
BETSOL Global IT Services (BETSOL.com) builds and supports end-to-end enterprise solutions, reducing time-to-market for its customers.
BETSOL offices are set against the vibrant backdrops of Broomfield, Colorado and Bangalore, India.
We take pride in being an employee-centric organization, offering comprehensive health insurance, competitive salaries, 401K, volunteer programs, and scholarship opportunities. Office amenities include a fitness center, cafe, and recreational facilities.
Learn more at betsol.com
Job DescriptionThe Senior Data Analytics Engineer will lead the design, development, and evolution of our enterprise analytics platform. This role blends hands on technical ownership with team leadership, offering the opportunity to shape both our data architecture and our analytics engineering practice.
You will own critical components of our Data Vault 2.0 warehouse on Snowflake, mentor a growing team of Analytics Engineers, and serve as the connective layer between raw operational data and business critical insights.
Our data environment goes well beyond a typical dbt project. We operate a mature, multi tenant analytics warehouse integrating 13+ source systems (including Five9, inContact, Serenova, Salesforce, Shiftboard, and NetSuite) across 15+ client engagements. You’ll work across the full analytics stack—from staging and raw vault to business vault, prime stage, and client specific information marts—delivering clean, tested, and well documented models that support finance, operations, and compliance reporting
QualificationsRequired Qualifications
- Bachelor’s degree in a quantitative or technical field, or equivalent professional experience
- 5+ years of production SQL experience, including:
- Window functions
- CTEs
- Performance tuning
- Scalable modeling patterns
- Strong Snowflake expertise, including:
- Warehouse sizing and query profiling
- Role‑Based Access Control (RBAC)
- Cost monitoring
- Incremental and MERGE strategies
- Hands‑on experience with dbt, including:
- Modular model design
- ref and source usage
- Generic and custom tests
- YAML schema configuration
- CI workflows on pull requests
- Solid understanding of Data Vault 2.0 or similar methodologies (e.g., Kimball, Anchor Modeling)
- Proficiency with Git and GitHub workflows (branching, PRs, code reviews)
- Proven ability to lead or mentor junior and mid‑level engineers
Preferred Qualifications
- Experience building Streamlit or other Python‑based data applications
- Familiarity with contact center platforms (Five9, inContact, Serenova, or similar)
- Experience using dbt Cloud for orchestration and environment management
- Exposure to data integration tools such as Fivetran or Airbyte
- Comfort working with semi‑structured data (VARIANT columns, LATERAL FLATTEN)
What You’ll Do
Data Architecture & Development
- Design and maintain scalable data models using dbt and Data Vault 2.0 methodologies
- Build hubs, links, and satellites (SCD Type 2), along with business vault–derived objects
- Develop performant info mart views for internal and external stakeholders
Leadership & Mentorship
- Lead and mentor Analytics Engineers through code reviews and pair programming
- Establish and enforce standards for SQL style, testing, and documentation
- Raise the overall quality and consistency of analytics engineering deliverables
Performance & Optimization
- Optimize Snowflake workloads through incremental strategies, dynamic tables, and warehouse sizing
- Apply cost‑conscious query design and performance tuning best practices
Integration & Enablement
- Onboard new data sources and clients with repeatable ingestion and staging patterns
- Flatten and model complex semi‑structured (JSON) data effectively
Collaboration & Delivery
- Partner with Finance, Operations, Compliance, and Client Services to translate business questions into reliable data models
- Participate in a 4‑week agile cadence with CI on pull requests and automated dbt Cloud deployments



