The Staff Data Engineer will design scalable data architectures, lead cross-functional collaborations, mentor engineers, and modernize data systems, while ensuring data solution quality and compliance.
Clearwater's mission is to be the world's most trusted and comprehensive technology platform that simplifies the entire investment lifecycle. We empower our clients to run efficient investment accounting operations, provide an auditable SaaS platform for integrated investment accounting, analytics, and reporting, foster a diverse and collaborative culture of innovation and excellence, and contribute to our local communities to make a meaningful impact on society.As a Staff Data Engineer at Clearwater, you will play a crucial role in accomplishing our mission. You will be a leading member of the Prism team, which is responsible for managing our external BI reporting platform and the data aggregation pipeline backing it. This team’s product serves as a key data hub for multiple crucial products that rely on Prism Data.In this role, you will
- Lead the design and execution of scalable data architecture strategy, ensuring alignment with business objectives, operational maturity, and long-term maintainability of systems (e.g., warehouses, lakes, pipelines).
- Collaborate cross-functionally with data scientists, analysts, and business stakeholders to translate requirements into reliable, high-quality data solutions that drive decision-making.
- Champion end-to-end ownership of critical data initiatives, driving multi-team, multi-sprint projects from conception to delivery while balancing technical risk and timelines.
- Design and optimize robust data pipelines, ensuring efficient ingestion, transformation, storage, and retrieval while adhering to security, privacy, and compliance standards.
- Mentor engineers of all levels, fostering a culture of knowledge-sharing and operational excellence across teams; act as a trusted technical advisor in ambiguous or complex scenarios.
- Identify and evangelize innovative patterns (e.g., automation, monitoring, testing) to improve data quality, system reliability, and developer velocity organization-wide.
- Spearhead major modernization efforts, including redesigns of legacy systems and adoption of cutting-edge tools to meet evolving analytical and operational needs.
- Embed operational rigor into data products through logging, observability, and documentation, empowering less-experienced teams to debug and extend systems independently.
- Continuously build your skills through regular code reviews, training, mentoring, and access to free trainings on Udemy for Business.
- Snowflake as our enterprise data warehouse, with Airflow for workflow orchestration
- DBT, Prophecy, and Python for developing ELT processes
- Amazon Web Services as our public cloud provider, with configuration controlled by Terraform and Helm
- OpenSearch, Dynatrace, and Snowflake-native tooling for logging and monitoring.
- Git repositories hosted on Gitlab for code management.
- Atlassian (Jira, Confluence), Office365 (including Microsoft Teams), and Zoom for communication.
- Quality hardware to support development and communication on Windows or Mac platforms.
- 9+ years of enterprise data engineering experience (data warehousing, ETL development, data modelling, scalable Enterprise Data Warehouse (EDW) solutions, etc.).
- 5+ years of experience leveraging Snowflake and its various capabilities.
- Examples of leverage dimensional modeling/star schema design concepts in enterprise implementations
- Experience with both DBT and Python development
- Snowflake performance tuning expertise
- Exceptional leadership and mentorship skills.
- Enthusiasm for data engineering work in a software-as-a-service company.
- Driven by client satisfaction.
- Strong communication and teamwork skills.
- Ability to manage own time and deliver expected results on time.
- Commitment to continuous learning and improvement.
- Exceptional problem-solving and analytical skills.
- Experience running data through a public cloud provider.
Top Skills
Airflow
Amazon Web Services
Atlassian
Dbt
Dynatrace
Git
Gitlab
Helm
Office365
Opensearch
Prophecy
Python
Snowflake
Terraform
Zoom
Similar Jobs
Fintech • Insurance • Software • Analytics
As a Staff Data Engineer, you'll design, develop, and maintain data systems, focusing on data pipelines, quality, and integration, while collaborating with teams to align on business objectives.
Top Skills:
Aws KinesisAzureDbtInformaticaOracleRedshiftSnow SqlSnowflakeSnowpipeSQLSQL ServerSsisTalendTeradata
Artificial Intelligence • Information Technology • Machine Learning • Software • Virtual Reality • Analytics
The Senior Staff Engineer will design and maintain scalable data pipelines, utilize expertise in data engineering and programming languages, and oversee project requirements and developments.
Top Skills:
AgileAWSAzureCi/CdDevOpsGCPGitGoPostgresPythonSnowflakeSQL
Artificial Intelligence • Information Technology • Machine Learning • Software • Virtual Reality • Analytics
The Associate Staff Engineer will design and maintain data pipelines, work with SQL and Snowflake, and lead technical projects with cross-functional teams.
Top Skills:
Apache AirflowInformaticaMySQLPostgresPower BIPythonScalaSnowflakeSQLTalend
What you need to know about the Kolkata Tech Scene
When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.