At Satori Analytics, we aim to change the world one algorithm at a time by bringing clarity to global brands thought Data & AI. From cloud-based ecosystems for fintech to predictive models for airlines, our cutting-edge solutions cover the entire data lifecycle—from ingestion to AI applications.
As a fast-growing scale-up, our team of 100+ tech specialists—including Data Engineers, Data Scientists, and more—delivers innovative analytics solutions across industries like FMCG, retail, manufacturing and FSI. Join us as we lead the data revolution in South-Eastern Europe and beyond!
What Your Day Might Look Like:
- Independently design and maintain scalable ETL pipelines within a collaborative project team, delivering clean, analytics- and AI-ready data.
- Work with SQL, Python, PySpark and tools like MS Fabric, Azure Data Factory, Databricks, Snowflake to develop, optimize, and automate data processes.
- Design robust ETL, scalable data models and optimised design patterns for analytics and BI workloads.
- Ensure data quality and data governance through automated checks, monitoring, source to target mapping, data lineage and continuous improvements.
- Collaborate with cross-functional teams to understand data requirements, business semantics and deliver high-quality solutions.
- Troubleshoot, optimize, and support production pipelines to keep data flowing smoothly.
- Use Git and Agile practices to work effectively in collaborative, iterative projects.
Requirements
Your Superpowers 🚀
- BSc or MSc in Computer Science, Engineering, or similar.
- Strong SQL, Python and/or PySpark skills.
- Solid professional experience developing ETL pipelines (e.g. Azure Data Factory, Databricks, etc.) and modern data warehouses (e.g. MS Fabric or Databricks delta lakehouse, Snowflake, etc).
- Solid professional experience working with relational and NoSQL databases and systems (MS SQL Server, PostgresSQL, MongoDB, etc).
- Strong understanding of data modelling and design patterns (star-schema, data vault, SCD).
- Basic knowledge of cloud platforms (Azure, AWS, or GCP).
- Basic knowledge of visualization tools (Power BI, Tableau, Looker, etc.).
- Understanding of Agile practices and version control systems (GitHub, Azure DevOps).
- Strong problem-solving skills, eagerness to learn, collaborative spirit, customer facing skills.
- Fluent in English, both written and spoken.
Bonus Points For:
- 3+ years’ experience in hands-on data engineering.
- Understanding of AI concepts and architectures.
- Experience with enterprise platforms like Salesforce, SAP or Entersoft.
- Familiarity with no-code/low-code ETL tools such as Airflow, dbt, Matillion, Fivetran, etc.
- Advanced knowledge of PowerBI, Tableau, Qlik, etc.
- Exposure to Java or Scala, and OO/functional programming concepts.
Benefits
Perks on Perks:
- Competitive salary and hybrid work model – come hang out in our Athens office or work remotely from anywhere in European economic Area (EU, Switzerland etc.) or UK (up to 6 weeks per year).
- Training budget to level up your skills from the top tech partners in the market (Microsoft, AWS, Salesforce, Databricks etc.) – whether it’s certifications or courses, we’ve got you covered.
- Private insurance, top-tier tech gear, and the chance to work with a stellar crew.
Ready to create some data magic with us? Hit that apply button and let’s get started.