Contáctanos

Snowflake - Senior Data Engineer

Join our innovative, fast-growing data team at the forefront of cloud data architecture. We're building scalable, secure, and modern data platforms using cutting-edge Snowflake and modern data stack technologies. If you're passionate about creating high-performance data infrastructure and solving complex data challenges in a cloud-native environment, this is the perfect opportunity for you.

As a Senior Data Engineer specializing in Snowflake and the modern data stack, you will architect and implement enterprise-grade cloud-native data warehousing solutions. This is a hands-on engineering role with significant architectural influence, where you'll work extensively with dbt, Fivetran, and other modern data tools to create efficient, maintainable, and scalable data pipelines using ELT-first approaches.

What you do

Technical Expertise:
Snowflake Mastery: Deep expertise in Snowflake including architecture, access controls, security, and performance tuning
dbt Proficiency: Advanced proficiency in dbt including macros, packages, testing frameworks, and best practices
Data Ingestion: Strong experience with data ingestion tools like Fivetran, Stitch, Singer, or similar ELT platforms
SQL & Data Modeling: Proficiency in SQL and data modeling techniques (dimensional modeling, data vault, star schema)
Cloud Platforms: Experience with cloud platforms (preferably AWS, GCP, or Azure) and their native data services
Orchestration: Knowledge of data orchestration tools like Airflow, Prefect, or Dagster
Programming: Python programming skills for data processing, scripting, and automation
DevOps: Understanding of CI/CD practices, Git-based workflows, and version control systems

Data Management:
Understanding of data governance frameworks and data quality practices
Experience with data validation and testing methodologies
Familiarity with data visualization tools (Tableau, Looker, Power BI)

Preferred Qualifications & Certifications:
Bachelor's degree in Computer Science, Engineering, or related field
4–7 years of hands-on experience in data engineering with strong focus on cloud data warehousing
Snowflake SnowPro Core and Advanced/Architect certifications
dbt Analytics Engineering or dbt Practitioner certification

Advanced Technologies:
Experience with reverse ETL tools like Census, Hightouch, or Polytomic
Knowledge of streaming data platforms (Kafka, Kinesis, Pub/Sub)
Experience with data catalog tools (Alation, Collibra, DataHub)
Understanding of DataOps, analytics engineering, and data mesh architecture principles
Experience with containerization (Docker) and Kubernetes
Exposure to containerized environments and modern deployment practices

Technology Stack:
Core Technologies
Data Warehouses: Snowflake (primary), BigQuery
ELT & Transformation: dbt, Dataform
Data Ingestion: Fivetran, Stitch, Singer-based connectors
Orchestration: Airflow, Prefect, Dagster
Quality & Testing: Great Expectations, dbt tests
Observability: Monte Carlo, Datafold
Data Catalog: Alation, Collibra, DataHub
Visualization: Looker, Tableau, Power BI

Cloud & DevOps:
Cloud Platforms: AWS, GCP, Azure
Version Control: Git
Programming: Python, SQL
Containerization: Docker, Kubernetes

What we ask

Data Architecture & Engineering:
Design and implement enterprise-scale, robust data warehouse solutions using Snowflake
Architect scalable ELT pipelines using Fivetran, Stitch, Singer, and custom connectors
Build automated data ingestion processes and maintain data transformation workflows
Develop and maintain data transformation pipelines using dbt (data build tool) with modular and test-driven design

Data Modeling & Optimization:
Implement data modeling best practices including dimensional modeling, data vault, and star schema designs
Optimize Snowflake warehouse performance, including query optimization, cost management, and resource management
Apply advanced Snowflake features for architecture, security, and performance tuning

Quality & Governance:
Develop and maintain comprehensive data quality tests and monitoring using dbt tests and Great Expectations
Implement data governance, lineage, and security policies within Snowflake
Create and maintain comprehensive data documentation and lineage tracking
Build data validation and testing frameworks for proactive data monitoring

DevOps & Automation:
Build and maintain CI/CD pipelines for dbt projects and data warehouse deployments
Develop custom Python scripts for data processing, manipulation, and automation tasks
Implement data observability and monitoring solutions using tools like Monte Carlo and Datafold
Work with orchestration tools for pipeline scheduling and management

Collaboration & Analytics:
Collaborate with analytics teams and data scientists to support self-service data access
Enable self-service analytics capabilities for business users
Work closely with stakeholders to understand data requirements and deliver solutions

What we offer

You’ll join an international network of data professionals within our organisation. We support continuous development through our dedicated Academy. If you're looking to push the boundaries of innovation and creativity in a culture that values freedom and responsibility, we encourage you to apply.

At Valtech, we’re here to engineer experiences that work and reach every single person. To do this, we are proactive about creating workplaces that work for every person at Valtech. Our goal is to create an equitable workplace which gives people from all backgrounds the support they need to thrive, grow and meet their goals (whatever they may be). You can find out more about what we’re doing to create a Valtech for everyone here.

Please do not worry if you do not meet all of the criteria or if you have some gaps in your CV. We’d love to hear from you and see if you’re our next member of the Valtech team!

Contáctanos

Reinventemos el futuro