WHAT YOU'LL DO:

  • Design, develop, and maintain data pipelines for ETL/ELT/Streaming workflows.
  • Collaborate with backend and platform engineers to integrate data solutions into cloud-native applications.
  • Optimize data storage, retrieval, and processing for performance and cost efficiency.
  • Operate cloud data infrastructure, primarily Google Cloud Platform (BigQuery, Cloud Storage, Pub/Sub).
  • Work with analytics and product teams to define data models for reporting and business intelligence.
  • Implement data security, privacy, and governance best practices.
  • Monitor, troubleshoot, and enhance data pipeline reliability and performance.
  • Maintain clear documentation for data pipelines, transformations, and data sources.
  • Stay updated with best practices and emerging technologies in data engineering.

WHAT WE'RE LOOKING FOR:

  • Strong proficiency in English (written and verbal communication) is required.
  • Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones.
  • 3+ years of experience in data engineering, focusing on building scalable pipelines and cloud-native architectures.
  • Strong SQL skills for data modeling, transformation, and optimization.
  • Proficiency in Python for data processing and automation.
  • Experience with cloud data platforms, particularly Google Cloud Platform (GCP).
  • Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub.
  • Familiarity with ETL/ELT tools such as DBT, Apache Beam, or Google Dataflow.
  • Exposure to data pipeline orchestration tools like Dagster, Apache Airflow, or Google Cloud Workflows.
  • Knowledge of data privacy, security, and compliance practices.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.

NICE TO HAVES:

  • Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis).
  • Familiarity with machine learning workflows and MLOps best practices.
  • Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio).
  • Knowledge of Terraform for Infrastructure as Code (IaC) in data environments.
  • Familiarity with data integrations involving Contentful, Algolia, Segment, and Talon.One.