Facundo Iannello

Analytics Engineer

Professional Summary

Analytics Engineer with 7+ years designing and delivering end-to-end data solutions.

From architecture through implementation to BI and self-service analytics. I partner with business stakeholders to translate requirements into scalable, well-documented data platforms that drive decisions.

What I'm best at:

  • Stakeholder-driven analytics: Partner with business teams to define KPIs and self-service reporting while elevating quality through engineering best practices.
  • Dashboards & decision layers: Deliver trusted reporting in Tableau, Power BI, Looker, or Looker Studio—adapting to each organization's existing tooling.
  • Data quality & reliability: Implement validation, freshness SLAs, anomaly detection, and monitoring to ensure consistent, trustworthy data.
  • Data ingestion & integration: Connect SaaS apps, databases, and other source systems via managed connectors (Fivetran, Datastream) or custom pipelines (Airflow).
  • Modeling & transformation (dbt): Build modular SQL transformations (stg → int → marts) with tests, dbt docs, and CI validation.
  • Orchestration & CI/CD: Schedule and monitor pipelines; enforce quality gates via GitHub Actions CI for dbt and data tests.

Technical Skills

Languages: Python, SQL (multiple dialects), Jinja

Transformation: dbt (modeling, testing, docs)

Orchestration: Airflow, Composer, GitHub Actions

Cloud: GCP (expert), AWS (experienced), Azure (familiar)

Warehouses: BigQuery, Redshift, Synapse

BI & Visualization: Looker, Tableau, Power BI, Looker Studio

Infrastructure: Terraform, Docker, CI/CD pipelines

Recent Professional Experience

UJET

Embedded Analytics Engineer

September 2024 - Present • San Francisco Bay Area, United States - Remote

  • Operate the analytics layer for a B2B SaaS product serving thousands of users and hundreds of thousands of queries/day, ensuring reliability SLAs.
  • Design and maintain embedded analytics products at scale in Looker; manage LookML models, content governance, and release cycles.
  • Improve query performance and cost efficiency in MySQL (Cloud SQL) and BigQuery through refactoring and tuning.
  • Build dbt models with incremental strategies and GitHub-based CI for validation.
  • Define semantic standards, KPI definitions, and documentation across analytics layers.

Tech Stack: Looker, BigQuery, MySQL, dbt, GitHub CI, LookML

Cloud-native CCaaS platform powered by AI, unifying omnichannel support with automation and intelligence.

Andela

Analytics Engineer

April 2022 - April 2024 • New York, United States - Remote

  • Designed scalable analytics architectures spanning ingestion, transformation, semantic layer, and BI—adapting patterns across GCP and client environments.
  • Delivered scalable data lakes and ingestion pipelines in GCP with reliable, documented datasets.
  • Implemented Infrastructure as Code (Terraform) for secure, repeatable environment provisioning and management.
  • Automated analytics workflows (QA, deployments, modeling standards) to accelerate delivery and quality.
  • Designed Looker Explores and dashboards for Product, Marketing, Network, and Revenue teams with consistent KPIs.
  • Established data quality controls, monitoring, and documentation via dbt docs, fostering a self-service data discovery culture across teams.

Tech Stack: GCP, dbt, Terraform, Airflow, Python, Looker

Global talent marketplace with 150K+ developers across 135+ countries, serving 600+ clients; unicorn status ($1.5B) after a $200M Series E.

Trafilea

Senior BI Data Analyst

June 2020 - April 2022 • Montevideo, Uruguay - Remote

  • Embedded in cross-functional squads with Data Science, Marketing (Acquisition & Retention), and Executive teams—collaborating on exploratory analyses, translating insights into operational improvements, and building Tableau dashboards to track business performance.
  • Built decision-support layers and single-source-of-truth datasets for spend, traffic, conversion, and revenue.
  • Partnered with Data Science on forecasting models (demand/spend time series, retention/repurchase propensity) and statistical analyses that informed inventory planning and media investment.
  • Supported media investment optimization with weekly performance reviews and scenario analyses.
  • Implemented ELT/ETL pipelines with QA and documentation; standardized A/B test analysis with KPI frameworks.

Tech Stack: AWS (Redshift, RDS, Athena, S3), Tableau, Python, Airflow

Direct-to-consumer e-commerce group (Shapermint, Truekind) scaled from 0 to $200M+ in 5 years; Shapermint reached 10M+ customers and sold 22M+ units.

Zigla

Data Analyst

June 2019 - June 2020 • Buenos Aires, Argentina

  • Served multiple clients simultaneously, working directly with business stakeholders on data cleaning, transformation, and dashboard delivery—including clients on GCP, Azure Synapse and Snowflake environments.
  • Led end-to-end digital transformation projects: discovery → roadmap → delivery → change management.
  • Delivered complete reporting environments: data ingestion, modeling, and BI dashboards with clear KPIs.
  • Automated recurring ETL/ELT processes, replacing manual workflows and reducing reporting times.
  • Trained client teams through workshops and playbooks, enabling sustainable self-service analytics.

Tech Stack: Python, SQL, Azure Synapse, Power BI, Tableau, Salesforce

Consultancy supporting non-profit organizations (UNICEF, United Nations, Barkey Foundation, Disney Foundation) in building intelligent systems.

Education

Master's in Data Mining & Knowledge Discovery

Universidad Austral, Argentina • 2018 - 2019

Bachelor's in Economics

University of Buenos Aires • 2013 - 2017