[v3] Expiration of incremental source tables
Incident Report for Y42
Resolved
This incident has been resolved.
Posted May 15, 2024 - 09:30 CEST
Identified
Our team detected an issue with our expiration logic for incremental imports. Over the past 5 days we deleted the underlying table for a small set of incremental imports. Subsequent imports recreated the table, but miss the full data.

We recommend to run a full import for all incremental models to ensure complete data. You can achieve this by executing `y42 build --full-refresh`.

For some API connectors, e.g. Facebook, the full import only retrieves data for a limited historical time-window. You can restore the data preceding this window using time-travel. Follow https://cloud.google.com/bigquery/docs/access-historical-data#query_data_at_a_point_in_time and https://docs.snowflake.com/en/user-guide/data-time-travel for BigQuery and Snowflake warehouses, respectively. If you need this historical data, we recommend to take action immediately as time-travel in the underlying warehouses is limited to a short time. Reach out to your customer success representative in case of questions.
Posted May 13, 2024 - 18:55 CEST
This incident affected: Integrations.