5million Records in BigQuery will create what metadata footprint in postgresdb for Metabase

@flamber Please can you give me an idea if I use BigQuery DB of the order of 5million records getting ingested daily then what is the footprint that metabase creates in underlying postgres db being used for metabase. I am suspecting if we do some data analysis on the order of that data will metabase be able to deal with that kind of volume of data.

I am worried if my underlying Postgres used for metabase will need to be scaled in volume and compute and if this is the right approach and a usecase metabase has known to solve before hand

Hi @sagun.garg
Please don’t direct new questions to me - this is a community forum, where anyone should be able to help.
It will likely not make any difference. The scan-process will only look through a few thousand distinct values, and only a few hundred values are stored in the application database.

1 Like