Metabase Big Query Costs

Hi I'm currently using Big Query as my data warehouse and I'm looking to get costs down.

I'm currently

  • Partitioning tables where possible
  • Not running select * where possible on queries
  • Have just turned on caching for queries that take over 3 seconds (not sure on how many seconds to go for)

Are there common cost saving measures I'm missing? Should I look to reduce the amount of database scans Metabase does?

Can provide more specific details if it's helpful.

Thanks,

Jeremy

Hi @j95
Limit the datasets being available in Metabase. Disable "Include User ID and query hash in queries". Only do daily sync. Disable scans.
https://www.metabase.com/docs/latest/databases/connections/bigquery

Thanks so much for the quick reply flamber!