Metabase Big Query Costs

Hi I'm currently using Big Query as my data warehouse and I'm looking to get costs down.

I'm currently

  • Partitioning tables where possible
  • Not running select * where possible on queries
  • Have just turned on caching for queries that take over 3 seconds (not sure on how many seconds to go for)

Are there common cost saving measures I'm missing? Should I look to reduce the amount of database scans Metabase does?

Can provide more specific details if it's helpful.



Hi @j95
Limit the datasets being available in Metabase. Disable "Include User ID and query hash in queries". Only do daily sync. Disable scans.

Thanks so much for the quick reply flamber!

If you don't mind me asking @j95 – is your usage actually generating costs?

We've been using Metabase with BigQuery for a few months with our team of 8 and so far our usage has stayed under Google's threshold and I'm trying to anticipate (and budget for) the day we exceed the free tier.

Hi @jamesgreenblue. The only reason it was costing us anything was that I had set up our Big Query instance with much higher storage and performance than necessary. I've since downgraded it and the costs are now super low. I think we are charged a few dollars a day at the moment.

In general I think it's fairly easy to keep costs low in a small team/with a small amount of data querying.

1 Like