Hi I'm currently using Big Query as my data warehouse and I'm looking to get costs down.
I'm currently
- Partitioning tables where possible
- Not running select * where possible on queries
- Have just turned on caching for queries that take over 3 seconds (not sure on how many seconds to go for)
Are there common cost saving measures I'm missing? Should I look to reduce the amount of database scans Metabase does?
Can provide more specific details if it's helpful.
Thanks,
Jeremy