I’m building a data panel, and for some of the charts the query is quite heavy. For heavy I mean querying a database with 13 millions entry joint with another of 4 millions. In my hardware, this takes 1 or 2 minutes to run.
The case is this data do not change often. The database is updated monthly.
Is there any strategy to force caching these queries? It makes no sense to run them every time someone accesses the charts.
Metabase can do something regarding this issue, or should I focus on the DBMS? My DBMS is mysql, but I’m open to other possibilities.
Two ways to improve query speed, either as @william mentioned with MetaBase caching (should be turned on anyways if the long queries are run often and data rarely changes).
Other way is to look into how your DB is structured/indexed/etc or possibly if a different DB environment would be better for the type of data you are storing.
An example is that often people will spin up a MongoDB server and plug Metabase into it and the queries run slow because they didn’t properly index the data.
@alantygel you may want to look into Dremio, it’s open-source. Not only can you make your joins etc. in a user-friendly way, but then the queries are optimized via apache arrow and Reflections to make your queries sub-second. Hope that helps!