We are gonna monitor some metrics and we’re thinking about using metabase, the DB will grow by 15 million records per month, which seems quite a lot.
the backend DB could handle those, we’re thinking about that, will metabase tho?
I’ve noticed i can modify some cache settings, which might help, suggestions?
EDIT: adding some more details.
the db is hosting metrics of an app from various host, we analize time (ms), size (bytes) and libraries called within various layers of this app. so it’s one table containing various timestamps/dates (down to the ms) and numerical/alphanum values, for now it’s one table with 20 columns.
since these are basically timeseries past data won’t change, we have about 30 questions split amongs 3 dashboards, we look at average/max/standard deviation and group both by date and name of the library, we do use filters tho, date/host/library called (alphanum value).
I think caching might help a lot in this situation, suggestions?
we could also either limit the amount of data we actually have metabase analyze by rewriting the table structures, but we’d still talk about millions of records, or we could work on a monthly database, the problem with that is that every month we would have to redo the dashboards because by changing the DB/table of a question the query/visualization is also lost, is there any way to achieve this? I see there’s an internal DB, which I suppose holds all that info, would it be possible to operate directly on it to modify the DB/table source of a question while keeping visualization and query settings?