Metabase efforts on heavy databases

At work, we have a solution that stores its engine's necessary data on a Postgres database. The solution process manages flows from a JSON description and transforms them into engine fuel for some applications. Each solution depends on multiple (possibly branchy) workflows, each workflow has multiple processes for each user session and each process has multiple current states.

So far so good! As can imagine, the process_state may be classified either as "Medium Data" or "Big Data". The question is: What are Metabase efforts to lighten heavy-data tables?

Metabase will build queries and run those in your database so the performance depends mostly on your schema design and how beefy are your servers.

Therefore I recommend sincerely bringing some documentation from DBA consulting from the Metabase side to instruct Metabase users. Table indexing is the most recommended subject.

@brunolnetto That would not be specific to Metabase at all and would be a giant, complicated topic, which would depend a lot on your specific database type, structure, amount of data and more.
There's a quick summary in https://www.metabase.com/learn/administration/making-dashboards-faster
You should hire a DBA to help you.
Also worth reading https://www.metabase.com/learn/administration/metabase-at-scale

Very well! I thank you for your advice bits. It seems you conquered my respect once more.