Huge latencies to load dashboards with AWS hosting

Hi,

We have problems of dashboards suddenly not loading at all (see screenshot 1:

), or being very long to load (see screenshot 2: ).

We don't know what happens exactly. We host Metabase 0.41.2 (current latest version) on AWS (set-up done following the official documentation.

We constantly run from 3 instances using AWS Beanstalk (that can scale to up to 10 based the target response time: if big, a new instance is added to help).

Here is a few hours window from our AWS Beanstalk monitoring (see screenshot 3

). This screenshot is here to show you that the environment very often switches to "degraded" which is very bad. The target response time of the instances is the metrics you use to assess the health of the environment: we see that it has huge peaks regularly. The number of requests is a few hundreds during work days: we think that's not a crazy load. The Max Network In shows a huge peak during the morning: more than 286Mo: we don't really know why but it suggests there is a bottleneck related to the amount of data being received by the instances (due to the Metabase sync ? See our 2nd hypothesis below). Max Network Out is normal. Finally, the CPU of the instances is always low so no problem of compute power (vertical scale is ok).

We have 2 hypothesis but none of we tried solved the problem so far:

  1. The number of simultaneous connexion on the Metabase database (AWS RDS Postgres database for us)
    We then increased the number of connexions on this database but also, on our "data" database, the number of Jetty process. Here is the list of the environment parameters we increased from the default value to 100: MB_APPLICATION_DB_MAX_CONNECTION_POOL_SIZE (100), MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE (100), MB_JETTY_MAXTHREADS (150).
    We still have the "degraded" states though.

  2. Metabase regularly syncs the databases to have the newest schemas, tables, columns and their data types. We think this sync may be related to the peak of 286Mo or so we see regularly in the Max Network In graph of our Beanstalk.

Additional information:

  • We use c5a.large, r5a.large for the instances.
  • Our Postgres RDS (Metabase database) is a db.r5.large. So, it's a huge configuration for such a small database but in order to increase the number of connexions, we had to increase the RAM so the instance class itself. It didn't help so much and we still have the latency problem.
  • Our "data" database is for now on Heroku (legacy: we killed the dynos but kept the database only): Standard-4 heroku-postgresql:standard-4 30 GB 768 GB 500.
  • For a few months, all worked pretty well but for a good 4 to 6 weeks we experience more and more latency issues (we do have more users, more big dashboards lately to run).

We think solving this problem requires more knowledge of how Metabase works and what it needs to reach an equilibrium between good performances / cost efficient and scale with the number connexions/users. Do you have an explanation, similar experience, recommendations, tests to share / suggest ?

Thanks a lot for your help,
Best regards,

Louis

Hi @Louis
Which URL are you using for the EB health-check and how often?
https://www.metabase.com/docs/latest/operations-guide/running-metabase-on-elastic-beanstalk.html#21-enabling-enhanced-health-checks

Without seeing the logs, it's hard for me to tell what is going on, but something isn't right. Unless you have a lot of long-running-queries and high concurrent user activity, then your environment variables seem too high. Have you read this?
https://www.metabase.com/learn/administration/metabase-at-scale.html

As for the 286MB ingestion, yes, that is very likely the analysis, but most likely scan or fingerprinting, not sync.
https://www.metabase.com/docs/latest/administration-guide/01-managing-databases.html#choose-when-metabase-syncs-and-scans
Again, the logs would tell you what is happening around the time of the higher ingestion.

From my personal experience, EB is difficult to keep stable. I would recommend something more bare-bone like ECS. It's easier to maintain and understand when resources are maxing out instead of all the auto-magic that EB provides.

Hi @flamber,

Thanks for the quick answer.

  • Yes, we do use the EB health-check. The trigger is every 5 minutes. It is set-up as recommended in the official documentation (see screenshot:

    ).

  • We actually used the logs to try to better understand what could go wrong. Here is a classic log we have (no problem here but you can check our current values of db connexions, jetty threads):
    [78f95288-a545-4f92-a97e-283faa6e2e75] 2021-11-26T12:58:31+01:00 DEBUG metabase.server.middleware.log POST /api/dataset 202 [ASYNC: completed] 623.5 ms (15 DB calls) App DB connections: 0/100 Jetty threads: 2/100 (93 idle, 0 queued) (389 total active threads) Queries in flight: 90 (0 queued); postgres DB 3 connections: 90/100 (0 threads blocked)

  • For the scan / fingerprinting, I disabled it and still got some latency issues. So, it suggests it's not the main cause of the problem.

  • At the beginning, we tried AWS Fargate, which is ECS but the performances were super bad. It made us switch to EB.

What do you suggest then ?

Thanks in advance

@Louis It's hard for me to tell from that single log, if there were a lot of activity at the same time, which could account of the 90 active connections to the database.

Debugging something like this would require much more logs and an understanding of how the instances are currently being used, along with pg_stats info from the database to understand the connection status.

The analysis process should only take one or two connections, so that should not be causing "latency".
I think "latency" is the wrong term to use. It seems like you might have "hanging" connections or the connection pool gets "exhausted".
But it's difficult for me to know without more logs and database connection stats.

I would just try to use a single ECS with the application database on RDS - it should be enough, but again, I have no idea of your load.

Yes, it's just a copy of the logs that question us about the environment variables of the number of connexions allowed.

I understand this is a difficult problem to debug without seing some logs. What would be useful for you to check and assess the situation here ?

We actually already tried to use ECS but the performances were super bad.

@Louis I'm guessing that something must have been misconfigured, since ECS is what most people use. That's what Metabase Cloud uses.

You say that you had performance problems, but which type - where was the performance problem?

And you are not answering questions about your load - without knowing that, then no-one will be able to guide you in any way.
There's a huge difference between having 10 concurrent users accessing dashboards with 100 cards/questions, where each question can run for several minutes. And then having a 100 concurrent users accessing a dashboard with 10 cards, which are cached, but usually loads within a couple of seconds.

Logs are everything - without those, then I have no idea.

@flamber:
Yes, our configuration for ECS was perhaps off. But we had such dreadful performances (without setting exotic parameter values) from the beginning that we switched to EB.
EB was working just fine for us for a few months so there is definitely something to fix now.

So, I detailed a bit the consequences in Metabase of our performance problem in my first message: we have dashboards not loading at all (none of the questions embedded in the dashboard load). Clients and co-workers have experienced it with various dashboards that contains different questions. This phenomenon appeared particularly in November.

About the load:
We have approximately up to 30 people that wants to load a dashboard during the morning rush hours. Some dashboards contain 10 questions, and the biggest ones contain 40-50 questions I'd say.

Here are some useful logs for you to check. I've stressed the system up to the point I don't have results in the questions of my dashboard:

> [360edd99-000f-40c5-9dc2-dd42f0526a1f] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/286/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 0/61 Jetty threads: 4/150 (20 idle, 0 queued) (196 total active threads) Queries in flight: 36 (0 queued); postgres DB 3 connections: 0/12 (0 threads blocked)
> [360edd99-000f-40c5-9dc2-dd42f0526a1f] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/61 Jetty threads: 2/150 (22 idle, 0 queued) (196 total active threads) Queries in flight: 36 (0 queued); postgres DB 3 connections: 13/13 (6 threads blocked)
> [360edd99-000f-40c5-9dc2-dd42f0526a1f] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/61 Jetty threads: 3/150 (22 idle, 0 queued) (196 total active threads) Queries in flight: 35 (0 queued); postgres DB 3 connections: 6/12 (0 threads blocked)
> [19a0140b-5765-4b34-b36f-4c870d87bf6b] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/34 Jetty threads: 2/150 (20 idle, 0 queued) (198 total active threads) Queries in flight: 38 (0 queued); postgres DB 3 connections: 13/13 (5 threads blocked)
> [19a0140b-5765-4b34-b36f-4c870d87bf6b] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/34 Jetty threads: 2/150 (20 idle, 0 queued) (198 total active threads) Queries in flight: 37 (0 queued); postgres DB 3 connections: 3/13 (0 threads blocked)
> [19a0140b-5765-4b34-b36f-4c870d87bf6b] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/34 Jetty threads: 2/150 (20 idle, 0 queued) (198 total active threads) Queries in flight: 36 (0 queued); postgres DB 3 connections: 2/13 (0 threads blocked)
> [360edd99-000f-40c5-9dc2-dd42f0526a1f] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/61 Jetty threads: 2/150 (22 idle, 0 queued) (196 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 4/12 (0 threads blocked)
> [360edd99-000f-40c5-9dc2-dd42f0526a1f] 2021-11-30T17:52:24+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/61 Jetty threads: 4/150 (21 idle, 0 queued) (196 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 2/12 (0 threads blocked)

It looks like the "postgres DB 3 connections" is the problem here.
Database Postgres 3 is indeed my "data" database (not the Metabase database). According the official documentation, the MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE environment variable is the one to adjust. We went from 100 to 300 and I still had the dashboard not loading. Same when going down at 50. We don't understand the real impact of this variable: the number of active / pool is always the equal when for the questions in the dashboard that doesn't load (obviously as it seems to be the bottleneck). But the number of pool does not vary constantly: the maximum I've seen is 50. Sometimes, it's 13, then 33, etc. In short, there is no clear correlation with MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE.

Can you provide more details about this parameter and if you think that's indeed the bottleneck we need to fix ?

PS: our postgres database can take up to 500 connexions at the same time.

@Louis I've written the documentation for this, so I should be able to answer :wink:

MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE is the maximum amount of connections that each instance will have to a database.
Meaning if you have configured a db in Admin > Databases and are running two instances, then each instance will open up to 15 connections for each configured db.

But ASYNC: canceled means that the connection between Metabase and the browser was closed, and given that it's 1 minute, then I'm guessing that it's your reverse-proxy, which closes the connection.

Please read this too: https://www.metabase.com/docs/latest/troubleshooting-guide/server-logs.html

But you are not providing enough information. The log snippet doesn't look like there's much activity.
What is the state of your database, when you match it with the logs?
SELECT * FROM pg_stat_activity

@flamber
Thanks for the clear explanation. Most of the time, we have Metabase running on 3 instances. So, MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE = 100 means there are 300 total opened connections to the database.

NB: When I do the tests to get the logs, I use a huge dashboard which contains 106 questions. A lot are the same questions but with different values from different dashboard filters. I've just did some tests when I'm sure I'm the only one using Metabase: the dashboard could not load (only 5-10 questions / 106 displayed results).
I should mention that I've seen regularly this dashboard loads in 2-3 seconds (during my previous investigation): so that's definitely possible to load it quickly.

So, when I've launched the dashboard 2 times. Both tests failed and the dashboard didn't load. Below are some more logs:

In particular, we can see (example): (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 62 (0 queued); postgres DB 3 connections: 39/39 (59 threads blocked)

  • Not a lot of Jetty threads but lots in idle.
  • Lots of queries in flight: no queue.
  • As much active connection as connection pool.
  • Lots of threads blocked.
  • Almost all the logs are: [ASYNC: canceled] with "1 mins" or "59.3 s" (for some reason). I don't stop the dashboard / close the window at 1min. I think (I may be wrong) it's the Metabase time out. Every time the dashboard doesn't load. The questions cancelled.

[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:16+01:00 DEBUG metabase.server.middleware.log GET /api/user/current 200 3.2 ms (3 DB calls) App DB connections: 0/100 Jetty threads: 4/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 97 (0 queued)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:16+01:00 DEBUG metabase.server.middleware.log GET /api/session/properties 200 3.9 ms (2 DB calls) App DB connections: 0/100 Jetty threads: 4/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 97 (0 queued)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:16+01:00 DEBUG metabase.server.middleware.log GET /api/database 200 4.8 ms (3 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 97 (0 queued)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 96 (0 queued); postgres DB 3 connections: 1/1 (0 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 95 (0 queued); postgres DB 3 connections: 3/3 (3 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 93 (0 queued); postgres DB 3 connections: 0/1 (0 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 93 (0 queued); postgres DB 3 connections: 1/1 (2 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (33 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 92 (0 queued); postgres DB 3 connections: 6/6 (2 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 91 (0 queued); postgres DB 3 connections: 3/3 (4 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 4/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 90 (0 queued); postgres DB 3 connections: 1/1 (1 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 4/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 89 (0 queued); postgres DB 3 connections: 3/3 (2 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 87 (0 queued); postgres DB 3 connections: 3/3 (1 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 87 (0 queued); postgres DB 3 connections: 9/9 (3 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 86 (0 queued); postgres DB 3 connections: 7/7 (3 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 59.1 s (31 DB calls) App DB connections: 0/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 84 (0 queued); postgres DB 3 connections: 32/32 (27 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 1/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 84 (0 queued); postgres DB 3 connections: 8/8 (3 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.1 s (30 DB calls) App DB connections: 1/100 Jetty threads: 4/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 83 (0 queued); postgres DB 3 connections: 35/35 (45 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 1/100 Jetty threads: 2/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 81 (0 queued); postgres DB 3 connections: 7/7 (2 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 1/100 Jetty threads: 3/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 81 (0 queued); postgres DB 3 connections: 13/13 (1 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 59.1 s (32 DB calls) App DB connections: 1/100 Jetty threads: 3/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 80 (0 queued); postgres DB 3 connections: 35/35 (42 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 59.1 s (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 79 (0 queued); postgres DB 3 connections: 35/35 (43 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.1 s (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (26 idle, 0 queued) (343 total active threads) Queries in flight: 78 (0 queued); postgres DB 3 connections: 35/35 (49 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.1 s (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 76 (0 queued); postgres DB 3 connections: 35/35 (46 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.1 s (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 76 (0 queued); postgres DB 3 connections: 35/35 (48 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 59.2 s (31 DB calls) App DB connections: 0/100 Jetty threads: 6/150 (22 idle, 0 queued) (343 total active threads) Queries in flight: 73 (0 queued); postgres DB 3 connections: 34/34 (41 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 59.2 s (32 DB calls) App DB connections: 0/100 Jetty threads: 6/150 (22 idle, 0 queued) (343 total active threads) Queries in flight: 73 (0 queued); postgres DB 3 connections: 35/35 (47 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.2 s (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 73 (0 queued); postgres DB 3 connections: 35/35 (41 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.3 s (31 DB calls) App DB connections: 1/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 71 (0 queued); postgres DB 3 connections: 39/39 (55 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.3 s (32 DB calls) App DB connections: 1/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 72 (0 queued); postgres DB 3 connections: 39/39 (53 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/286/query 202 [ASYNC: canceled] 59.2 s (31 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 65 (0 queued); postgres DB 3 connections: 39/39 (58 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 59.2 s (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 63 (0 queued); postgres DB 3 connections: 38/38 (50 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 59.2 s (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 63 (0 queued); postgres DB 3 connections: 39/39 (52 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.3 s (31 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 63 (0 queued); postgres DB 3 connections: 39/39 (57 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/286/query 202 [ASYNC: canceled] 59.2 s (30 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 63 (0 queued); postgres DB 3 connections: 39/39 (56 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 59.3 s (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 62 (0 queued); postgres DB 3 connections: 39/39 (59 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.3 s (31 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 63 (0 queued); postgres DB 3 connections: 39/39 (51 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.2 s (31 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 62 (0 queued); postgres DB 3 connections: 39/39 (50 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.3 s (31 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 62 (0 queued); postgres DB 3 connections: 39/39 (54 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 59.3 s (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 61 (0 queued); postgres DB 3 connections: 38/38 (49 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 59.2 s (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 60 (0 queued); postgres DB 3 connections: 38/38 (47 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 57 (0 queued); postgres DB 3 connections: 23/23 (12 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 57 (0 queued); postgres DB 3 connections: 16/16 (5 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (343 total active threads) Queries in flight: 55 (0 queued); postgres DB 3 connections: 18/18 (7 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 55 (0 queued); postgres DB 3 connections: 23/23 (13 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 0/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 55 (0 queued); postgres DB 3 connections: 14/14 (4 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 32/32 (31 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 15/15 (4 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 18/18 (6 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 21/21 (10 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 16/16 (4 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 30/30 (20 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (30 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 13/13 (3 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 12/12 (1 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 5/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 33 (0 queued); postgres DB 3 connections: 19/19 (9 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 6/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 25 (0 queued); postgres DB 3 connections: 32/32 (25 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 6/100 Jetty threads: 4/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 25 (0 queued); postgres DB 3 connections: 13/13 (2 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 6/100 Jetty threads: 4/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 25 (0 queued); postgres DB 3 connections: 23/23 (10 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 6/100 Jetty threads: 9/150 (20 idle, 0 queued) (344 total active threads) Queries in flight: 25 (0 queued); postgres DB 3 connections: 32/32 (29 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 6/100 Jetty threads: 4/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 25 (0 queued); postgres DB 3 connections: 32/32 (30 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 6/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 32/32 (24 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 6/100 Jetty threads: 5/150 (24 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 34/34 (36 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 34/34 (37 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/286/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 33/33 (32 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/286/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 6/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 31/31 (20 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 32/32 (21 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 6/100 Jetty threads: 4/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 21/21 (9 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 19/19 (7 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (30 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 32/32 (28 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 35/35 (44 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 19/19 (8 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 23/23 (11 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 29/29 (18 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 13/13 (4 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 32/32 (20 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 20/20 (9 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 24 (0 queued); postgres DB 3 connections: 34/34 (33 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:41+01:00 INFO metabase.query-processor.middleware.cache Query took 59.2 s to run; minimum for cache eligibility is 4.0 s
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 5 (0 queued); postgres DB 3 connections: 32/32 (23 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 34/34 (38 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 34/34 (35 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (30 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 30/30 (19 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 34/34 (34 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 27/27 (16 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (30 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 33/33 (31 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 26/26 (16 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 33/33 (33 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 27/27 (18 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 29/29 (19 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 34/34 (40 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 24/24 (13 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 29/29 (17 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 24/24 (15 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 29/29 (16 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 24/24 (14 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 3 (0 queued); postgres DB 3 connections: 16/16 (6 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 5/150 (24 idle, 0 queued) (344 total active threads) Queries in flight: 2 (0 queued); postgres DB 3 connections: 26/26 (15 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2374/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 2 (0 queued); postgres DB 3 connections: 24/24 (16 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 2/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 2 (0 queued); postgres DB 3 connections: 27/27 (17 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/167/query 202 [ASYNC: canceled] 1.0 mins (32 DB calls) App DB connections: 3/100 Jetty threads: 3/150 (25 idle, 0 queued) (344 total active threads) Queries in flight: 2 (0 queued); postgres DB 3 connections: 32/32 (22 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/card/2372/query 202 [ASYNC: canceled] 1.0 mins (31 DB calls) App DB connections: 4/100 Jetty threads: 3/150 (24 idle, 0 queued) (343 total active threads) Queries in flight: 1 (0 queued); postgres DB 3 connections: 32/32 (26 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:42+01:00 DEBUG metabase.server.middleware.log POST /api/dataset 202 [ASYNC: completed] 39.1 s (4 DB calls) App DB connections: 0/100 Jetty threads: 4/150 (24 idle, 0 queued) (343 total active threads) Queries in flight: 1 (0 queued); postgres DB 3 connections: 50/50 (46 threads blocked)
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:52+01:00 INFO metabase.query-processor.middleware.cache Query took 1.2 mins to run; minimum for cache eligibility is 4.0 s
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:53+01:00 INFO metabase.query-processor.middleware.cache Query took 1.2 mins to run; minimum for cache eligibility is 4.0 s
[32f4440e-4ca2-447f-ad7b-2c62062ea6ec] 2021-12-01T00:07:54+01:00 ERROR metabase.query-processor.middleware.catch-exceptions Error processing query: null

I've also looked at

  • select count(*) used from pg_stat_activity; --> max I got was 56 or so during the fail loading of the dashboard.
  • select setting::int max_conn from pg_settings where name=$$max_connections$$ --> always get 500.

Here are more details about the logs from one of the 2 tests (cut because it was too big):
> [
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.398266+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:00.057202+01:00","pid":9689,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:00.057178+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50026},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.353239+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.260293+01:00","pid":9688,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.260269+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50024},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.346127+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:18:57.98919+01:00","pid":9687,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:18:57.989168+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50022},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.252965+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:18:55.513206+01:00","pid":9686,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:18:55.513179+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50020},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.249261+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:17:00.849196+01:00","pid":9685,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:17:00.849167+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50018},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.209268+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:03.465211+01:00","pid":9684,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:03.46518+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50016},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.097166+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.695363+01:00","pid":9683,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.69534+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50014},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.094789+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.832684+01:00","pid":9682,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.832664+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50012},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:16:00.057263+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:05.113175+01:00","pid":9681,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:05.113152+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50010},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.943519+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.605194+01:00","pid":9679,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.605157+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50006},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.94336+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:18:44.889776+01:00","pid":9680,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:18:44.889754+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50008},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.937246+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:01.645203+01:00","pid":9678,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:01.645178+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50004},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.825933+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:03.577208+01:00","pid":9677,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:03.577185+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50002},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.822866+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:18:54.293207+01:00","pid":9675,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:18:54.293184+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49998},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.821274+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.209196+01:00","pid":9676,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.209172+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":50000},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.71374+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:05.548768+01:00","pid":9674,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:05.548741+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49996},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.713559+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.713683+01:00","pid":9673,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.713633+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49994},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.711293+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.594157+01:00","pid":9672,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.594117+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49992},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.589251+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.82611+01:00","pid":9671,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.826084+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49990},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.557271+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.722202+01:00","pid":9670,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.722175+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49988},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.557268+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.475787+01:00","pid":9669,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.47576+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49986},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.498134+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.629893+01:00","pid":9668,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.629871+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49984},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.461225+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.666881+01:00","pid":9667,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.666834+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49982},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.458177+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.950046+01:00","pid":9666,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.950024+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49980},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.4253+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.673719+01:00","pid":9665,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.67368+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49978},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.36632+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.647266+01:00","pid":9663,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.64707+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49974},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.364758+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.60919+01:00","pid":9662,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.609159+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49972},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.361254+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:05.437779+01:00","pid":9664,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:05.437755+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49976},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.304169+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:16:59.729187+01:00","pid":9660,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:16:59.729153+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49968},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.302412+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:05.449967+01:00","pid":9661,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:05.449945+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49970},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.301261+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:04.738721+01:00","pid":9659,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:04.738697+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49966},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.241953+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:17:01.660314+01:00","pid":9658,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:17:01.660275+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49964},
> {"client_addr":"34.244.195.34","xact_start":null,"backend_xid":null,"wait_event":"ClientRead","usesysid":16384,"backend_type":"client backend","backend_start":"2021-12-01T00:15:59.23445+01:00","query":"SET SESSION CHARACTERISTICS AS TRANSACTION ISOLATION LEVEL READ COMMITTED","state_change":"2021-12-01T00:19:02.881202+01:00","pid":9657,"datname":"d33b4cjhp5ms6","client_hostname":null,"application_name":"Metabase v0.41.2 [32f4440e-4ca2-447f-ad7b-2c62062ea6ec]","wait_event_type":"Client","state":"idle","datid":16402,"query_start":"2021-12-01T00:19:02.881171+01:00","backend_xmin":null,"usename":"ubmh8t9qchmi1h","client_port":49962},

PS: I remember changing the parameters for the cache helped with overall performances. Perhaps, the values we use are problematic ?

  • MINIMUM QUERY DURATION = 5
  • TTL = 50
  • MAX CACHE ENTRY SIZE = 200000

Then I changed today the param to:

  • MINIMUM QUERY DURATION = 4
  • TTL = 2
  • MAX CACHE ENTRY SIZE = 100000
    To see if it would make an impact. Performances are not worse but there are definitely not better.

Thanks in advance for your help !

@Louis Please read this - even if you think you've read it, then read it again very closely:
Metabase at scale
Adjust the error message of "Your question took too long" · Issue #12423 · metabase/metabase · GitHub
Making dashboards faster

If you only have 30 concurrent users, then I doubt you need more than a single instance. And you probably only need slightly higher pools on that single-instance.


The problem already existed before your log snippet. Block access for anyone besides you and restart Metabase, so you can test without a lot of other people doing things at the same time, which will make it more difficult for you to debug.

Not a lot of Jetty threads but lots in idle.

That's because you're using HTTPS over HTTP/2.

Lots of queries in flight: no queue.

Mostly ignore that, as I've also written in the troubleshooting guide to the log.

As much active connection as connection pool.

As I wrote in the troubleshooting guide to the log, the pool number is currently just what has been reserved in the memory, not the max defined.

Lots of threads blocked.

That's the queue.

Almost all the logs are: [ASYNC: canceled] with "1 mins" or "59.3 s" (for some reason). I don't stop the dashboard / close the window at 1min. I think (I may be wrong) it's the Metabase time out. Every time the dashboard doesn't load. The questions cancelled.

There's no timeout in Metabase for 60 seconds - that is caused by something in your network.


Your pg stats doesn't seem to correlate with the Metabase logs you've provided.

You can try making a cronjob, which runs every minute or so, that kills idle connections:

DO $$ BEGIN
  PERFORM pg_terminate_backend("pid")
  FROM "pg_stat_activity"
  WHERE "pid" <> pg_backend_pid()
    AND "application_name" LIKE 'Metabase v%'
    AND "state" = 'idle';
END $$;

@flamber
I've read "Metabase at scale" again:

  • We use a much larger instance class than the 't3.small' suggested in the official documentation. Thus, no problem of vertical scaling in our case. Same for horizontal scaling as we have constantly 3 instances. I'll adjust those after we fix the non-loading dashboard problem.
    One thing is not clear though. How do you differentiate users and the number of connexions they create when they load a dashboard. 1 user could potentially load 3 dashboards with dozens of questions and therefore would need lots of connexions and lots of power ?
    To be more precise about the numbers, I have 30 co-workers that could potentially load a dashboard every morning at 9AM. We also have 80 clients that have access to their respective dashboard(s), and that could potentially want to load the dashboard(s) at the time time (even if it's very rare).

  • I don't see any particular wrong settings in our AWS that could lead to performance issues when reading the doc. We do use an application database separated, a load balancer, etc. On Metabase-side, all our queries are focused on some indexed tables with limited data, that are systematically filtered in the MB questions. Of course, there is room for improvement but the dashboards that don't load, regularly are displayed in 2-3 seconds. And the same day, for an unknown reason (for now), they cannot load at all.

Block access for anyone besides you and restart Metabase, so you can test without a lot of other people doing things at the same time, which will make it more difficult for you to debug.

I did that by doing some more tests late in the evening: I made sure I was the only one connected on Metabase. I can provide the logs as well but there are similar to what I've sent you in my previous message: the dashboard could not load.

Lots of threads blocked.
---> That's the queue.

Why is there so much queries in the queue ? Do you think the problem can come from our Heroku Postgres database where the data is located ? I've asked the support and reported some peaks in the read/write operations (max 2000 total): 11/30/21T14:30-15:10, 12/01/21T12:26. That's a lead but we have non-loading dashboard every day but not critical PIPOs peaks every day. Apart from that, the database is healthy (diagnose shows Green everywhere).

(FW by the Heroku support).

There's no timeout in Metabase for 60 seconds - that is caused by something in your network.

I find the workaround with increasing timeout on the proxy to stop the automatic cancel not useful. We should be capable of return results way before so why increase the time out ?

Your pg stats doesn't seem to correlate with the Metabase logs you've provided.

To summarize, this is what we see: a totally acceptable number of connexions on the database (<< 500).

You didn't comment the cache parameters: do you look normal to you ?

Thanks for your help on this,
Best regards,

Louis

@Louis I would very much recommend that you enable cache if you tend to have spikes like that, since that would solve those problems. If your queries normally only take 2-3 seconds to run, then your cache won't be effective, since you've set it to 4 seconds. And the TTL to 2, so it would be very short-lived too. And then you're caching up to 100MB per entry, which seems excessive, but you probably don't have anything this big, so it's not really a problem.

But please try to do exactly this:

  1. Just 1 instance, whatever size you prefer
  2. Fix your load-balancer, so it doesn't timeout after 60 seconds
  3. Setup a cronjob to kill idle connections

You are essentially seeing this issue: https://github.com/metabase/metabase/issues/8679

@flamber
Thanks for all your recommendations.

  • I've set-up the cache to 2s, TTL of 50 and the maximum storage (204800 MB).
  • 1instance for now on on c5a.large or r5a.large instance class.
  • I've raised the time out to 300s. I've tested a pg_sleep and it worked.
  • I've implemented the cronjob to kill idle connections: thanks for the code.

I'm concerned about the NetworkIn (in bytes), from the EB monitoring tab. See the following screenshot:


I think there is still big amount of data going in the instance. What do you think ?

Also, as expected with 1 instance, the CPU is more used than before:

. The peaks in the CPU use correspond to the NetworkIn.

The sum of requests doesn't correlate though:

Overall, the instance really often switches to warning:

However, Heroku database shows no sign of weakness (according to Heroku support, the database is relatively healthy).


The problem may come the PIOPs I mentioned in my previous message.

What would be the next step to figure out why the balance is still off ?

@Louis The amount of data going into the instance could be someone doing a couple of large exports of some large tables, since Metabase would download up to 1 million rows.
You'll need to correlate with the logs - either in Metabase or the database query logs.
But given that you have CPU spikes at that time, then my guess would be exports.

I cannot tell you why the instance switches to warning, you probably wouldn't have this problem on ECS.
Like I said, we stopped using EB because of various problems.

I don't think your Heroku database is the problem, though, they are likely using PgBouncer or something like that, which we think is one of the reasons for the exhausted/hanging connections.

@flamber
I understand your point about switching to ECS. But if we think about it, it has no reason to not work on EB. All we see are some slow downs for no apparent reason for now.

I've done a new test: I've deployed the newest version of Metabase. Here are the logs:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.core
> Metabase v0.41.3 (b0ac6f2 release-x.41.x)
>
> Copyright © 2021 Metabase, Inc.
>
> Metabase Enterprise Edition extensions are NOT PRESENT.
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 WARN metabase.core WARNING: You have enabled namespace tracing, which could log sensitive information like db passwords.
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.core Starting Metabase in STANDALONE mode
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.server Launching Embedded Jetty Webserver with config:
> {:port 3000, :host "0.0.0.0", :max-threads 150}
>
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.core Starting Metabase version v0.41.3 (b0ac6f2 release-x.41.x) ...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.core System info:
> {"file.encoding" "UTF-8",
> "java.runtime.name" "OpenJDK Runtime Environment",
> "java.runtime.version" "11.0.13+8",
> "java.vendor" "Eclipse Adoptium",
> "java.vendor.url" "https://adoptium.net/",
> "java.version" "11.0.13",
> "java.vm.name" "OpenJDK 64-Bit Server VM",
> "java.vm.version" "11.0.13+8",
> "os.name" "Linux",
> "os.version" "4.14.252-195.483.amzn2.x86_64",
> "user.language" "en",
> "user.timezone" "GMT"}
>
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.plugins Loading plugins in /plugins...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:25+01:00 INFO metabase.util.files Extract file /modules/sparksql.metabase-driver.jar -> /plugins/sparksql.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/bigquery.metabase-driver.jar -> /plugins/bigquery.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/sqlserver.metabase-driver.jar -> /plugins/sqlserver.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/presto-jdbc.metabase-driver.jar -> /plugins/presto-jdbc.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/sqlite.metabase-driver.jar -> /plugins/sqlite.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/vertica.metabase-driver.jar -> /plugins/vertica.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/redshift.metabase-driver.jar -> /plugins/redshift.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/presto.metabase-driver.jar -> /plugins/presto.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/druid.metabase-driver.jar -> /plugins/druid.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/mongo.metabase-driver.jar -> /plugins/mongo.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/google.metabase-driver.jar -> /plugins/google.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/oracle.metabase-driver.jar -> /plugins/oracle.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/snowflake.metabase-driver.jar -> /plugins/snowflake.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/googleanalytics.metabase-driver.jar -> /plugins/googleanalytics.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/bigquery-cloud-sdk.metabase-driver.jar -> /plugins/bigquery-cloud-sdk.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.util.files Extract file /modules/presto-common.metabase-driver.jar -> /plugins/presto-common.metabase-driver.jar
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :hive-like...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered abstract driver :hive-like (parents: [:sql-jdbc]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :sparksql...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :sparksql (parents: [:hive-like]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugin 'Metabase BigQuery Driver' depends on plugin 'Metabase Google Drivers Shared Dependencies'
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugins with unsatisfied deps: ["Metabase BigQuery Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase BigQuery Driver dependency {:plugin Metabase Google Drivers Shared Dependencies} satisfied? false
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :sqlserver...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :sqlserver (parents: [:sql-jdbc]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugin 'Metabase Presto JDBC Driver' depends on plugin 'Presto Common Driver'
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase Presto JDBC Driver dependency {:plugin Presto Common Driver} satisfied? false
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugins with unsatisfied deps: ["Metabase Presto JDBC Driver" "Metabase BigQuery Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :sqlite...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :sqlite (parents: [:sql-jdbc]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase cannot initialize plugin Metabase Vertica Driver due to required dependencies. Metabase requires the Vertica JDBC driver in order to connect to Vertica databases, but we can't ship it as part of Metabase due to licensing restrictions. See Redirecting… for more details.
>
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase Vertica Driver dependency {:class com.vertica.jdbc.Driver} satisfied? false
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugins with unsatisfied deps: ["Metabase Vertica Driver" "Metabase Presto JDBC Driver" "Metabase BigQuery Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :redshift...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :redshift (parents: [:postgres]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugin 'Metabase Presto Driver' depends on plugin 'Presto Common Driver'
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase Presto Driver dependency {:plugin Presto Common Driver} satisfied? false
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugins with unsatisfied deps: ["Metabase Vertica Driver" "Metabase Presto JDBC Driver" "Metabase BigQuery Driver" "Metabase Presto Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :druid...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :druid :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :mongo...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :mongo :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :google...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered abstract driver :google :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase BigQuery Driver dependency {:plugin Metabase Google Drivers Shared Dependencies} satisfied? true
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.initialize Dependencies satisfied; these plugins will now be loaded: ["Metabase BigQuery Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 WARN metabase.plugins.lazy-loaded-driver Warning: plugin manifest for :bigquery does not include connection properties
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :bigquery...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.driver.impl Registered driver :bigquery (parents: [:sql :google]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase cannot initialize plugin Metabase Oracle Driver due to required dependencies. Metabase requires the Oracle JDBC driver in order to connect to Oracle databases, but we can't ship it as part of Metabase due to licensing restrictions. See Redirecting… for more details.
>
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Metabase Oracle Driver dependency {:class oracle.jdbc.OracleDriver} satisfied? false
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 INFO metabase.plugins.dependencies Plugins with unsatisfied deps: ["Metabase Vertica Driver" "Metabase Presto JDBC Driver" "Metabase Oracle Driver" "Metabase Presto Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:26+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :snowflake...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.driver.impl Registered driver :snowflake (parents: [:sql-jdbc]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.plugins.dependencies Metabase Google Analytics Driver dependency {:plugin Metabase Google Drivers Shared Dependencies} satisfied? true
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.plugins.dependencies Plugin 'Metabase Google Analytics Driver' depends on plugin 'Metabase Google Drivers Shared Dependencies'
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.driver.impl Registered driver :googleanalytics (parents: [:google]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :googleanalytics...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :bigquery-cloud-sdk...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.driver.impl Registered driver :bigquery-cloud-sdk (parents: [:sql]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :presto-common...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.driver.impl Registered abstract driver :presto-common :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.plugins.dependencies Metabase Presto JDBC Driver dependency {:plugin Presto Common Driver} satisfied? true
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 DEBUG metabase.plugins.initialize Dependencies satisfied; these plugins will now be loaded: ["Metabase Presto Driver" "Metabase Presto JDBC Driver"]
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.plugins.dependencies Metabase Presto Driver dependency {:plugin Presto Common Driver} satisfied? true
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :presto...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.driver.impl Registered driver :presto (parents: [:presto-common]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.driver.impl Registered driver :presto-jdbc (parents: [:presto-common]) :truck:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 DEBUG metabase.plugins.lazy-loaded-driver Registering lazy loading driver :presto-jdbc...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.core Setting up and migrating Metabase DB. Please sit tight, this may take a minute...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.db.setup Verifying postgres Database Connection ...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.db.setup Successfully verified PostgreSQL 11.9 application database connection. :white_check_mark:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.db.setup Running Database Migrations...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.db.setup Setting up Liquibase...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.db.setup Liquibase is ready.
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:27+01:00 INFO metabase.db.liquibase Checking if Database has unrun migrations...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:29+01:00 INFO metabase.db.liquibase Database has unrun migrations. Waiting for migration lock to be cleared...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:29+01:00 INFO metabase.db.liquibase Migration lock is cleared. Running migrations...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.db.setup Database Migrations Current ... :white_check_mark:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.db.data-migrations Running all necessary data migrations, this may take a minute.
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.db.data-migrations Finished running data migrations.
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.dependencies/Dependencies :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.activity-feed/ActivityFeed :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.metabot-lifecycle/MetaBotLifecycle :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.last-login/LastLogin :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.notifications/Notifications :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.sync-database/Sync :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.driver-notifications/DriverNotifications :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.revision/Revisions :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.events Starting events listener: :metabase.events.view-log/ViewLog :ear:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task SyncDatabases :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task CheckForNewVersions :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task SendAnonymousUsageStats :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task SendAbandomentEmails :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task SendPulses :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task SendFollowUpEmails :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.task Initializing task TaskHistoryCleanup :calendar:
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:35+01:00 INFO metabase.core Metabase Initialization COMPLETE
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:36+01:00 INFO metabase.util.i18n.impl Reading available locales from locales.clj...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:38+01:00 INFO metabase.driver.impl Initializing driver :sql-jdbc...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:38+01:00 INFO metabase.driver.impl Initializing driver :sql...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:38+01:00 INFO metabase.driver.impl Initializing driver :postgres...
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:54+01:00 DEBUG metabase.server.middleware.log GET /api/session/properties 200 46.5 ms (2 DB calls) App DB connections: 0/4 Jetty threads: 4/150 (3 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:54+01:00 DEBUG metabase.server.middleware.log GET /api/user/current 200 310.9 ms (3 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (3 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:55+01:00 DEBUG metabase.server.middleware.log GET /api/database 200 32.9 ms (4 DB calls) App DB connections: 1/4 Jetty threads: 4/150 (3 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:55+01:00 DEBUG metabase.server.middleware.log GET /api/collection/root/items 200 148.8 ms (4 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (3 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:56+01:00 DEBUG metabase.server.middleware.log GET /api/collection/root 200 13.5 ms (2 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (3 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:32:56+01:00 DEBUG metabase.server.middleware.log GET /api/collection/root/items 200 17.4 ms (4 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (4 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:10+01:00 DEBUG metabase.server.middleware.log GET /api/setting 200 2.4 ms (0 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (4 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:10+01:00 DEBUG metabase.server.middleware.log GET /api/session/properties 200 29.2 ms (2 DB calls) App DB connections: 0/4 Jetty threads: 4/150 (3 idle, 0 queued) (43 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:10+01:00 DEBUG metabase.server.middleware.log GET /api/setup/admin_checklist 200 37.5 ms (11 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (4 idle, 0 queued) (44 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:16+01:00 DEBUG metabase.server.middleware.log GET /api/setup/admin_checklist 200 15.5 ms (11 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (4 idle, 0 queued) (44 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:19+01:00 DEBUG metabase.server.middleware.log GET /api/activity/recent_views 200 25.1 ms (1 DB calls) App DB connections: 0/4 Jetty threads: 3/150 (4 idle, 0 queued) (44 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:29+01:00 DEBUG metabase.server.middleware.log GET /api/search 200 7.9 s (3 DB calls) App DB connections: 3/7 Jetty threads: 6/150 (1 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:31+01:00 DEBUG metabase.server.middleware.log GET /api/search 200 8.6 s (3 DB calls) App DB connections: 2/7 Jetty threads: 5/150 (2 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:32+01:00 DEBUG metabase.server.middleware.log GET /api/search 200 8.8 s (3 DB calls) App DB connections: 1/7 Jetty threads: 4/150 (3 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:32+01:00 DEBUG metabase.server.middleware.log GET /api/search 200 9.4 s (3 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (4 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:33:34+01:00 DEBUG metabase.server.middleware.log GET /api/database/3/schemas 200 34.6 ms (4 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:06+01:00 DEBUG metabase.server.middleware.log GET /api/user/current 200 10.2 ms (3 DB calls) App DB connections: 0/7 Jetty threads: 4/150 (4 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:06+01:00 DEBUG metabase.server.middleware.log GET /api/session/properties 200 7.9 ms (2 DB calls) App DB connections: 1/7 Jetty threads: 4/150 (4 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:06+01:00 DEBUG metabase.server.middleware.log GET /api/database 200 7.3 ms (3 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:06+01:00 DEBUG metabase.server.middleware.log GET /api/collection/root/items 200 8.2 ms (4 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:06+01:00 DEBUG metabase.server.middleware.log GET /api/collection/root 200 4.2 ms (2 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:06+01:00 DEBUG metabase.server.middleware.log GET /api/collection/root/items 200 16.4 ms (4 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:08+01:00 DEBUG metabase.server.middleware.log GET /api/setting 200 922.3 µs (0 DB calls) App DB connections: 0/7 Jetty threads: 4/150 (4 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:08+01:00 DEBUG metabase.server.middleware.log GET /api/session/properties 200 10.4 ms (2 DB calls) App DB connections: 0/7 Jetty threads: 4/150 (4 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:08+01:00 DEBUG metabase.server.middleware.log GET /api/setup/admin_checklist 200 15.3 ms (11 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:10+01:00 DEBUG metabase.server.middleware.log GET /api/util/bug_report_details 200 5.7 ms (1 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:11+01:00 DEBUG metabase.server.middleware.log GET /api/task/info 200 9.2 ms (0 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:13+01:00 DEBUG metabase.server.middleware.log GET /api/task/info 200 1.4 ms (0 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:23+01:00 DEBUG metabase.server.middleware.log GET /api/task/info 200 1.5 ms (0 DB calls) App DB connections: 1/7 Jetty threads: 3/150 (5 idle, 0 queued) (45 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:30+01:00 DEBUG metabase.server.middleware.log GET /api/task 200 18.7 ms (2 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (4 idle, 0 queued) (44 total active threads) Queries in flight: 0 (0 queued)
> [a4dd8ba8-5d91-4dc9-8978-48576373f8db] 2021-12-02T20:34:32+01:00 DEBUG metabase.server.middleware.log GET /api/util/bug_report_details 200 2.5 ms (1 DB calls) App DB connections: 0/7 Jetty threads: 3/150 (4 idle, 0 queued) (44 total active threads) Queries in flight: 0 (0 queued)

I'm the one user and see the NetworkIn: I only opened the main page of Metabase. No dashboard, no nothing:

Just to confirm, I'm the only user at 20h30 = 8PM30:

The instance goes right from start to a degraded state which is crazy.

We're going to try switching to a higher plan for the Heroku database to see if it changes anything.

Do you have other suggestions to try before even considering switching to ECS ?

Thanks a lot for your time on this issue.

@Louis While I don't know if it is related, but /api/search 200 9.4 s looks very bad. The search should be very quick. But perhaps that's somehow related to the spike, since Metabase will make queries to the appdb, which then returns the results that Metabase then manipulates and show to the end user.

@flamber
It's almost always the case that the search takes a few seconds before giving the result.
I'm currently doing some testing. What is crazy to me is that I can load all the problematic dashboard now (and it's super fast), but we totally unable to do the same thing a few hours ago.
Why Metabase has so much ups and dows in terms of performances ? I know we discussed this a lot. But we have to miss something big here.

Could it be because I do expensive work with dbt to refresh my model every 2h ?

@Louis Post the row count and "physical" size of all tables in the application database.
Think you might be "abusing" Metabase in a way that wasn't intended.

I have seen huge installations with thousands of users on a couple of instances, so I know Metabase can perform well, but I have also seen installations with just a couple of users that is being dragged to its knees because of some obscure overload somewhere.

@Louis I was asking for the Metabase application database information, not your warehouse information.

Having everything in different locations just adds latency.

Please just go and create a trial of Metabase Cloud and see if that works for you: https://www.metabase.com/pricing/
Then we can see the connection information.

@flamber
I'm sorry I sent you the wrong data. Here are the Metabase application table row counts and sizes:

relation total_size row_count_estimate
public.query 3816 MB 1 373 200
public.query_execution 1589 MB 4 640 911
public.view_log 202 MB 686 947
public.query_cache 175 MB 72 277
public.metabase_field 116 MB 187 775
public.revision 60 MB 27 716
public.task_history 22 MB 96 311
public.report_dashboardcard 16 MB 12 536
public.activity 14 MB 31 834
public.metabase_fieldvalues 12 MB 33 335
public.report_card 11 MB 2 097
public.metabase_table 5024 kB 7 992
public.login_history 3560 kB 4 613
public.core_session 1584 kB 5 182
public.databasechangelog 1128 kB 501
public.report_dashboard 1104 kB 667
public.permissions 464 kB 941
public.qrtz_triggers 296 kB 12
public.setting 240 kB 35
public.collection 168 kB 281
public.qrtz_fired_triggers 160 kB 1
public.core_user 136 kB 57
public.qrtz_job_details 96 kB 8
public.permissions_group_membership 88 kB 113
public.metric 80 kB 2
public.native_query_snippet 80 kB 3
public.collection_permission_graph_revision 80 kB 1
public.permissions_revision 72 kB 19
public.spark_client_reports 64 kB 140
public.pulse_channel 64 kB 10
public.qrtz_cron_triggers 64 kB 12
public.pulse 64 kB 10
public.pulse_card 56 kB 10
public.qrtz_scheduler_state 56 kB 1
public.databasechangeloglock 56 kB 1
public.permissions_group 56 kB 4
public.spark_kpis 48 kB 19
public.spark_levels 48 kB 7
public.spark_platforms 48 kB 6
public.segment 40 kB 0
public.dependency 40 kB 0
public.data_migrations 40 kB 15
public.group_table_access_policy 32 kB 0
public.report_cardfavorite 32 kB 0
public.metabase_database 32 kB 4
public.metric_important_field 32 kB 0
public.card_label 32 kB 0
public.spark_clients 32 kB 59
public.dimension 32 kB 0
public.label 32 kB 0
public.dashboard_favorite 32 kB 0
public.qrtz_locks 24 kB 2
public.pulse_channel_recipient 24 kB 20
public.moderation_review 24 kB 0
public.dashboardcard_series 24 kB 0
public.computation_job_result 16 kB 0
public.qrtz_calendars 16 kB 0
public.qrtz_simprop_triggers 16 kB 0
public.qrtz_blob_triggers 16 kB 0
public.computation_job 16 kB 0
public.qrtz_simple_triggers 16 kB 0
public.task_history_id_seq 8192 bytes 1
public.view_log_id_seq 8192 bytes 1
public.activity_id_seq 8192 bytes 1
public.collection_permission_graph_revision_id_seq 8192 bytes 1
public.moderation_review_id_seq 8192 bytes 1
public.report_dashboardcard_id_seq 8192 bytes 1
public.group_table_access_policy_id_seq 8192 bytes 1
public.label_id_seq 8192 bytes 1
public.dependency_id_seq 8192 bytes 1
public.login_history_id_seq 8192 bytes 1
public.metabase_database_id_seq 8192 bytes 1
public.metabase_field_id_seq 8192 bytes 1
public.metabase_fieldvalues_id_seq 8192 bytes 1
public.metabase_table_id_seq 8192 bytes 1
public.metric_id_seq 8192 bytes 1
public.metric_important_field_id_seq 8192 bytes 1
public.native_query_snippet_id_seq 8192 bytes 1
public.permissions_group_id_seq 8192 bytes 1
public.permissions_group_membership_id_seq 8192 bytes 1
public.permissions_id_seq 8192 bytes 1
public.permissions_revision_id_seq 8192 bytes 1
public.pulse_card_id_seq 8192 bytes 1
public.pulse_channel_id_seq 8192 bytes 1
public.pulse_channel_recipient_id_seq 8192 bytes 1
public.pulse_id_seq 8192 bytes 1
public.qrtz_paused_trigger_grps 8192 bytes 0
public.dashboardcard_series_id_seq 8192 bytes 1
public.dashboard_favorite_id_seq 8192 bytes 1
public.core_user_id_seq 8192 bytes 1
public.query_execution_id_seq 8192 bytes 1
public.report_card_id_seq 8192 bytes 1
public.report_cardfavorite_id_seq 8192 bytes 1
public.computation_job_result_id_seq 8192 bytes 1
public.report_dashboard_id_seq 8192 bytes 1
public.computation_job_id_seq 8192 bytes 1
public.dimension_id_seq 8192 bytes 1
public.revision_id_seq 8192 bytes 1
public.collection_id_seq 8192 bytes 1
public.segment_id_seq 8192 bytes 1
public.spark_client_reports_id_seq 8192 bytes 1
public.spark_clients_id_seq 8192 bytes 1
public.card_label_id_seq 8192 bytes 1
public.pg_stat_statements 0 bytes 0

Having everything in different locations just adds latency.

What do you mean ? We created a Postgres RDS like recommended in the official documentation. Our warehouse is in fact located in the same region as the RDS and is also hosted on AWS.

We prefer to understand the problem in our configuration first.

Thanks in advance.