Problem with driver Spark

Metabase version - 0.32.9
Centos 7
database postgres

When I try to connect to the hive database, I get the following error:
Timed out after 5,000 milliseconds.

By looking at the logs, I found that when I re-save an existing spark connection or create a new connection, the system will automatically load the sparksql.metabase-driver.jar.

I tried the following methods:

  1. Delete this jar separately, leaving only the old spark driver package.
  2. Rename the old spark driver package to sparksql.metabase-driver.jar,
    The first method, the problem still exists in the centos7 environment (effective under local windows 10).
    The second method, after renaming and starting, the database type cannot find spark.

So at the moment, I don't think there are other ways to deal with it, please help me, because I mainly use spark sql to get the data, thank you!
image

GET /api/database/2 200 14 ms (3 DB calls) Jetty threads: 3/50 (4 idle, 0 queued) (41 total active threads) Queries in flight: 0

07-02 09:11:17 INFO metabase.driver :: Initializing driver :hive-like...

07-02 09:11:17 INFO plugins.classloader :: Added URL file:/usr/metabase/plugins/sparksql.metabase-driver.jar to classpath

07-02 09:11:17 DEBUG plugins.init-steps :: Loading plugin namespace metabase.driver.sparksql...

07-02 09:11:17 INFO metabase.driver :: Registered abstract driver :hive-like (parents: :sql-jdbc) 

07-02 09:11:17 INFO metabase.driver :: Registered driver :sparksql (parents: :hive-like) 

07-02 09:11:18 DEBUG plugins.jdbc-proxy :: Registering JDBC proxy driver for class metabase.driver.FixedHiveDriver...

Load lazy loading driver :hive-like took 1 s

07-02 09:11:18 INFO metabase.driver :: Initializing driver :sparksql...

07-02 09:11:18 DEBUG plugins.init-steps :: Loading plugin namespace metabase.driver.sparksql...

07-02 09:11:18 DEBUG plugins.jdbc-proxy :: Registering JDBC proxy driver for class metabase.driver.FixedHiveDriver...

Load lazy loading driver :sparksql took 909 µs

07-02 09:11:22 DEBUG middleware.log :: PUT /api/database/2 400 5 s (1 DB calls)

{:valid false, :dbname "Timed out after 5,000 milliseconds.", :message "Timed out after 5,000 milliseconds."}

Hi @jianbo
So if you replace sparksql.metabase-driver.jar with your own custom version, then make sure to run touch sparksql.metabase-driver.jar to update the file timestamp, so it doesn’t get replaced at next startup.
You might have to edit the file metabase-plugin.yaml inside the jar-file (it’s just a simple zip-file) and make sure the version is higher on your custom version or that name is different (not sure how namespaces handled).
Also, check the Metabase log at startup, so you’ll make sure that it actually pre-loads the correct driver.