Connecting Metabase with Databricks Catalogs: Override Default hive_metastore Connection?

Title: Connecting Metabase with Databricks Catalogs: Override Default hive_metastore Connection?

Hi!

I'm trying to connect Metabase with Databricks catalogs, but the default connection seems to be limited to hive_metastore. Is there a way to switch to other catalogs?

Has anyone successfully connected Metabase with Databricks catalogs outside of hive_metastore? If you have any insights or workarounds, I'd greatly appreciate your help!

Thanks,
Kevin

Metabase can’t connect to databricks yet, you’ll need to use the non official driver GitHub - relferreira/metabase-sparksql-databricks-driver

Hello Luiggi,

Thank you for your response.

I appreciate your suggestion of using the unofficial driver, but I want to clarify that I'm already using it. However, I'm still encountering the limitation of only being able to connect to the default hive_metastore.

To provide a clearer picture of the situation, let's consider the catalogs: catalog_A, catalog_B, and catalog_C, along with the default hive_metastore. Despite my efforts, I can only establish a connection with the hive_metastore and not with the specific catalogs.

I hope this clarifies the issue.

Best regards,
Kevin

thanks for your feedback, as we don't control the development of that driver, is there a way you could check with the developer to fix that issue?

Hi Kevin, I was facing the same problem today.

I believe you already have a jdbc url with UID and PWD configured, right? if yes, you have to add UID=token;PWD=;ConnCatalog=;ConnSchema=

I hope this helps you!

Best regards.
Gustavo

1 Like

Hi Gustavo,

Thank you for your support. It worked!

Best regards.
Kevin