Spark SQL datasource refresh

Hi.

I read from a Spark SQL datasource (on thrift) to visualize on my dashboard, when the underlying table changes Metabase reports are crashing because the definition on thrift side is obsolete and needs to be refreshed.

Is there a way to make metabase REFRESH TABLE if it encounters such a problem? Thanks.

You can do this programmatically but you will need a process that identifies there is a problem and then triggers the sync. You can look into 2 options:

  1. endpoint with session token: /api/database/:id/sync_schema or api/database/:id/rescan_values by authenticating with a user and passing a session token in the header

  2. endpoint with API key: /api/notify/db/:id, passing an API key that you have to define in the MB_API_KEY environment variable (this endpoint was made for notifying Metabase to sync after an ETL finishes)

But in your case it's not an ETL but an Error so you need to catch this exception and then perform one of the above

Thanks. The 2. didn't work, I also waited for a couple of minutes and it didn't refresh the data. I think the problem is that thrift server caches tables for that specific session, and what we need is more like a reconnect than a refresh of the structure, is there API to reconnect to the database?

I am not sure that will fix the issue cause if you reconnect the connection will be with the thrift server cached tables no?

Is there a setting to remove caches from the thrift server. I don't really have that much experience with Hive to really understand whats going on