Github, Deep Copy, Cloning Dashboards and Underlying Tables

We built a great report for one of our clients that leverages an aggregated data table in SQL Server. Before the concept of cascading filters arrived in v37 we had to create separate aggregate tables by a client, add to Metabase and then rebuild all of the questions from a master example dashboard. I thought there must be an easier way. After looking online in the discussion forum I found some python-based GitHub solutions to allow me to deep copy dashboards and questions. https://github.com/vvaezian/metabase_api_python . Well, I’m not super technical. I had one of our technical people help me figure out how to set up my environment and was successfully able to clone and deep copy a dashboard. This successfully created a new dashboard and independent copies of the questions on that dashboard. The problem is that because the data connection is different there doesn’t seem to be an easy way to be able to update the table underlying each question without recreating the question from scratch.

I think a lot of the heavy lifting is solved by the GitHub solution ( thank you vvaezian) but I needed to go into the underlying tables to be able to update specific information and this is where I got a little confused. I would welcome any help or Direction on how best to update these underlying tables.

I then needed to identify the following

  1. Database ID for both the original clone database as well as the new connection.
    a. Cloned database_id = 1 and the new connection which we will call database_id=2.
  2. Cloned table
    . Cloned table_id=1 and the new table table_id=2
  3. Cloned Collection
    . New Cloned collection Collection_id = 2

With the information, I looked up records in the report_card table so that could update some key fields.

  1. I updated the dataset_query field where I did a search and replace for database":1 with database":2
  2. I updated the dataset_query field where I did a search and replace for source-table":1 with source-table":2
  3. Copy the value out of the dataset_query field into a text editor and isolate all of the field-id values as the existing values relate to the original data connection. I use the id of the original value to look up the text name of that value and then use the text name value to look up the new field-id for the same text name but the new connection. once this was complete I was able to do a search and replace on report_card.dataset_query for the old field number field-id = 1 (from database_id = 1) for the new field number field-id = 2 (from database_id = 2)
  4. Updated report_card.database_id = 2
  5. Updated report_card.table_id = 2

This seemed to work well and I was able to open the dashboard and see the new questions properly formatted. there was still a little bit of wonkiness but I was working through some of those details when I decided to look for other tables that may have information that needed to be updated with Legacy database, table or field values

That is when I found the table->column report_dashboardcard.parameter_mappings

Looks like in this table there are values such as parameter_ID and field-ID, card_id that most likely are referencing the Legacy data connection. I think this field - ID is relatively self-explanatory and just needs to use the same replace update logic as I did in the dataset_query field. I’m a little bit stumped on what parameter_ID is. I thought at first it might be related to the dashboard filters so I tried removing and saving the dashboard without any filters to see if the underlined table removed parameter IDs. But it does not. I also know that the parameter IDs are shared with both the original dashboard and the new clone dashboard with my guess is that it will cause a problem down the road.

My overall question is what cables do I need to update for simply Having new questions in a new dashboard pointed to a new connection. This assumes that I’ve added the connection and need to update the underlying tables as I do not believe the python script that I have will update that information. If I’m missing something please let me know. Thank you for any advice.

Pinging @King_Edward (GitHub vvaezian)

1 Like

Hi @Slickrock22,

As I understand it, you have been able to deepcopy the dashboard, and now you have trouble doing something with the duplicated dashboard.
If you provide an example of the thing you want to do, it would make things easier.
For example you said

I needed to go into the underlying tables to be able to update specific information

Give an example for this.

P.S. Dashboards are comprised of cards (AKA questions). Cards are based on database tables.

Thanks for the response @King_Edward When you Deep copy via Python API It retains all the same database number and field number values. I want to change the database that the copied cards point to. That means I must also update the field numbers in certain tables. I think I know most of the columns that need to be updated but I’m a little stumped on what the field report_dashboardcard.parameter_mappings Is used for. I find field-id values in that column that most likely will need to updated so they match the revised field numbers of the new database connection.

By replacing the database that the card points to, I’m not sure what other data other than the database id need to be changed as well. This can be tested with a minimal example.

As I understand it, you have a blueprint for a set of cards and a dashboard for those cards, and you want to be able to change the database that these cards are using.
Instead of changing the database id of the cards, a probably better approach is to create new cards with the new database id that you want to use (using the create_card function) and then replacing the cards in the duplicated dashboard with the newly created cards. These all of course can be automated.

@King_Edward Thanks! My understanding of Metabase is that database fields (columns) get assigned field-id that are unique for each database/field combination. That would mean that you would need to update both database_id as well as field-id (which I did and it works). I will investigate create_card. Thanks for the suggestion.