Hi everyone,
I've been working with the metabase-sync Python script from GitHub and added the ability to import/export a collection in JSON. It's working great so far!
However, the datasource needs to be created before the import. So, I added the functionality to create the datasource at the start of the import using /api/database, and that part is working fine.
My issue is with the synchronization of tables and fields. For a large database, this can take some time. In my test case, it takes around 5 minutes. That's okay, but I haven't found a way to know when it's finished.
After creating the database, my script proceeds with creating the collections and cards. It looks up the new table and field IDs in the target Metabase instance to replace the ones in the JSON. Of course, if the sync isn't finished, the table or field IDs might not be available, causing an exception in my code.
If I could know when the sync process is finished after creating the database, I could pause the program to wait for the sync to complete before starting the import.
I haven't found any way to check this in the API. Any help with this timing issue would be greatly appreciated!