XML files in Temp Directory takes a lot of space


I am seeing XML files in the below directory which takes a lot of space

fe1abc3cab645eaba30a369ed/diff/tmp/poifiles# ls -lrt
total 25507808
-rw-r--r-- 1 2000 2000 1027047424 Dec 3 05:36 poi-sxssf-sheet5626920146125491702.xml
-rw-r--r-- 1 2000 2000 1057964032 Dec 3 05:36 poi-sxssf-sheet4126340088364832789.xml
-rw-r--r-- 1 2000 2000 1076785152 Dec 3 05:36 poi-sxssf-sheet1767818576025409865.xml
-rw-r--r-- 1 2000 2000 1064091648 Dec 3 05:36 poi-sxssf-sheet13417142868094272759.xml
-rw-r--r-- 1 2000 2000 1034092544 Dec 3 05:36 poi-sxssf-sheet7878143755364483219.xml
-rw-r--r-- 1 2000 2000 0 Dec 3 05:45 poi-sxssf-sheet11991267903715050451.xml
-rw-r--r-- 1 2000 2000 2121744384 Dec 7 05:59 poi-sxssf-sheet9040492408459142761.xml

Are these files due to export ? Like it's creates 2000 row chunks and in the end bundles and sends to client ?

If Yes, Can we point that directory somewhere else in the drive where we can clean up automatically ?

Hi @spanda
There should not be any left-overs after it has generated the XLSX export, so something else must be going wrong, which causes it to fail cleanup.

Yes, those are related to XLSX export, like I mentioned in Metabase Crashes After Performing Export

And yes, you can change the Java temp directory: https://stackoverflow.com/questions/1924136/environment-variable-to-control-java-io-tmpdir

But changing the temp directory is not the solution. You'll need to figure out what is causing it to not cleanup correctly.

1 Like

For huge number of records Exports to XLSX takes time.

If in between you click let's say twice because there is no way to know in UI that your download is in progress, in the temp folder it will create 3 XML file for 3 Clicks.
And once data accumulated XLSX template file comes and it starts downloading which shows in the browser.

Now the problem is all 3 files are not able to convert to XLSX. Only one file converted.
After that 2 XML stays, which is the problem.

What's your suggestion ?

@spanda This one is tricky to reproduce. I can reproduce the temporary file, when I download multiple times at the same time, but after a few minutes those files are all removed.
The files are also removed if I shutdown Metabase.
And all the temporary files are just 0 bytes.

When I look at your file sizes, then several of these are 1GB and one is 2GB. That is insanely large.

  1. Please post "Diagnostic Info" from Admin > Troubleshooting.
  2. How much RAM can Metabase consume (this is stated right when you start Metabase)?