CCM upload fills up /tmp directory
Last week we had a very strange CCM system behaviour in one of our customer environment.
This customer uses LVM for its SLES systems and each configured directory only gets the space needed.
So that for example the /tmp directory only has 512 MB of free space.
Then the customer reported a problem when uploading files with a size of more than 2 GB. (ok, you can discuss whether the system is capable of handling files of the size… but at least there is no official restriction so that it should be possible)…
The following error message came up:
After digging deeper on this problem my colleague Jan found out that while CCM files are uploaded, those get stored temporary in the /tmp directory. After the upload was successful, the temporary upload is move to the shared filestore where CCM stores its files. The temporary file is then deleted.
But come on… That`s a problem… uploading a 2 GB file on a partition with only 512 MB of size.
We then tried setting the IATEMPDIR to another directory (as described here) – (as we did when installing CCM). But it seems that filenet only makes use of the the variable „java.io.tmpdir“ for storing the temporary uploads. When you do not specify the variable in the JVM args, the standard path “/tmp” is used by default.
The solution for this problem was to add a Generic JVM argument to the filenet JVM args:
After restarting the server, the files get temporary stored in this directory. The upload for files smaller than 2 GB started working.
Regarding support for files larger than 2 GB using CCM… We got the official information from IBM that:
Our finding are Connections itself does not support uploads of over 2GB. I was told this is because of a limitation with the Apache Library that is used for the upload (struts). What happens is that it chokes when trying to upload the file because it tries to upload the file fully first then afterwards checks the file size against the max file size allowed, but struts doesn't allow the file to be uploaded as it's over 2gb so it spits out the generic error you were seeing.
So one problem was fixed – download for files < 2 GB works. But more than 2 GB is not possible. A restriction that would be worth some documentation!