File upload size limited by available system memory
|Reported by:||saul||Owned by:|
File uploads appear to be limited by the amount of memory available on the system.
In particular, I receive a MemoryError from queue_file.write(submitted_file.read())' in the mediagoblin/submit/lib.py file.
This occurs when the file size I upload exceeds 1.7GB -- things worked fine for files less than 1.7GB . I do not have any max_file_size specified and coincidentally 1.7GB is the amount of unused RAM I have available on my server (including swap).
These errors occur during the upload phase of the publishing process, before any transcoding is attempted. By watching top, it can be seen that memory usage keeps growing, including swap usage, and if all memory is used up then the upload fails. If the transfer completes without running out of memory then all used memory is freed and transcoding proceeds.
This to me seems indicative of either a memory leak in the file copying process, or that file copying reads the entire file into RAM before writing it back out. Regardless the cause, it poses a severe limitation on Mediagoblin deployment using small device servers and hosted slices where one might be charged for or have limitations placed on RAM usage.
Note: I have added an additional gigabyte of RAM to my server and was able to upload larger files, but still encountered this bug when the file size was greater than 2.7GB .