Good Day All,
When uploading larger zip file (2.5GB+ Echo Studies) via the Web or REST API I am running to this error. Does anyone know if there is a way to allocate more memory for this? The Orthanc is latest version running on a Windows 2016 Server 16+GB RAM.
E1017 12:18:31.062786 OrthancException.cpp:57] Bad file format: Cannot open ZIP archive from memory buffer
Thanks!
Hi Bryan,
I’ve just tried to compress 5 GB of data on my windows machine. The zipped file is 2.1 GB and, indeed, Orthanc is not able to uncompress it.
After stepping in the source code, it appears that the files are compressed using the Deflate64 method and this method is not supported by zlib (https://zlib.net/zlib_faq.html#faq40).
It seems that the windows explorer, which I use to compress the file, is using Deflate64 as soon as the uncompressed archive is “large” (probably > 2-4GB). If I compress smaller files, it uses another compression method which is supported by zlib and Orthanc.
Once I compress my files with Linux (with zip command line), it uses another compression method and Orthanc can uncompress it ! The funny thing is that the second file is even 1% smaller !
It’s not straightforward for us to add support for Deflate64 so, right now, the only thing I can do is add a more verbose error message: https://hg.orthanc-server.com/orthanc/rev/588fa6fb32ca
As a side note, talking about performance, it took 90 seconds to ingest 5GB of zipped data, 10.000 instances (2.1 GB once zipped):
- 20 seconds to upload the file to Orthanc from the browser and
- 70 seconds for Orthanc to uncompress it and store it.
Note that I’m working on localhost with an SSD so both network and disk bandwidths are almost unlimited.
HTH
Alain
Thanks for the Info. If I were to send to WebDav, Orthanc probably decompresses with zlib also correct, therefore wouldn’t work?
yes, same algo will be used. Your only chance is to change the zip algo (or uncompress by yourself)