Hm, I dont think, that it has something to do with the pipeline, because this was only the last step in my testings. Firstly, I executed each step separatly and generated with each step a new file. Because I thought the problem is somewhere withe the size of the files that I generated on the server, I changed to the pipeline-methode, so that I never have on server 1. a complete file (streaming). The Support of the server said also, that there are no limitations in filesize or execution time. So I came to the conclusion, that the problem can be only in the GnuPG-function. Maybe some configurations of GnuPG I haven't taken into account or a bug in GnuPG?
I have also noticed, that the GnuPG-function runs throught the complete file and just at the end it interrupts. So the resulting file is as large as the original or even a litle bit larger. Thanks for more ideas! -- View this message in context: http://old.nabble.com/Encrypted-large-files-cant-decrypt-tp33388747p33393790.html Sent from the GnuPG - User mailing list archive at Nabble.com. _______________________________________________ Gnupg-users mailing list [email protected] http://lists.gnupg.org/mailman/listinfo/gnupg-users
