I am helping out a open data science project. The project has a lot of
users, but the amount of data being dumped on the server is growing a lot
faster than the monetary support.


I was thinking that perhaps there is a way to use distributed file storage
(bittorrent, ZeroNet, IPFS, GNUnet) etc here to reduce the load while
preserving the current file distribution architecture..


The server is set up to allow visitors to the site to click and save files
(sizes vary from MB to 10s of GB). I am curious to know if there is any
opensource platform which can provide a similar frontend (being able to
click and save large files) while storing the files on a distributed
peer-to-peer network?


I noticed zeronet, but it seems at the moment, it is limited to 10MB
files..

There is not much concern about privacy or anonymity, as these files are
not IP protected..


Can this be done in Gnunet? Or is there a more appropriate platform I
should look int?


Sorry if I am using incorrect jargon.. I am very new to file storage and
distribution..


Thanks


Paul
_______________________________________________
Help-gnunet mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/help-gnunet

Reply via email to