vlad Posted February 21, 2016 Posted February 21, 2016 I guess everyone knows that downloading distros in general takes a lot of bandwidth, same with armbian even thought the armbian distro i only a couple of hundreds of MB. so beside offering direct download why not start using p2p downloads like ubuntu and other distros do already in order to save bandwidth. We can use the openbittorrent.com public tracker - i already have a cubietruck which is powered 24/7 for my p2p needs so i would mind sharing some armbian distros via p2p. What do you guys think ? 1
zador.blood.stained Posted February 21, 2016 Posted February 21, 2016 It was asked before: http://forum.armbian.com/index.php/topic/583-privacy/#entry4002 I guess images don't generate too much traffic for now. BTW, who needs trackers when you have DHT and PEX?
tkaiser Posted February 21, 2016 Posted February 21, 2016 I guess it doesn't images don't generate too much traffic for now. And you get also an idea what users are downloading: http://mirror.igorpecovnik.com/stats.html Igor changed the interval a few days ago and it was quite interesting how many people were downloading our H3 images...
vlad Posted February 21, 2016 Author Posted February 21, 2016 @zador.blood.stained - i didn't know it was asked before. I am not familiar with DHT and or PEX but how would you get a list of peers for a specific torrent of if there is no tracker to keep track of the peers, I mean I have read about DHT and it's suppose to be decentralized but would that work also on the internet or would it only work on a local network ? @tkaiser I know about that page and it looks like we are already consuming more than 100G/day - I guess it really depends what deal has Igor got from his server provider
zador.blood.stained Posted February 21, 2016 Posted February 21, 2016 DHT is global, for local networks there are LPD (Local peer discovery) and retrackers. If Igor ever wants to reduce number of prebuilt images based on download statistics, its easier to count HTTP downloads.
Toast Posted February 22, 2016 Posted February 22, 2016 if he used torrent downloads and he hosts the torrentfile he could still get stats, he could also use something like webtorrent to have a hybrid http and torrent download to ease the usage on the server.
vlad Posted February 22, 2016 Author Posted February 22, 2016 after reading more on the bittorent protocol, looks like even though PEX and DHT can be used to have a truly decentralized and to have better performance/efficiency it still needs a tracker to form the initial swarm. bittorrent protocol, specially PEX and DHT chapters https://en.wikibooks.org/wiki/The_World_of_Peer-to-Peer_(P2P)/Networks_and_Protocols/BitTorrent if he used torrent downloads and he hosts the torrentfile he could still get stats, he could also use something like webtorrent to have a hybrid http and torrent download to ease the usage on the server. Totally ubuntu and fedora are already doing this ( https://torrents.fedoraproject.org/and http://torrent.ubuntu.com:6969/)
Toast Posted February 22, 2016 Posted February 22, 2016 Far more interesting things then just plain torrent files having hybrid downloads (traditional http mixed with p2p) going with nodejs these days
Igor Posted January 14, 2017 Posted January 14, 2017 Bump. This problem is getting realistic so we need to either add few new mirrors or work on some p2p solution. Ideas for simple and good working solution?
zador.blood.stained Posted January 14, 2017 Posted January 14, 2017 Bump. This problem is getting realistic so we need to either add few new mirrors or work on some p2p solution. Generate magnet links for stable releases automatically/semi-automatically with mktorrent as per here? Obviously this won't reduce the load on the APT repositories and from nightly images and obviously you'll need to have a torrent client which will run on the download server as the initial seed.
Christos Posted January 14, 2017 Posted January 14, 2017 Is making the image files available as releases in github a possible solution? IF that can be done then you off-load your own servers. Dont know if that is possible or permitted though.
Igor Posted January 14, 2017 Posted January 14, 2017 Load is not the only problem, but also bandwidth. Torrent is probably the simplest way to achieve download decentralisation, save bandwidth and give better download speed from anywhere ... if there would be enough seeders. The other way is to make some diplomacy / talk with folks around and establish "continental" mirrors.
zador.blood.stained Posted January 14, 2017 Posted January 14, 2017 Load is not the only problem, but also bandwidth. Torrent is probably the simplest way to achieve download decentralisation, save bandwidth and give better download speed from anywhere ... if there would be enough seeders. The other way is to make some diplomacy / talk with folks around and establish "continental" mirrors. If you have bandwidth issues you may also try to limit number of connections per IP and speed per connection in nginx config of the file server, this way at least users with fast connections won't grab all the available bandwidth.
Christos Posted January 14, 2017 Posted January 14, 2017 Load is not the only problem, but also bandwidth. Torrent is probably the simplest way to achieve download decentralisation, save bandwidth and give better download speed from anywhere ... if there would be enough seeders. The other way is to make some diplomacy / talk with folks around and establish "continental" mirrors. Have you considered 'cloud' storage options, or quite possibly the 'unlimited bandwidth' hosting providers? -> https://alreadyhosts.com/ -> https://hostingfacts.com/
Igor Posted January 14, 2017 Posted January 14, 2017 Have you considered 'cloud' storage options, or quite possibly the 'unlimited bandwidth' hosting providers? That's the last option to consider. We are community project and I was offered such resources in several occasions - as contribution to the project. The real problem is to organise / talk / setup and I rather do that than just click "buy now" button I'll check up suggestions. Thanks. P.S. I refreshed download pages yesterday, again. Missing data is because Github had severe problems yesterday and I haven't yet planned for such situation. Some more debug and hardening ...
martinayotte Posted January 14, 2017 Posted January 14, 2017 Missing data is because Github had severe problems yesterday ... Yes, yesterday Github was returning 504 during several hours, for any kind of requests, except for Web accesses.
Recommended Posts