Last night, @Sanjay started a conversation with
why don't browsers let you download the same file from multiple sources in parallel to speed downloads like Bittorrent? seems obvious.
— Sanjay Parekh (@sanjay) August 10, 2014
And it is an interesting question indeed. If the internet were peer to peer, multi-source, and multi-cast, it could potentially be faster.
This applies well specifically to large binary files – for software – like Linux Distributions, Android Rom Updates, Libre Office, iOS updates.
- To be useful to the peers, they need to be able to have used the downloaded file, so it can’t be encrypted for one time use.
- To be useful for proprietary binaries, some encryption or DRM is needed, see spotify
- For adoption, this needs to be built into the browser
- Low Commitment to Individual Seeding (both from a cache and bandwidth perspective)
@smileysteve also normal people don't use/understand bittorrent.
— Sanjay Parekh (@sanjay) August 11, 2014
A large part of torrents not being used more mainstream is that they have a marketing problem. Well, and a performance problem, because ISPs kill connections, and torrents can kill routers with their hundreds of connections.
Linux distributions, in particular, should all be marketed as to using torrents to download (and a change to apt would save fortunes). Those global mirrors all cost significant amounts of money. In the case of @sanjay downloading Mint, their links are not exactly designed well to help users know which link to click.
Spotify is one of the few companies that used a P2P model to deliver media (Netflix would get rid of their peering issues if they did) and all sorts of streaming media would be better if it did.
From a commercial standpoint though, there are issues with serving distributed content.
- Prevent Man In The Middle changes (signature verification can slow things down)
- Overloading number of requests (client side)
- Personalization to users is no longer possible
To some extent, using a load balancer and subdomains is distributing the load of web requests. For reasonably sized downloads, should browsers use part files based on how many load balancers it can connect to?
On the Server Side
A similar technology form the server perspective (to get a web app to all hosts quickly) is called Murder – I looked into it when I was at Pardot – https://github.com/lg/murder – it has drastic speed and parrallelization improvements when talking about hundreds of servers when compared to rsync.
S3 Has Torrents
Something I wish I had known – or that all android rom developers knew is that S3 allows for torrenting of any files it has, just add the “?torrent” postfix.
More to Come
I’d like to look into this more. So what do you think? Do people still use international mirrors (or do CDNs do it for us?) Do you which your browser would download torrents? How do you make it commercially viable?