Bit Torrent, Updating Linux

Rajinder Yadav devguy.ca-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org
Thu Jul 9 19:26:07 UTC 2009


On Thu, Jul 9, 2009 at 2:39 PM, Lennart
Sorensen<lsorense-1wCw9BSqJbv44Nm34jS7GywD8/FfD2ys at public.gmane.org> wrote:
> On Thu, Jul 09, 2009 at 11:46:05AM -0400, Rajinder Yadav wrote:
>> I am not sure if anyone is already doing this. What I would like to
>> eventually provide for the Linux base I am working on putting together
>> is a way for the user to download modules with an integrated bit
>> torrent download manager. (For security reasons the torrent would only
>> use known trusted servers.)
>>
>> I am planning on building a core Linux base that could be quickly
>> downloaded. Then during the install phase or after, allow the user to
>> customize their download of extras using a bit torrent server.
>
> People have talked about this for debian many many times in the past,
> and everytime it is pointed out taht bittorrent is a lousy method for
> updates because:
>
> Packages are obviously compressed (it would be stupid not to).  This means
> that every time a new version of the package is made, there is nothing
> reuseable from the old one to make bittorrent more efficient (same is
> true for rsync unfortuantely).  In the case of rsync there have been
> talk of extending gzip in a way that allows predictable blocks to occour
> that would result in identical compressed blocks if only part of a
> package changes.  Not sure where that ever went.
>
> Who is going to keep all the packages around just to feed bittorrent
> once they have installed them?  What a waste of disk space.
>
> So really, other than saving you bandwidth (assuming it is your
> distribution on your server), where is the benefit to anyone else?
>
> Oh and if you want to see what bittorrent does to server load and
> resources, have a look at the info from last years linux symposium from
> one of the admins of mirrors.kernel.org.  It showed just how horrible
> and wasteful bittorrent is for distribution.  If you have the choice
> of having some decent http/ftp servers and using bittorrent, then the
> http/ftp servers will always be far faster and more efficient.  Bittorrent
> is only good when you have absolutely no way to do any real servers,
> or you are dealing with something not very popular or at least very
> short lived.
>
> --
> Len Sorensen

I am trying to understand what you're sharing, and with my limited
knowledge I can only imagine how things (should) work from a higher
level.

About wasted space, not quite sure I follow. If Package A was made
download-able i.e. gcc4.3, it would have to reside on a server and one
or more mirror, which is how things are done today. If you introduce a
bit torrent (BT), where does the wasted space come from?

Overall throughput of a BT download should be faster when connecting
to several slow servers, if compared to ftp/http connected to a single
slow server.

If ftp/http downloads from a slow server or BT downloads how does that
make the server load any different? I am not saying it does not, just
doesn't seem to make sense =)

I can see the issue of logistic to make sure all the mirror servers
are always up to date with the main package server. Otherwise BT will
not deliver on its throughput.

About a decent server being faster than BT. Would you know if it's the
norm today that "most" (mirror) servers out there are decent? What
would constitute decent in terms of kb/sec I am wondering, roughly the
top speed of a typical home internet connection? If so than the
argument for BT is mute.

-- 
Kind Regards,
Rajinder Yadav
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list