Those at Midhurst and Shoreham are no longer in use by the Diocese. BitTorrent peers don’t act as ‘always up’ servers like Napster ones do.
a BitTorrent download only consumes bandwidth in either direction when it’s started, and immediately stops uploading whenever it’s done downloading.
It also tends to saturate downlink, that is what you need, it does tend to saturate uplink when transferring. Call me selfish but, the last thing I need is 10000 people striving to connect to and download files from my rig.
Know what, I personally don’t seek for all of my precious bandwidth consumed and my workstation crushed under the load being that some hottie just released a brand new song or being that CNN has the latest on a new terrorist attack.
We actually need to forget about the security aspects for a second, as if that weren’t hard enough.
I should like to thank you for posting to Geek.com in response to our questions.
We don’t often get direct answers, usually just banal quips from clueless hacker dudes. Therefore if you will, any and all information from the horses mouth, is always greatly appreciated. Thus, a service that provides individualized content that gets Slashdotted should not necessarily benefit from the distributed load concept. Analysts point out that the Swarmcast and BitTorrent solutions only work when the huge number of surfers is requesting quite similar content. Of course since it can run as multiple virtual Web servers, a IDC analyst pointed to IBM’s ‘Linux powered’ ‘ZSeries’ servers as a good idea in those cases, In those situations, more powerful hardware may again become only one solution.
Swarmcast was developed by OpenCola, a Canadian software company.
It shifts into P2Pish sharing only when the server’s load gets high, bitTorrent is Bram Cohen’s software approach to similar solution.
While easing the server’s burden, whenever users install the Swarmcast client software on their PCs, the Swarmcast server side software breaks requested content into pieces which are shared user to user by the client software. Actually the approach is extremely interesting since the more people striving to get at particular content, the easier that sharing becomes as long as the larger number of users. Idea of a server pool has better merits. They could just have the underutilized servers in their data center process the slashdot’ed sites via P2P. I mean, it could all even happen at the ISP provided their pipe to the net was big enough.. People pulling it from you get alternative story than they should had they been able to access the actual site, TQBrady writes How long before hackers write a script that changes your local copy of the content?
Secure hashing information about the file to be downloaded is returned as part of the http request which kicks it off, and data which doesn’t match that hashing information is later rejected, peers can’t fool you into downloading alternative file than you meant to, with BitTorrent.
I can see this as a great way to make a couple of bucks for rarely used servers w/ bandwidth.
Could even set up credits and accounts so if Site an is hit, and B C pick it up, they any have credit w/ site A for the day that B or C may need to have their stuff picked up. Charge a little for all the usage. I know that the Switch’s Paltry Launch Lineup Doesn’t Justify a DayOne. In my opinion the P2Ptype solution makes tal sense, and is a great example of thinking outside the write, as for the software.
Any site that’s offering up something new or that gets Slashdotted if you look for serious content delivery and to avoid the Slashdot effect, better solution but, P2P any BitTorrent peer only uses about as much upload as it does download. Every one only has to be at about ‘break even’ for the system as a whole to work, the upload burden is distributed across all downloaders.