Solution to server download limits for new Puppies?

What features/apps/bugfixes needed in a future Puppy
Message
Author
User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

Solution to server download limits for new Puppies?

#1 Post by veronicathecow »

I know torrenting has been proposed before but this is a variation and will require 5 scripts. This is a simple (to the user) way of helping distribute Puppy without having to understand anything, or having to remember to do anything at the next or subsequent boot.

Script 1. At install time you are asked the following question. "Would you like to help Puppy Linux by distributing Puppy Linux to others using a Torrent" (Please someone rephrase this so it is clear they have nothing to do other than press the yes button. Perhaps a simple explanation and a link to a full explanation?)

If yes then at the end of the install the ISO is copied to the HDD, and then verified and a variable added to the boot script.

If no then Puppy boots as normal.


Script 2. On boot if the variable is there then a script is checked for (In local.rc0 ??? where user files for booting are loaded) and if not there this script is added. This script will automatically start up the torrrent client with the installed ISO as the target
The script could also display something to the effect of "Thank you for helping Puppy Linux expand" or perhaps a small Icon showing that Puppy is being torrented so people can see they are helping.

Script 3. temporarily (Until next boot) I.E. Either close the torrent client

A way of permanently switching off this feature, I.E. close the client and remove the start up variable and script.

Script 4 A way of reversing script 3 in case later on people decide they would like to help again.

Script 5 A script to get via torrent the latest version of Puppy and install it.

If these scripts were added to the basic Puppy then it would be available to all derivatives and could significantly lower the load on servers and increase availability of all Puppy variants.

Any thoughts appreciated.
Tony (Aka Veronicathecow)

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#2 Post by klu9 »

I suspect there'd be a problem in that Puppy doesn't automagically set up networking; that would have to come first. Also people might feel hassled that they have to deal with this issue of torrenting the ISO just to try out a LiveCD. Fewer clicks, the better.

On this subject of distributing puppies, there's a sticky thread in the Announcements forum where I just posted some info on a cacheing service called CoBlitz: I think this could be a solution to the problem that requires no action on Barry's or puppy users' part.

Hosting for Large Files - Page 3

I think with a combination of a couple of orginating mirrors (like puppylinux.ca and tuxfamily.org) and CoBlitz, the problem would be solved.

PS I'm not against torrenting Puppy; I'm seeding now a torrent I created for NOP 301 rev 1
http://linuxtracker.org/torrents-details.php?id=4831

User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

#3 Post by veronicathecow »

Hi Klu9, thanks for comments. I'm glad you mentioned about autoconnect to internet as I think this is something Puppy is overdue for.

The proposal wasn't for people running live CD's but for those installing, I suppose I always install and forgot that others might still use CD's on a daily basis to boot to Puppy. (I wonder what percentage do?)

I think you idea re CoBlitz, is interesting but with torrents the more people who download the more nodes it becomes available from and I'm not sure if that's the case with the system you are proposing?

Also I have had troubles over a couple of years with mirrors, coming and going and being maintained. With a self replicating Puppy as long as there were a couple of trackers still active then all versions of puppy that people are currently using would be available (If they keep their client open)

Perhaps it could be integrated into Puppy at a later stage so that anyone who has created a Puppy can publish it just via the torrent system from a simple script?

From an environmental viewpoint torrents also make sense in that they reduce the amount dedicated hosting that goes on around the world and hence save energy.

This is an extract from a recent EPA report.

"America's server consumed about 61 billion kilowatt-hours of juice in 2006, about 1.5 percent of total electricity consumption in the country and representing about $4.5 billion in costs. Servers ate about as much electricity as all of the color televisions in the country, and about the same amount of power as 5.8 million typical households. This is about twice the amount of electricity that servers and data centers consumed in 2000."

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#4 Post by klu9 »

at install time
doh! my bad, I glossed over that bit or confused "install" with "boot". :oops:

Autoconnect
another vote from me for that :D

Vanishing Mirrors
I agree "auto-torrenting" (would that be the name for your invention?) could overcome the issue of mirrors coming and going, especially if "trackerless" torrenting works out; then you wouldn't even need a tracker staying up.

vanishing mirrors is less a problem for official Puppy files, because they get onto 'everlasting' sites like nluug & ibiblio. It would be nice if puplets could be got onto those sites too and then this issue would be moot.

Number of Nodes: BT vs. CoBlitz
A definite plus of BT is that the more downloaders there are, the more uploaders there are; I don't know if that happens with CoBlitz/CoDeeN nodes. But I suspect it doesn't really matter: the CoBlitz nodes aren't simple home users or cash-strapped volunteers but universities with fat pipes (and Internet2 connections between each other too).

But I haven't tried it yet... what the hell, here goes... [klu9 starts downloading FireHydrant puplet via CoBlitz]

If I hose ttuuxxx's server may I burn in hell, but I might just get to try out FireHydrant and find a (semi-)solution to the hosting issue.

Will report back on my progress using coblitz. BTW I think CoBlitz speed improves after 1 or 2 completed CoBlitzed downloads have happened; that way the CoBlitz nodes will have requested the file, received it and be ready to serve it.

Energy Efficiency
I hadn't thought about that. Servers have to be reliable and up 100% of the time, so I suspect BT has an advantage here.

User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

#5 Post by veronicathecow »

Hi klu9, I might try Firehydrant down load as well but at the moment I have an intermittent connection (Another reason that I was interested in BT rather than straight downloads (I have tried with download managers but even they seem to fail me and I get odd sized ISOs after hours of downloading.)
Thanks for reminding me about "trackerless" torrent I have not tried that yet and had forgotten about the idea.
This idea could also work for .sfs and other addons.
I think even places like Ibiblio have problems. I found out about as I use PClinux and Puppy.
http://www.theinquirer.net/en/inquirer/ ... disappears

By torrenting we have total control over distribution and ensuring Puppy users and well supplied.
I appreciate your input to this idea.

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#6 Post by klu9 »

Torrenting
Torrenting would mean not being at the mercy of hosts. That ibiblio story's a bit of an eye-opener. :shock:

Firehydrant, Download manager & metalinks
there is a command-line *nix downloader that can download a file via many protocols (http, ftp, BitTorrent): Aria2.

It's even available as a dot pet: aria2-0.11.4: file downloading utility

If you use it with a metalink, it will have corruption checking, multi-sourcing and resumption (in case of interruption), just like with BitTorrent.

Unfortunately, I can't attach a metalink here. If you want you could copy & paste the text below into a text editor, then just save it as a .metalink file. Then use it in aria2.

Code: Select all

<?xml version="1.0" encoding="UTF-8"?>
<metalink version="3.0"
  xmlns="http://www.metalinker.org/"
  generator="http://www.metamirrors.nl/"
  >
<files>
	<file name="firehydrant3.0.1a.iso">
		<verification>
			<hash type="md5">74b5e6d8e00a7aa7f812b9d163395b12</hash>
		</verification>
		<resources>
			<url location="ca" type="http">http://www.puppylinux.ca/puppyfiles/custom/firehydrant3.0.1a.iso</url>
			<url type="http">http://www.ttuuxxx.com/firehydrant/firehydrant3.0.1a.iso</url>
			<url type="http">http://coblitz.codeen.org/www.puppylinux.ca/puppyfiles/custom/firehydrant3.0.1a.iso</url>
			<url type="http">http://coblitz.codeen.org/www.ttuuxxx.com/firehydrant/firehydrant3.0.1a.iso</url>
		</resources>
	</file>
</files>
</metalink>
When I finish downloading FH, I might post a torrent of it but I warn you: my ISP throttles BitTorrent. :x

User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

#7 Post by veronicathecow »

Hi, I will have a further look into this in a day or so. Currently fighting with an Intel D201GLY2 . So far only PCLinux and Windows will work with it.
Cheers
Tony

User avatar
Pizzasgood
Posts: 6183
Joined: Wed 04 May 2005, 20:28
Location: Knoxville, TN, USA

#8 Post by Pizzasgood »

Cool idea. Very busy, just skimmed, so I apologize in advance, and for my non-refined post too. For auto-start, put it in /etc/init.d. As long as it's executable bit is set, it will be launched with <name> start each boot. When you reboot, will be called as <name> stop, so be aware so it doesn't accidentally start a second instance when shutting down. To disable it, just remove the executable bit. Easy to write a toggle script:

Code: Select all

#!/bin/sh
#toggles the executable status of /etc/init.x/conky_auto

if [ -f /etc/init.x/conky_auto ]; then
	if [ -x /etc/init.x/conky_auto ]; then
		chmod 644 /etc/init.x/conky_auto
	else
		chmod 755 /etc/init.x/conky_auto
	fi
fi
EXCEPT: That script is using init.x, which DOESN'T EXIST in Puppy. That's a custom addition I made on my own system, which mimics init.d, except it's in .xinitrc, so that it runs after X starts and can do X applications (like conky). So to use this, just change it from init.x to init.d, and change the "conky_auto" to the appropriate script's name.

Then just add a .desktop entry for that toggle script to provide a menu entry. Now user can toggle it easily. Maybe add a dialog box asking to confirm, and which would let them know whether they're enabling or disabling it.

I don't know if whatever torrent program you use will have issues if there's no network connection. Probably should have it ping google or something as a test, and if the network isn't up, drop out rather than continue through the script and trying to run the torrent.

It could run a daemon that sleeps for three seconds or so, then checks again, and keeps doing that until there is a connection, so that it could cover people who start the connection by hand each time.


Okay, bed time for bonzo. G'night :)
[size=75]Between depriving a man of one hour from his life and depriving him of his life there exists only a difference of degree. --Muad'Dib[/size]
[img]http://www.browserloadofcoolness.com/sig.png[/img]

User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

#9 Post by veronicathecow »

Hi Pizzasgood, many thanks for your input, that makes it sound do-able even by a newbie like myself (With probably just 50-100 posts on the Puppy forum 8-)
I just downloaded SUSE 10.3 (To see if it will sort the D201GLY2 problem with video tearing and found another potential problem.
Automounting of the source drive.
Still I'm sure there will stuff on the forum somewhere...
Cheers

User avatar
Pizzasgood
Posts: 6183
Joined: Wed 04 May 2005, 20:28
Location: Knoxville, TN, USA

#10 Post by Pizzasgood »

You mean the drive with the puppy iso on it?
Assuming it's on /dev/hda1, the pup_save.2fs file is NOT on /dev/hda1, and /mnt/hda1 already exists:

Code: Select all

mount /dev/hda1 /mnt/hda1
That can be added to /etc/rc.d/rc.local. Technically, you should be able to edit /etc/fstab instead, which is the standard Linux method for automounting things, but I don't know whether that's working in Puppy (when I tried it recently in a heavily tweaked version it wasn't working, but I might have broken it).

If the pup_save.2fs file is on the same partition as the puppy iso, then that partition is already mounted (/mnt/home is the preferred method to get there, even though it's actually a symlink to the true mount point).

The catch would be knowing where it is. The path could be stored somewhere though, and just load it:

Code: Select all

#at this point, we determined the path (stored in $THE_PATH), and need to save it like this:
echo "$THE_PATH" > /etc/path_to_iso

Code: Select all

#This is in the program that runs each boot.  It loads the path from /etc/path_to_iso:
THE_PATH=`cat /etc/path_to_iso`
[size=75]Between depriving a man of one hour from his life and depriving him of his life there exists only a difference of degree. --Muad'Dib[/size]
[img]http://www.browserloadofcoolness.com/sig.png[/img]

User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

#11 Post by veronicathecow »

Hi Pizzasgood, I replied to you post the other day but I think the Forum gremlins ate it!
I've never got auto mount to work on any Linux distro, it's a real pain.
Why can't they just be loaded, perhaps goes back to server days and security?

Perhaps also there could be a search facility (using the excellent Pfind) with the torrent controller offering other things like .pups, etc (Getting ahead of myself now.)

I'm surprised there isn't more interest in this idea as it is my experiences that slow downloads, dead links, failed downloads are a common thing. Perhaps I have had more trouble than most?

twanj
Posts: 11
Joined: Fri 15 Sep 2006, 07:18

#12 Post by twanj »

I think this could be a great idea.

does anyone else do it? I haven't seen it.

as long as people are given the chance to decline (the default should be No), and people are made aware that they're sharing their bandwidth. could be really cool!
-- Metalink: Easier, More Reliable, Self Healing downloading that harnesses the speed and power of P2P & traditional downloads in a single click...

User avatar
veronicathecow
Posts: 559
Joined: Sat 21 Oct 2006, 09:41

#13 Post by veronicathecow »

Hi twanj I was thinking. A start, stop permanently and a pause for this session. and if possible a throttle. Perhaps a little box in the notification area telling people how much they had contributed and also to show it was running.

Server farms use huge amount of power (now account for 1.2 percent of total U.S. electric consumption) so by using our machines to serve which are already on we can do something to reduce that.

nic2109
Posts: 405
Joined: Mon 01 Jan 2007, 20:24
Location: Hayslope, near Middlemarch, Midlands, England

#14 Post by nic2109 »

Just found this thread - it's so clearly A Really Good Idea that I'm surprised it hasn't been publicised and widely adopted.

How far has it got, and can I try it out?

With the move of the main Puppy sites from Servage to Hostgator there was a suggestion that Servage be retained just as the primary source for downloads. If all the d/l files (.ISO, .PET and .PUP) were loaded as Torrents, and these scripts put into standard Puppy then we can all benefit. And then if sites like distrowatch could point to a torrent source as well as http and ftp it just gets better and better.

This is particularly important right now as puppylinux.org is currently off-line because of server overload caused by 1000's of Dingo downloads. BT wouldn't have eliminated that problem but could have reduced it.

BTW; the latest version of MU's Muppy has automatic internet connection (wired certainly - not sure about wireless, and I've only tried the 'mini' version) activated so it is possible.
[color=darkblue][b][size=150]Nick[/size][/b][/color]

User avatar
HairyWill
Posts: 2928
Joined: Fri 26 May 2006, 23:29
Location: Southampton, UK

#15 Post by HairyWill »

nic2109 wrote:This is particularly important right now as puppylinux.org is currently off-line because of server overload caused by 1000's of Dingo downloads.
Just to dispel any misconception. There are no isos or any other significant sized files at the new address. The intention is to avoid big files at that site to avoid downloads eating up the maximum transfer or bandwidth.

The problem is cpu usage, Tom wasn't expecting the site to go live yet and the site was still configured for development not large scale access.
Will
contribute: [url=http://www.puppylinux.org]community website[/url], [url=http://tinyurl.com/6c3nm6]screenshots[/url], [url=http://tinyurl.com/6j2gbz]puplets[/url], [url=http://tinyurl.com/57gykn]wiki[/url], [url=http://tinyurl.com/5dgr83]rss[/url]

User avatar
Lobster
Official Crustacean
Posts: 15522
Joined: Wed 04 May 2005, 06:06
Location: Paradox Realm
Contact:

#16 Post by Lobster »

Nothing like large scale access to learn how to cope with large scale access. :)

Hairy Will I know, is asking for new editors to work on the new site.
If you did not get a personal invite and would like to help out, don't be shy - contact him and offer. Lots to learn by doing.
You will learn so much by involvement. It really is a privilege
to help. 8)

It is not unusual for Puppy to outgrow existing blogs, forum or web sites.

By maintaining secondary or backup resources we can cope with more.
For example I did not read Puppys internal documentation until 3 months of heavy usage.

Tom I know is doing all he can to produce a top web site.
Hairy Will is kindly offering expertise in adding modules and Php coding.
Warren has been supportive from the start.
Raffy has been helping with transfer from the existing wiki.

Step right up. Your Puppy needs you. Thanks guys :)
Puppy Raspup 8.2Final 8)
Puppy Links Page http://www.smokey01.com/bruceb/puppy.html :D

User avatar
HairyWill
Posts: 2928
Joined: Fri 26 May 2006, 23:29
Location: Southampton, UK

#17 Post by HairyWill »

hear, hear

Anyone welcome, it is a community site you are the community. If I've missed anyone please don't feel offended just shout.
Will
contribute: [url=http://www.puppylinux.org]community website[/url], [url=http://tinyurl.com/6c3nm6]screenshots[/url], [url=http://tinyurl.com/6j2gbz]puplets[/url], [url=http://tinyurl.com/57gykn]wiki[/url], [url=http://tinyurl.com/5dgr83]rss[/url]

User avatar
SirDuncan
Posts: 829
Joined: Sat 09 Dec 2006, 20:35
Location: Ohio, USA
Contact:

#18 Post by SirDuncan »

I was just trying to visit the site, and kept getting a "not available" page. However, when I followed a bookmark to a different part of the site it loaded fine. Are you aware of this?

Not available:
http://www.puppylinux.org/

Available:
http://www.puppylinux.org/home


EDIT: Now I'm getting a directory tree from the first URL, so I assume that something is being worked on right now.
Be brave that God may help thee, speak the truth even if it leads to death, and safeguard the helpless. - A knight's oath

User avatar
HairyWill
Posts: 2928
Joined: Fri 26 May 2006, 23:29
Location: Southampton, UK

#19 Post by HairyWill »

yup,
Tom is on it
it was up briefly but still flaky, he has to go out so I'm not sure it will get sorted today
Will
contribute: [url=http://www.puppylinux.org]community website[/url], [url=http://tinyurl.com/6c3nm6]screenshots[/url], [url=http://tinyurl.com/6j2gbz]puplets[/url], [url=http://tinyurl.com/57gykn]wiki[/url], [url=http://tinyurl.com/5dgr83]rss[/url]

nic2109
Posts: 405
Joined: Mon 01 Jan 2007, 20:24
Location: Hayslope, near Middlemarch, Midlands, England

#20 Post by nic2109 »

HairyWill wrote:
nic2109 wrote:This is particularly important right now as puppylinux.org is currently off-line because of server overload caused by 1000's of Dingo downloads.
Just to dispel any misconception. There are no isos or any other significant sized files at the new address. The intention is to avoid big files at that site to avoid downloads eating up the maximum transfer or bandwidth.

The problem is cpu usage, Tom wasn't expecting the site to go live yet and the site was still configured for development not large scale access.
Sorry 'bout that: the misconception was mine. I misinterpreted something whodo had said about 4000+downloads.
[color=darkblue][b][size=150]Nick[/size][/b][/color]

Post Reply