Hosts for large files

News, happenings
Message
Author
raffy
Posts: 4798
Joined: Wed 25 May 2005, 12:20
Location: Manila

Size

#21 Post by raffy »

MU:
Size does not matter, we have 275 Gigabyte
You mean hosting space? This must be something new and great news. :D

Which domain/s is/are using this?
Puppy user since Oct 2004. Want FreeOffice? [url=http://puppylinux.info/topic/freeoffice-2012-sfs]Get the sfs (English only)[/url].

User avatar
MU
Posts: 13649
Joined: Wed 24 Aug 2005, 16:52
Location: Karlsruhe, Germany
Contact:

#22 Post by MU »


User avatar
MU
Posts: 13649
Joined: Wed 24 Aug 2005, 16:52
Location: Karlsruhe, Germany
Contact:

#23 Post by MU »

For Dotpups / Pupgets:
http://dotpups.de/dotpups
alternative URL:
http://puppyfiles.org/dotpupsde/dotpups

For Isos:
http://puppyisos.org

FTP Server: ftp.servage.net
Username: puppyuploads
Password: get it from me via Personal Message!

That is a temporary folder for fresh uploads.
Please DO create subfolders there.
Please add a xxx.htm with info where to find more info about your package (usually only a link to the forum-announcement).

Send me a personal message when you uploaded, so that I can move your package to the final location.
Tell me in your message, to which subfolder I shall upload.

Mark
this message: http://murga-linux.com/puppy/viewtopic. ... 9400#99400
Last edited by MU on Mon 17 Oct 2011, 10:14, edited 4 times in total.

User avatar
Springer
Posts: 52
Joined: Tue 22 Aug 2006, 16:25
Location: Austin, TX

#24 Post by Springer »

I know this thread is a bit old, but I don't think anyone mentioned one of the obvious solutions for distributing Puppy ISOs, packages, podcasts, etc.: CoralCache (http://www.coralcdn.org/)

This free service caches frequently used content and make it available from distributed servers around the globe (260 of them, as I write this...)

Using it requires only appending ".nyud.net:8080" to the hostname portion of an ordinary URL (like, say, puppyos.com/download/whatever.iso). At the first request, the Coral network will check to see if the file is in the cache. If not, it grabs the file from your server, and so long as it's accessed frequently enough to stay in the cache, it will be available *transparently* from distribution sites around the world.

It offers many of the benefits of BitTorrent, but works as a regular HTTP download URL, so there's no special client or management as there is with BT.

Coral has been in operation for a years now, so it appears that it will work in the future, unlike many of the online file hosting services, and some Coral users distribute over 1TB/month.

Given that Puppy is so much smaller than many distros, CoralCache might make a lot of sense for us...

User avatar
Springer
Posts: 52
Joined: Tue 22 Aug 2006, 16:25
Location: Austin, TX

#25 Post by Springer »

Just found this in the Coral FAQ - Looks like Coral is still a great choice for packages, podcasts and the like, but only ISOs under 50 MB:
Coral is not saving me bandwidth for my large file!

Because of bandwidth overuse, we temporarily capped off Coral to disallow transfers of files greater than 50 MB. Our current deployment has servers with 4 GB performing whole-file caching (and are running at cache capacity). If clients are pulling files on the orders of 100s of MB, the benefit of the system for hundreds of other websites is greatly reduced, as data would otherwise be quickly evicted from caches.

We're going to look at better techniques to do striped large-file caching to be able to handle larger files, but until we implement such functionality, large file transfers were otherwise killing our file caches.

Thus, instead of just returned some type of error message (like 403: Forbidden), we are transparently redirecting clients back to the origin site, where they at least have a possibility of downloading the file, and the server is not in worse shape than pre-Coral.

User avatar
Previously known as Guest
Posts: 240
Joined: Thu 29 Sep 2005, 00:39

#26 Post by Previously known as Guest »

As I've been hosting some Puppy files on http://www.pkagfiles.net/ thought I'd make it official in this thread.

Make your directory & leave a description htm file pointing to forum thread of announcement, PM me.

ISO's & sfs files.

Ftp Server: pkagfiles.net
Username: puppystuff@pkagfiles.net
Password: puppystuff987

Edit:
Password login to download
User: puppy
Password: puppylinux
Last edited by Previously known as Guest on Tue 03 Mar 2009, 03:54, edited 5 times in total.

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#27 Post by klu9 »

Just found this site:
http://www.tuxfamily.org/

They host "free-libre" files for "free-gratis". Maybe one of you Puplet creators might want to ask them if they'll host your ISO.

I looked around their website, FAQ, wiki & forums but couldn't find any mention of file size limits.
TuxFamily, free hosting for free people

TuxFamily is a non-profit organization that provide free services for projects and contents dealing with the free software philosophy (free as in free speech, not as in free beer). Any project licenced with any libre licence is accepted, for example GPL, BSD, CC-BY-SA, Art Libre, ...
Right now, about 4000 users and 900 projects are using TuxFamily.org free hosting services. So don't wait, subscribe to Tuxfamily !

User avatar
darrelljon
Posts: 551
Joined: Sun 08 Apr 2007, 11:10
Contact:

#28 Post by darrelljon »

Wikipedia compare one-click file hosters here. I'm thinking of uploaded as many puplets as possible to one, but would like a service where files don't expire.

User avatar
MU
Posts: 13649
Joined: Wed 24 Aug 2005, 16:52
Location: Karlsruhe, Germany
Contact:

#29 Post by MU »

those services where you don't pay usually dont survive very long.
This is why I have set up puppyisos.org (see messages further up).
It currently can be payed for a second year by the donations I already received.
Mark

User avatar
darrelljon
Posts: 551
Joined: Sun 08 Apr 2007, 11:10
Contact:

#30 Post by darrelljon »

Ideally we would use a free sustainable server where files don't expire. I hope donations can sustain your ability to host ISOs for as long as possible. Perhaps a strong long-lasting free alternative will emerge if bandwidth prices fall.

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#31 Post by klu9 »

well Gray used tuxfamily.org (the one I mentioned above) and I got NOP successfully from there. http://download.tuxfamily.org/nop/

I think because it's not a 1-click "upload pirated stuff"-type site and just for free-libre software it will last longer.

also I think more use could be made of BitTorrent & Linuxtracker.org to distribute larger files, and metalinks to spread the burden.

Caneri
Posts: 1513
Joined: Tue 04 Sep 2007, 13:23
Location: Canada

#32 Post by Caneri »

MU et al,

I have space and bandwidth for you to use at
www.puppylinux.ca

email or pm and I'll set up an secure ftp account for you...that way you also help me to keep up to date with the iso and pets etc.

Eric

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#33 Post by klu9 »

coralcache has already been mentioned, but ruled out for most ISOs because it has a 50MB file size limit. But...

has anyone tried CoBlitz? It seems a very similar idea to coralcache, and is designed with ISOs in mind, up to 20GB (!) in size.
CoBlitz homepage wrote:How Does It Work?

You add the prefix http://coblitz.codeen.org/ to the URL you want to serve, and CoBlitz does the rest... To give a high-level description of how it operates:
  • When clients request a large file, they are really contacting a special agent that resides on the CDN node. This agent looks like a standard Web server.
  • The agent converts the single request from the client into a stream of requests for smaller pieces (chunks) of the file. These requests are spread, in parallel, to other peer CDN nodes.
  • These peers request the chunks from the origin server, using the byte-range support in HTTP. The peers not only send the request back to the original agent, but also cache their chunks.
  • The agent reassembles the chunks and sends them back to the client in order, making it appear like one seamless download.
This approach has several benefits:
  • As peers join/leave the CDN, only the missing parts of the large file need to be re-requested, instead of doing whole-file caching.
  • Large files can be spread across the main memory of many nodes, reducing the memory pressure on any single node, and reducing the number of disk accesses needed to serve the file.
  • Since we use HTTP as the underlying protocol, no changes are required to clients or servers. All CoBlitz support is on the CDN itself.
It looks like all we'd have to do is put http://coblitz.codeen.org/ at the start of a regular HTTP iso link. The academic institutions running CoBlitz take care of the rest automatically. That's it, I think.
  • 1 special action on part of person posting a download link: just add http://coblitz.codeen.org/ to the start.
  • 0 special action required from the host (except maybe making sure they don't block CoBlitz nodes from downloading the file)
  • 0 special action required from the downloader: they just do what they usually do with a regular HTTP iso link. No special software required.
Someone correct me if I'm wrong. It looks almost too easy.

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#34 Post by klu9 »

PS here's some more info about CoBlitz from their web page.
  • File size limits - No files smaller than 100KB or larger than 20GB are served for the general public. Exceptions are provided for PlanetLab users and for US Educational sites.
  • Content types - We are focusing on serving large files, like ISO images, PDFs, etc. We automatically change the content type of '.iso' files to be 'application/octet-stream'. We do not serve Web pages, images, videos, or audio files for the general public. These restrictions do not apply to files hosted at US Educational sites, or to any downloads initiated at PlanetLab-affiliated addresses.
You can easily make a CoBlitz link out of a regular link. The canonical form is

http://coblitz.codeen.org/Original_URL

Note that the original URL can either contain "http://" or not. (wget complains if you include "http://" in the orignal URL. So, when you use wget, please either use "\" in the second "http://" like "http://coblitz.codeen.org/http:\/\/original_url", or strip off the second "http://", like "http://coblitz.codeen.org/original_url".)

Example:
http://coblitz.codeen.org/www.cs.prince ... igfile.zip

openworld
Posts: 1
Joined: Tue 20 Nov 2007, 21:50

Hello everyone

#35 Post by openworld »

Hello and thank you for your distro release

I'd like to promote puppy from my personal mirror website,an independant website. It's intended to provide mirrors to linux distributions.

It's located in Europe (France)

This proposal is free and is placed under the GPL Licensing.

if you agree just tell me
Thanks for your cooperation

Webmaster@openxworld

mailto:openxworld @gmail.com

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

Videos, Screencasts of your Puppy

#36 Post by klu9 »

What if you want to share your video of Puppy or a Puplet? Videos can be very big files and you probably shouldn't offer them from your personal webspace.

Try a video sharing site. Here's a table of different video sharing sites.
http://en.wikipedia.org/wiki/Comparison ... o_services

The most famous is Youtube, but there can be problems. For example Youtube doesn't accept videos in the .ogg (Theora) format, the usual default for Linux screencasts as it is open-source/free/libre/gratis.

If you don't want to go through the hassle of converting your video to a format acceptable to Youtube, look for services that accept .ogg. The wikipedia table incudes some (scroll down to the "Files" table).

I also found this French site dedicated to Linux videos and using .ogg:
http://www.linuxvideos.fr/
(But note that you can't do it "automatically"; you have to send them an e-mail.)

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#37 Post by klu9 »

for files under 100mb in size, a good option seems to be mediafire

www.mediafire.com

Pros
  • Free
  • unlimited bandwidth
  • unlimited time
  • web-2.0-ish file management & upload progress viewing
  • easily share with others
  • downloaders can resume, use download managers, get more than 1 file at a time
Cons
  • file size limit of 100mb
  • no hotlinking (can't offer direct links)
  • upload only thru webpage (no FTP etc)
Last edited by klu9 on Sun 17 Feb 2008, 16:14, edited 1 time in total.

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#38 Post by klu9 »

another one I've been trying out is boxstr

www.boxstr.com

Pros
  • free
  • 1GB filesize limit
  • hotlinks allowed (link directly to files, without any page in between)
Cons
  • rolling bandwidth limit (512MB per 24 hours)
  • upload via webpage (or windows software, limited in free version)
    a bit ugly :lol:
  • Damn, just saw this in the TOS: "Do not mirror out file downloads" :(
  • no FAQ to answer all my questions

User avatar
Dingo
Posts: 1437
Joined: Tue 11 Dec 2007, 17:48
Location: somewhere at the end of rainbow...
Contact:

#39 Post by Dingo »

klu9 wrote:for files under 100mb in size, a good option seems to be mediafire

www.mediafire.com

Pros
  • Free
  • unlimited bandwidth
  • unlimited time
  • web-2.0-ish file management & upload progress viewing
  • easily share with others
  • downloaders can resume, use download managers, get more than 1 file at a time
Cons
  • file size limit of 100mb
  • no hotlinking (can't offer direct links)
  • upload only thru webpage (no FTP etc)
yes, mediafire is good service, I have already used for dokupuppy, but I have also read that they in future may change policy for file retention (not more unlimited) other say that this policy is already changed. What can yoy say regard this?
replace .co.cc with .info to get access to stuff I posted in forum
dropbox 2GB free
OpenOffice for Puppy Linux

klu9
Posts: 344
Joined: Wed 27 Jun 2007, 16:02

#40 Post by klu9 »

Dingo wrote:yes, mediafire is good service, I have already used for dokupuppy, but I have also read that they in future may change policy for file retention (not more unlimited) other say that this policy is already changed. What can yoy say regard this?
Sorry, I have no info regarding this.
BTW I've included dokupuppy in my experimental Google custom search engine, Puppy Linux Super Search.

Post Reply