Good alternative for Uget downloader

Using applications, configuring, problems
Post Reply
Message
Author
User avatar
nic007
Posts: 3408
Joined: Sun 13 Nov 2011, 12:31
Location: Cradle of Humankind

Good alternative for Uget downloader

#1 Post by nic007 »

Uget has all the features I need from a downloader (like importing text files) but it's a work in progress and buggy. Any alternatives with all the available features?

User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#2 Post by smokey01 »


User avatar
nic007
Posts: 3408
Joined: Sun 13 Nov 2011, 12:31
Location: Cradle of Humankind

#3 Post by nic007 »

Thanks smokey01. I'll check some of them

User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#4 Post by smokey01 »

Have you tried wget? It comes with most distributions.

I just tried ProZilla and it's pretty good too. Easy to compile and a single binary. Both are CLI programs but it's easy enough to knock a GUI together.

User avatar
nic007
Posts: 3408
Joined: Sun 13 Nov 2011, 12:31
Location: Cradle of Humankind

#5 Post by nic007 »

I need something that allows the import of text/html files. Sometimes one wants to download a video which is chopped up in small chunks, this is when specifying al the links in a text file comes in handy.

User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#6 Post by smokey01 »

I just knocked together a really simple script for ProZilla or as the binary is called proz.

Copy this code to a script and make it executable.

Code: Select all

#!/bin/sh
for i in "$@"; do
 case "${i}" in
	*) 
	xterm -e proz ${i} -r -P /root &
	;;
esac
done
Now drag this script to your desktop.

Drag the URL from your browser to this script and your file will be downloaded and saved to /root. I just tried a text and html file and both downloaded fine.

User avatar
nic007
Posts: 3408
Joined: Sun 13 Nov 2011, 12:31
Location: Cradle of Humankind

#7 Post by nic007 »

smokey01 wrote:I just knocked together a really simple script for ProZilla or as the binary is called proz.

Copy this code to a script and make it executable.

Code: Select all

#!/bin/sh
for i in "$@"; do
 case "${i}" in
	*) 
	xterm -e proz ${i} -r -P /root &
	;;
esac
done
Now drag this script to your desktop.

Drag the URL from your browser to this script and your file will be downloaded and saved to /root. I just tried a text and html file and both downloaded fine.
Thanks. I'm talking about a text file containing multi-urls (as in 50+).. And I want to use a GUI application which looks nice on the eye

User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#8 Post by smokey01 »

Ah, thought as much. Try kiwi it's very comprehensive. Download: https://sourceforge.net/projects/wget-gui-kiwi/

What OS do you use? I have just made a package for Fatdog64. It does need Qt4 but then so do may other things so I always have it loaded.

User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#9 Post by smokey01 »

This is what the GUI looks like.
Attachments
1.png
(11.18 KiB) Downloaded 136 times
2.png
(16.49 KiB) Downloaded 133 times
3.png
(22.3 KiB) Downloaded 127 times
4.png
(50.8 KiB) Downloaded 131 times
5.png
(19.94 KiB) Downloaded 143 times
6.png
(35.99 KiB) Downloaded 132 times
7.png
(32.99 KiB) Downloaded 133 times
8.png
(20.1 KiB) Downloaded 143 times
9.png
(39.39 KiB) Downloaded 136 times

User avatar
nic007
Posts: 3408
Joined: Sun 13 Nov 2011, 12:31
Location: Cradle of Humankind

#10 Post by nic007 »

Thanks for the pointers smokey01. I use a lot of Puppies. I don't think I'm going to download Qt4, a bit of an overkill as far as my personal usage is concerned. Think I'll stick with Uget for now and just condone the odd behaviour every now and then. Actually, the only time I do use uget is specifically for the odd occasion I want to import a lot of urls otherwise I just use the browser's built-in downloader which is efficient most of the time. I appreciate your contribution

User avatar
smokey01
Posts: 2813
Joined: Sat 30 Dec 2006, 23:15
Location: South Australia :-(
Contact:

#11 Post by smokey01 »

The GUI above is for wget. So you can use wget without the GUI to do what you want. Do a wget -h and take a look at all the options.

In particulare look at the following switches.
-i, --input-file=FILE download URLs found in local or external FILE.
-F, --force-html treat input file as HTML.

Cheers

Post Reply