Fix waiting for (webpage) freezes and speed up browsing

Under development: PCMCIA, wireless, etc.
Post Reply
Message
Author
User avatar
technosaurus
Posts: 4853
Joined: Mon 19 May 2008, 01:24
Location: Blue Springs, MO
Contact:

Fix waiting for (webpage) freezes and speed up browsing

#1 Post by technosaurus »

I don't know if I will have time to implement this, but I have been getting annoying behavior in all of my browsers "waiting for" ... including ajax.googleapis.com, platform.twitter.com, google-analytics.com, pagead2....

some of these I just don't need and I can add them to my /etc/hosts and redirect them to 127.0.0.1 (or 0.0.0.0) but others, like ajax.googleapis.com is needed by a ton of websites for jquery and other hosted js libraries. so I have the option to disable it and accept degraded or non-functioning sites or keep hitting reload every couple of minutes until it stops hiccuping.

I figured out a third option
A) run a small webserver
B) create a directory tree that mimicks the original hosts' including the needed files
C) use /etc/hosts to redirects offenders to 127.0.0.1

ex. put the latest jquery at:
$HOME/Web-Server/ajax/libs/jquery/1/jquery.min.js (+ symlink to jquery.js and symlinks to the /1/ directory for all known versions)

configure the webserver to either log or prompt the user for 404 not found errors, so that we can use it to locally precache them (this would involve temporarily enabling the sites for wget to download them to the appropriate locations)

any thoughts, anyone else get this problem, is it worth it just for the speed increase?
Check out my [url=https://github.com/technosaurus]github repositories[/url]. I may eventually get around to updating my [url=http://bashismal.blogspot.com]blogspot[/url].

Bligh
Posts: 480
Joined: Sun 08 Jan 2006, 11:05
Location: California

#2 Post by Bligh »

I get that a lot on comcast cable. Particularly on older slower comps.
Not so much lately. on updated mozilla browsers.
Cheers

User avatar
greengeek
Posts: 5789
Joined: Tue 20 Jul 2010, 09:34
Location: Republic of Novo Zelande

#3 Post by greengeek »

Sounds like a great idea to me. I get so sick of waiting for even simple websites to load, especially on older spec machines with less cpu and slower internet connections. (some of my machines are linked to the internet by IP wireless which is faster than dialup but nowhere near broadband speed).

Turning off flash helps a lot but all the google-analytics redirects etc seem like such a timewaster.

User avatar
Barkin
Posts: 803
Joined: Fri 12 Aug 2011, 04:55

Re: Fix waiting for (webpage) freezes and speed up browsing

#4 Post by Barkin »

technosaurus wrote:... I have been getting annoying behavior in all of my browsers "waiting for" ... including ajax.googleapis.com, platform.twitter.com, google-analytics.com, pagead2....

some of these I just don't need and I can add them to my /etc/hosts and redirect them to 127.0.0.1 (or 0.0.0.0) but others, like ajax.googleapis.com is needed by a ton of websites for jquery and other hosted js libraries. so I have the option to disable it and accept degraded or non-functioning sites or keep hitting reload every couple of minutes until it stops hiccuping ...
Using the Addons "NoScript" and "AddBlockPlus" in FireFox browser stops all that unwanted stuff. Whitelisting is a feature of NoScript: just permit the stuff you want rather than trying to blacklist everything you don't, (as in hosts).
Wildcards are permitted in AddBlockPlus filters so you can block a class of object rather than attempt to list every variant (as in hosts).
technosaurus wrote:any thoughts, anyone else get this problem, is it worth it just for the speed increase?
My browsing is much faster with NoScript and AddBlockPlus enabled, (in FireFox).

User avatar
technosaurus
Posts: 4853
Joined: Mon 19 May 2008, 01:24
Location: Blue Springs, MO
Contact:

#5 Post by technosaurus »

the waiting for ... is often a javascript library that is needed for the page to function properly - adblock, noscript or any other plugin can't magically fix that. I don't want to block them, just not freeze the page while waiting on them to (sometimes never) load. I need to redirect to another url that doesn't freeze, and still gets me the right resources. I _could_ run a paid service that would allow customers to redirect to a hosted website ... or I could write free software to host it locally. Honestly I wish bittorrent was integrated into browsers other than just opera and that javascript could use it for resources.
Check out my [url=https://github.com/technosaurus]github repositories[/url]. I may eventually get around to updating my [url=http://bashismal.blogspot.com]blogspot[/url].

User avatar
technosaurus
Posts: 4853
Joined: Mon 19 May 2008, 01:24
Location: Blue Springs, MO
Contact:

#6 Post by technosaurus »

I've set up a basic web server for this purpose, it disregards the subdirectories and just loads the file from the current directory (so for instance you only need 1 jquery.js in 1 location and not a mess of directories for each CDN)
it probably still needs some work, specifically with /etc/hosts and/or /etc/nsswitch.conf, but it also means that it can be used as an ad blocker too:
/etc/hosts
127.0.0.1 localhost puppypc25346 #local machine = our webserver
127.0.0.1 ajax.googleapis.com #a domain we mirror locally
0.0.0.0 pagead2.googlesyndication.com #a site we want to block


The server (along with automatic downloader for missing files)

Code: Select all

#!/bin/sh
nc -ll -p 80 -e sh -c '
while read A B DUMMY
do
	case "$A" in
		[Gg][Ee][Tt])
			FULL=$B
			F=${FULL##*/}
			F=${F%%\?*}
			[ -f "$F" ] && cat "$F" && break
		;;
		[Hh][Oo][Ss][Tt]*)
			[ -f "$F" ] && break
			HOST=${B:0:$((${#B}-1))}
			sed -i "s/hosts:\t\tfiles /hosts:\t\t/g" /etc/nsswitch.conf
			wget -t 0 -q --no-dns-cache $HOST$FULL
			sed -i "s/hosts:\t\t/hosts:\t\tfiles /g" /etc/nsswitch.conf
			cat "$F"
			break
		;;
	esac
done
'
this would be much simpler without the wget part, but then I'd have to come up with a list of files to download (already started, but incomplete and need to verify licenses)
Check out my [url=https://github.com/technosaurus]github repositories[/url]. I may eventually get around to updating my [url=http://bashismal.blogspot.com]blogspot[/url].

User avatar
Moose On The Loose
Posts: 965
Joined: Thu 24 Feb 2011, 14:54

#7 Post by Moose On The Loose »

technosaurus wrote: this would be much simpler without the wget part, but then I'd have to come up with a list of files to download (already started, but incomplete and need to verify licenses)
I think it should stay with the "wget" and also publish the local web site for others on the local network to see. This way if you share a new work with someone who has a less modern OS, you can also make that machine use the local version for those files.

We could perhaps make a list that sort of grows organically. As the users use the system, each wget adds the name to the list of files that are to be downloaded the next time we do a fresh start on the web server.

The logic can just be to "grep" the file for the path already in it and then add it to the list if it isn't there. This way, it will cleanly recover from a file on the list being deleted.

User avatar
technosaurus
Posts: 4853
Joined: Mon 19 May 2008, 01:24
Location: Blue Springs, MO
Contact:

#8 Post by technosaurus »

for other computers on the lan, you could add an /etc/hosts entry for each mirrored site such as:
/etc/hosts

Code: Select all

#our local "proxy" is @ 192.168.0.100
192.168.0.100 ajax.googleapis.com
0.0.0.0 pagead2.googlesyndication.com
TODO output to stdout and then the file so we can use tee to get it to the browser on the first try.
Check out my [url=https://github.com/technosaurus]github repositories[/url]. I may eventually get around to updating my [url=http://bashismal.blogspot.com]blogspot[/url].

User avatar
Ted Dog
Posts: 3965
Joined: Wed 14 Sep 2005, 02:35
Location: Heart of Texas

#9 Post by Ted Dog »

Great ideas guys. I have a large host file to stop most simple issues but google has such deep reach and complexity across so many sites I can get a blank page at times I visit social sites. Also consider a like way to redirect social media tie ins its getting to the point of you catn close a over screen social media tieup to see the page below without causing a massive data load for my social media stuff.. Takes minutes just to see what is playing at the movie theater. :x

User avatar
technosaurus
Posts: 4853
Joined: Mon 19 May 2008, 01:24
Location: Blue Springs, MO
Contact:

#10 Post by technosaurus »

TODO
create blacklist of files to skip,
create graylist of file patterns to update symlinks (use latest js lib versions)
create whitelist of url patterns to allow (*/ajaxapis/* but not /ad/*...)
use my mods of musl-libc's gethostbyname etc... to bypass /etc/hosts
use axtls' httpd because it supports https
Check out my [url=https://github.com/technosaurus]github repositories[/url]. I may eventually get around to updating my [url=http://bashismal.blogspot.com]blogspot[/url].

Post Reply