Friday, September 04, 2009

squid reverse proxy (aka website accelerator) on ubuntu hardy

We had a bunch of machines all hitting the same URLs on, so we put up a squid reverse proxy on so that would get served and cached by

sudo apt-get install squid squid-cgi # squid-cgi enables the cache manager web interface

Edit /etc/squid/squid.conf. It's very well documented, and we only had to modify a few lines from the default ubuntu hardy config:

Allow other local machines to use our cache:

acl our_networks src
http_access allow our_networks

instead of the default of:

http_access allow localhost

Have the cache listen on port 80 and forward all requests to

http_port 80
cache_peer parent 80 0 no-query originserver

In our case, we wanted to cache pages that included "GET parameters" in the URL, such as (which is something you should only do in special cases):

# enable logging of the full URL, so you can see what's going on (though it's a potential privacy risk to your users)
strip_query_terms off

Comment out the lines that exclude cgi-bin and GET parameter URLs from being cached:

#acl QUERY urlpath_regex cgi-bin \?
#cache deny QUERY

Then we went to: http://localhost/cgi-bin/cachemgr.cgi to see how well our cache was working (blank login and password by default).

After doing an "/etc/init.d/squid restart", we found that we could hit and get, as expected.

No comments: