Daniele Duca
BOFH excuse for today:  short leg on process table
Server date is: 30/11/2023

This site best viewed with eyes

Say NO to software patents!

04/01/05 - Release 0.1

With this HOWTO you will be (hopefully) able to setup a transparent proxy with Squid that does virus scanning and content filtering. This is useful basically for two things:
  1. Avoid bandwidth waste. p0rn and warez consume a lot of bandwidth. But fortunately Squid can block some of these sites with blacklists.
  2. Avoid that "guests" on your network with unpatched versions of Internet Exploder goes around being infected by JPEG viruses. Probably you, like me, don't care about guests :), but the bad thing of infected PC in your network is that virtually anything could happen if the virus installs a backdoor or a zombie.
So, there we go. You will need these things:
  1. A Linux box. I used FC3 but this could be applied to any other 2.4 or 2.6 distribution
  2. Squid
  3. Clam Antivirus
  4. Squid Clamav Redirector
  5. Netfilter with the REDIRECT target extension compiled.
  6. SRG (Optional, for log analysis)
  7. Apache with PHP (Optional, to see the analysis made by SRG and to use the little virus.php script)
In this little HOWTO I assume you already have Apache and PHP installed and working on your box.

Start compiling squid:

# tar xvfj squid-2.5.STABLEX.tar.bz2
# cd squid-2.5.STABLEX
# ./configure --enable-linux-netfilter
# make
# make install

Edit carefully the /usr/local/squid/etc/squid.conf file. READ the ACL section and add your own ACL to make Squid works in "normal" mode.
Initialize and start the proxy

# /usr/local/squid/sbin/squid -z
# /usr/local/squid/bin/RunCache &

Test it with manually adding the proxy in your browser.
After you succesfully tested it, remove the manual configuration in the browser and add these rules in your squid.conf:

httpd_accel_host virtual
httpd_accel_port 80
httpd_accel_with_proxy on
httpd_accel_uses_host_header on

Then, configure the transparent proxy feature with iptables.

# echo 1 > /proc/sys/net/ipv4/ip_forward
# iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT --to-port 3128

To test if the transparent feature is working, change in your PC the default gateway to the IP of the proxy ..

# route del default
# route add default gw ip.address.of.proxy

.. and try to browse the Internet. If you see your hits in the file /usr/local/squid/var/logs/access.log then it worked!

If you need to allow some IP addresses to go surf the web without the proxy, do that with iptables:

# iptables -t nat -I PREROUTING 1 -s ip.that.needs.direct.access -p tcp -m tcp --dport 80 -j RETURN

Now it's time to install Clam Antivirus.

# tar xvfz clamav-0.XX.tar.gz
# cd clamav-0.XX
# ./configure
# make
# make install

Edit the files /usr/local/etc/clamd.conf and /usr/local/etc/freshclam.conf (just comment the line with the "Example" statement if you are in hurry).
Launch clamd (the clam antivirus daemon) and freshclam (the virus database update tool)

# clamd
# freshclam
ClamAV update process started at Tue Jan 4 22:27:19 2005

Now install the SquidClamAV_Redirector. If you want the in-deep instructions, go here else minimal survival commands are:

# cp SquidClamAV_Redirector.py /usr/local/bin
# cp SquidClamAV_Redirector.conf /usr/local/squid/etc
# chmod +x /usr/local/bi/SquidClamAV_Redirector.py
# chown squid.squid /usr/local/squid/etc/SquidClamAV_Redirector.conf

Edit the file /usr/local/squid/etc/SquidClamAV_Redirector.conf, here follows mine:

virusurl = http://ip.of.the.proxy/virus.php
cleancache = 300
ForceProtocol = http MaxRequestsize = 2Mb
log_priority = LOG_INFO
log_facility = LOG_LOCAL6
acceptredirects = 300 301 302 303

Infected = true
Clean = true
Error = true
Ignored = true

pattern = .jpg .exe .zip .rar .ar .com .bzip .gz

http = http://ip.of.the.proxy:3128/

Test if the redirector works:

# /usr/local/bin/SquidClamAV_Redirector.py -c /usr/local/squid/etc/SquidClamAV_Redirector.conf
http://www.freshmeat.net/ FOO FOO BAR

Look in your /var/log/messages (or whatelse you configured) if the hit has been logged. If you get compilation errors this probably means you don't have something needed by the redirector. Check the full installation instructions to solve the error. When you have the redirector working, add these lines in the squid.conf:

redirect_program /usr/local/bin/SquidClamAV_Redirector.py -c /usr/local/squid/etc/SquidClamAV_Redirector.conf
redirect_children 5
redirector_access deny localhost

Copy the virus.php sample script in the Apache DocumentRoot and restart squid

# squid -k reconfigure

Try to browse somewhere, if everything goes ok, try to download this SAMPLE virus http://www.eicar.com/download/eicar.com (SAMPLE means that it will not infect your computer, just trigger the AV engine). If everything worked, you will be redirected to the virus.php page that will complain about a "Eicar-Test-Signature" virus, and you will see in your SquidClamAV_Redirector logs something like:

Jan 4 23:20:05 proxy SquidClamAV: Url: http://www.eicar.com/download/eicar.com Status Infected Eicar-Test-Signature

This means that the antivirus is working. Add a line in the crontab to check daily for Clam updates. This is my cron entry

# crontab -l
0 4 * * * /usr/local/bin/freshclam 2&>/dev/null

Now we are going to setup the blacklists to block porn and warez sites. I packed my current (4/1/2005) blacklists in this file (initially based on blacklist from Squidguard.org). Download and unpack it to /usr/local/squid/etc

# tar xvfj blacklist.tar.bz2
# chown -R squid.squid blacklists/*

The domains.ok files contains the hostnames, the ips.ok file contains IP addresses.
Add these ACL in your squid.conf file

acl porndomains dstdomain "/usr/local/squid/etc/blacklists/porn/domains.ok"
acl pornips dst "/usr/local/squid/etc/blacklists/porn/ips.ok"
acl warezdomains dstdomain "/usr/local/squid/etc/blacklists/warez/domains.ok"
acl adsdomains dstdomain "/usr/local/squid/etc/blacklists/ads/domains.ok"
http_access deny warezdomains
http_access deny porndomains
http_access deny pornips
http_access deny adsdomains

The "domains.ok" files contains domains in the format ".domain.com", the "ips.ok" contains only IP addresses. You can also block URLs by using regular expressions

acl pornurlregex url_regex -i ^http://this.is.a.porn/url
http_access deny pornurlregex

Restart squid with squid -k reconfigure and try to go to a blocked URL, like http://www.playboy.com. You will get an "Access denied" page in the browser and a TCP_DENIED/403 line in Squid's access.log.

The last thing to do is to setup SRG for log analysis. To do that, unpack and compile srg

# tar xvfj srg-X.X.tar.bz2
# cd srg-1.1
# make
# make install

Create a /bin/proxyanalyze file and add it to the crontab:

# cat /bin/proxyanalyze
date=$(date -d yesterday +%d/%m/%Y-%d/%m/%Y)
/usr/local/sbin/srg -H -r -R -S -t $date -f /usr/local/squid/var/logs/access.log -g A -o /usr/local/apache2/htdocs/proxy
# crontab -l
10 0 * * * /bin/proxyanalyze 2&>/dev/null

With this cronjob you will have, every day, a report accessible at the URL http://ip.address.of.proxy/proxy containing traffic details for every IP that used the proxy in the last 24 hours. Very useful ;)

Well done! From now on you should only check SRG's reports and probably add new domains to blacklists. Speaking about hardware, I used a PIII 600MHz with 128MB of RAM to handle traffic from about 100 users, and it's working very fine without bottlenecks. Just be careful when you do squid -k reconfigure (tipically after adding new domains to the blacklists), the server will hang for some seconds because it needs to re-read blacklists files and thus the navigation will slow down a little.

As usual, feel free to contact me if you need more informations.


No liability for the contents of this document can be accepted. Use the concepts, examples and other content at your own risk. There may be errors and inaccuracies that may damage your system. Proceed with caution, and although this is highly unlikely, the author does not and can not take any responsibility for any damage to your system that may occur as a direct or indirect result of information that is contained within this document. You are strongly recommended to make a backup of your system before proceed and adhere to the practice of backing up at regular intervals.

Informations on this page are released under the GNU FDL License
This page last updated: 05/01/05