The Perfect SpamSnake - Ubuntu Jeos 12.04 LTS Precise Pangolin - Page 4

14. KAM

vi /etc/cron.daily/

with the following content:

 # Original version modified by Andrew MacLachlan (
 # Added additional MailScanner restarts on inital restart failure
 # Made script run silently for normal (successful) operation
 # Increased UPDATEMAXDELAY to 900 from 600
 # Insert a random delay up to this value, to spread virus updates round
 # the clock. 1800 seconds = 30 minutes.
 # Set this to 0 to disable it.
 if [ -f /opt/MailScanner/var/MailScanner ] ; then
 . /opt/MailScanner/var/MailScanner
 if [ "x$UPDATEMAXDELAY" = "x0" ]; then
 logger -p -t Delaying cron job up to $UPDATEMAXDELAY seconds
 perl -e "sleep int(rand($UPDATEMAXDELAY));"
 # JKF Fetch
 #echo Fetching
 cd /etc/mail/spamassassin
 rm -f
 wget -O > /dev/null 2>&1
 if [ "$?" = "0" ]; then
 #echo It completed and fetched something
 if ( tail -10 | grep -q '^#.*EOF' ); then
 # echo It succeeded so make a backup
 cp -f
 echo ERROR: Could not find EOF marker
 cp -f
 echo It failed to complete properly
 cp -f
 #echo Reloading MailScanner and SpamAssassin configuration rules
 /etc/init.d/mailscanner reload > /dev/null 2>&1
 if [ $? != 0 ] ; then
 echo "MailScanner reload failed - Retrying..."
 /etc/init.d/mailscanner force-reload
 if [ $? = 0 ] ; then
 echo "MailScanner reload succeeded."
 echo "Stopping MailScanner..."
 /etc/init.d/mailscanner stop
 echo "Waiting for a minute..."
 perl -e "sleep 60;"
 echo "Attemping to start MailScanner..."
 /etc/init.d/mailscanner start

Make it executable:

chmod +x /etc/cron.daily/


15. ScamNailer

vi /opt/MailScanner/bin/update_scamnailer

with the following content:

# (c) 2009 Julian Field ‹›
#          Version 2.05
# This file is the copyright of Julian Field ‹›,
# and is made freely available to the entire world. If you intend to
# make any money from my work, please contact me for permission first!
# If you just want to use this script to help protect your own site's
# users, then you can use it and change it freely, but please keep my
# name and email address at the top.
use strict;
use File::Temp;
use Net::DNS::Resolver;
use LWP::UserAgent;
use FileHandle;
use DirHandle;
# Filename of list of extra addresses you have added, 1 per line.
# Does not matter if this file does not exist.
my $local_extras = '/etc/MailScanner/ScamNailer.local.addresses';
# Output filename, goes into SpamAssassin. Can be over-ridden by just
# adding the output filename on the command-line when you run this script.
my $output_filename = '/etc/mail/spamassassin/';
# This is the location of the cache used by the DNS-based updates to the
# phishing database.
my $emailscurrent = '/var/cache/ScamNailer/';
# Set this next value to '' if ou are not using MailScanner.
# Or else change it to any command you need to run after updating the
# SpamAssassin rules, such as '/sbin/service spamd restart'.
my $mailscanner_restart = '/etc/init.d/mailscanner force-reload';
# The SpamAssassin score to assign to the final rule that fires if any of
# the addresses hit. Multiple hits don't increase the score.
# I use a score of 0.1 with this in MailScanner.conf:
# SpamAssassin Rule Actions = SCAMNAILER=>not-deliver,store,forward, header "X-Anti-Phish: Was to _TO_"
# If you don't understand that, read the section of MailScanner.conf about the
# "SpamAssassin Rule Actions" setting.
my $SA_score = 4.0;
# How complicated to make each rule. 20 works just fine, leave it alone.
my $addresses_per_rule = 20;
my $quiet = 1 if grep /quiet|silent/, @ARGV;
if (grep /help/, @ARGV) {
  print STDERR "Usage: $0 [ --quiet ]\n";
my($count, $rule_num, @quoted, @addresses, @metarules);
#local(*YPCAT, *SACF);
$output_filename = $ARGV[0] if $ARGV[0]; # Use filename if they gave one
# First do all the addresses we read from DNS and anycast and only do the
# rest if needed.
if (GetPhishingUpdate()) {
open(SACF, ">$output_filename") or die "Cannot write to $output_filename $!";
print SACF "# ScamNailer rules\n";
print SACF "# Generated by $0 at " . `date` . "\n";
# Now read all the addresses we generated from GetPhishingUpdate().
open(PHISHIN, $emailscurrent . 'phishing.emails.list')
  or die "Cannot read " . $emailscurrent . "phishing.emails.list, $!\n";
<phishin>) {
  next if /^\s*$/;
  next unless /^[^@]+\@[^@]+$/;
  push @addresses, $_; # This is for the report
  s/[^0-9a-z_-]/\\$&/ig; # Quote every non-alnum
  s/\\\*/[0-9a-z_.+-]*/g; # Unquote any '*' characters as they map to .*
  # Find all the numbers just before the @ and replace with them digit wildcards
  #push @quoted, '(' . $_ . ')';
  push @quoted, $_;
  if ($count % $addresses_per_rule == 0) {
    # Put them in 10 addresses at a time
    # Put a start-of-line/non-address character at the front,
    # and an end-of-line /non-address character at the end.
    print SACF "header __SCAMNAILER_H$rule_num ALL =~ /" .
               '(^|[;:\s])(?:' . join('|',@quoted) . ')($|[^0-9a-z_.+-])' .
    push @metarules, "__SCAMNAILER_H$rule_num";
    print SACF "uri __SCAMNAILER_B$rule_num /" .
               '^mailto:(?:' . join('|',@quoted) . ')$' .
    push @metarules, "__SCAMNAILER_B$rule_num";
    undef @quoted;
    undef @addresses;
close PHISHIN;
# Put in all the leftovers, if any
if (@quoted) {
    print SACF "header __SCAMNAILER_H$rule_num ALL =~ /" .
               '(^|[;:\s])(?:' . join('|',@quoted) . ')($|[^0-9a-z_.+-])' .
    push @metarules, "__SCAMNAILER_H$rule_num";
    print SACF "uri __SCAMNAILER_B$rule_num /" .
               '^mailto:(?:' . join('|',@quoted) . ')$' .
    push @metarules, "__SCAMNAILER_B$rule_num";
print SACF "\n# ScamNailer combination rule\n\n";
print SACF "meta     SCAMNAILER " . join(' || ',@metarules) . "\n";
print SACF "describe SCAMNAILER Mentions a spear-phishing address\n";
print SACF "score    SCAMNAILER $SA_score\n\n";
print SACF "# ScamNailer rules ($count) END\n";
close SACF;
# And finally restart MailScanner to use the new rules
$mailscanner_restart .= " >/dev/null 2>&1" if $quiet;
system($mailscanner_restart) if $mailscanner_restart;
exit 0;
sub GetPhishingUpdate {
  my $cache = $emailscurrent . 'cache/';
  my $status = $emailscurrent . 'status';
  my $urlbase = "";
  my $target= $emailscurrent . 'phishing.emails.list';
  my $query="";
  my $baseupdated = 0;
  if (! -d $emailscurrent) {
    print "Working directory is not present - making....." unless $quiet;
    mkdir ($emailscurrent) or die "failed";
    print " ok!\n" unless $quiet;
  if (! -d $cache) {
    print "Cache directory is not present - making....." unless $quiet;
    mkdir ($cache) or die "failed";
    print " ok!\n" unless $quiet;
  if (! -s $target) {
    open (FILE,">$target") or die
      "Failed to open target file so creating a blank file";
    print FILE "# Wibble";
    close FILE;
  } else {
    # So that clean quarantine doesn't delete it!
    utime(time(), time(), $emailscurrent);
  my ($status_base, $status_update);
  if (! -s $status) {
    print "This is the first run of this program.....\n" unless $quiet;
  } else {
    print "Reading status from $status\n" unless $quiet;
    open(STATUS_FILE, $status) or die "Unable to open status file\n";
    my $line=<status_file>;
    close (STATUS_FILE);
    # The status file is text.text
    if ($line =~ /^(.+)\.(.+)$/) {
  print "Checking that $cache$status_base exists..." unless $quiet;
  if ((! -s "$cache$status_base") && (!($status_base eq "-1"))) {
    print " no - resetting....." unless $quiet;
  print " ok\n" unless $quiet;
  print "Checking that $cache$status_base.$status_update exists..." unless $quiet;
  if ((! -s "$cache$status_base.$status_update") && ($status_update>0)) {
    print " no - resetting....." unless $quiet;
  print " ok\n" unless $quiet;
  my $currentbase = -1;
  my $currentupdate = -1;
  # Lets get the current version
  my $res = Net::DNS::Resolver->new();
  my $RR = $res->query($query, 'TXT');
  my @result;
  if ($RR) {
    foreach my $rr ($RR->answer) {
      my $text = $rr->rdatastr;
      if ($text =~ /^"emails\.(.+)\.(.+)"$/) {
  die "Failed to retrieve valid current details\n" if $currentbase eq "-1";
  print "I am working with: Current: $currentbase - $currentupdate and Status: $status_base - $status_update\n" unless $quiet;
  my $generate=0;
  # Create a user agent object
  my $ua = LWP::UserAgent->new;
  $ua->agent("UpdateBadPhishingSites/0.1 ");
  # Patch from
  if (!($currentbase eq $status_base)) {
    print "This is base update\n" unless $quiet;
    $status_update = -1;
    $baseupdated = 1;
    # Create a request
    #print "Getting $urlbase . $currentbase\n" unless $quiet;
    my $req = HTTP::Request->new(GET => $urlbase.$currentbase);
    # Pass request to the user agent and get a response back
    my $res = $ua->request($req);
    # Check the outcome of the response
    if ($res->is_success) {
      open (FILE, ">$cache/$currentbase") or die "Unable to write base file ($cache/$currentbase)\n";
      print FILE $res->content;
      close (FILE);
    } else {
      warn "Unable to retrieve $urlbase.$currentbase :".$res->status_line, "\n";
  } else {
    print "No base update required\n" unless $quiet;
  # Now see if the sub version is different
  if (!($status_update eq $currentupdate)) {
    my %updates=();
    print "Update required\n" unless $quiet;
    if ($currentupdate‹$status_update) {
      # In the unlikely event we roll back a patch - we have to go from the base
      print "Error!: $currentupdate<$status_update\n" unless $quiet;
      $generate = 1;
      $status_update = 0;
    # If there are updates avaliable and we haven't donloaded them
    # yet we need to reset the counter
    if ($currentupdate>0) {
      if ($status_update<1) {
      my $i;
      # Loop through each of the updates, retrieve it and then add
      # the information into the update array
      for ($i=$status_update+1; $i<=$currentupdate; $i++) {
        print "Retrieving $urlbase$currentbase.$i\n" unless $quiet;
        #print "Getting $urlbase . $currentbase.$i\n" unless $quiet;
        my $req = HTTP::Request->new(GET => $urlbase.$currentbase.".".$i);
        my $res = $ua->request($req);
        warn "Failed to retrieve $urlbase$currentbase.$i"
          unless $res->is_success;
        my $line;
        foreach $line (split("\n", $res->content)) {
          # Is it an addition?
          if ($line =~ /^\> (.+)$/) {
            if (defined $updates{$1}) {
              if ($updates{$1} eq "<") {
                delete $updates{$1};
            } else {
          # Is it an removal?
          if ($line =~ /^\< (.+)$/) {
            if (defined $updates{$1}) {
              if ($updates{$1} eq ">") {
                delete $updates{$1};
            } else {
      # OK do we have a previous version to work from?
      if ($status_update>0) {
        # Yes - we open the most recent version
        open (FILE, "$cache$currentbase.$status_update") or die
          "Unable to open base file ($cache/$currentbase.$status_update)\n";
      } else {                        # No - we open the the base file
        open (FILE, "$cache$currentbase") or die
          "Unable to open base file ($cache/$currentbase)\n";
      # Now open the new update file
      print "$cache$currentbase.$currentupdate\n" unless $quiet;
      open (FILEOUT, ">$cache$currentbase.$currentupdate") or die
        "Unable to open new base file ($cache$currentbase.$currentupdate)\n";
      # Loop through the base file (or most recent update)
      while (<file>) {
        my $line=$_;
        if (defined ($updates{$line})) {
          # Does the line need removing?
          if ($updates{$line} eq "<") {
          # Is it marked as an addition but already present?
          elsif ($updates{$line} eq ">") {
            delete $updates{$line};
        print FILEOUT $line."\n";
      close (FILE);
      my $line;
      # Are there any additions left
      foreach $line (keys %updates) {
        if ($updates{$line} eq ">") {
          print FILEOUT $line."\n" ;
      close (FILEOUT);
  # Changes have been made
  if ($generate) {
    print "Updating live file $target\n" unless $quiet;
    my $file="";
    if ($currentupdate>0) {
    } else {
    if ($file eq "") {
      die "Unable to work out file!\n";
    system ("mv -f $target $target.old");
    system ("cp $file $target");
    open(STATUS_FILE, ">$status") or die "Unable to open status file\n";
    print STATUS_FILE "$currentbase.$currentupdate\n";
    close (STATUS_FILE);
  my $queuedir = new DirHandle;
  my $file;
  my $match1 = "^" . $currentbase . "\$";
  my $match2 = "^" . $currentbase . "." . $currentupdate . "\$";
  $queuedir->open($cache) or die "Unable to do clean up\n";
  while(defined($file = $queuedir->read())) {
    next if $file eq '.' || $file eq '..';
    next if $file =~ /$match1/;
    next if $file =~ /$match2/;
    print "Deleting cached file: $file.... " unless $quiet;
    unlink($cache.$file) or die "failed";
    print "ok\n" unless $quiet;

Make it executable:

chmod +x /opt/MailScanner/bin/update_scamnailer

 Add it to cron:

@daily /usr/sbin/update_scamnailer &> /dev/null #Update Scamnailer


16. Firewalling the SpamSnake with Firehol

Firehol is a stateful iptables packet filtering firewall configurator. It is abstracted, extensible, easy and powerful. It can handle any kind of firewall, but most importantly, it gives you the means to configure it, the same way you think of it.

Install Firehol:

apt-get install firehol -y

vi /etc/default/firehol

and change the following:


vi /etc/firehol/firehol.conf

and add the following:

version 5
   # Accept all client traffic on any interface
   interface any internet
   protection strong
   server "icmp ping ICMP ssh http https telnet webmin dns dcc echo smtp" accept
 client all accept

This filters all incoming connections that are not related to the above services. If you want to be less polite, you can drop them by adding the following after 'protection strong': policy drop

vi /usr/sbin/get-iana

with the following content:

 # $Id:,v 1.13 2010/09/12 13:55:00 jcb Exp $
   # $Log:,v $
   # Revision 1.13 2010/09/12 13:55:00 jcb
   # Updated for latest IANA reservations format.
   # Revision 1.12 2008/03/17 22:08:43 ktsaou
   # Updated for latest IANA reservations format.
   # Revision 1.11 2007/06/13 14:40:04 ktsaou
   # *** empty log message ***
   # Revision 1.10 2007/05/05 23:38:31 ktsaou
   # Added support for external definitions of:
   # in files under the same name in /etc/firehol/.
   # Only RESERVED_IPS is mandatory (firehol will complain if it is not  there,
   # but it will still work without it), and is also the only file that  firehol
   # checks how old is it. If it is 90+ days old, firehol will complain  again.
   # Changed the supplied script to generate the RESERVED_IPS  file.
   # FireHOL also instructs the user to use this script if the file is  missing
   # or is too old.
   # Revision 1.9 2007/04/29 19:34:11 ktsaou
   # *** empty log message ***
   # Revision 1.8 2005/06/02 15:48:52 ktsaou
   # Allowed to be in RESERVED_IPS
   # Revision 1.7 2005/05/08 23:27:23 ktsaou
   # Updated RESERVED_IPS to current IANA reservations.
   # Revision 1.6 2004/01/10 18:44:39 ktsaou
   # Further optimized and reduced PRIVATE_IPS using:
   # The supplied uses .aggregate. if it finds it in the path.
   # (aggregate is the name of this program when installed on Gentoo)
   # Revision 1.5 2003/08/23 23:26:50 ktsaou
   # Bug #793889:
   # Change #!/bin/sh to #!/bin/bash to allow FireHOL run on systems that
   # bash is not linked to /bin/sh.
   # Revision 1.4 2002/10/27 12:44:42 ktsaou
   # CVS test
   # Program that downloads the IPv4 address space allocation by IANA
   # and creates a list with all reserved address spaces.
 # The program will match all rows in the file which start with a  number, have a slash,
   # followed by another number, for which the following pattern will also  match on the
   # same rows
 # which rows that are matched by the above, to ignore
   # (i.e. not include them in RESERVED_IPS)?
   #IANA_IGNORE="(Multicast|Private use|Loopback|Local  Identification)"
 AGGREGATE="`which aggregate 2>/dev/null`"
   if [ -z "${AGGREGATE}" ]
   AGGREGATE="`which aggregate 2>/dev/null`"
 if [ -z "${AGGREGATE}" ]
   echo >&2
   echo >&2
   echo >&2 "WARNING"
   echo >&2 "Please install 'aggregate' to shrink the list of  IPs."
   echo >&2
   echo >&2
 echo >&2
   echo >&2 "Fetching IANA IPv4 Address Space, from:"
   echo >&2 "${IPV4_ADDRESS_SPACE_URL}"
   echo >&2
 wget -O - -proxy=off "${IPV4_ADDRESS_SPACE_URL}" |\
   egrep " *[0-9]+/[0-9]+.*${IANA_RESERVED}" |\
   egrep -vi "${IANA_IGNORE}" |\
   sed -e 's:^ *\([0-9]*/[0-9]*\).*:\1:' |\
 while IFS="/" read range net
   if [ ! $net -eq 8 ]
   echo >&2 "Cannot handle network masks of $net bits  ($range/$net)"
 first=`echo $range | cut -d '-' -f 1`
   first=`expr $first + 0`
   last=`echo $range | cut -d '-' -f 2`
   last=`expr $last + 0`
   while [ ! $x -gt $last ]
   # test $x -ne 127 && echo "$x.0.0.0/$net"
   echo "$x.0.0.0/$net"
   x=$[x + 1]
   ) | \
   if [ ! -z "${AGGREGATE}" -a -x "${AGGREGATE}" ]
   ) >"${tempfile}"
 echo >&2
   echo >&2
   printf "RESERVED_IPS=\""
   for x in `cat ${tempfile}`
   i=$[i + 1]
   printf "${x} "
   printf "\"\n"
 if [ $i -eq 0 ]
   echo >&2
   echo >&2
   echo >&2 "Failed to find reserved IPs."
   echo >&2 "Possibly the file format has been changed, or I  cannot fetch the URL."
   echo >&2
 rm -f ${tempfile}
   exit 1
   echo >&2
   echo >&2
   echo >&2 "Differences between the fetched list and the list  installed in"
   echo >&2 "/etc/firehol/RESERVED_IPS:"
 echo >&2 "# diff /etc/firehol/RESERVED_IPS  ${tempfile}"
   diff /etc/firehol/RESERVED_IPS ${tempfile}
 if [ $? -eq 0 ]
   echo >&2
   echo >&2 "No  differences found."
   echo >&2
 rm -f ${tempfile}
   exit 0
 echo >&2
   echo >&2
   echo >&2 "Would you like to save this list to  /etc/firehol/RESERVED_IPS"
   echo >&2 "so that FireHOL will automatically use it from  now on?"
   echo >&2
   while [ 1 = 1 ]
   printf >&2 "yes or no > "
   read x
 case "${x}" in
   yes) cp -f /etc/firehol/RESERVED_IPS /etc/firehol/RESERVED_IPS.old  2>/dev/null
   cat "${tempfile}" >/etc/firehol/RESERVED_IPS || exit 1
   echo >&2 "New RESERVED_IPS written to  '/etc/firehol/RESERVED_IPS'."
   echo "Firehol will now be restart"
   sleep 3
   /etc/init.d/firehol restart
   echo >&2 "Saved nothing."
 *) echo >&2 "Cannot understand '${x}'."
 rm -f ${tempfile}

Make it executable:

chmod +x /usr/sbin/get-iana

vi /usr/sbin/update-iana

with the following content:

 /usr/sbin/get-iana  < /etc/firehol/get-iana-answerfile

Make it excutable:

chmod +x /usr/sbin/update-iana

vi /etc/firehol/get-iana-answerfile

with the following content:


Run the script to update RESERVED_IPS:


*Note: Now your server is set up to only accept connections for the services you allowed.

Add it to cron:

@monthly /usr/sbin/update-iana &> /dev/null #Update firehol reserved ips


17. Apply Relay Recipients (Optional)

The following directions are meant for people using Microsoft Exchange 2000 or Microsoft Exchange 2003.

This page describes how to configure your mail gateway to periodically get a list of valid recipient email addresses from your Exchange system. By doing this, you can configure your server to automatically reject any email addressed to invalid addresses. This will reduce the load on your exchange server, since it no longer has to process non-delivery reports, and it will reduce the load on your postfix server since it won't have to perform spam and virus scanning on the message.

Install the perl module Net::LDAP:

perl -MCPAN -e shell
install Net::LDAP

vi /usr/bin/ with the following content:
#!/usr/bin/perl -T -w
   # This script will pull all users' SMTP addresses from your Active Directory
   # (including primary and secondary email addresses) and list them in the
   # format " OK" which Postfix uses with relay_recipient_maps.
   # Be sure to double-check the path to perl above.
   # This requires Net::LDAP to be installed.  To install Net::LDAP, at a shell
   # type "perl -MCPAN -e shell" and then "install Net::LDAP"
   use Net::LDAP;
   use Net::LDAP::Control::Paged;
   use Net::LDAP::Constant ( "LDAP_CONTROL_PAGED" );
   # Enter the path/file for the output
   $VALID = "/etc/postfix/relay_recipients";
   open VALID, ">$VALID" or die "CANNOT OPEN $VALID $!";
   # Enter the FQDN of your Active Directory domain controllers below
   # Enter the LDAP container for your userbase.
   # The syntax is CN=Users,dc=example,dc=com
   # This can be found by installing the Windows 2000 Support Tools
   # then running ADSI Edit.
   # In ADSI Edit, expand the "Domain NC []" &
   # you will see, for example, DC=example,DC=com (this is your base).
   # The Users Container will be specified in the right pane as
   # CN=Users depending on your schema (this is your container).
   # You can double-check this by clicking "Properties" of your user
   # folder in ADSI Edit and examining the "Path" value, such as:
   # LDAP://,DC=example,DC=com
   # which would be $hqbase="cn=Users,dc=example,dc=com"
   # Note:  You can also use just $hqbase="dc=example,dc=com"
   # Enter the username & password for a valid user in your Active Directory
   # with username in the form cn=username,cn=Users,dc=example,dc=com
   # Make sure the user's password does not expire.  Note that this user
   # does not require any special privileges.
   # You can double-check this by clicking "Properties" of your user in
   # ADSI Edit and examining the "Path" value, such as:
   # LDAP://,CN=Users,DC=example,DC=com
   # which would be $user="cn=user,cn=Users,dc=example,dc=com"
   # Note: You can also use the UPN login: "user\"
   # Connecting to Active Directory domain controllers
   $ldap = Net::LDAP->new($dc1) or
   if ($noldapserver == 1)  {
   $ldap = Net::LDAP->new($dc2) or
   die "Error connecting to specified domain controllers $@ \n";
   $mesg = $ldap->bind ( dn => $user,
   password =>$passwd);
   if ( $mesg->code()) {
   die ("error:", $mesg->error_text((),"\n"));
   # How many LDAP query results to grab for each paged round
   # Set to under 1000 for Active Directory
   $page = Net::LDAP::Control::Paged->new( size => 990 );
   @args = ( base     => $hqbase,
   # Play around with this to grab objects such as Contacts, Public Folders, etc.
   # A minimal filter for just users with email would be:
   # filter => "(&(sAMAccountName=*)(mail=*))"
   filter => "(& (mailnickname=*) (| (&(objectCategory=person)
   (objectCategory=group)(objectCategory=publicFolder) ))",
   control  => [ $page ],
   attrs  => "proxyAddresses",
   my $cookie;
   while(1) {
   # Perform search
   my $mesg = $ldap->search( @args );
   # Filtering results for proxyAddresses attributes
   foreach my $entry ( $mesg->entries ) {
   my $name = $entry->get_value( "cn" );
   # LDAP Attributes are multi-valued, so we have to print each one.
   foreach my $mail ( $entry->get_value( "proxyAddresses" ) ) {
   # Test if the Line starts with one of the following lines:
   # proxyAddresses: [smtp|SMTP]:
   # and also discard this starting string, so that $mail is only the
   # address without any other characters...
   if ( $mail =~ s/^(smtp|SMTP)://gs ) {
   print VALID $mail." OK\n";
   # Only continue on LDAP_SUCCESS
   $mesg->code and last;
   # Get cookie from paged control
   my($resp)  = $mesg->control( LDAP_CONTROL_PAGED ) or last;
   $cookie    = $resp->cookie or last;
   # Set cookie in paged control
   if ($cookie) {
   # We had an abnormal exit, so let the server know we do not want any more
   $ldap->search( @args );
   # Also would be a good idea to die unhappily and inform OP at this point
   die("LDAP query unsuccessful");
   # Add additional restrictions, users, etc. to the output file below.
   #print VALID "user\ OK\n";
   #print VALID "user\ 550 User unknown.\n";
   #print VALID " 550 User does not exist.\n";
 close VALID;;

Make it executable:

chmod +x /usr/bin/

Edit the file to customize it for your specific domain. Since the file is read only, you will need to use :w! to save the file in vi.

1. Set $dc1 and $dc2 to the fully qualified domain names or IP addresses of 2 of your domain controllers.
2. Set $hqbase equal to the LDAP path to the container or organizational unit which holds the email accounts for which you wish to get the email addresses.
3. Set $user and $passwd to indicate which user account should be used to access this information. This account only needs to be a member of the domain, so it would be a good idea to setup an account specifically for this.

Try running the script. If it works correctly, it will create /etc/postfix/relay_recipients.

*Note: If your postfix server is separated from your active directory controllers by a firewall, you will need to open TCP port 389 from the postfix server to the ADCs.

At this point, you may want to edit /etc/postfix/relay_recipients and edit out any unwanted email addresses as this script imports everything.

Postmap the file to create the hash db:

postmap /etc/postfix/relay_recipients
postfix reload

Finally, you may want to set up a cron job to periodically update and build the /etc/postfix/relay_recipients.db file. You can set up a script called /usr/bin/ (Optional)

vi /usr/bin/

with the following content:

postmap /etc/postfix/relay_recipients
postfix reload

Make it executable:

chmod +x /usr/bin/

Don't forget to make sure the following is in your /etc/postfix/ file:

relay_recipient_maps = hash:/etc/postfix/relay_recipients

Add it to cron:

30 2 * * * /usr/bin/ #syncronize relay_recipients with Active Directory addresses

*Note: This cron job will run every day at 2:30 AM to update the database file. You may want to run yours more frequently or not depending on how often you add new email users to your system.


18. Install Webmin (Optional):

vi /etc/apt/sources.list

and add the following:

deb sarge contrib
deb sarge contrib

Install the GPG Key along with the package:

apt-key add jcameron-key.asc
apt-get update
apt-get install webmin -y

*Note: All dependencies should be resolved automatically.

Now to access webmin open your browser and enter: http://serverip:10000/


19. Automatically Add A Disclaimer To Outgoing Emails With alterMIME (Optional)

Install alterMIME:

apt-get install altermime -y

Next we create the user filter with the home directory /var/spool/filter - alterMIME will be run as that user:

useradd -r -c "Postfix Filters" -d /var/spool/filter filter
mkdir /var/spool/filter
chown filter:filter /var/spool/filter
chmod 750 /var/spool/filter

Afterwards we create the script /etc/postfix/disclaimer which executes alterMIME. Ubuntu's alterMIME package comes with a sample script that we can simply copy to /etc/postfix/disclaimer:

cp /usr/share/doc/altermime/examples/ /etc/postfix/disclaimer
chgrp filter /etc/postfix/disclaimer
chmod 750 /etc/postfix/disclaimer

Now the problem with this script is that it doesn't distinguish between incoming and outgoing emails - it simply adds a disclaimer to all mails. Typically you want disclaimers only for outgoing emails, and even then not for all sender addresses. Therefore I've modified the /etc/postfix/disclaimer script a little bit - we'll come to that in a minute.

vi /etc/postfix/disclaimer_addresses

which holds all sender email addresses (one per line) for which alterMIME should add a disclaimer:


and modify it as follows (I have marked the parts that I've changed):

# Localize these.
####### Changed From Original Script #######
####### Changed From Original Script END #######
# Exit codes from <sysexits.h>
# Clean up when done or when aborting.
trap "rm -f in.$$" 0 1 2 3 15
# Start processing.
cd $INSPECT_DIR || { echo $INSPECT_DIR does not exist; exit
cat >in.$$ || { echo Cannot save mail to file; exit $EX_TEMPFAIL; }
####### Changed From Original Script #######
# obtain From address
from_address=`grep -m 1 "From:" in.$$ | cut -d "<" -f 2 | cut -d ">" -f 1`
if [ `grep -wi ^${from_address}$ ${DISCLAIMER_ADDRESSES}` ]; then
  /usr/bin/altermime --input=in.$$ \
                   --disclaimer=/etc/postfix/disclaimer.txt \
                   --disclaimer-html=/etc/postfix/disclaimer.txt \
                   --xheader="X-Copyrighted-Material: Please visit" || \
                    { echo Message content rejected; exit $EX_UNAVAILABLE; }
####### Changed From Original Script END #######
$SENDMAIL "$@" <in.$$
exit $?

Next we need the text file /etc/postfix/disclaimer.txt which holds our disclaimer text. Ubuntu's alterMIME package comes with a sample text that we can use for now (of course, you can modify it if you like):

cp /usr/share/doc/altermime/examples/disclaimer.txt /etc/postfix/disclaimer.txt

Finally we have to tell Postfix that it should use the /etc/postfix/disclaimer script to add disclaimers to outgoing emails.


and add -o content_filter=dfilt: to the smtp line:

# Postfix master process configuration file.  For details on the format
# of the file, see the master(5) manual page (command: "man 5 master").
# ==========================================================================
# service type  private unpriv  chroot  wakeup  maxproc command + args
#               (yes)   (yes)   (yes)   (never) (100)
# ==========================================================================
smtp      inet  n       -       -       -       -       smtpd
   -o content_filter=dfilt:

At the end of the same file, add the following two lines:

dfilt     unix    -       n       n       -       -       pipe
    flags=Rq user=filter argv=/etc/postfix/disclaimer -f ${sender} -- ${recipient}

Restart Postfix afterwards:

/etc/init.d/postfix restart

That's it! Now a disclaimer should be added to outgoing emails sent from the addresses listed in /etc/postfix/disclaimer_addresses.


20. Screenshots


You should now have a completely working SpamSnake.

Share this page:

43 Comment(s)

Add comment


From: Marco at: 2013-03-01 17:38:32

don't you need to add spf to  I got errors on policy-spf_time_limit until i added

policy-spf    unix  -     n     n     -     -   spawn

user=nobody argv=/usr/bin/policyd-spf

to /etc/postfix/


From: at: 2013-03-19 20:26:59

spf is added to later in the tutorial, however the line in the script to setup has an error. It says:

postconf -e "spf_policy = check_policy_service unix:private/policy" 

it should be:

postconf -e "spf_policy = check_policy_service unix:private/policy-spf"

 also the line about rbl policy is commented out with #, which postfix didn't like when I ran the script.  I just added it manually when i set up rbl.

From: Anonymous at: 2013-09-18 03:20:37

In the edit for postfix, the next line says:

 vi /usr/src/ 

which makes no sense whatsoever ??

 Also, the username and password for mysql are not referenced elsewhere - I assume the user should be root, and the associated password?


From: at: 2014-01-24 16:15:31

You are creating a script to edit, so the command "vi /usr/src/" creates an empty file then you add the text indicated to the file.  Make sure to change the red text to match your setup.

 There is no need for your mysql password here.  You are simply adding the settings to postfix.  Make sure you remember the password you select for the baruwa db.  You will need to make sure it matches the password you set when you create that db.

From: at: 2013-11-25 21:58:53

I have been trying to set this up, and having problems. Even opened a thread up in forums:, however no reply so far.

 Is there anyone who can help to complete the setup ?

From: at: 2014-01-24 16:06:38

Rocky changed jobs so isn't as available as he once was.  What problems are you having?

From: at: 2014-02-09 19:38:27


Since i didn't get a notice on a reply, I didn't notice your post even. I had posted by issue on the forums ->, to which Rocky did reply. I can understand his unavailability. Is there an updated guide which i can use ?

I wish to do inbound as well as outbound spam filtering, separate ofcourse.


From: JR at: 2012-12-21 20:40:34

In addition to the instructions provided here, it was also necessary to create the MailScanner incoming directory:

mkdir /var/spool/MailScanner/incoming
chown postfix:www-data /var/spool/MailScanner/incoming

Mailscanner started just fine after this...  



From: Anonymous at: 2013-03-18 19:09:08

dcc package dcc-common_1.3.130-0ubuntu1~ppa2~quantal1 as well as client seems to be unavailable.  I installed with 1.3.144-0ubuntu1~ppa1~precise1 which seems to be working fine. So to install dcc try this:

 wget$(uname -m | sed -e 's/x86_64/amd64/' -e 's/i686/i386/').deb && dpkg -i dcc-common_1.3.144-0ubuntu1~ppa1~precise1_$(uname -m | sed -e 's/x86_64/amd64/' -e 's/i686/i386/').deb
wget$(uname -m | sed -e 's/x86_64/amd64/' -e 's/i686/i386/').deb && dpkg -i dcc-client_1.3.144-0ubuntu1~ppa1~precise1_$(uname -m | sed -e 's/x86_64/amd64/' -e 's/i686/i386/').deb

NB I didn't run this by Rocky...  but it is working fine for me.



From: Andy at: 2013-04-08 02:56:55

Another question on this...  Is there any particular reason to install the "quantal" version on "precise", and why are we doing it this way instead of adding the ppa and installing it via apt?  Not criticising, just asking the question, because I'm sure Rocky has a good reason, and I want to know :)



From: newbie at: 2013-04-15 04:17:06


DCC still failed to start based on your sources;

I got this unavailable error:

root@unknown:/tmp#  wget$(uname -m | sed -e 's/x86_64/amd64/' -e 's/i686/i386/').deb && dpkg -i dcc-common_1.3.144-0ubuntu1~ppa1~precise1_$(uname -m | sed -e 's/x86_64/amd64/' -e 's/i686/i386/').deb
--2013-04-15 12:06:21--
Resolving (
Connecting to (||:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2013-04-15 12:06:21 ERROR 404: Not Found.

Please help , many thanks:)

From: at: 2013-09-27 15:52:34

ya they changed the file name again.  In this case you just need to change the instances of ppa1 in the name to ppa2.  You can always go to to see which version of the files are currently available.

From: jamesloker at: 2013-07-16 15:21:48

If you receive this error when running the spamassassin test run the command:

#sa-learn --sync

Then try the spamassassin test again 

From: Anonymous at: 2013-11-22 21:05:50

you might need to run this  

 apt-get install libssl-dev 

 before you can install Crypt::OpenSSL::RSA

From: at: 2014-01-24 16:35:13

I had the same issue.  I also had to install libmysqlclient-dev before I could install DBD::mysql

From: at: 2014-02-14 07:16:25

Create the following to prevent an error in a lint test:

mkdir /var/www/.spamassassin

But not have /var/www now.



From: at: 2014-03-10 15:26:24

For this part don't forget to run  

apt-get install libmysqlclient-dev


From: Real at: 2012-12-19 15:10:55

Same probléme with the FTP auth 

 But very good How to, more details, nice!

From: Dan at: 2012-12-19 05:09:16      


The above (or any variant) is not public.  If it is your ftp, please set it to anonymous ftp or provide another way to download the file.  Thanks for all the effort to put this together.

From: JhonKa at: 2012-12-27 23:30:08


 I followed all your instructions to a T and i'm having an error when I start up nginx I get this error

 /etc/init.d/uwsgi restart && /etc/init.d/nginx restart

* Restarting app server(s) uwsgi [ OK ] 

Restarting nginx: nginx: [emerg] unexpected end of file, expecting ";" or "}" in /etc/nginx/sites-enabled/baruwa.conf:7
nginx: configuration file /etc/nginx/nginx.conf test failed

I even used your SN packages from your google docs account.  Could you help point me in the right direction?

From: at: 2012-12-28 08:40:20

What is on line 7 of /etc/nginx/sites-enabled/baruwa.conf

From: at: 2012-12-28 15:50:47

You can ignore this!  I found out that the baruwa.ini and .conf were switched in the uwsgi and nginx files.

From: at: 2012-12-28 19:41:11

Fixed, nginx and uwsgi config files were mixed up.

From: Doug Thomas at: 2013-01-06 21:34:49

Can't use an undefined value as an ARRAY reference at /opt/MailScanner/lib/MailScanner/ line 2588, <DATA> line 500.

Is anyone else getting this error? 

From: at: 2013-04-09 09:48:39

 I do.


Did you find the solution?

From: Andy at: 2013-06-06 00:52:19

I have this too. My guess is it's something in the MailScanner.conf, but line 500 is just a comment, and if you grep out all the comments there's less than 500 lines.  I'm still looking into it, but if anyone could shed light it would be appreciated :)

From: Andy at: 2013-06-06 01:25:37

I found this occurred because I hadn't commented out the following lines:

#Inline HTML Signature = htmlsigs.customize
#Inline Text Signature = textsigs.customize
#Signature Image Filename = sigimgfiles.customize
#Signature Image Filename = sigimgs.customize

as described above, AND I hadn't set up signatures properly (I'm not really sure how to do this at this stage).  So I commented out the lines, ran /opt/MailScanner/bin/MailScanner --lint, and things seem to be working a lot better.

Hope that helps someone.


From: SHL at: 2013-01-27 16:08:17


 I believe i've followed to guide but my SpamSnake isn't working when rbl_policy and spf_policy is added here: smtpd_recipient_restrictions = permit_mynetworks, permit_sasl_authenticated, reject_unknown_recipient_domain, reject_unauth_destination, whitelist_policy, grey_policy, permit

I'm getting these errors:

postfix/smtpd[6073]: warning: connect to private/policy: No such file or directory

warning: unknown smtpd restriction: "rbl_policy"

Could someone point me in the right direction? :) 

From: monopati at: 2013-03-24 18:10:07

I get 

 # baruwa-admin migrate
Unknown command: 'migrate'
Type 'baruwa-admin help' for usage.


 # baruwa-admin createsuperuser
Unknown command: 'createsuperuser'
Type 'baruwa-admin help' for usage.

Is there any replacement for these commands?


From: kup at: 2013-04-23 07:25:25

Rocky, thank you for this great howto. Please let us know, how to migrate Baruwa frontend to next version (2.0), if you have some way to do it.

Many thanks!

From: at: 2013-08-06 17:39:07

There is no direct upgrade path to 2.0.  Also, 2.0 only currently supports Exim for an MTA.   I did get it working rather poorly with Postfix on a test server, but I wouldn't put it into production.  I would wait until Postfix is supported.

From: kec at: 2013-04-23 07:34:22

For the first Rocky, I have to say - thank you for this great guide. I also want to ask you, if you have a way, how to migrate Baruwa frontend to next version (2.0)?

From: kup at: 2013-04-23 10:30:21

Just one hint for those who want to see Baruwa translated:

#apt-get install gettext
#baruwa-admin compilemessages

Event. edit main language of Baruwa:

#vi /etc/baruwa/




From: Mikacom at: 2013-11-21 21:20:10

Populating the database has changed to

#baruwa-admin migrate djcelery

From: Anonymous at: 2013-04-19 09:49:15

Nice guide, some minor problems at start but works perfect now !
Thanks alot ! ! !


From: Anonymous at: 2013-05-20 15:22:41


do not plan on virtual images to be issued?


From: e3fi389 at: 2013-05-27 11:02:40

Little (but interest!) error in script, correct line 68:

  attrs  => ["proxyAddresses"],


From: at: 2014-05-04 06:50:09


Great tutorial.

in the, i was getting an error message regarding postman and postfix. There are 2 ways around it.

1) Put and getadsmtp in the /usr/sbin directory and update the cron accordingly using crontab -e
2) in the change the following lines:
postmap /etc/postfix/relay_recipients
postfix reload

/usr/sbin/postmap /etc/postfix/relay_recipients
/usr/sbin/postfix reload

From: Kevin Traas at: 2014-06-12 19:01:00

* Line 141:  replace '' with ''.
* Line 257:  replace the (invalid) less-than character.

From: Anonymous at: 2014-06-12 19:05:46

The suggested cron change to run update_scamnailer on a daily basis has the wrong path, and is a bit ambiguous.  

 Instead, add the following to /etc/crontab

 53      3 * * * /opt/MailScanner/bin/update_scamnailer

From: kecup at: 2014-11-03 15:45:08

Thanks for great howto, but let me ask you, do you plan an update of this all? I mean upversion this howto to Ubuntu 14.04? Thank you.

From: Michael at: 2015-02-24 21:32:36

just having a small issue with the setup. im new to linux but am trying to learn. so the question i have might be extremely basic but im not sure.


im getting the following error when trying to start maillscanner via /etc/init.d/mailscanner start

Can't use an undefined value as an ARRAY reference at /opt/MailScanner/lib/MailScanner/ line 2588, <DATA> line 500.

so i commented out the following in the file /etc/MailScanner/conf.d/baruwa.conf

#Inline HTML Signature = htmlsigs.customize#Inline Text Signature = textsigs.customize#Signature Image Filename = sigimgfiles.customize#Signature Image Filename = sigimgs.customize

and it appears that everything is working fine. until i click the connect button in baruwa interface to test if the connection to the exchange server is valid.


it takes me to http://localhost/settings/hosts/2/test/

and says page unavailable

sorry the requested page is unavailable due to a server malfunction.


anyone got any ideas

From: Alexandro at: 2015-03-18 15:14:06

as of 2015 installation just have to fix (like said) most of wget instructions to use up2date releases of downloaded softwares, but that's a mino issue, big problem seem that djcelery need to be migrated with command:

$baruwa-admin migrate djcelery

after the the baruwa sync procedure, otherwise it wouldn't create djcelery related tables in baruwa db (preventing message preview/train/delete, smtp test connection, and few other things via web interface) beside that.. damn great guide <3