The Perfect SpamSnake - Ubuntu Jeos 12.04 LTS Precise Pangolin - Page 4

14. KAM

vi /etc/cron.daily/

with the following content:

 # Original version modified by Andrew MacLachlan ([email protected])
 # Added additional MailScanner restarts on inital restart failure
 # Made script run silently for normal (successful) operation
 # Increased UPDATEMAXDELAY to 900 from 600
 # Insert a random delay up to this value, to spread virus updates round
 # the clock. 1800 seconds = 30 minutes.
 # Set this to 0 to disable it.
 if [ -f /opt/MailScanner/var/MailScanner ] ; then
 . /opt/MailScanner/var/MailScanner
 if [ "x$UPDATEMAXDELAY" = "x0" ]; then
 logger -p -t Delaying cron job up to $UPDATEMAXDELAY seconds
 perl -e "sleep int(rand($UPDATEMAXDELAY));"
 # JKF Fetch
 #echo Fetching
 cd /etc/mail/spamassassin
 rm -f
 wget -O > /dev/null 2>&1
 if [ "$?" = "0" ]; then
 #echo It completed and fetched something
 if ( tail -10 | grep -q '^#.*EOF' ); then
 # echo It succeeded so make a backup
 cp -f
 echo ERROR: Could not find EOF marker
 cp -f
 echo It failed to complete properly
 cp -f
 #echo Reloading MailScanner and SpamAssassin configuration rules
 /etc/init.d/mailscanner reload > /dev/null 2>&1
 if [ $? != 0 ] ; then
 echo "MailScanner reload failed - Retrying..."
 /etc/init.d/mailscanner force-reload
 if [ $? = 0 ] ; then
 echo "MailScanner reload succeeded."
 echo "Stopping MailScanner..."
 /etc/init.d/mailscanner stop
 echo "Waiting for a minute..."
 perl -e "sleep 60;"
 echo "Attemping to start MailScanner..."
 /etc/init.d/mailscanner start

Make it executable:

chmod +x /etc/cron.daily/ 


15. ScamNailer

vi /opt/MailScanner/bin/update_scamnailer

with the following content:

# (c) 2009 Julian Field ‹[email protected]›
#          Version 2.05
# This file is the copyright of Julian Field ‹[email protected]›,
# and is made freely available to the entire world. If you intend to
# make any money from my work, please contact me for permission first!
# If you just want to use this script to help protect your own site's
# users, then you can use it and change it freely, but please keep my
# name and email address at the top.
use strict;
use File::Temp;
use Net::DNS::Resolver;
use LWP::UserAgent;
use FileHandle;
use DirHandle;
# Filename of list of extra addresses you have added, 1 per line.
# Does not matter if this file does not exist.
my $local_extras = '/etc/MailScanner/ScamNailer.local.addresses';
# Output filename, goes into SpamAssassin. Can be over-ridden by just
# adding the output filename on the command-line when you run this script.
my $output_filename = '/etc/mail/spamassassin/';
# This is the location of the cache used by the DNS-based updates to the
# phishing database.
my $emailscurrent = '/var/cache/ScamNailer/';
# Set this next value to '' if ou are not using MailScanner.
# Or else change it to any command you need to run after updating the
# SpamAssassin rules, such as '/sbin/service spamd restart'.
my $mailscanner_restart = '/etc/init.d/mailscanner force-reload';
# The SpamAssassin score to assign to the final rule that fires if any of
# the addresses hit. Multiple hits don't increase the score.
# I use a score of 0.1 with this in MailScanner.conf:
# SpamAssassin Rule Actions = SCAMNAILER=>not-deliver,store,forward [email protected], header "X-Anti-Phish: Was to _TO_"
# If you don't understand that, read the section of MailScanner.conf about the
# "SpamAssassin Rule Actions" setting.
my $SA_score = 4.0;
# How complicated to make each rule. 20 works just fine, leave it alone.
my $addresses_per_rule = 20;
my $quiet = 1 if grep /quiet|silent/, @ARGV;
if (grep /help/, @ARGV) {
  print STDERR "Usage: $0 [ --quiet ]\n";
my($count, $rule_num, @quoted, @addresses, @metarules);
#local(*YPCAT, *SACF);
$output_filename = $ARGV[0] if $ARGV[0]; # Use filename if they gave one
# First do all the addresses we read from DNS and anycast and only do the
# rest if needed.
if (GetPhishingUpdate()) {
open(SACF, ">$output_filename") or die "Cannot write to $output_filename $!";
print SACF "# ScamNailer rules\n";
print SACF "# Generated by $0 at " . `date` . "\n";
# Now read all the addresses we generated from GetPhishingUpdate().
open(PHISHIN, $emailscurrent . 'phishing.emails.list')
  or die "Cannot read " . $emailscurrent . "phishing.emails.list, $!\n";
<phishin>) {
  next if /^\s*$/;
  next unless /^[^@]+\@[^@]+$/;
  push @addresses, $_; # This is for the report
  s/[^0-9a-z_-]/\\$&/ig; # Quote every non-alnum
  s/\\\*/[0-9a-z_.+-]*/g; # Unquote any '*' characters as they map to .*
  # Find all the numbers just before the @ and replace with them digit wildcards
  #push @quoted, '(' . $_ . ')';
  push @quoted, $_;
  if ($count % $addresses_per_rule == 0) {
    # Put them in 10 addresses at a time
    # Put a start-of-line/non-address character at the front,
    # and an end-of-line /non-address character at the end.
    print SACF "header __SCAMNAILER_H$rule_num ALL =~ /" .
               '(^|[;:\s])(?:' . join('|',@quoted) . ')($|[^0-9a-z_.+-])' .
    push @metarules, "__SCAMNAILER_H$rule_num";
    print SACF "uri __SCAMNAILER_B$rule_num /" .
               '^mailto:(?:' . join('|',@quoted) . ')$' .
    push @metarules, "__SCAMNAILER_B$rule_num";
    undef @quoted;
    undef @addresses;
close PHISHIN;
# Put in all the leftovers, if any
if (@quoted) {
    print SACF "header __SCAMNAILER_H$rule_num ALL =~ /" .
               '(^|[;:\s])(?:' . join('|',@quoted) . ')($|[^0-9a-z_.+-])' .
    push @metarules, "__SCAMNAILER_H$rule_num";
    print SACF "uri __SCAMNAILER_B$rule_num /" .
               '^mailto:(?:' . join('|',@quoted) . ')$' .
    push @metarules, "__SCAMNAILER_B$rule_num";
print SACF "\n# ScamNailer combination rule\n\n";
print SACF "meta     SCAMNAILER " . join(' || ',@metarules) . "\n";
print SACF "describe SCAMNAILER Mentions a spear-phishing address\n";
print SACF "score    SCAMNAILER $SA_score\n\n";
print SACF "# ScamNailer rules ($count) END\n";
close SACF;
# And finally restart MailScanner to use the new rules
$mailscanner_restart .= " >/dev/null 2>&1" if $quiet;
system($mailscanner_restart) if $mailscanner_restart;
exit 0;
sub GetPhishingUpdate {
  my $cache = $emailscurrent . 'cache/';
  my $status = $emailscurrent . 'status';
  my $urlbase = "";
  my $target= $emailscurrent . 'phishing.emails.list';
  my $query="";
  my $baseupdated = 0;
  if (! -d $emailscurrent) {
    print "Working directory is not present - making....." unless $quiet;
    mkdir ($emailscurrent) or die "failed";
    print " ok!\n" unless $quiet;
  if (! -d $cache) {
    print "Cache directory is not present - making....." unless $quiet;
    mkdir ($cache) or die "failed";
    print " ok!\n" unless $quiet;
  if (! -s $target) {
    open (FILE,">$target") or die
      "Failed to open target file so creating a blank file";
    print FILE "# Wibble";
    close FILE;
  } else {
    # So that clean quarantine doesn't delete it!
    utime(time(), time(), $emailscurrent);
  my ($status_base, $status_update);
  if (! -s $status) {
    print "This is the first run of this program.....\n" unless $quiet;
  } else {
    print "Reading status from $status\n" unless $quiet;
    open(STATUS_FILE, $status) or die "Unable to open status file\n";
    my $line=<status_file>;
    close (STATUS_FILE);
    # The status file is text.text
    if ($line =~ /^(.+)\.(.+)$/) {
  print "Checking that $cache$status_base exists..." unless $quiet;
  if ((! -s "$cache$status_base") && (!($status_base eq "-1"))) {
    print " no - resetting....." unless $quiet;
  print " ok\n" unless $quiet;
  print "Checking that $cache$status_base.$status_update exists..." unless $quiet;
  if ((! -s "$cache$status_base.$status_update") && ($status_update>0)) {
    print " no - resetting....." unless $quiet;
  print " ok\n" unless $quiet;
  my $currentbase = -1;
  my $currentupdate = -1;
  # Lets get the current version
  my $res = Net::DNS::Resolver->new();
  my $RR = $res->query($query, 'TXT');
  my @result;
  if ($RR) {
    foreach my $rr ($RR->answer) {
      my $text = $rr->rdatastr;
      if ($text =~ /^"emails\.(.+)\.(.+)"$/) {
  die "Failed to retrieve valid current details\n" if $currentbase eq "-1";
  print "I am working with: Current: $currentbase - $currentupdate and Status: $status_base - $status_update\n" unless $quiet;
  my $generate=0;
  # Create a user agent object
  my $ua = LWP::UserAgent->new;
  $ua->agent("UpdateBadPhishingSites/0.1 ");
  # Patch from [email protected]
  if (!($currentbase eq $status_base)) {
    print "This is base update\n" unless $quiet;
    $status_update = -1;
    $baseupdated = 1;
    # Create a request
    #print "Getting $urlbase . $currentbase\n" unless $quiet;
    my $req = HTTP::Request->new(GET => $urlbase.$currentbase);
    # Pass request to the user agent and get a response back
    my $res = $ua->request($req);
    # Check the outcome of the response
    if ($res->is_success) {
      open (FILE, ">$cache/$currentbase") or die "Unable to write base file ($cache/$currentbase)\n";
      print FILE $res->content;
      close (FILE);
    } else {
      warn "Unable to retrieve $urlbase.$currentbase :".$res->status_line, "\n";
  } else {
    print "No base update required\n" unless $quiet;
  # Now see if the sub version is different
  if (!($status_update eq $currentupdate)) {
    my %updates=();
    print "Update required\n" unless $quiet;
    if ($currentupdate‹$status_update) {
      # In the unlikely event we roll back a patch - we have to go from the base
      print "Error!: $currentupdate<$status_update\n" unless $quiet;
      $generate = 1;
      $status_update = 0;
    # If there are updates avaliable and we haven't donloaded them
    # yet we need to reset the counter
    if ($currentupdate>0) {
      if ($status_update<1) {
      my $i;
      # Loop through each of the updates, retrieve it and then add
      # the information into the update array
      for ($i=$status_update+1; $i<=$currentupdate; $i++) {
        print "Retrieving $urlbase$currentbase.$i\n" unless $quiet;
        #print "Getting $urlbase . $currentbase.$i\n" unless $quiet;
        my $req = HTTP::Request->new(GET => $urlbase.$currentbase.".".$i);
        my $res = $ua->request($req);
        warn "Failed to retrieve $urlbase$currentbase.$i"
          unless $res->is_success;
        my $line;
        foreach $line (split("\n", $res->content)) {
          # Is it an addition?
          if ($line =~ /^\> (.+)$/) {
            if (defined $updates{$1}) {
              if ($updates{$1} eq "<") {
                delete $updates{$1};
            } else {
          # Is it an removal?
          if ($line =~ /^\< (.+)$/) {
            if (defined $updates{$1}) {
              if ($updates{$1} eq ">") {
                delete $updates{$1};
            } else {
      # OK do we have a previous version to work from?
      if ($status_update>0) {
        # Yes - we open the most recent version
        open (FILE, "$cache$currentbase.$status_update") or die
          "Unable to open base file ($cache/$currentbase.$status_update)\n";
      } else {                        # No - we open the the base file
        open (FILE, "$cache$currentbase") or die
          "Unable to open base file ($cache/$currentbase)\n";
      # Now open the new update file
      print "$cache$currentbase.$currentupdate\n" unless $quiet;
      open (FILEOUT, ">$cache$currentbase.$currentupdate") or die
        "Unable to open new base file ($cache$currentbase.$currentupdate)\n";
      # Loop through the base file (or most recent update)
      while (<file>) {
        my $line=$_;
        if (defined ($updates{$line})) {
          # Does the line need removing?
          if ($updates{$line} eq "<") {
          # Is it marked as an addition but already present?
          elsif ($updates{$line} eq ">") {
            delete $updates{$line};
        print FILEOUT $line."\n";
      close (FILE);
      my $line;
      # Are there any additions left
      foreach $line (keys %updates) {
        if ($updates{$line} eq ">") {
          print FILEOUT $line."\n" ;
      close (FILEOUT);
  # Changes have been made
  if ($generate) {
    print "Updating live file $target\n" unless $quiet;
    my $file="";
    if ($currentupdate>0) {
    } else {
    if ($file eq "") {
      die "Unable to work out file!\n";
    system ("mv -f $target $target.old");
    system ("cp $file $target");
    open(STATUS_FILE, ">$status") or die "Unable to open status file\n";
    print STATUS_FILE "$currentbase.$currentupdate\n";
    close (STATUS_FILE);
  my $queuedir = new DirHandle;
  my $file;
  my $match1 = "^" . $currentbase . "\$";
  my $match2 = "^" . $currentbase . "." . $currentupdate . "\$";
  $queuedir->open($cache) or die "Unable to do clean up\n";
  while(defined($file = $queuedir->read())) {
    next if $file eq '.' || $file eq '..';
    next if $file =~ /$match1/;
    next if $file =~ /$match2/;
    print "Deleting cached file: $file.... " unless $quiet;
    unlink($cache.$file) or die "failed";
    print "ok\n" unless $quiet;

Make it executable:

chmod +x /opt/MailScanner/bin/update_scamnailer

 Add it to cron:

@daily /usr/sbin/update_scamnailer &> /dev/null #Update Scamnailer


16. Firewalling the SpamSnake with Firehol

Firehol is a stateful iptables packet filtering firewall configurator. It is abstracted, extensible, easy and powerful. It can handle any kind of firewall, but most importantly, it gives you the means to configure it, the same way you think of it.

Install Firehol:

apt-get install firehol -y
vi /etc/default/firehol

and change the following:

vi /etc/firehol/firehol.conf

and add the following:

version 5
   # Accept all client traffic on any interface
   interface any internet
   protection strong
   server "icmp ping ICMP ssh http https telnet webmin dns dcc echo smtp" accept
 client all accept

This filters all incoming connections that are not related to the above services. If you want to be less polite, you can drop them by adding the following after 'protection strong': policy drop

vi /usr/sbin/get-iana

with the following content:

 # $Id:,v 1.13 2010/09/12 13:55:00 jcb Exp $
   # $Log:,v $
   # Revision 1.13 2010/09/12 13:55:00 jcb
   # Updated for latest IANA reservations format.
   # Revision 1.12 2008/03/17 22:08:43 ktsaou
   # Updated for latest IANA reservations format.
   # Revision 1.11 2007/06/13 14:40:04 ktsaou
   # *** empty log message ***
   # Revision 1.10 2007/05/05 23:38:31 ktsaou
   # Added support for external definitions of:
   # in files under the same name in /etc/firehol/.
   # Only RESERVED_IPS is mandatory (firehol will complain if it is not  there,
   # but it will still work without it), and is also the only file that  firehol
   # checks how old is it. If it is 90+ days old, firehol will complain  again.
   # Changed the supplied script to generate the RESERVED_IPS  file.
   # FireHOL also instructs the user to use this script if the file is  missing
   # or is too old.
   # Revision 1.9 2007/04/29 19:34:11 ktsaou
   # *** empty log message ***
   # Revision 1.8 2005/06/02 15:48:52 ktsaou
   # Allowed to be in RESERVED_IPS
   # Revision 1.7 2005/05/08 23:27:23 ktsaou
   # Updated RESERVED_IPS to current IANA reservations.
   # Revision 1.6 2004/01/10 18:44:39 ktsaou
   # Further optimized and reduced PRIVATE_IPS using:
   # The supplied uses .aggregate. if it finds it in the path.
   # (aggregate is the name of this program when installed on Gentoo)
   # Revision 1.5 2003/08/23 23:26:50 ktsaou
   # Bug #793889:
   # Change #!/bin/sh to #!/bin/bash to allow FireHOL run on systems that
   # bash is not linked to /bin/sh.
   # Revision 1.4 2002/10/27 12:44:42 ktsaou
   # CVS test
   # Program that downloads the IPv4 address space allocation by IANA
   # and creates a list with all reserved address spaces.
 # The program will match all rows in the file which start with a  number, have a slash,
   # followed by another number, for which the following pattern will also  match on the
   # same rows
 # which rows that are matched by the above, to ignore
   # (i.e. not include them in RESERVED_IPS)?
   #IANA_IGNORE="(Multicast|Private use|Loopback|Local  Identification)"
 AGGREGATE="`which aggregate 2>/dev/null`"
   if [ -z "${AGGREGATE}" ]
   AGGREGATE="`which aggregate 2>/dev/null`"
 if [ -z "${AGGREGATE}" ]
   echo >&2
   echo >&2
   echo >&2 "WARNING"
   echo >&2 "Please install 'aggregate' to shrink the list of  IPs."
   echo >&2
   echo >&2
 echo >&2
   echo >&2 "Fetching IANA IPv4 Address Space, from:"
   echo >&2 "${IPV4_ADDRESS_SPACE_URL}"
   echo >&2
 wget -O - -proxy=off "${IPV4_ADDRESS_SPACE_URL}" |\
   egrep " *[0-9]+/[0-9]+.*${IANA_RESERVED}" |\
   egrep -vi "${IANA_IGNORE}" |\
   sed -e 's:^ *\([0-9]*/[0-9]*\).*:\1:' |\
 while IFS="/" read range net
   if [ ! $net -eq 8 ]
   echo >&2 "Cannot handle network masks of $net bits  ($range/$net)"
 first=`echo $range | cut -d '-' -f 1`
   first=`expr $first + 0`
   last=`echo $range | cut -d '-' -f 2`
   last=`expr $last + 0`
   while [ ! $x -gt $last ]
   # test $x -ne 127 && echo "$x.0.0.0/$net"
   echo "$x.0.0.0/$net"
   x=$[x + 1]
   ) | \
   if [ ! -z "${AGGREGATE}" -a -x "${AGGREGATE}" ]
   ) >"${tempfile}"
 echo >&2
   echo >&2
   printf "RESERVED_IPS=\""
   for x in `cat ${tempfile}`
   i=$[i + 1]
   printf "${x} "
   printf "\"\n"
 if [ $i -eq 0 ]
   echo >&2
   echo >&2
   echo >&2 "Failed to find reserved IPs."
   echo >&2 "Possibly the file format has been changed, or I  cannot fetch the URL."
   echo >&2
 rm -f ${tempfile}
   exit 1
   echo >&2
   echo >&2
   echo >&2 "Differences between the fetched list and the list  installed in"
   echo >&2 "/etc/firehol/RESERVED_IPS:"
 echo >&2 "# diff /etc/firehol/RESERVED_IPS  ${tempfile}"
   diff /etc/firehol/RESERVED_IPS ${tempfile}
 if [ $? -eq 0 ]
   echo >&2
   echo >&2 "No  differences found."
   echo >&2
 rm -f ${tempfile}
   exit 0
 echo >&2
   echo >&2
   echo >&2 "Would you like to save this list to  /etc/firehol/RESERVED_IPS"
   echo >&2 "so that FireHOL will automatically use it from  now on?"
   echo >&2
   while [ 1 = 1 ]
   printf >&2 "yes or no > "
   read x
 case "${x}" in
   yes) cp -f /etc/firehol/RESERVED_IPS /etc/firehol/RESERVED_IPS.old  2>/dev/null
   cat "${tempfile}" >/etc/firehol/RESERVED_IPS || exit 1
   echo >&2 "New RESERVED_IPS written to  '/etc/firehol/RESERVED_IPS'."
   echo "Firehol will now be restart"
   sleep 3
   /etc/init.d/firehol restart
   echo >&2 "Saved nothing."
 *) echo >&2 "Cannot understand '${x}'."
 rm -f ${tempfile}

Make it executable:

chmod +x /usr/sbin/get-iana
vi /usr/sbin/update-iana

with the following content:

 /usr/sbin/get-iana  < /etc/firehol/get-iana-answerfile

Make it excutable:

chmod +x /usr/sbin/update-iana
vi /etc/firehol/get-iana-answerfile

with the following content:


Run the script to update RESERVED_IPS:


*Note: Now your server is set up to only accept connections for the services you allowed.

Add it to cron:

@monthly /usr/sbin/update-iana &> /dev/null #Update firehol reserved ips


17. Apply Relay Recipients (Optional)

The following directions are meant for people using Microsoft Exchange 2000 or Microsoft Exchange 2003.

This page describes how to configure your mail gateway to periodically get a list of valid recipient email addresses from your Exchange system. By doing this, you can configure your server to automatically reject any email addressed to invalid addresses. This will reduce the load on your exchange server, since it no longer has to process non-delivery reports, and it will reduce the load on your postfix server since it won't have to perform spam and virus scanning on the message.

Install the perl module Net::LDAP:

perl -MCPAN -e shell
install Net::LDAP

vi /usr/bin/ with the following content:
#!/usr/bin/perl -T -w
   # This script will pull all users' SMTP addresses from your Active Directory
   # (including primary and secondary email addresses) and list them in the
   # format "[email protected] OK" which Postfix uses with relay_recipient_maps.
   # Be sure to double-check the path to perl above.
   # This requires Net::LDAP to be installed.  To install Net::LDAP, at a shell
   # type "perl -MCPAN -e shell" and then "install Net::LDAP"
   use Net::LDAP;
   use Net::LDAP::Control::Paged;
   use Net::LDAP::Constant ( "LDAP_CONTROL_PAGED" );
   # Enter the path/file for the output
   $VALID = "/etc/postfix/relay_recipients";
   open VALID, ">$VALID" or die "CANNOT OPEN $VALID $!";
   # Enter the FQDN of your Active Directory domain controllers below
   # Enter the LDAP container for your userbase.
   # The syntax is CN=Users,dc=example,dc=com
   # This can be found by installing the Windows 2000 Support Tools
   # then running ADSI Edit.
   # In ADSI Edit, expand the "Domain NC []" &
   # you will see, for example, DC=example,DC=com (this is your base).
   # The Users Container will be specified in the right pane as
   # CN=Users depending on your schema (this is your container).
   # You can double-check this by clicking "Properties" of your user
   # folder in ADSI Edit and examining the "Path" value, such as:
   # LDAP://,DC=example,DC=com
   # which would be $hqbase="cn=Users,dc=example,dc=com"
   # Note:  You can also use just $hqbase="dc=example,dc=com"
   # Enter the username & password for a valid user in your Active Directory
   # with username in the form cn=username,cn=Users,dc=example,dc=com
   # Make sure the user's password does not expire.  Note that this user
   # does not require any special privileges.
   # You can double-check this by clicking "Properties" of your user in
   # ADSI Edit and examining the "Path" value, such as:
   # LDAP://,CN=Users,DC=example,DC=com
   # which would be $user="cn=user,cn=Users,dc=example,dc=com"
   # Note: You can also use the UPN login: "user\"
   # Connecting to Active Directory domain controllers
   $ldap = Net::LDAP->new($dc1) or
   if ($noldapserver == 1)  {
   $ldap = Net::LDAP->new($dc2) or
   die "Error connecting to specified domain controllers $@ \n";
   $mesg = $ldap->bind ( dn => $user,
   password =>$passwd);
   if ( $mesg->code()) {
   die ("error:", $mesg->error_text((),"\n"));
   # How many LDAP query results to grab for each paged round
   # Set to under 1000 for Active Directory
   $page = Net::LDAP::Control::Paged->new( size => 990 );
   @args = ( base     => $hqbase,
   # Play around with this to grab objects such as Contacts, Public Folders, etc.
   # A minimal filter for just users with email would be:
   # filter => "(&(sAMAccountName=*)(mail=*))"
   filter => "(& (mailnickname=*) (| (&(objectCategory=person)
   (objectCategory=group)(objectCategory=publicFolder) ))",
   control  => [ $page ],
   attrs  => "proxyAddresses",
   my $cookie;
   while(1) {
   # Perform search
   my $mesg = $ldap->search( @args );
   # Filtering results for proxyAddresses attributes
   foreach my $entry ( $mesg->entries ) {
   my $name = $entry->get_value( "cn" );
   # LDAP Attributes are multi-valued, so we have to print each one.
   foreach my $mail ( $entry->get_value( "proxyAddresses" ) ) {
   # Test if the Line starts with one of the following lines:
   # proxyAddresses: [smtp|SMTP]:
   # and also discard this starting string, so that $mail is only the
   # address without any other characters...
   if ( $mail =~ s/^(smtp|SMTP)://gs ) {
   print VALID $mail." OK\n";
   # Only continue on LDAP_SUCCESS
   $mesg->code and last;
   # Get cookie from paged control
   my($resp)  = $mesg->control( LDAP_CONTROL_PAGED ) or last;
   $cookie    = $resp->cookie or last;
   # Set cookie in paged control
   if ($cookie) {
   # We had an abnormal exit, so let the server know we do not want any more
   $ldap->search( @args );
   # Also would be a good idea to die unhappily and inform OP at this point
   die("LDAP query unsuccessful");
   # Add additional restrictions, users, etc. to the output file below.
   #print VALID "user\ OK\n";
   #print VALID "user\ 550 User unknown.\n";
   #print VALID " 550 User does not exist.\n";
 close VALID;;

Make it executable:

chmod +x /usr/bin/

Edit the file to customize it for your specific domain. Since the file is read only, you will need to use :w! to save the file in vi.

1. Set $dc1 and $dc2 to the fully qualified domain names or IP addresses of 2 of your domain controllers.
2. Set $hqbase equal to the LDAP path to the container or organizational unit which holds the email accounts for which you wish to get the email addresses.
3. Set $user and $passwd to indicate which user account should be used to access this information. This account only needs to be a member of the domain, so it would be a good idea to setup an account specifically for this.

Try running the script. If it works correctly, it will create /etc/postfix/relay_recipients.

*Note: If your postfix server is separated from your active directory controllers by a firewall, you will need to open TCP port 389 from the postfix server to the ADCs.

At this point, you may want to edit /etc/postfix/relay_recipients and edit out any unwanted email addresses as this script imports everything.

Postmap the file to create the hash db:

postmap /etc/postfix/relay_recipients
postfix reload

Finally, you may want to set up a cron job to periodically update and build the /etc/postfix/relay_recipients.db file. You can set up a script called /usr/bin/ (Optional)

vi /usr/bin/

with the following content:

postmap /etc/postfix/relay_recipients
postfix reload

Make it executable:

chmod +x /usr/bin/

Don't forget to make sure the following is in your /etc/postfix/ file:

relay_recipient_maps = hash:/etc/postfix/relay_recipients

Add it to cron:

30 2 * * * /usr/bin/ #syncronize relay_recipients with Active Directory addresses

*Note: This cron job will run every day at 2:30 AM to update the database file. You may want to run yours more frequently or not depending on how often you add new email users to your system.


18. Install Webmin (Optional):

vi /etc/apt/sources.list

and add the following:

deb sarge contrib
deb sarge contrib

Install the GPG Key along with the package:

apt-key add jcameron-key.asc
apt-get update
apt-get install webmin -y

*Note: All dependencies should be resolved automatically.

Now to access webmin open your browser and enter: http://serverip:10000/


19. Automatically Add A Disclaimer To Outgoing Emails With alterMIME (Optional)

Install alterMIME:

apt-get install altermime -y

Next we create the user filter with the home directory /var/spool/filter - alterMIME will be run as that user:

useradd -r -c "Postfix Filters" -d /var/spool/filter filter
mkdir /var/spool/filter
chown filter:filter /var/spool/filter
chmod 750 /var/spool/filter

Afterwards we create the script /etc/postfix/disclaimer which executes alterMIME. Ubuntu's alterMIME package comes with a sample script that we can simply copy to /etc/postfix/disclaimer:

cp /usr/share/doc/altermime/examples/ /etc/postfix/disclaimer
chgrp filter /etc/postfix/disclaimer
chmod 750 /etc/postfix/disclaimer

Now the problem with this script is that it doesn't distinguish between incoming and outgoing emails - it simply adds a disclaimer to all mails. Typically you want disclaimers only for outgoing emails, and even then not for all sender addresses. Therefore I've modified the /etc/postfix/disclaimer script a little bit - we'll come to that in a minute.

vi /etc/postfix/disclaimer_addresses

which holds all sender email addresses (one per line) for which alterMIME should add a disclaimer:

[email protected]
[email protected]
[email protected]

and modify it as follows (I have marked the parts that I've changed):

# Localize these.
####### Changed From Original Script #######
####### Changed From Original Script END #######
# Exit codes from <sysexits.h>
# Clean up when done or when aborting.
trap "rm -f in.$$" 0 1 2 3 15
# Start processing.
cd $INSPECT_DIR || { echo $INSPECT_DIR does not exist; exit
cat >in.$$ || { echo Cannot save mail to file; exit $EX_TEMPFAIL; }
####### Changed From Original Script #######
# obtain From address
from_address=`grep -m 1 "From:" in.$$ | cut -d "<" -f 2 | cut -d ">" -f 1`
if [ `grep -wi ^${from_address}$ ${DISCLAIMER_ADDRESSES}` ]; then
  /usr/bin/altermime --input=in.$$ \
                   --disclaimer=/etc/postfix/disclaimer.txt \
                   --disclaimer-html=/etc/postfix/disclaimer.txt \
                   --xheader="X-Copyrighted-Material: Please visit" || \
                    { echo Message content rejected; exit $EX_UNAVAILABLE; }
####### Changed From Original Script END #######
$SENDMAIL "$@" <in.$$
exit $?

Next we need the text file /etc/postfix/disclaimer.txt which holds our disclaimer text. Ubuntu's alterMIME package comes with a sample text that we can use for now (of course, you can modify it if you like):

cp /usr/share/doc/altermime/examples/disclaimer.txt /etc/postfix/disclaimer.txt

Finally we have to tell Postfix that it should use the /etc/postfix/disclaimer script to add disclaimers to outgoing emails.


and add -o content_filter=dfilt: to the smtp line:

# Postfix master process configuration file.  For details on the format
# of the file, see the master(5) manual page (command: "man 5 master").
# ==========================================================================
# service type  private unpriv  chroot  wakeup  maxproc command + args
#               (yes)   (yes)   (yes)   (never) (100)
# ==========================================================================
smtp      inet  n       -       -       -       -       smtpd
   -o content_filter=dfilt:

At the end of the same file, add the following two lines:

dfilt     unix    -       n       n       -       -       pipe
    flags=Rq user=filter argv=/etc/postfix/disclaimer -f ${sender} -- ${recipient}

Restart Postfix afterwards:

/etc/init.d/postfix restart

That's it! Now a disclaimer should be added to outgoing emails sent from the addresses listed in /etc/postfix/disclaimer_addresses.


20. Screenshots


You should now have a completely working SpamSnake.

Share this page:

7 Comment(s)