I recently noticed that spamassassin has been letting a LOT more mail through than it had before. The downside to running my own server for so long is that sometimes I have a “set it and forget it” attitude.. and frankly, tinkering with mailserver settings hasn’t been high on my list. But, I was tired of getting hit so often, so I took a bit of time and refreshed my setup. Another way to look at it would be: I started fresh with spamassassin so that my config from 2008’ish wouldn’t be so out of date. Unfortunately, that alone wasn’t a good fix so the process has been ongoing.
Spam Assassin setup
I’m not going to go into specifics on most of this because it’s going to vary based on your environment.. but here is the high-level view of what I attempted:
- apt-get purge spamassassin spamc
- re-install with the latest spamassassin (3.4.0 as of this writing)
- added Sought rules (http://taint.org/2007/08/15/004348a.html)
- Found missing perl modules and installed them (spamassassin -D –lint 2>&1 | grep -i failed)
- Setup fake high/low MX records to trick poorly written bots (https://wiki.apache.org/spamassassin/OtherTricks)
- Adjusted postfix’s smtpd_recipient_restrictions to…be more restrictive
- Verified RBL was working properly by saving a previously missed SPAM (with headers) and running it on cli as ‘spamassassin -t -D < spam.email 2>&1 | less’
- Created two files, one that only had good (ham) emails and one that had a lot of spam that had been missed.. ran sa-learn against them with respective spam/ham commands to help it along
But, it doesn’t seem that is helping enough. Don’t get me wrong, my mailserver is happily rejecting lots of messages and properly flagging most of them as spam, but it is infuriating to watch such obvious crap get through. I even took a more drastic measure of increasing the RBL scores for a few of the sites to 4 points.. but there are too many IPs out there that have not made it into those lists yet.
Greylisting (the bane of my mailserver existence)
I hate the idea of greylisting. No, actually, I love the idea of it, but it is incredibly annoying… particularly when you are waiting on an automated email from a shopping site, or waiting for your password-reset email to come through and you don’t know if their server will retry in 5 minutes or 5 hours. It places the annoyance squarely on the recipients shoulders and that isn’t how email should work!
But, what if we only greylisted items that made it into an RBL? Surely that would reduce a lot of spam… luckily, I am not the first person to think of this, so I didn’t have to write it from scratch.
http://giovanni.bajo.it/post/47121521214/grey-on-black-combining-greylisting-with-rbl has a nice overview of the process, and their github account has a lot of very recent updates to it. Within a few minutes I was able to have this up and running. I went with their suggestions initially with only one change: I added ‘reject_rbl_client b.barracudacentral.org,’ to my main.cf’s ‘smtpd_recipient_restrictions’.
The upside is that this all worked as expected, and I could see it doing its job… but the major downside is that it really didn’t help at all. While I think it’s an excellent approach overall, it’s just not an effective tool at combating spam (but again, better than many suggested alternatives).
Greylisting with SPF Checking
So, back to
the drawing board googling around and I came across tumgreyspf. This is a slightly different approach in that it is a combination of SPF Record checking and greylisting. Again, it was very easy to get into place and I could see it working immediately… but all in all, this is only slightly different than just using postgrey. It is helpful that you can configure it to check SPF records first, and THEN greylist if that check failed, but you’re going to end up having to wait around for some of those emails still.
But yanno what really hurts? Seeing mail get greylisted and then STILL receiving some crappy spam with a “.link” tld url in it. Painful.
Defeating annoying .link spam
So after yet more googling, I found out that spamassassin uses a perl module called RegistrarBoundaries.pm to help with it’s URI Blacklisting. It was very recently updated to include many of the new TLDs and can be found at http://svn.apache.org/repos/asf/spamassassin/trunk/lib/Mail/SpamAssassin/Util/RegistrarBoundaries.pm. And if you’re running the latest 3.4.0 release, you can simply overwrite your locally installed version with this one and be done with it.
I also learned about the SEM_FRESH ruleset which will flag an domains registered less than 5 days ago. By adding this rule to my local /etc/spamassassin/ directory, I should be able to add enough points to those sites to push them back over the spam threshold.
Putting it all together, I now have a configuration kinda like this:
tumgreyspf installed, added to master.cf as:
tumgreyspf unix - n n - - spawn
tumgreyspf_time_limit = 3600
disable_vrfy_command = yes
smtpd_delay_reject = yes
smtpd_helo_required = yes
smtpd_helo_restrictions = permit_mynetworks,
smtpd_error_sleep_time = 1s
smtpd_soft_error_limit = 10
smtpd_hard_error_limit = 20
spamassassin’s local.cf which includes:
skip_rbl_checks 1 ##doing this via greylisting, don't check twice!
and a new ruleset, /etc/spamassassin/10_semfresh.cf:
urirhssub SEM_FRESH fresh.spameatingmonkey.net. A 2
body SEM_FRESH eval:check_uridnsbl('SEM_FRESH')
describe SEM_FRESH Contains a domain registered less than 5 days ago
tflags SEM_FRESH net
score SEM_FRESH 2.5
Has this solved my problem? Only time will tell.. and unfortunately, now I have to wait a bit longer thanks to greylisting.