Sorry about the overnight outage. The site was hit by a “XANAX” comment spam attack, prompting a well-justified overnight lockdown from Site5. The spams came in at a rate of about 2-5 comments per second, from a botnet — a network of infected PCs — with machines mostly in the
212.138.64 range. This meant multiple IPs from various locations were simultaneously hammering mt-comments.cgi, and since I use static files, each comment meant a db entry and a static file rebuild. Doing this hundreds of times per minute meant intense load on the machine, so that the comment spam flood effectively operated like a DDOS attack, slowing down all the sites on the server.
Since old open comments are usually the target of these attacks, I’ve reinstalled MTCloseComments to block off entries older than a month, and scoured my access logs for all the IPs in the botnet to add to my blacklist — a mostly imperfect solution, but returning 403’s to spam bots should be enough of a stopgap for now, I guess. Pity; I was hoping I’d be able to keep older comments open for more discussion, but the spam is just too much trouble. Hence the spam aspect of the tragedy of the commons — spammers, in their overpowering thirst for inbound links, will drown out conversation and disrupt the flow of communication for the chance that 0.1% of their victims are gullible enough to buy cheap fake drugs or click links on their ugly, keyword-littered websites.
What I’m wondering is, if a weblog spammer’s aim is to boost a site’s search engine rank with multiple links to the site from multiple pages, what use is it if his rampant posting of those links just brings down the target site, making it impossible for search engines to crawl, defeating the purpose of the spam anyway — even without considering that I’m already using nofollow in all comment and trackback links? It just goes to show, as I’ve said before, weblog spam might make a lot of money for the spammers and botnet operators, but money can’t buy them brains.