In the process of writing a LALR(1) parser for the NBTSWikiWiki, and then seeing the front page get spammed a second time, I wonder now if stricter parsing would have avoided the problem. If pages couldn’t contain invalid markup constructs, and you just rejected the edit…. Then that last spam would have failed since NBTSWikiWiki’s syntax is a bit different than the spammers were suspecting.
It’s probably a bad idea, but it’s a thought. In a more geeky use of a wiki, it might be very smart.