BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Designing for Spam: A Challenge for the Web?

Designing for Spam: A Challenge for the Web?

This item in japanese

Bookmarks

The increasing activity and hostility of spammers and the sophistication of their spamming tools are a constantly growing concern for the web. The recent spam attack on Craigslist triggered many reactions in the blogosphere seeking to analyze spammers’ techniques and implications of the spam’s spread.

John Nagle, quoted by Mike Masnick, describes, for instance, the way tools like CL Auto Posting Tool defeat Craigslist’s anti-spam techniques:

Craigslist tries to stop spamming by checking for duplicate submissions. They check for excessive posts from a single IP address. They require users to register with a valid E-mail address. They added a CAPTCHA to stop automated posting tools. And users can flag postings they recognize as spam.

Several commercial products are now available to overcome those little obstacles to bulk posting.  

[…]  

Random text is added to each spam message to fool Craigslist's duplicate message detector. IP proxy sites are used to post from a wide range of IP addresses. E-mail addresses for reply are Gmail accounts conveniently created by Jiffy Gmail Creator […] An OCR system reads the obscured text in the CAPTCHA. Automatic monitoring detects when a posting has been flagged as spam and reposts it.

Even the largest companies, like Google, having at their disposal “thousands of employees and enormous budgets” are not safe from spammers attack. The blog of Websense Security Labs describes what new techniques were used to defeat Google’s CAPTCHA - Completely Automated Public Turing test to tell Computers and Humans Apart – so that random Gmail accounts can be signed up and created for spamming purposes.

Two authors consider the implications of the growing threat of spam for the web. The author of Discipline and Punish blog emphasizes the fact that “this problem will only grow and grow as the web becomes the fundamental architectural and communication medium”. He finds it rather surprising that while many prospects are being made “about Web 5.0 and the Semantic Web few of these visions give much consideration to the threat of spam” whereas “spam is already a major factor in the viability of web 1.0 institutions” and Web 2.0 is even more vulnerable to spam given its focus on social, collaboration and aggregation. In his opinion, “the ability to resist the endless waves of spam” will define the viability of future distributed architectures. Not taking this into consideration would be “a big mistake”.

Also in response Craigslist attack, Jeff Atwood pointed out that spammer’s activity “undermines the community's trust […] and devalues everyone's participation.” He goes along the same lines as Discipline and Punish as he argues that “when you design your software, work under the assumption that some of your users will be evil” because “when you fail to design for evil, you have failed your community”.

Discipline and Punish blog highlights however the fact that in Web 2.0 context, spam is not necessarily the product of “bad guys”. The author believes indeed that “social networks like Facebook and super-aggregrators like FriendFeed introduce a new type of social spam” by encouraging spammy behavior from their users thus introducing “a new type of social spam that comes primarily from your "friends"”

If several authors provide suggestions how to fight spam coming from “bad guys”, e.g. by developing new kinds of CAPTCHA or involving the community with spam control, no solutions have been yet put forward with regard to “social spam”.

Rate this Article

Adoption
Style

BT