Spam 2.0: The Problem Ahead
Access Status
Authors
Date
2010Type
Metadata
Show full item recordCitation
Source Title
ISBN
School
Collection
Abstract
Webspam is one of the most challenging problems faced by major search engines in the social computing arena. Spammers exploit weaknesses of major search engine algorithms to get their website in the top 10 search results, which results in higher traffic and increased revenue. The development of web applications where users can contribute content has also increased spam, since many web applications like blogging tools, CMS etc are vulnerable to spam. Spammers have developed targeted bots that can create accounts on such applications, add content and even leave comments automatically. In this paper we introduce the field of webspam, what it refers to, how spambots are designed and propagated, why webspam is becoming a big problem. We then experiment to show how spambots can be identified without using CAPTCHA. We aim to increase the general understanding of the webspam problem which will assist web developers, software engineers and web engineers.
Related items
Showing items related by title, author, creator and subject.
-
Hayati, Pedram; Chai, Kevin; Potdar, Vidyasagar; Talevski, Alex (2010)Web spam is an escalating problem that wastes valuable resources, misleads people and can manipulate search engines in achieving undeserved search rankings to promote spam content. Spammers have extensively used Web robots ...
-
Goh, Kwang Leng (2013)Web spamming has tremendously subverted the ranking mechanism of information retrieval in Web search engines. It manipulates data source maliciously either by contents or links with the intention of contributing negative ...
-
Hayati, Pedram; Potdar, Vidyasagar; Talevski, Alex; Chai, Kevin (2011)The growth of spam in Web 2.0 environments not only reduces the quality and trust of the content but it also degrades the quality of search engine results. By means of web spambots, spammers are able to distribute spam ...