BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News 80legs Is a Web Crawling Service

80legs Is a Web Crawling Service

This item in japanese

80legs uses Plura’s grid of over 50,000 computers to crawl over 2 billion pages a day. Shion Deysarkar, 80legs CEO, says that their crawling services are generally requested by smaller search engines which do not afford their own large capacity grid, companies performing market research, organizations monitoring copyright infringement activities, and ad companies spying on what their competitors are doing.

The service can be accessed on demand by setting up a job and executing it. As any crawling process, the job needs a seed list which can be contained by a text file up to 1 GB in size. Other job parameters are:

  • Outgoing links – used to specify which links to crawl of those resulting from a seed
  • Depth level – the URL level measured to a seed
  • Crawling type – multiple depths in the same time or only one depth at a time
  • Number of URLs – specifies the maximum number of URLs to crawl
  • MIME types – specifies the page types to crawl
  • Analyze options – there are several analysis options like keyword matching, regular expressions, running custom code

When a job runs, the crawler starts reading web pages starting with the seed ones and considering the outgoing links options, and analyzes the content of the pages. Simple analysis is available by specifying keywords to match or by selecting information based on regular expressions, but complex analysis can be performed on the data by using a custom application or a pre-built 80legs application. The analysis application needs to be written in Java. 80legs plans to open an application store where developers can sell their applications at their desired price and will collect all the revenue. 80legs has launched a contest to attract developers.

Paid subscriptions offer access to a Python API to interact with the crawling engine. Plans are for a Perl API. Free subscribers can create and control their jobs through the 80legs Portal.

There is a free plan with some limitations: 1 job at a time, 100k pages of max 100KB each, a 10MB analysis application (Java JAR), no API, 1 hit per second for the domain searched. There are two paid subscriptions, the top one offering 5 concurrent repeatable jobs with 10M pages/job, 10 MB/page, a 10 MB JAR, and 10 hits/sec/domain for $2/million pages crawled and 3 cents for CPU-hour utilized.

Rate this Article

Adoption
Style

BT