HowtoForge Forums | HowtoForge - Linux Howtos and Tutorials

HowtoForge Forums | HowtoForge - Linux Howtos and Tutorials (
-   Programming/Scripts (
-   -   Web Spiders (

shinyjoy 29th October 2007 13:13

Web Spiders
Hiii All,

Can anyone help me out in creating web spiders. I have been able to do it site specifically using CURL ie for one site. But i need to integrate several sites. Can anyone help me out?? :(



edge 29th October 2007 20:45

Maybe this is of some use:

shinyjoy 30th October 2007 05:40

Thank you but........

Originally Posted by edge
Maybe this is of some use:

But you see , that spider is to perform only searches. What I wanted was to login into authorized sites using user name and password and retrieve data from that site. Its possible using CURL, i did one site using it, but I have to use several sites , hence to make a generalized one using database and all.



falko 30th October 2007 18:36

Should be possible with Curl. wget and Snoopy ( ) might be other options.

leblanc 15th November 2007 21:07

how about javascript?
raw html webspiders are a thing of the past...

you need a full blown browser api @ your finger tips.

how about pages that modify the dom after the page has already been loaded?

example safari books does this.. just to make it difficult on the end user from simply stripping the html...

see if you can use: WebClient Class (System.Net)
you can use c# and c# visual studio express to test it out.

if not u need to build mozilla or hijack ie using dll's the goal is to work with a browser programatically. See mono project for their mozilla client api.

its on my todo list..

petter5 22nd February 2008 00:03

You can download OmniFind for free from
It will meet your requirements.

it's based on nutch

Omnifind is easy to use and and can be installed by absolutely noobs. ;)

It runs on :

* 32-bit Red Hat Enterprise Linux
Version 5
* 32-bit SUSE Linux Enterprise 10
* 32-bit Windows XP SP2
* 32-bit Windows 2003 Server SP2

/ Petter

jonepain 23rd December 2009 07:21

There are a few ways to spider.The first, which I'll call general spidering,simply grabs a page, and searches it for whatever you're looking for for instance,a search phrase.The second, specific spidering, grabs only a certain portion of a page.This scenario is useful in cases where you might want to grab news headlines from another site.If you want to get fancy,you can build in functionality to ignore links that are within the same site.You have used ASP page.There are a few drawbacks,however. Normally,you can get around this issue by not allowing the ITC to use default values specify the values every time.Another, more serious, problem involves licensing issues.ASPs do not have the ability to invoke the license manager.The license manager checks the key in the actual component, and compares it to the one in the Windows registry.If they're not the same,the component won't work.

All times are GMT +2. The time now is 12:58.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.