Need shell script for extracting website content for a list of given websites

Discussion in 'Programming/Scripts' started by unknowngeek, Aug 31, 2009.

  1. unknowngeek

    unknowngeek New Member


    I have used bash scripting for sometime in the past and know quite a
    bit of linux. But haven't used it in a long time.

    I am off job and took some data entry job where I need to do some stats work on a very long list of websites.

    This I get from a website called which gives
    statistics of other web sites when I put the other site from the list
    I already have, each into the SITENAME field here:

    Here's the url for stats on quantcast:

    Here's an example below for in place of the above SITENAME:

    (The problem now is that the site gives the stats in images instead of text so I will have a hard time getting the values out of the images.)

    After getting this kind of list, it has to be input into the xls
    sheet. So I will need this information in some kind of file format
    which can be converted or imported into excel easily. CSV etc I

    I know this can be done but I will need to learn quite a lot of bash
    scripting again. I have not used it since some time.

    Can someone please help me making this script. I will be able to
    understand the basics for sure.

    This is what I had intended to to at first, when I didn't know that the stats are in images rather than text:

    What I had intended to do was in these steps:

    1. Copy / Paste the site list into a text file. (Each site is on a
    newline already.)

    2. Use something like sed/awk etc to insert the
    before each of these sitenames.

    3, To each of the result in (2) above, insert the word "/demographics"
    at the end of the urls.

    4. The above will now be a file with links to each of the sites on Using wget, download each page and save them with some
    names or numbers.

    5. Tidy the html or somehow batch-convert the html to plain-text.

    6. From this file, get the given field-names and their values (that
    is, the numbers) and save this info for each site into a new file also
    including only the original sitename (without the,
    using the cut command or something.

    7. Convert this file into a CSV or tab-delimited format.

    The CSV conversion as per me would be the last thing to do.

    But first, I need to try to get each page, save it with a numbered
    name, then extract the images.

    Save the images for each site in separate folders.
    (The image names are the same for example they are like demograp.png,
    demograq.png, demograr.png, demogras.png)

    And then extract the content of each image using any utility I can
    find. I hope this part really can be done.

    Any help Very Much appreciated and thanks,
  2. cfajohnson

    cfajohnson New Member

    Do the images not have an alt text that you could use?
  3. unknowngeek

    unknowngeek New Member

    Unfortunately, no

    They don't have the text that is needed from the image, just a generic text, same for all the demographics.

    Anyway, I've been typing all those and completed about 3000 of them, still 7K of them remain.
  4. PatrickMc

    PatrickMc New Member

    Collecting data from web pages

    Yes, automating manual data entry using a scripting language is easy. Collecting and organizing data using biterscripting is very easy. I could write a custom script for you. But instead of me scratching my head, do take a look at the google group article , where they are doing the exact same thing and they have posted some scripts for collecting business addresses, telephone numbers, etc. from web pages.


Share This Page