Results 1 to 4 of 4

Thread: Extract all links of all pages in a website

  1. #1
    Junior Registered Del's Avatar
    Join Date
    Jan 2004
    Posts
    2

    Extract all links of all pages in a website

    Hi guys,

    I am forwarding you the same message I have sent to Westech, please let em know if you could help me.

    Message:

    I work with translations services and sometimes I need to extract all pages from a website in order to provide clients a quote based on word amount. I have used these "Website mappers" but it doesn't work, it extracts only the main page, so I have to access the others page one by one.

    According to what I have read about it that is because top menus in most new sites contain in a flash file (.swf) and spider programs can not process flash binary files also complex java scripts, java applet, cookies. etc...

    I don't have an especific web site to extract the links because a new client brings a new website.

    Also some people told me that it is very, very difficult to be done and I won't find a software in the meanwhile.

    My website is an example www.terralingua.com.br

    Please let me know if you know a software that could do it.


    Regards,

    Daniel
    Porto Alegre - R.S/Brazil

  2. #2
    Chronic Entrepreneur
    Join Date
    Nov 2003
    Location
    Tulsa, Oklahoma, USA
    Posts
    1,112
    Sorry, I don't know of any software that can follow flash links.

    I'm curious as to why you picked me to send a private message about this, since I'm sure that many of the other users here have more knowledge about this than I do.

    Good luck in your search!

  3. #3
    Resident smart a$$
    Join Date
    Sep 2003
    Location
    UK
    Posts
    152
    You could use a offline browser.

    Offline Explorer Pro's one of the better programs that does this IMO. Here's a comparison table they've published.

    One thing to note when using one of these programs is not to overload the site your downloading - there's usually an option to limit the amount of pages to download each connection.

    Originally posted by Del
    According to what I have read about it that is because top menus in most new sites contain in a flash file (.swf) and spider programs can not process flash binary files
    Google and All The Web have no problem following links within SWF files as can the software mentioned above
    Last edited by Best.Flash; 02-26-2004 at 11:39 PM.

  4. #4
    Junior Registered Del's Avatar
    Join Date
    Jan 2004
    Posts
    2

    Reply

    Hi,

    I have sent you that because I read a message of yours about extracting links, so...

    I appreciate your attencion anyway.


    Regards,

    Daniel

Similar Threads

  1. 1500 pages PR6 website for sale
    By sunny in forum The Marketplace
    Replies: 13
    Last Post: 10-30-2004, 06:34 AM
  2. How much is links from several PR9 pages worth?
    By incka in forum Search Engine Optimization
    Replies: 29
    Last Post: 05-12-2004, 03:09 AM
  3. Local Rank stuff...
    By chromate in forum Search Engine Optimization
    Replies: 41
    Last Post: 02-07-2004, 03:53 PM
  4. Website publisher backwards links
    By Cloughie in forum Search Engine Optimization
    Replies: 6
    Last Post: 10-24-2003, 07:08 PM
  5. E-mailing html pages (not links or attachments)
    By Kenny L in forum HTML, CSS, Layout, and Design
    Replies: 2
    Last Post: 08-24-2003, 02:55 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •