Results 1 to 13 of 13

Thread: Mixing static pages and CMS

  1. #1
    Senior Member agua's Avatar
    Join Date
    Sep 2005
    Location
    Pottsville, NSW
    Posts
    531

    Mixing static pages and CMS

    I have a client who has a site which is around 3 years old. It is made up of static pages - around 350.

    He doesn't want to pay for moving these pages into a CMS, but wants to start using a CMS for all new pages to the site.

    Is there any pitfalls in doing this, or is it the norm?
    I Do Website Design - but I am here to learn all about publishing

  2. #2
    I'm not completely sure it's possible the way most CMS's are set up. With the CMS's I've used, even though a page is "static" the information is still stored in a database, so I'm not sure how simply uploading 350 static html files to the root directory of your site would react with the CMS? Maybe it'll be fine.

    As for pitfalls, I guess one would be that it's more work to back everything up since the info wouldn't reside in a database. I guess it wouldn't be that bad if the info on the 350 pages was in fact static and NEVER changed, but if something did change the client would have to know how to back it up by hand to his local machine/CD/etc.

    I'm sure this is all stuff you've already thought of though.

  3. #3
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    I can't see how you could avoid atleast entering article titles & names into the CMS. It seems like a stupid thing to scrimp money on.
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  4. #4
    Site Contributor KLB's Avatar
    Join Date
    Feb 2006
    Location
    Saco Maine
    Posts
    1,181
    I'd agree with the others that not entering the static pages into the CMS seems like a really bad idea.
    Ken Barbalace - EnvironmentalChemistry.com (Environmental Careers, Blog)
    InternetSAR.org: Volunteers Assisting Search and Rescue via the Internet
    My Firefox Theme Classic Compact: Based onFirefox's classic theme but uses much less window space

  5. #5
    Registered
    Join Date
    Mar 2006
    Posts
    350
    Well, it wouldn't be too hard to scrape the website for data so that you can insert it into the CMS (and hopefully keep the link structure intact).

  6. #6
    Senior Member agua's Avatar
    Join Date
    Sep 2005
    Location
    Pottsville, NSW
    Posts
    531
    I thought as much

    I did think of an idea to spread the cost into his hosting so it isn't a lump sum payment.

    Scraping sounds good - I've heard lots about it, but don't really know what it is. It would speed up the process considerably and cut down costs.

    How do i "scrape"?
    I Do Website Design - but I am here to learn all about publishing

  7. #7
    mastermind michael_gersitz's Avatar
    Join Date
    Aug 2003
    Location
    Buffalo
    Posts
    749
    A php or perl script that reads the text on a page and puts it into a database.

    -------------


    How are all the current 250 articles aranged on the site? All in one folder like /articles/

    and how are they named?

    like
    /article1.php ?

  8. #8
    Registered
    Join Date
    Jan 2006
    Location
    Italy/Thailand
    Posts
    27
    If you are not very proficient with PHP, you can always use one program to do the work for you, I'm currently using Robosuite by Kapow, but it's way too expensive to justify a purchase, I mean only for this simple work.
    I heard screenscraper is good enough and costs $90 or so.
    Or you can use something like macroexpress to scrape and save that stuff.

    ...or , copy and past each single page manually... heh

  9. #9
    Registered
    Join Date
    Aug 2006
    Location
    Sacramento, CA
    Posts
    208
    Yes, scaping the articles into a DB would be very easy assuming they all use the same template, etc. Making the CMS use the same URL structure as the original files may be a little more difficult, but depending on the situation may not be that important.
    ________
    Voyeur hidden
    Last edited by rpanella; 03-17-2011 at 10:29 AM.

  10. #10
    Registered
    Join Date
    Mar 2006
    Posts
    350
    Scraping the website would consist of a fairly simple PHP script (or Perl, ASP, etc). Of course, it really depends on how complex the pages are and what CMS you plan on using.

  11. #11
    I see mildly ill people. AmbulanceBlues's Avatar
    Join Date
    Aug 2006
    Location
    Houston
    Posts
    119
    I figured there had to be a free or open source constituency for this sort of thing. Try any of these:

    Sinon
    phpSpider
    Smutster (Hehe...)
    --> --> --> --> --> -->

  12. #12
    Registered
    Join Date
    Jan 2006
    Location
    Italy/Thailand
    Posts
    27
    Ah, yeah, good luck customizing that stuff for your needs.
    I mean, free stuff is good, but when you consider the time you spend to learn how it works...
    Give me a well written manual everyday...
    yes I'm a php noob shame on me

  13. #13
    Senior Member agua's Avatar
    Join Date
    Sep 2005
    Location
    Pottsville, NSW
    Posts
    531
    Thanks for the info.

    The articles are in separate directories and on a variety of page templates with unique filenames - so this may not be as easy as it could be, but I'll give it a go.

    Downloading Screen Scraper and Sinon now.
    I Do Website Design - but I am here to learn all about publishing

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •