PDA

View Full Version : Mixing static pages and CMS



agua
02-15-2007, 09:55 PM
I have a client who has a site which is around 3 years old. It is made up of static pages - around 350.

He doesn't want to pay for moving these pages into a CMS, but wants to start using a CMS for all new pages to the site.

Is there any pitfalls in doing this, or is it the norm?

deronsizemore
02-16-2007, 07:32 AM
I'm not completely sure it's possible the way most CMS's are set up. With the CMS's I've used, even though a page is "static" the information is still stored in a database, so I'm not sure how simply uploading 350 static html files to the root directory of your site would react with the CMS? Maybe it'll be fine.

As for pitfalls, I guess one would be that it's more work to back everything up since the info wouldn't reside in a database. I guess it wouldn't be that bad if the info on the 350 pages was in fact static and NEVER changed, but if something did change the client would have to know how to back it up by hand to his local machine/CD/etc.

I'm sure this is all stuff you've already thought of though.

Chris
02-16-2007, 07:53 AM
I can't see how you could avoid atleast entering article titles & names into the CMS. It seems like a stupid thing to scrimp money on.

KLB
02-16-2007, 08:09 AM
I'd agree with the others that not entering the static pages into the CMS seems like a really bad idea.

MaxS
02-16-2007, 04:28 PM
Well, it wouldn't be too hard to scrape the website for data so that you can insert it into the CMS (and hopefully keep the link structure intact).

agua
02-16-2007, 04:34 PM
I thought as much :(

I did think of an idea to spread the cost into his hosting so it isn't a lump sum payment.

Scraping sounds good - I've heard lots about it, but don't really know what it is. It would speed up the process considerably and cut down costs.

How do i "scrape"?

michael_gersitz
02-16-2007, 05:31 PM
A php or perl script that reads the text on a page and puts it into a database.

-------------


How are all the current 250 articles aranged on the site? All in one folder like /articles/

and how are they named?

like
/article1.php ?

mark1
02-16-2007, 05:57 PM
If you are not very proficient with PHP, you can always use one program to do the work for you, I'm currently using Robosuite by Kapow, but it's way too expensive to justify a purchase, I mean only for this simple work.
I heard screenscraper is good enough and costs $90 or so.
Or you can use something like macroexpress to scrape and save that stuff.

...or , copy and past each single page manually... heh :eek:

rpanella
02-16-2007, 07:13 PM
Yes, scaping the articles into a DB would be very easy assuming they all use the same template, etc. Making the CMS use the same URL structure as the original files may be a little more difficult, but depending on the situation may not be that important.
________
Voyeur hidden (http://www.****tube.com/categories/1154/hidden/videos/1)

MaxS
02-16-2007, 07:18 PM
Scraping the website would consist of a fairly simple PHP script (or Perl, ASP, etc). Of course, it really depends on how complex the pages are and what CMS you plan on using.

AmbulanceBlues
02-16-2007, 08:50 PM
I figured there had to be a free or open source constituency for this sort of thing. Try any of these:

Sinon (http://sourceforge.net/projects/sinon/)
phpSpider (http://sourceforge.net/projects/phpspider/)
Smutster (Hehe...) :p (http://sourceforge.net/projects/smutster/)

mark1
02-17-2007, 10:53 AM
Ah, yeah, good luck customizing that stuff for your needs.
I mean, free stuff is good, but when you consider the time you spend to learn how it works...:sick:
Give me a well written manual everyday...
yes I'm a php noob :bawling: shame on me :p

agua
02-18-2007, 07:13 PM
Thanks for the info.

The articles are in separate directories and on a variety of page templates with unique filenames - so this may not be as easy as it could be, but I'll give it a go.

Downloading Screen Scraper and Sinon now.