Flattening a websitey

Jamon Camisso jamon.camisso-H217xnMUJC0sA/PxXw9srA at public.gmane.org
Fri Apr 17 00:56:38 UTC 2009


Christopher Browne wrote:
> On 2009-04-16, Chris F.A. Johnson <cfaj-uVmiyxGBW52XDw4h08c5KA at public.gmane.org> wrote:
>> On Thu, 16 Apr 2009, William O'Higgins Witteman wrote:
>>
>>> I have a strange, interesting problem.  I have a dynamic website which
>>> is going to stop being dynamic - it will no longer be updated in the
>>> future, but we'd like to keep it up and running, rather than going dark.
>>>
>>> The dynamic nature of the site is pretty simple - just a content
>>> management system.
>>>
>>> I was wondering - could you put squid in front of this site, hit it with
>>> a crawler, and then replace the old site with the contents of the cache?
>>> Is this a sane approach?
>>>
>>> Any suggestions would be welcome.  Thanks!
>>     I'd use wget -r to duplicate the site locally and replace it with
>>     that.
> 
> I did this once with a "legacy" Bugzilla instance, where there were
> such massive changes between old and new versions that it was
> implausible to convert data from the old instance into the new one.
> 
> By pulling a copy via wget, we had an archive of the old bugs.  Not
> editable, of course, but they were *available*...

I'm in the same boat, about 4--5 bugzilla instances all of different 
outdated versions. I think for archiving I'll be importing them into 
trac since it has an import feature, and then from there into redmine.

Jamon
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list