Publish a set of Wiki pages and attachments, as specified by the QueryPlugin, to a user defined directory, on some user-defined schedule.  The directory could be made publicly accessible.  In essence the Wiki could be used to author content that the PublishPlugin then (duh) publishes for read-only public consumption.

NOTE: {{baseURl}} would need stripped out or modified in the published HTML so relative links would work as expected.  Interwiki links could be problematic too.  Hmm, if the original Wiki was secured appropriately there wouldn't be any need of this plugin would there be?... Or would there?... maybe?... --JohnV

See [Ideas] for some discussions that give context to this plugin.

You could do this now with [wget|].  Wget will walk a site and pull all the text and graphics.  I have used it to strip a customer site to make sure we have all the HTML and graphics before we make changes.  It will accept a list of pages to look at, so you can pick and choose what pages you extract.  Wget is open source and there are precompiled Unix/Linux/Windows versions available.  -- [FosterSchucker]

Hmm, I use wget on a chron to "ping" our running wiki's.  It never occured to me to actually use it for this... But... The files you get won't be usable will they?  I mean I couldn't point a browser at the gotten files and read and follow links, all links would route me back to the live site.  I want something more javadoc-like where all links are relative. Unzip to a directory and off you go.  Also wget would pull all the live site rendered "stuff" like ~LeftMenu, the ~SearchBox, etc.  I want the page content only, no edit link, no more history link, etc.  So wget is similar in spirit but not quite what I'm after.  --JohnV

Wget will make a clone on disk and fix up all the URLs (A,HREF,IMG tags) or it will make a site someplace else that you can host.  It's one of those programs that has about 2.3 million options and they are all good.  If you had enough disk you could get the entire Internet on your laptop :-).  We've also used Wget to preload the cache on both webservers and reverse proxy servers. -- [FosterSchucker] 

On a semi related note, I pull my site down to the Palm Pilot using AvantGo, so I have a read only version with me.  It does not pull the attachments down.  -- [FosterSchucker]


This may be a stupid idea, but couldn't you just point two copies of JSPWiki at the same content directory.  Then you would just need to disable the ability to edit the page from one copy.  The editable copy would be internal, and the uneditable would be external.  I imagine you could just remove the 'Edit' links from the external version, and then remove whatever processes the changes when you press save.  

''Yes, you could.  Removing Edit.jsp should do the trick.  However, I wouldn't try this with 2.0; 2.2 should survive it fine. --JanneJalkanen''

''Actually I believe some plugins like QueryPlugin among others have problems when you update/remove pages.  The "read-only" copy wont understand the change has happened, so unless you don't use QueryPlugin or ReferringPagesPlugin it should work.  I used a "live" Wiki, set up Tomcat security to only allow authenticated users to edit/delete content, and placed a little note on the page saying, "there are edit links, but you can't use them."  -- ErikAlm''

!Current Status: ConceptPlugin