This is version . It is not the current version, and thus it cannot be edited.
[Back to current version]   [Restore this version]

Here's an idea: Let's define an XML-RPC or SOAP interface to Wiki. I don't exactly know what we could do with it, but at least we could do things like:

  • Automatical notification of page changes (someone would need to write a script that would check the RecentChanges, then email anyone.
  • Combining Wikis in a manner more efficient than InterWiki.

This would save us from actually requiring to implement all sorts of features into JSPWiki itself, and allow people to make their own modular thingies.

Interface#

I'm thinking of the following interfaces:

  • getRecentChanges( long timestamp ): Get list of changed pages since timestamp.
  • getPage( String pagename, int version ): Get the raw Wiki text of page.
  • getPageText( String pagename, int version ): Get page data rendered as plain text, with most of formatting removed (this should be really good for when you actually send wiki pages via email or something).
  • getPageHTML( String pagename, int version ): Return page in rendered HTML. This is of course required because you can never know how the WikiText should be applied, since it varies from Wiki to Wiki.

I don't know whether writing would be such a good idea. But with these you could get somewhere anyway.

--JanneJalkanen


That's a very interesting thought. It allows any outside process to monitor and perform actions based on the current state of the Wiki, but without actually affecting the Wiki.

As specified above, the only way to get a list of pages is getRecentChanges(). It seems a bit limiting to think that all the other process would care about is recently changed files. What about:

  • getAllPages(): Get list of all Wiki pages.
  • getMatchingPages(String query): Get list of all Wiki pages that match the query. Of course, this then begs the question of what matching means. That's kind of annoying.

getMatchingPages() may be overkill. Perhaps these queries would be better implemented with a SOAPProvider, a WikiPageProvider based on SOAP. Then the matching SOAP service can store the files anyway it likes, be notified the instant that a page is stored, and query the page text any silly way it feels like. A SOAPProvider would also get around the limitation that the API specified above requires the interested party to "poll" the WikiEngine periodically. With a SOAPProvider, if you care about every save or read, you can track them yourself.

--MahlenMorris


This was almost too easy. JSPWiki 1.6.6-cvs now supports an XML-RPC interface at URL http://www.ecyrd.com/JSPWiki/RPC2/ (note the trailing slash).

To try this one out, go download the Apache XML-RPC package from http://xml.apache.org/dist/xmlrpc/, then type in the following command:

java -cp xmlrpc.jar org.apache.xmlrpc.XmlRpcClient http://www.ecyrd.com/JSPWiki/RPC2/ jspwiki.getPage Main

See below for the available methods.

(NB: You can't use the command line to get RecentChanges, you'll have to write a script of your own.)

Please test this and see how it works out.

--JanneJalkanen


Nifty! Works really well.

My only quibble is that you allow me to ask by specific version, but i have no idea what versions are acually available. It's current behavior is nice in that if I use a non-existant version number it just uses the most recent, but wouldn't a call like:

  • int getLatestVersion(String pagename)
help? Just so my program don't have to guess at version numbers.

In fact, could you add a version number to the struct that getRecentChanges() returns? Then an email alerting program could diff between the oldest and the current within the time period, since a page may have changed multiple times in the time period requested (I know i have a tendancy to save a page, and then immediately have another thought).

I agree on using XML-RPC. Much nicer. -- MahlenMorris


You can use just plain getPage( string ) to get the latest version.

--JanneJalkanen


The getPageInfo() method has been implemented now. The current API looks thus like this:

  • getRecentChanges( Date timestamp ): Get list of changed pages since timestamp. The result is an array, where each element is a struct:
    • name (string) : Name of the page.
    • lastModified (date) : Date of last modification.
    • author (string) : Name of the author (if available).
    • version (int) : Current version.
  • getPage( String pagename ): Get the raw Wiki text of page, latest version.
  • getPage( String pagename, int version ): Get the raw Wiki text of page.
  • getPageHTML( String pagename ): Return page in rendered HTML.
  • getPageHTML( String pagename, int version ): Return page in rendered HTML.
  • getAllPages(): Returns a list of all pages. The result is an array of strings.
  • struct getPageInfo( string pagename ) : returns a struct with elements
    • name (string): the canonical page name
    • lastModified (date): Last modification date
    • version (int): current version
    • author (string): author name
  • struct getPageInfo( string pagename, int version ) : returns a struct just like plain getPageInfo(), but this time for a specific version.

There is still a known problem with Date handling... It should probably be UTC, but at the moment all times seem to be local time.

--JanneJalkanen.


Two problems:

  1. Date handling.
  2. UTF-8.

First, dates: The XML-RPC spec has no way to specify the timezone. Which means that you actually would have to know the timezone in which the server resides... And which also means that the XML-RPC client can use whatever timezone it likes to interpret the incoming message. And that ain't fun. We could of course

  • a) Just send an int meaning seconds of UTC since EPOCH,
  • b) Send a long disguised as a String meaning milliseconds of UTC since EPOCH,
  • c) Send a ISO format String as the date, or
  • d) Try to figure out a way to talk UTC under XML-RPC.

Second, sending UTF-8. XML-RPC spec says "ASCII". Which sucks, again. Sending ISO-8859-1 seems to work, but is there any guarantee that the recipient can talk UTF-8? And how would he know the incoming character set? Or would that be the responsibility of the XML parser?

At the moment the Apache XML-RPC library seems to use Latin1 ONLY. The euro sign (€) gets killed on the way, as are all UTF-8 characters.

--JanneJalkanen


More updates. The UTF-8 issue seems to be talked to the death on the XMLRPC mailing list. The summary seems to be: "While many toolkits might support something else than ASCII in string values, the XML-RPC spec is frozen, and will never change. If you transport something else than ASCII, you're in violation of the spec. Use base64."

Using base64 would mean that all methods that use strings now, should use base64 (because JSPWiki supports UTF-8 all across the board - in fact even ISO-Latin1 is not supposed to go through XML-RPC strings). Which means more work to the application writer, since he has to encode/decode all stuff going back and forth. Gng. XML-RPC is not person-to-person interoperable - many people are unable to write their own names as strings.

I'm seriously considering SOAP at this point. Or breaking the XML-RPC spec knowingly and willingly; call it WikiRPC or something =). (XML-RPC is a registered trademark of Userland Software).

Date issue is easier, but java.util.Calendar and the XML-RPC library does not make it exactly simple to assume UTC. Need some shifting back and forth.

--JanneJalkanen


Be prepared: The API will change in the next release! Return values will be UTF-8 strings wrapped in base64, and input will likely be URL-encoded UTF-8 strings (to be compatible with the JSPWiki URL scheme). --JanneJalkanen


Here is now the new API as of v1.6.9 (the command prefix being jspwiki.):

  • getRecentChanges( Date timestamp ): Get list of changed pages since timestamp, which should be in UTC. The result is an array, where each element is a struct:
    • name (string) : Name of the page. The name is UTF-8 with URL encoding to make it ASCII.
    • lastModified (date) : Date of last modification, in UTC.
    • author (string) : Name of the author (if available). Again, name is UTF-8 with URL encoding.
    • version (int) : Current version.
  • base64 getPage( String pagename ): Get the raw Wiki text of page, latest version. Page name must be UTF-8, with URL encoding. Returned value is a binary object, with UTF-8 encoded page data.
  • base64 getPage( String pagename, int version ): Get the raw Wiki text of page. Returns UTF-8, expects UTF-8 with URL encoding.
  • base64 getPageHTML( String pagename ): Return page in rendered HTML. Returns UTF-8, expects UTF-8 with URL encoding.
  • base64 getPageHTML( String pagename, int version ): Return page in rendered HTML, UTF-8.

  • array getAllPages(): Returns a list of all pages. The result is an array of strings, again UTF-8 in URL encoding.
  • struct getPageInfo( string pagename ) : returns a struct with elements
    • name (string): the canonical page name, URL-encoded UTF-8.
    • lastModified (date): Last modification date, UTC.
    • version (int): current version
    • author (string): author name, URL-encoded UTF-8.
  • struct getPageInfo( string pagename, int version ) : returns a struct just like plain getPageInfo(), but this time for a specific version.

As you can see, all data is returned in a base64 -type in UTF-8 encoding, regardless of what JSPWiki preference actually is. Also, all incoming or outcoming strings are really UTF-8, but they have been URL-encoded so that the XML-RPC requirement of ASCII is fulfilled.

Let me know if this works...

--JanneJalkanen

MahlenMorris It seems to be working fine. I'm turning the base64 back to a String by calling new String((byte) server.execute(GETPAGE, args), "UTF-8"); does that seem right? Like for most ignorant Americans, I18N and character encodings are very mysterious to me :)

I'm still having trouble getting the time zone right, though. I've been looking at what you did in your code, but no matter what I do i get times that think they are PST but are in fact EET, For example, as i write this it thinks that the TODOList was last changed at 00:01 PST, when it was really 00:01 EET. If you were really sending UTC, i don't think I'd be getting that. Should that be working yet?

JanneJalkanen: Yeah, that's the correct way to get UTF-8. It's entirely possible that I screwed up something in the TimeZone thing... I didn't really test it properly. BTW, note that XML-RPC does not transport TimeZone information at all, and the Apache XML-RPC library always assumes your default TimeZone when it's reading the timestamp. You'll have to manipulate the result with the Calendar class to make sure it's UTC.

10-Feb-02: I think I fixed the date problem in 1.6.10. Please test it, o ye who live on other timezones. :-)


ErnoOxman: It would be nice to have something like setPage( String pagename, base64 text) for the next version of API.

Expect to see J2ME client soon...

JanneJalkanen: Yes, it's probably a good idea to have one as well. I wanted to make sure the getting of pages work before allowing anyone to write a four-line script effectively deleting all pages :-).

A secondary note: Should the putPage API also include a username-password combination?

ErnoOxman: I guess authentication could be done with HTTP Basic Authentication header, which is supported at least in Apache XML-RPC package. If it doesn't feel suitable for some reason, username-password combination would be ok, too.


Funny, I was just thinking of implementing something like this for UseMod and / or TWiki. Seems like it wouldn't take too much to do, and it'd be nice to jump on a common interface bandwagon. Let's see... how quick can I throw it together...

(Oh yeah, and I wandered in off the street from http://www.scripting.com/)

Something else I was thinking of for my own Wiki RPC API, which might sound strange, was a wiki filter method. That is, accept text, process text for formatting and WikiWords, return content with links and formatting applied (sans wiki header/footer).

My first purpose in mind was to use this as a way to join a weblog and a wiki, where the weblog entries link to the wiki and can be written in the wiki's style.

-- LesOrchard

I'm scratching my head a bit on that API extension. I too am growing very fond of the idea of using Wiki TextFormattingRules to edit my web pages (I've even noticed myself accidently using Wiki style in Word documents). But if you want to look at Wiki pages without the header/footer/left menu, why not just edit the .JSP page that doesn't include those items? Aren't those other elements part of the page design? I'm not sure how Twiki is designed, so admittedly this comment may not be all that germane. Am I misunderstanding your desired effect? Using Wiki to create pages that are not editable by the masses?

This problem does point to one limitation of the current structure of the code. TranslatorReader thinks that the only .JSP that you'd want to view pages with is "wiki.jsp". But if I wanted to have two different views of the data, one that I use to view and potentially edit pages, and another that just shows the pages without suggesting the ability to edit them, there's no good method for that. I think this implies that there's too close a tie between the Model and the View. Maybe if TranslatorReader could take arguments saying what page is for viewing pages.

By the way, I know that the above situation could be partially implemented by using permissions (give me read/write permissions, but no one else). But the pages would still have the "Edit this page" links on them; they just wouldn't work for anyone else. A cleaner way to do that would be interesting.

But then, this latter seems counter to the Wiki Way. Maybe the real solution is to make a Weblog that had wiki-style editing!

(The above is Mahlen rambling on a topic; the above paragraphs are not believed to form a coherent idea) -- MahlenMorris


I agree that the current TranslatorReader is not as independent as it should be. Partly this is because I wanted to avoid the complications of making a completely generic Wiki translator - I figured nobody else would be interested in using the same kind of translator =).

You could do two things, though:

As for a Wiki&Weblog synthesis, PikiePikie is a sort of combination, I believe. I just couldn't really make heads or tails with it =).

Also, you can get the Wiki HTML by using the getPageHTML() methods of the XML-RPC API. That way you get it without the headers/footers.

--JanneJalkanen


Oh yeah - and making a common Wiki interface is a cool thing, I agree. Perhaps we should define a standard "wiki." -prefix for all commands, so that you could use "twiki." or "jspwiki." or whatever for app-specific thingies? :-) --JanneJalkanen


Yup, it looks like PikiePikie has something quite similar to what I'm thinking of: A weblog whose entries lead into the wiki itself. In the case of PikiePikie, the weblog is a trick of the wiki itself. What I'm thinking of is where something like BlogApp is used to post a weblog entry to a MovableType weblog, and via some filter (say, a BloggerAPIProxy) which calls on the WikiRPCInterface, that weblog entry is imbued with links to the wiki on the site before it reaches MovableType. So, a site would have a weblog for timely news and updates and a wiki for more long-term idea development. Sure, the wiki's RecentChanges could serve as a source of news and updates, but a weblog is a more explicit tool for that.

--LesOrchard

Add new attachment

Only authorized users are allowed to upload new attachments.
« This particular version was published on 11-Feb-2002 18:45 by 65.42.33.202.