Tags:
create new tag
view all tags
A read only offline Wiki is for people in the field who have a need to browse and search content. (Topic started in OfflineWiki)
  • Pros:
    • Only web browser needed for browsing.
  • Cons:
    • Can't edit content while offline.
  • How:
    • Generate static HTML pages and a big index as a search replacement.
    • Copy the pages (only diff?) periodically to offline machines.

Related: OfflineWiki, ReadWriteOfflineWiki, WikiClusters, WebsitePublishing, NonWebAccessToWiki, TWikiWithCVS, TWikiXML, PublishAddOn

-- PeterThoeny - 23 May 2000


Use a cron job to generate a zip file of all the entire site. If the site is too large and people update frequently, also generate an archive of only the changed files.

This way the user just needs the archive tool, and overwrites any existing files for incremental mode. Expecting people to have a patch and be able to use program would be beyond many casual users.

-- MathewBoorman - 24 May 2000


An experiment - ReadOnlyOfflineWiki, the Palm Pilot, and AvantGo

Background
AvantGo is a PC client, handheld client (palm pilot for me), and website. The website records "channels" of information you want downloaded through the PC client to the palm pilot, information to help manage the size of the download, and scheduling information. These channels are sets of web pages linked hierarchically from a "home" page you identify. You specify the depth of link traversal. If you specify a "link depth" of 0 to AvantGo, only the primary page is fetched; 1 gets you the primary and all pages it links to; 2 fetches the primary, all pages it links to, and all pages they link to as well; etc.

In Twiki webs, the header and footer of pages generated using standard templates contain links to other webs, to the search script, the index page, and edit, rdiff, etc. AvantGo wants to follow all of these and causes what appears to be an exponential increase in the links traversed (I say appears, since when trying to fetch a 200-topic TWiki web I canceled after AvantGo had checked 2,200 pages).

Perhaps AvantGo assumes a hierarchical structure in which the navigational links generally point "up" and "down", and the everything-links-to-everything nature of Wiki webs may not be handled well. In any case I concluded I needed to limit what links were offered to AvantGo to traverse, while maintaining as much of the interwoven structure of the TWiki web as possible. I also wanted to be able to let AvantGo fetch the same pages I edit and view normally, with as few changes to the pages as possible.

To do that, I modified the standard file set:

  • I made a copy of the view script, calling it viewpalm
    • viewpalm uses the viewpalm template
    • viewpalm uses a modified version of the wiki.pm script, called wikipalm.pm
    • I modified viewpalm so that it does not create links to edit for topics that don't exist. AvantGo would want to follow those links to the edit script.
  • I made a copy of wiki.pm calling it wikipalm.pm to generate view links with viewpalm rather than view .
  • I created viewpalm.tmpl from view.tmpl in the suggested way
    • removed all links to other on-site webs
    • removed graphics and the copyright notice (for size reasons, will have to add proper attribution back).
    • removed the header and footer sections with all the links to edit, rdiff, etc.
  • Finally, I pointed AvantGo at site/viewpalm/web/WebHomePalm and told it to go 2 levels deep.

It was also necessary to modify individual pages, and this is something I would like to avoid. But as a first draft proof-of-concept:

  • I created a clone of the Web Home topic, calling it WebHomePalm.
    • removed everything but the basic "starting points", i.e. local, personal links.
  • I removed signature links from all the pages, since they point to the Main web.
  • I removed references to Web Home, because it has links to other on-site webs. (Notice that in this page I've split the name so a link is not created.)
  • I refactored topics that included pre HTML tags, e.g. the result of cal 6 2000 that I dumped to a file to be INCLUDED in other topics. I changed these to be narrow enough not to wrap on the palm pilot.
  • I create an index page that simply lists topic names. The AvantGo search-all-pages is too slow, and not all topics are easily reachable via links. A search mod or option to just list names?
    • technique: ls * .txt> topicindex.txt and then run sed against it removing file extensions and adding br tags.

Current State: it works very nicely.

To do:

  • Modify viewpalm to mark any link outside of the immediate local web as notexist, but maintain the change that does not create a link to edit for these. This should obviate the need to change topics or have a separate home topic.
  • Think about tables - AvantGo seems to ignore table formatting. Maybe wikipalm could re-format tables into two-level bulleted lists?
  • Think about 404 errors, currently pointed at the registration page.
  • Think about the x technique for mono-spaced fonts, since fonts aren't rendered. Italics also don't come through, they are rendered as bold type.

-- SteveCarol - 26 Jun 2000

See the 15 Jun 2000 DevelopersNews item. This lead NicholasLee, PeterThoeny and KevinKinnell to a rapid fire discussion regarding a possible quick mirroring/backup method.

Summary

Generating a zip of the complete web, data/ and pub/ directories, isn't economical. Currently this TWiki main site contains about 6.3Mb of files.

Extension of the .changes mechanism.

Just archive all files with a timestamp newer then the last archive (and preserve the directory structure in the archive) TWiki is file based, this is granular enough for a incremental backup/mirroring.

On the receiving side I would simply update (e.g. overwrite existing) files, and keep all archive files. If things go bad by the remote chance of mirroring corrupt data, it is a simple matter of manually restoring files from previous archives.

Distribution via HTTP or SMTP and scripts.

One method would be to create a script cgi-bin/raw that takes a date and Web and published all changed topics in raw text form.

If cgi-bin/upload modified the .changes files as well, a seperate script could extract these URLs and published them in a similar manner to the raw script.

Some issues

  • Updates will however constantly grow, as Topic generally get larger.
  • There is the possible issue of topic and pub/ files removals.

A more complex method would involve keeping change lists and essentially copying a protocol similar to CVS.

-- NicholasLee - 15 Jul 2000


I've made a small script twiki2html which creates static HTML pages and a full copy for the pub subdirectories. Of course, searching and editing pages is impossible without the CGI scripts.

The usage is very simple. It creates a similar directory tree under the output directory.

command line option description
--twikidir=dir The installation directory, the default is /usr/local/httpd/twiki
--outdir=dir The output directory for the static HTML files, the default is temp in the current directory

Perhaps, someone finds it useful for offline access.

-- StefanScherer - 27 Dec 2000


I have patched StefanScherer's twiki2html so that it generates files ending in .html (and fixes up the internal links). The patches are fairly nasty but do work, email me for details. People can now easily get a read-only offline copy, generated by a daily cron job.

The reason for this is that the offline browser tools that I tried (NearSite, WebStripper, and WebCopier) all refused to download .../WebHome but worked with, say, .../WebHome.html.

-- RichardDonkin - 15 Feb 2001

Does twiki2html still work for current versions of Twiki? I've been struggling to get it to work. It is calling wiki.pm, instead of TWiki.pm. Does anyone have an updated version?

-- MattMillard - 11 Oct 2002

I've created an OfflineBook topic. It looks like normal WebIndex in book view, but links are all relative (#.. format). This allows that one (large) HTML file to be saved for offline reading. It required a rewrite of the internalLink function and a few other patches.

It was actually created as part of TocAndReportGeneration to allow creation of a "book" with most of the content within the web.

-- StanleyKnutson - 26 Feb 2001

Whilst this topic is not particularly active, I think it's worth mentioning that with EricScouten's fantastic new PublishAddOn (a leap forward from GenHTMLAddon), ReadOnlyOfflineWiki becomes an almost brain-free process. Eric's addon creates static html pages which are rendered using existing plugins and skins. This means that if you want "lightweight" offline html for palmtop devices (i.e. no chunky graphics on each page, etc) you can Publish using ?skin=plain or similar. Setup a cron job to publish your site and then save the static pages locally. Job done.

-- RossC - 13 Feb 2003

I have managed to make offline HTML browseable copies of a twiki web using the great htttrack program. here is a quick note on how to do this in my blog... if one feels interested : http://www-inf.int-evry.fr/~olberger/oldblog/040830.html

-- OlivierBerger - 30 Aug 2004

Another recipe for use with Plucker and a Palm powered device


   plucker-build  \
   --verbosity=1 \
   --doc-name="Fantasio TWiki"  \
   --doc-file=FantasioTWiki \
   --zlib-compression \
   --maxdepth=2 \
   --bpp 4 \
   --staybelow=http://localhost/twiki/bin/view/Projects \
   --home-url=http://localhost/twiki/bin/view/Projects/WebIndex

-- FrankHartmann - 02 Nov 2004
Topic attachments
I Attachment History Action Size Date Who Comment
Unknown file formatext twiki2html   manage 8.3 K 2000-12-27 - 20:25 UnknownUser small converter, based on the view script
Edit | Attach | Watch | Print version | History: r15 < r14 < r13 < r12 < r11 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r15 - 2006-03-02 - OlivierBerger
 
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2026 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.