Refactoring Proposal: Centralise Topic Processing Heuristic
There are several places in the core code where topic text is processed according to local heuristics to extract
TML features. For
example, the detection and extraction of
verbatim blocks, the detection of
pre, the processing of wikiwords, the detection and processing
of headers.
Places where this is done are:
- During update of referring pages during a rename, where the code needs to detect "live" links to the topic being renamed
- Composition of topic summaries and changes summaries
- TOC handling
- The main rendering loop
These could/should be handled by a single parser/processor that could call methods on a passed-in object to handle
processing. This would reduce code volume, and at the same time ensure the correctness of the parse in all places.
package RenderingCallback;
sub activeWikiLink { url => "http://..." text => "LinkMe" }
sub heading ( level => 4 ... )
sub plainText( $insidePre => 1, $noAutoLink => 1 ... )
Note how the current plugins architecture can be expressed in terms of this approach. However
future plugins can leverage this architecture to become a lot more powerful,
as a plugin author has the option to override a lot more methods.
--
CrawfordCurrie - 29 Jan 2005