Tags:
create new tag
view all tags

Question

We upgraded to Dakar release a few weeks ago, and ever since it's been filling up /tmp with large twiki-stats.*.* files that quickly grow in number to fill their partition, which causes many bad things to happen.

Are these supposed to be deleted automatically? Generating them also seems to chew up a fair amount of CPU. Have we misconfigured something somewhere?

Environment

TWiki version: TWikiRelease04x00x00
TWiki plugins: DefaultPlugin, EmptyPlugin, InterwikiPlugin, SpreadSheetPlugin, CommentPlugin, EditTablePlugin, PreferencesPlugin, SlideShowPlugin, TablePlugin, SmiliesPlugin, XpTrackerPlugin
Server OS: RedHat Linux, Fedora Core 4
Web server: Apache 2.0.54 (Fedora)
Perl version: 5.8.6
Client OS: N/A
Web Browser: N/A
Categories: Statistics, Performance

-- NicMcPhee - 27 Mar 2006

Answer

ALERT! If you answer a question - or someone answered one of your questions - please remember to edit the page and set the status to answered. The status selector is below the edit box.

twiki-stats files are generated when a statistics job is run. This is normally in response to auser pressing a button to refresh the statistics, or a cron job running in the background. The stats files are temporary, for the duration of the statistics run, and should be unlinked by TWiki when it is finished. So something is either killing the apache process generating the statistics, or the cron job if that is how it is done.

Generating statistics involves processing the twiki access log, so if that file is very large, it could eat CPU.

-- CrawfordCurrie - 28 Mar 2006

Could this be a result of a robot trying to access the page that causes the statistics to regenerate, and it takes a while to process, so the robot times out and moves on before the script actually finishes?

-- MichaelAnderson - 28 Mar 2006

That is very well possible. You can exclude robots from running the statistics in the /robots.txt file, and/or by requiring a valid user in bin/.htaccess

-- PeterThoeny - 28 Mar 2006

Robots seem to have been the problem. Mike added

     require valid-user
to the twiki/bin/.htaccess file and that appears to have solved the problem.

Thanks to everyone!

-- NicMcPhee - 30 Mar 2006

Apache will keep the perl process running regardless of the client's actions (such as robot timeout). We have a similar issue and it appears that permissions may influence whether the script finishes running.

-- EricHanson - 27 Sep 2007

Edit | Attach | Watch | Print version | History: r6 < r5 < r4 < r3 < r2 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r6 - 2007-09-27 - EricHanson
 
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by Perl Hosted by OICcam.com Ideas, requests, problems regarding TWiki? Send feedback. Ask community in the support forum.
Copyright © 1999-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.