Question
We upgraded to Dakar release a few weeks ago, and ever since it's been filling up /tmp with large twiki-stats.*.* files that quickly grow in number to fill their partition, which causes many bad things to happen.
Are these supposed to be deleted automatically? Generating them also seems to chew up a fair amount of CPU. Have we misconfigured something somewhere?
Environment
--
NicMcPhee - 27 Mar 2006
Answer
If you answer a question - or someone answered one of your questions - please remember to edit the page and set the status to answered. The status selector is below the edit box.
twiki-stats files are generated when a statistics job is run. This is normally in response to auser pressing a button to refresh the statistics, or a cron job running in the background. The stats files are temporary, for the duration of the statistics run, and should be unlinked by TWiki when it is finished. So something is either killing the apache process generating the statistics, or the cron job if that is how it is done.
Generating statistics involves processing the twiki access log, so if that file is very large, it could eat CPU.
--
CrawfordCurrie - 28 Mar 2006
Could this be a result of a robot trying to access the page that causes the statistics to regenerate, and it takes a while to process, so the robot times out and moves on before the script actually finishes?
--
MichaelAnderson - 28 Mar 2006
That is very well possible. You can exclude robots from running the statistics in the
/robots.txt
file, and/or by requiring a valid user in
bin/.htaccess
--
PeterThoeny - 28 Mar 2006
Robots seem to have been the problem.
Mike added
require valid-user
to the twiki/bin/.htaccess file and that appears to have solved the problem.
Thanks to everyone!
--
NicMcPhee - 30 Mar 2006
Apache will keep the perl process running regardless of the client's actions (such as robot timeout). We have a similar issue and it appears that permissions may influence whether the script finishes running.
--
EricHanson - 27 Sep 2007