Exception occurred during DumpLog
Printed From: LogSat Software
Category: Spam Filter ISP
Forum Name: Spam Filter ISP Support
Forum Description: General support for Spam Filter ISP
URL: https://www.logsat.com/spamfilter/forums/forum_posts.asp?TID=5592
Printed Date: 13 March 2025 at 1:21pm
Topic: Exception occurred during DumpLog
Posted By: lyndonje
Subject: Exception occurred during DumpLog
Date Posted: 27 April 2006 at 11:25am
Other than the obvious (file locked), do you know what might actually
have caused these errors to appear in my log? There is no AV installed
on the server, so it can't be AV locking the file for scanning... any
ideas? Currently using v3.0.1.555
04/27/06 16:18:49:203 -- Exception occurred during DumpLog: Cannot open
file "C:\Program Files\SpamFilter\logfiles\20060427.log". The process
cannot access the file because it is being used by another process
Thanks,
Lyndon.
|
Replies:
Posted By: LogSat
Date Posted: 27 April 2006 at 4:10pm
Can you check your SpamFilter.ini file for the option:
RealtimeDiskLogging=0
and ensure it is indeed set to 0? If not, this will cause SpamFilter to flush to disk the logfile everytime a new line is appended, and this could cause a lot of IO traffic and (in theory - never happened before...) locking issues. We use a value of 1 only for troubleshooting purposes.
If it was already at 0 (or if the entry does not exist - default is 0), locking by another app is honestly the only thing we can think of.
------------- Roberto Franceschetti
http://www.logsat.com" rel="nofollow - LogSat Software
http://www.logsat.com/sfi-spam-filter.asp" rel="nofollow - Spam Filter ISP
|
Posted By: lyndonje
Date Posted: 27 April 2006 at 5:39pm
It was set to 0, I've changed it to 1. Does that mean the log will be saved every 60 seconds? And this is the recommended value?
Regards,
Lyndon.
|
Posted By: LogSat
Date Posted: 27 April 2006 at 7:00pm
You want it back to 0! If 1, it will save the file every time a single line is added to the log, which in high traffic scenarios can be dozens if not hundreds of times per second!
------------- Roberto Franceschetti
http://www.logsat.com" rel="nofollow - LogSat Software
http://www.logsat.com/sfi-spam-filter.asp" rel="nofollow - Spam Filter ISP
|
Posted By: Desperado
Date Posted: 27 April 2006 at 8:35pm
lyndonje,
If you are using something like SawMill to parse you logs, make sure it is set up to only parse inactive logs. I had the exact same errors when I parsed my active logs during high traffic.
------------- The Desperado
Dan Seligmann.
Work: http://www.mags.net
Personal: http://www.desperado.com
|
Posted By: lyndonje
Date Posted: 28 April 2006 at 3:50am
Sorry Roberto, must have missunderstood your post - it was late when I read it!
Its set back to 0 now.
With regard to SawMill, I've scheduled it to parse the logs at 19:00,
so does it also parse at other times? Also I can't find a setting to
only parse inactive logs, which version is that feature found in?
|
Posted By: lyndonje
Date Posted: 28 April 2006 at 5:03am
Checking the logs and it looks like these errors have been appearing since before I installed SawMill.
I'm using File Monitor from Sysinternals to capture what processes are
accessing the logs folder. Wonder what I'll find...? Should be
interesting.
Another question for LogSat though, as there are multiple entries in my
logs with this error, if SF is unable to open and write to the log
file, how come it is able to write the multiple entries saying the dump
failed? And ultimately what happens to the data it is trying to dump?
Does it get buffered and then re-tried later or is the data lost?
Thanks,
Lyndon.
|
Posted By: LogSat
Date Posted: 28 April 2006 at 7:04am
lyndonje,
When you mentioned "Other than the obvious (file locked)" in the original post, I had assumed you were certain no other applications where trying to access the logs. Desperado is absolutely correct, if you do have SawMill, or any other application, read the logfiles, they will place a lock on it while it's opened, and that will cause the error in the logs.
Please note that there is an otion in the SpamFilter.ini file to rotate the logs every "n" numbers of hours, so you may want to use that to split the logs hourly for more up-to-date imports.
------------- Roberto Franceschetti
http://www.logsat.com" rel="nofollow - LogSat Software
http://www.logsat.com/sfi-spam-filter.asp" rel="nofollow - Spam Filter ISP
|
Posted By: lyndonje
Date Posted: 28 April 2006 at 7:43am
At present I don't know what could be opening the files, as I said
SawMill is scheduled to parse the logs at 19:00, and this is not the
time when I'm seeing the error. Also if I look at previous logs I see
the error before I had even installed SawMill.
As there are multiple entries in my
logs with this error, if SF is unable to open and write to the log
file, how come it is able to write the multiple entries saying the dump
failed? And ultimately what happens to the data it is trying to dump?
Does it get buffered and then re-tried later or is the data lost?
I don't think I need to rotate my logs, at present they rotate on a daily basis.
|
Posted By: Desperado
Date Posted: 28 April 2006 at 7:55am
lyndonje,
Heres the deal ... I am still on my first POT of coffee so am not quite with it yet but ... what I do is split the logs like Roberto mentioned and use a perl script to move the logs off that server based on date/time and parse the moved logs. I sometime do a manual parse of the "live" logs if I want imediate, up to date info trying to find some new trend or something. Also, and this may not apply, I run Diskeeper to defrag my drives automatically. I make sure my master log folder ie excluded from that process. I log everything on my servers in a "master" log folder to make this, and my scripting easier to keep track of.
Your idea of tracking what is accessing the files is a good one and I will be interested in any results you get.
A more severe issue I ran into was with my Web GUI. I write walues to the spamfilter.ini file from the gui and at one point got lazy in my coding and was not checking to see if the ini file was open. As luck would have it, I did a write at the exact same time as spamfilter was updating the stats in the file and I wiped out the file. This had a very unfortunate effect as you may guess. So ... working with very active files can be tricky in unpredictable ways and I am amazed that high traffic servers work at all!
------------- The Desperado
Dan Seligmann.
Work: http://www.mags.net
Personal: http://www.desperado.com
|
Posted By: lyndonje
Date Posted: 28 April 2006 at 8:10am
As an idea, how many messages does your SF server process? from 23rd Apr - 27th Apr (Sun - Thu) mine totalled 135,914.
1 |
javascript ;" onclick="zoom0, 'action', 'rejected'; return false; - rejected |
62,063 |
45.7 % |
|
23.81 M |
2 |
javascript ;" onclick="zoom0, 'action', 'quarantined'; return false; - quarantined |
32,969 |
24.3 % |
|
383.71 M |
3 |
javascript ;" onclick="zoom0, 'action', 'accepted'; return false; - accepted |
22,891 |
16.8 % |
|
2.52 G |
4 |
javascript ;" onclick="zoom0, 'action', 'Dropped__HexEsc__20Connection'; return false; - Dropped Connection |
16,249 |
12.0 % |
|
0 b |
5 |
javascript ;" onclick="zoom0, 'action', 'Bypassed__HexEsc__20all__HexEsc__20rules'; return false; - Bypassed all rules |
1,742 |
1.3 % |
|
232.78 M |
|
Total |
135,914 |
100 % |
|
3.15 G |
So far since running the file monitor, the DumpLog error has not yet occured...
|
Posted By: Desperado
Date Posted: 28 April 2006 at 8:19am
This is Noon to Midnight yesterday (12 Hours)
|
javascript ;" target=_blank onclick="set_sort_column0, 'action'; return false; - Action |
javascript ;" target=_blank onclick="set_sort_column0, 'messages'; return false; - Messages |
 |
javascript ;" target=_blank onclick="set_sort_column0, 'bytes'; return false; - Bytes |
1 |
javascript ;" target=_blank onclick="zoom0, 'action', 'quarantined'; return false; - quarantined |
68,502 |
52.2 % |
|
301.28 M |
2 |
javascript ;" target=_blank onclick="zoom0, 'action', 'accepted'; return false; - accepted |
22,796 |
17.4 % |
|
674.63 M |
3 |
javascript ;" target=_blank onclick="zoom0, 'action', 'Dropped__HexEsc__20Connection'; return false; - Dropped Connection |
17,437 |
13.3 % |
|
3.42 M |
4 |
javascript ;" target=_blank onclick="zoom0, 'action', 'Bypassed__HexEsc__20all__HexEsc__20rules'; return false; - Bypassed all rules |
13,355 |
10.2 % |
|
390.76 M |
5 |
javascript ;" target=_blank onclick="zoom0, 'action', 'rejected'; return false; - rejected |
4,308 |
3.3 % |
|
10.93 M |
6 |
javascript ;" target=_blank onclick="zoom0, 'action', 'sent__HexEsc__20to__HexEsc__20NULL'; return false; - sent to NULL |
4,258 |
3.2 % |
|
243.00 k |
7 |
javascript ;" target=_blank onclick="zoom0, 'action', 'spam-tagged'; return false; - spam-tagged |
658 |
0.5 % |
|
5.33 M |
|
Total |
131,314 |
100 % |
|
1.35 G |
------------- The Desperado
Dan Seligmann.
Work: http://www.mags.net
Personal: http://www.desperado.com
|
Posted By: lyndonje
Date Posted: 28 April 2006 at 8:25am
I shouldn't need to worry about mine being a high traffic server then!
I see why you might want to rotate the logs now you've said. But at the moment I don't mind if my stats are a day behind.
|
Posted By: lyndonje
Date Posted: 03 May 2006 at 4:44am
How strange. Ever since I've been running File Monitor, I've only seen
the DumpLog error once on the 28th Apr, and once on the 2nd May, both
at 19:00, which is when SawMill run which I would expect. Before
running File Monitor, and even before installing SawMill, there have
been multiple DumpLog errors in the logs. I'll keep my eye on it...
|
Posted By: lyndonje
Date Posted: 03 May 2006 at 5:30am
Still have a question for LogSat. When a DumpLog error occurs, what
actually happens to the data that SF attempted to save to the log file?
Is the data buffered and re-tried or is the data lost? If the data is
lost I have another question... if SF was unable to write to the log
file therefore causing the DumpLog error, how does it actually write
the DumpLog error to file?
|
Posted By: LogSat
Date Posted: 03 May 2006 at 3:52pm
The memory buffer holding the log data to be saved is not cleared until *after* a successful saving of the file. So if there's a failure, the same old buffer, plus the new entries in the log, will be written the following time the log is saved (by default every 60 seconds). When there are exceptions to be logged, we spawn a separate thread dedicated to saving the buffer, thus increasing the chances that the error is correctly logged even if there are problems. Helps a lot with troubleshooting...
------------- Roberto Franceschetti
http://www.logsat.com" rel="nofollow - LogSat Software
http://www.logsat.com/sfi-spam-filter.asp" rel="nofollow - Spam Filter ISP
|
Posted By: lyndonje
Date Posted: 04 May 2006 at 3:31am
OK thats brilliant! Thanks.
|
|