Log file size limitations

Notes, tips, and other usefull things on how to use LogMX

Moderator: admin

Post Reply
Posts: 1
Joined: Tue May 28, 2019 10:33 am

Log file size limitations

Post by andkli » Tue May 28, 2019 10:51 am


I did search but didn't find an answer so here we go:

Is there a known upper limit to the log file size that can gracefully be handled by LogMX?
I just started trying out the evaluation version of 7.8.0 using Java 8.0.211 (64 bit) on a MacBook Pro.
The log file I'm interested in analysing is about 350 K lines / 57 Mb, but that fails to load.
Reducing the scope to first 60000 lines work but slowly, 100k seems quite a challenge.
I did manage to create a working RE parser, but I am interested if I do something wrong or
if I hit an inherent tool limitation.
(Yes that file is large, complex infrastructure setup log in debug mode...)

Example line:
2019-05-23 02:02:09,559 INFO [scheduler-2] CommandLine:365 - Running command: git submodule update --init --force --recursive


Site Admin
Posts: 523
Joined: Sun Dec 17, 2006 10:30 pm

Re: Log file size limitations

Post by admin » Tue May 28, 2019 9:40 pm


I would actually consider a 60 MB log file rather small. There is no file size limitation. But there are 2 cases:
  1. You want to load all the log events in memory to see them in the graphical user interface.
  2. You want to monitor only the end of a log file being updated in real-time (and maybe get alerts if some events happen).
For case #1, your limitation will be the memory available on your computer (RAM + swap). Yet, by default, LogMX will not use more than 1 GB of memory, just as a precaution for small configurations. To change that, you can edit the file "<LogMX_install_dir>/startup.conf" to change the value of "MAX_MEMORY" (for example "MAX_MEMORY=8G"). You can also see how much memory is used in LogMX at the top-right corner of the main window.

For case #2, you don't usually have to worry about memory, unless you want to keep all the log events in memory (i.e. no limit on the AutoRefresh setting "max. number of entries").

But since you mentioned that it works "slowly" with 60,000 lines, I guess memory is not your only concern. And since you also mentioned that you're using a RE Parser (Regular Expression I guess), I think the Regular Expression you are using is not efficient. I created a 60 MB log file with the log example you gave, and the RE parser I created loaded the file in less than a second (also, it uses 280 MB of memory, so you shouldn't have memory issues).

Here is the Regexp I used:

Code: Select all

(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d+?) (.*?) \[(.*?)\] (.*?):(\d+?) (.*)
One of the common mistakes when building a regexp is using the greedy quantifier .* when the reluctant quantifier .*? could/should be used instead. Notice how I used the greedy one only once: at the end, to capture ALL the remaining characters in the "Message" field. All the other quantifiers are "fixed" like \d{4} or "reluctant" like .*? or \d+?

To learn more about Regexp (and more precisely for LogMX), you can have a look at: https://logmx.com/docs/regex-parsers.html

Let me know if you still have trouble to load your log files (usually you shouldn't have any performance/memory issues with files smaller than 100 MB).


Post Reply