Jörn Huxhorn | 21 Apr 19:45 2014

Lilith 0.9.44 has been released!

This release brings huge improvements to the usage of conditions. Take a look
at "Focus" and "Exclude" in the popup and "Search" menu.

Remember that you can save conditions using Cmd/Ctrl-I. Those will show up in
the "Saved conditions" section of "Focus" and "Exclude".

It will also most likely be the last Java 5 compatible version.
Development will continue on Java 8.

- The table of the view will now always receive the focus if the selected
  view changes.
- Added alternative behavior for Focus/Exclude actions.
  By default, those actions are replacing the current views filter,
  if available, with the new combined filter.
  Hold shift to create a new view instead.
- Status text is now properly updated in case of a replaced filter.
- Renamed "Named" in the find combo to "Saved".
- Renamed "Add condition..." to "Save condition..." and moved it from
  the "View" to the "Search" menu. Also added it to the popup menu.
- Significantly enhanced tooltips of various condition-related components.
  They now show a pretty-printed string representation of the condition.
- Enhanced "Focus" and "Exclude" popup menus.
- Added corresponding "Focus" and "Exclude" menus to the "Search" menu.
- Added two Substance look&feels to the mix.
- Status text of main window is now properly updated on change of
  white/blacklisted list name.
- Enhanced profiling output of TracingAspect.
- Fixed of-by-one error in message renderer [+x lines]. Again.
- servlet-api dependency of de.huxhorn.lilith.logback.servlet is now
(Continue reading)

James Hutton | 21 Apr 14:21 2014

One appender, multiple layouts

Still trying to wrap my head around markers a bit, but I was wondering if
it was possible to use a marker to determine which layout the event was
formatted with to the same appender.  If there is any documentation or
examples that would be helpful and most appreciated.

huang | 20 Apr 09:18 2014

not lose log events while reload its configuration problem

Hi all,
         I am learning the source code of log4j 2.I want to know  how log4j 2 can reload its configuration "without
losing log events while reconfiguration is taking place.".How can  it  make sure it don't lose log events
while reload its configuration. 
      With regards.

Jeff Shaw | 16 Apr 21:55 2014

How to make a custom connection source available in log4j 2 config?

I made a custom connection source that I want to use. (Source follows
this message.) However, when I attempt to use my BoneCP connection
source in my config, I get the error, "ERROR JDBC contains an invalid
element or attribute "BoneCP"". What else do I need to do to make my
custom connection source available in the configuration?

I'm hoping the answer will also apply to a custom appender and manager
I've written, neither of which work because they also cannot be
instantiated from the configuration, however the error is a class not
found error.



 * Copyright (c) Bit Gladiator on 2014.

 <at> Plugin(name = "BoneCP", category="Core", elementType =
"connectionSource", printObject = true)
public class BoneCPConnectionSource implements ConnectionSource {
  private static final Logger LOGGER = StatusLogger.getLogger();

  private final BoneCP pool;

  private BoneCPConnectionSource(final BoneCP pool) {
    this.pool = pool;
(Continue reading)

Vin Karthik | 15 Apr 05:37 2014

Fw: News


News:  http://teochewassocjb.org/ki/link.php

Vin Karthik

David KOCH | 15 Apr 15:10 2014

Issue with log4j and Glassfish


Glassfish 3.1.2 does not seem to find (some) log4j2 classes when the log4j2
dependencies are not directly included in the wep application's pom.xml. In
my case, I have a separate artifact which contains a custom log appender
and all of the log4j2 stuff.

I already followed the advice in
LPS-21525<https://issues.liferay.com/browse/LPS-21525> to
help avoid some errors but I still get the following messages when trying
to deploy my application,

while trying to load Bean Class
org.apache.logging.log4j.core.async.RingBufferLogEvent$Factory :
java.lang.NoClassDefFoundError: com/lmax/disruptor/EventFactory|#]

See here <http://pastebin.com/RWRH2uxm> for a verbose list.

The classes are present in the web application's jar, albeit in:

How can I fix this? Any help is appreciated,


Mahesh Dilhan | 10 Apr 04:24 2014

Catalina.out trace : memory leak


I got following catalina console out continuously  when I try to stop the
 web application.

Brief  on configuration
*version : rc1*


<Configuration status="OFF" >
    <RollingRandomAccessFile name="RollingFile-${web:contextPath}"
immediateFlush="false" append="false"

        <Pattern>%d %p %c{1.} [%t] %m%n</Pattern>
        <TimeBasedTriggeringPolicy />
    <Root level="INFO" includeLocation="false">
      <AppenderRef ref="RollingFile-${web:contextPath}"/>

(Continue reading)

Manuel Teira | 9 Apr 09:57 2014

Compressing only old rollover files

Hello all,

I'm evaluating a switch to log4j-2 since my application is required to
rollover files by age and size (for what the composite triggering policies
come handy). The rollover files shall also be compressed, but only those
reaching a given age.

What would be the preferred approach to achieve that using log4j-2? Should
be reasonable to write a custom rollover strategy or is there any other way
out-of-the box that may work?

Thanks and best regards,

Matt Sicker | 6 Apr 03:10 2014

Slides for my upcoming Introduction to Log4j 2 talk at ApacheCon 2014

I submitted these a bit late due to not noticing when we were supposed to submit them, but I finished them! Attached is a PDF rendering of the slides (hopefully this works).

Matt Sicker <boards <at> gmail.com>

To unsubscribe, e-mail: log4j-user-unsubscribe <at> logging.apache.org
For additional commands, e-mail: log4j-user-help <at> logging.apache.org
Arkin Yetis | 4 Apr 21:04 2014

Flume Appender failure due to filesystem issue

We use the Flume Appender. Our logging stopped after a certain point in
time and we noticed the exception at the end of this message in our
application logs. It looks like there was an issue with the filesystem. But
although the filesystem has recovered, the appender (or probably the
persistence mechanism it uses) was stuck in this state and it took an
application restart for it to continue logging. It does not look like there
is a recovery mechanism or if there is one it failed.
Would you like me to open a log4j JIRA ticket for this? Or is this
something that can be prevented by something simple you can share over
e-mail such as a certain configuration setting?

- Arkin

Exception stack is:
1. Stale NFS file handle (java.io.IOException)
  java.io.RandomAccessFile:-2 (null)
2. Environment invalid because of previous exception: (JE 5.0.73)
/app/logs/abs-workflow/flumeDir java.io.IOException: Stale NFS file handle
LOG_READ: IOException on read, log is likely invalid. Environment is
invalid and must be closed. fetchTarget of 0x542/0x4af13c parent IN=5 IN
class=com.sleepycat.je.tree.BIN lastFullVersion=0x543/0x62d6c5
lastLoggedVersion=0x543/0x62d6c5 parent.getDirty()=true state=0
  com.sleepycat.je.log.FileManager:1883 (null)

Root Exception stack trace:java.io.IOException: Stale NFS file handle
    at java.io.RandomAccessFile.readBytes(Native Method)
    at java.io.RandomAccessFile.read(RandomAccessFile.java:338)
    at com.sleepycat.je.log.FileManager.readFromFile(FileManager.java:1869)
    at com.sleepycat.je.log.FileManager.readFromFile(FileManager.java:1807)
    at com.sleepycat.je.log.FileSource.getBytes(FileSource.java:56)
    at com.sleepycat.je.log.LogManager.getLogEntry(LogManager.java:848)
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1412)
    at com.sleepycat.je.tree.BIN.fetchTarget(BIN.java:1251)
    at com.sleepycat.je.dbi.CursorImpl.fetchCurrent(CursorImpl.java:2261)
    at com.sleepycat.je.dbi.CursorImpl.getNext(CursorImpl.java:1593)
    at com.sleepycat.je.utilint.DaemonThread.run(DaemonThread.java:163)
    at java.lang.Thread.run(Thread.java:662)
Mohit Anchlia | 2 Apr 19:00 2014

Non blocking JMS appender

I am trying to configure log4j such that the jms appender is non blocking.
Does this configuration make it non blocking?

   <appender name="async" class="org.apache.log4j.AsyncAppender">
        <param name="BufferSize" value="4096" />
        <param name="blocking" value="false"/>

    <appender name="search-indexer-async-jms"
        <param name="InitialContextFactoryName"
value="org.apache.activemq.jndi.ActiveMQInitialContextFactory" />
        <param name="ProviderURL" value="tcp://localhost:61616"/>
        <param name="TopicBindingName" value="indexTopicEndpoint"/>
        <param name="TopicConnectionFactoryBindingName"

        <appender-ref ref="async" />