Benjamin Jaton | 3 Jul 00:53 2015
Picon

RollingFileAppeneder MaxSize + keep 30 days

Hello,

How can I define a RollingFileAppender that would roll when it reaches
100MB, and that would discard any log older than 30 days (not before)?

I need to keep 30 days of logs for auditing purposes, but I also want to
limit the logs to a certain size as well.

Thanks!
Ben
dyf6372 | 1 Jul 10:43 2015

Question about using log4j

We use log4j to print the log about our system. Sometimes the computer will be crashed and reboot. Before the
computer down, the log4j will print many " <at> ". I want to know what the meaning of “ <at> “ is and when it will
appear. Thank you very much!

My Best Wishes!
董一峰
15201346372
dyf6372 <at> 163.com dyf6372 <at> gmail.com
Saurabh Jain | 30 Jun 10:49 2015
Picon

Mapped diagnostic context

Hello list

I was implementing MDC as per requirement have to pass additional
parameters to the logger.

Can someone help me suggesting best way to implement MDC or any other way
of achieving the requirement.

Thanks,
Ralph Goers | 29 Jun 18:33 2015

Re: Hide *.jar

OK, so you only want to hide hibernate from the stack trace.  Then modify your pattern for the PatternLayout
to include %xEx{full,filters[org.hibernate]}

Ralph

> On Jun 29, 2015, at 9:19 AM, Emi Lu <emilu <at> encs.concordia.ca> wrote:
> 
> Mailing list bounced my email. 
> 
> -------- Forwarded Message --------
> Subject:	Re: Hide *.jar
> Date:	Mon, 29 Jun 2015 12:16:34 -0400
> From:	Emi Lu <emilu <at> encs.concordia.ca> <mailto:emilu <at> encs.concordia.ca>
> Reply-To:	emilu <at> encs.concordia.ca <mailto:emilu <at> encs.concordia.ca>
> To:	log4j-user <at> logging.apache.org <mailto:log4j-user <at> logging.apache.org>
> 
> On 06/29/2015 12:04 PM, Ralph Goers wrote:
>> Can you post the full output of the log event?
>> 
> Here it is. 
> 
> Red lines are the ones I want to hide. 
> 
> 2015-Jun-29 12:10:39.214 [http-8080-5] ERROR DBUtilUser - set gui_model Error: could not extract
ResultSet  SELECT   ****** ORDER BY   1, 2, 3 
> 2015-Jun-29 12:10:39.215 [http-8080-5] ERROR DBUtilUser -
org.hibernate.exception.SQLGrammarException: could not extract ResultSet
> 2015-Jun-29 12:10:39.217 [http-8080-5] ERROR
org.apache.struts2.dispatcher.DefaultDispatcherErrorHandler - Exception occurred during
processing request: could not extract ResultSet
(Continue reading)

Emi Lu | 29 Jun 16:45 2015
Picon
Picon

Hide *.jar

Hello List,

May I know how to hide the following *.jar info please?

     at

org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:123) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at

org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:49) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at 
org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:126) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at 
org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:112) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at

org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.extract(ResultSetReturnImpl.java:91) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at org.hibernate.loader.Loader.getResultSet(Loader.java:2066) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at 
org.hibernate.loader.Loader.executeQueryStatement(Loader.java:1863) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
     at 
org.hibernate.loader.Loader.executeQueryStatement(Loader.java:1839) 
~[hibernate-core-4.3.8.Final.jar:4.3.8.Final]
(Continue reading)

Benjamin Jaton | 26 Jun 00:39 2015
Picon

SMTP appender + filter

Hello,

I am trying to create a filter at the appender level that would show all
the messages that contain "Show".
This is my test:

        LogManager.getLogger(Test.class).debug("Hide me!");
        LogManager.getLogger(Test.class).info("test");
        LogManager.getLogger(Test.class).debug("Show me (debug)");
        LogManager.getLogger(Test.class).fatal("Hide me! (fatal)");
        LogManager.getLogger(Test.class).info("Show me (info)");
        LogManager.getLogger(Test.class).info("Hide me!");

And this is the configuration:

<Configuration status="debug" name="MyApp" packages="">
  <Appenders>
    <SMTP name="Mail" subject="Error Log" to="..." from="..."
smtpProtocol="..." smtpHost="..." smtpPort="..." smtpUsername="..."
smtpPassword="..." bufferSize="1">
        <RegexFilter regex=".*Show.*" onMatch="ACCEPT" onMismatch="DENY"/>
        <PatternLayout>
            <pattern>%5p %m%n</pattern>
        </PatternLayout>
    </SMTP>
  </Appenders>
  <Loggers>
    <Root level="info">
      <AppenderRef ref="Mail"/>
    </Root>
(Continue reading)

David KOCH | 22 Jun 14:22 2015

Checking if logger is enabled for a specific marker

Hello,

Is there any way to programatically determine if a logger is enabled for a
specific log level and marker? If so how? There is a Logger#isEnabled(Level
level, Marker marker) method but it seems it ignores the marker argument.

I would like to avoid carrying out relatively expensive operations required
for preparing the log message when it's not required. The alternative,
implementing Message#getFormattedMessage would be quite clumsy in this case
so I'd like to be able to pre-check based on marker and log level instead.

I did like this in the log4j.xml:

<Logger name="com.xxxxx.rtb.log.LoggingBidInterceptor" level="info"
additivity="false">
    <MarkerFilter marker="bid_req_proto" onMatch="DENY" onMismatch="DENY"/>
  <AppenderRef ref="Console"/>
  <AppenderRef ref="KafkaBidRequest" level="info"/>
  <AppenderRef ref="KafkaBidResponse" level="off"/>
</Logger>

but logger.isInfoEnabled(MarkerManager.getMarker("bid_req_proto"))
evaluates to to "true" whatever way I set up the MarkerFilter.

Regards,

/David
Serdyn du Toit | 21 Jun 23:54 2015
Picon

flushing async loggers

Hi,

How do I flush async loggers?

I'm currently testing some code with multiple threads - when the test is
finished almost none of the log statements have been printed.  This also
makes me worried about my production code's logging when the server
gracefully shuts down (will it just not log as it the case in my test, or
will it actually flush...)

The only clue I've found on how to do this is the following stackoverflow
thread:

http://stackoverflow.com/questions/3060240/how-do-you-flush-a-buffered-log4j-fileappender

But the solution just doesn't seem to work for me.

I don't think its relevant to the question, but here is my log4j.xml

<?xml version="1.0" encoding="UTF-8"?>
<configuration status="WARN">
  <appenders>
<Console name="CONSOLE" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} -
%msg%n" />
</Console>
<RollingRandomAccessFile name="FILE" fileName="logs/webapplog.log"

filePattern="logs/$${date:yyyy-MM}/webapplog-%d{yyyy-MM-dd}-%i.log.gz">
      <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} -
(Continue reading)

Mohan Bhargava | 19 Jun 22:51 2015
Picon

Fwd: Error while logging , using DbAppender

I am trying to log to Oracle 11g database using log4j's `DBAppender ( part
of Apache Extras project for log4j ) . I am using log4j-1.2.17

Have created the required tables by modifying the Oracle.sql script. Had to
tweak logging_event_id_seq_trig trigger to populate event_id column in
`logging_event` table. This was to avoid SQLException , resulting out of
null values in event_id column.

    CREATE SEQUENCE logging_event_id_seq MINVALUE 1 START WITH 1;

    CREATE TABLE logging_event
      (
        sequence_number   NUMBER(20) NOT NULL,
        timestamp         NUMBER(20) NOT NULL,
        rendered_message  VARCHAR2(4000) NOT NULL,
        logger_name       VARCHAR2(254) NOT NULL,
        level_string      VARCHAR2(254) NOT NULL,
        ndc               VARCHAR2(4000),
        thread_name       VARCHAR2(254),
        reference_flag    NUMBER(5),
        caller_filename   VARCHAR2(254) NOT NULL,
        caller_class      VARCHAR2(254) NOT NULL,
        caller_method     VARCHAR2(254) NOT NULL,
        caller_line       CHAR(4) NOT NULL,
        event_id          NUMBER(10) PRIMARY KEY
      );

    CREATE TRIGGER logging_event_id_seq_trig
      BEFORE INSERT ON logging_event
      FOR EACH ROW
(Continue reading)

lomax0000@gmail.com | 12 Jun 20:32 2015
Picon

log4j configuration ignored for inner classes

Hi all,

I am trying to cut back on some of the logging noise from the Spark / Kafka
frameworks by editing log4j.properties. However, it seems like
package-specific customisations are ignored for inner classes.
For example, the following configuration lines mostly work as expected,
suppressing most logs from these packages below WARN level:
log4j.logger.org.apache.spark=WARN
log4j.logger.kafka.utils=WARN

However, I am still getting log messages at INFO from classes like:
org.apache.spark.Logging$class
kafka.utils.Logging$class

I suspect it's because these are inner classes. It still happens even when
I go up a level and add configurations like "log4j.logger.org=WARN".

Is this a known bug in log4j? Is there any known way to suppress these,
ideally through configuration rather than programmatically?

Many thanks
Adam Retter | 9 Jun 21:08 2015

Write log files relative to log4j2.xml?

Hi there,

Is it somehow possible to specify that the log4j2 log files should be
written somewhere relative to the configuration file?

For example my config file is here - /somewhere/my-app/logj2.xml

I would like to ensure that regardless of which folder my application
is started from then my log files are always written to -
/somewhere/my-app/var/log

My log4j2.xml has something like this -

        <RollingRandomAccessFile name="my-app"
filePattern="var/log/my-app.log.gz" fileName="var/log/my-app.log">
            <Policies>
                <SizeBasedTriggeringPolicy size="10MB"/>
            </Policies>
            <DefaultRolloverStrategy max="14"/>
            <PatternLayout pattern="%d [%t] %-5p (%F [%M]:%L) - %m %n"/>
        </RollingRandomAccessFile>

However, var/log/my-app.log is always written relative to the folder
where my-app is started from and not relative to where the config file
is.

Any ideas?

--

-- 
Adam Retter
(Continue reading)


Gmane