The problem I have is that when using Log4j during a parallel execution of several processes, this creates a log for each of the processes, but writes all the output in one of them randomly. It is written in groovy.
This is my .properties
:
log4j.rootLogger=ERROR, INFO, DEBUG
Define which packages use which appenders
Custom Appenderlog4j.appender.customLogger=org.apache.log4j.RollingFileAppender
log4j.appender.customLogger.File=/dummy
log4j.appender.customLogger.MaxFileSize=3000KB
log4j.appender.customLogger.MaxBackupIndex=15
log4j.appender.customLogger.layout=org.apache.log4j.PatternLayout
log4j.appender.customLogger.layout.ConversionPattern=[%5p] %d [%c{1}] (%F:%M:%L)%n%m%n%n
And this my logger.groovy
:
class CustomLogin {
def conf
def log4javaProp
def getCustomLogger (conf, name)
{
Log log = LogFactory.getLog(org.apache.log4j.ConsoleAppender);
Properties props = new Properties();
try {
if (conf['LOG_CONFIG']['LOG4JAVA'] != null) {
log4javaProp = "/" + conf['LOG_CONFIG']['LOG4JAVA']
} else {
log4javaProp = "/CustomLog4j.properties"
}
InputStream configStream = getClass().getResourceAsStream(log4javaProp);
props.load(configStream);
configStream.close();
} catch (IOException e) {
throw new IOException(" FATAL: CustomLog4j.propertiess NOT FOUND");
}
props.setProperty("log4j.rootLogger", "INFO, customLogger");
//Obtener PID
java.lang.management.RuntimeMXBean runtime = java.lang.management.ManagementFactory.getRuntimeMXBean();
java.lang.reflect.Field jvm = runtime.getClass().getDeclaredField("jvm");
jvm.setAccessible(true);
sun.management.VMManagement mgmt = (sun.management.VMManagement) jvm.get(runtime);
java.lang.reflect.Method pid_method = mgmt.getClass().getDeclaredMethod("getProcessId");
pid_method.setAccessible(true);
int pid = (Integer) pid_method.invoke(mgmt);
//FIN PID
//Indicamos el nombre del archivo donde se va a guardar el log
def filepath = "log/"+conf['LOG_CONFIG']['LOG_DIR']+"/";
props.setProperty("log4j.appender.customLogger.File", filepath + name + "_" + pid + "_" + System.currentTimeMillis() + ".log");
//LogManager.resetConfiguration();
PropertyConfigurator.configure(props);
}
def destroyCustomLogger(conf) {
org.apache.log4j.Logger.getRootLogger().removeAppender("customLogger");
}
}
I'm launching to try an import with sqoop that makes 3 imports in parallel and these are the logs that it writes to me:
426 Jun 22 11:01 main
0 Jun 22 11:01 Sqoop x1.log
0 Jun 22 11:01 Sqoop x2.log
8356 Jun 22 11:02 Sqoop x3.log
I want you to write the output of each of the exports that are made in parallel in your corresponding log, and not the 3 outputs together in one of them randomly for each execution.