The following vignette presents benchmarks for log4r against all general-purpose logging packages available on CRAN:
Each logging package features slightly different capabilities, but these benchmarks are focused on the two situations common to using all of them:
The first of these is likely the most common kind of logging done by end users, although some may chose to log to files, over HTTP, or to the system log (among others). Yet a benchmark of these other scenarios would largely show the relative expense of these operations, instead of the overhead of the logic performed by the logging packages themselves.
The second measures the performance impact of leaving logging messages in running code, even if they are below the current threshold of visibility. This is another measure of overhead for each logging package.
cat()
As a reference point, we can measure how long it takes R itself to write a simple message to the console:
The following is a typical log4r setup:
The following is a typical futile.logger setup:
The following is what I believe to be a typical logging setup:
The following is what I believe to be a typical logger setup:
The following is what I believe to be a typical lgr setup:
requireNamespace("lgr")
#> Loading required namespace: lgr
lgr_logger <- lgr::get_logger("perf-test")
lgr_logger$set_appenders(list(cons = lgr::AppenderConsole$new()))
lgr_logger$set_propagate(FALSE)
lgr_info <- function() {
lgr_logger$info("Info message.")
}
lgr_debug <- function() {
lgr_logger$debug("Debug message.")
}
Debug messages should print nothing.
Info messages should print to the console. Small differences in output format are to be expected.
log4r_info()
#> INFO [2019-09-04 17:46:26] Info message.
cat_info()
#> INFO [2019-09-04 17:46:26] Info message.
logging_info()
#> 2019-09-04 17:46:26 INFO::Info message.
fl_info()
#> INFO [2019-09-04 17:46:26] Info message.
logger_info()
#> INFO [2019-09-04 17:46:26] Info message.
lgr_info()
#> INFO [17:46:26.100] Info message.
The following benchmarks all loggers defined above:
info_bench <- microbenchmark::microbenchmark(
cat = cat_info(),
log4r = log4r_info(),
futile.logger = fl_info(),
logging = logging_info(),
logger = logger_info(),
lgr = lgr_info(),
times = 500,
control = list(warmups = 50)
)
debug_bench <- microbenchmark::microbenchmark(
cat = cat_debug(),
log4r = log4r_debug(),
futile.logger = fl_debug(),
logging = logging_debug(),
logger = logger_debug(),
lgr = lgr_debug(),
times = 500,
control = list(warmups = 50)
)
print(info_bench, order = "median")
#> Unit: microseconds
#> expr min lq mean median uq max neval
#> cat 38.5 52.40 73.9786 57.65 71.55 3005.2 500
#> log4r 34.2 57.65 73.1090 65.30 76.10 1620.4 500
#> logger 234.1 277.45 323.6790 320.50 353.60 1624.0 500
#> logging 347.6 415.75 512.0916 458.75 507.90 10692.0 500
#> lgr 1373.2 1505.25 1712.7456 1562.90 1633.50 39488.1 500
#> futile.logger 2451.5 2701.70 2906.5282 2794.80 2910.15 5774.6 500
print(debug_bench, order = "median")
#> Unit: microseconds
#> expr min lq mean median uq max neval
#> cat 1.6 4.60 8.2034 6.10 6.90 913.7 500
#> log4r 7.9 14.80 22.9410 19.25 22.90 1128.0 500
#> logging 24.8 40.80 59.8054 49.20 59.45 2129.2 500
#> logger 35.9 57.45 72.7706 64.65 79.90 1056.4 500
#> lgr 214.6 280.30 343.3326 321.00 364.50 4310.1 500
#> futile.logger 761.5 874.00 938.2286 910.60 947.10 3980.3 500