logger
quantpylib.logger
is a thread-safe logging module for quantpylib featuring custom formatters
and handlers for recording structured log messages and reducing log latency.
The logger is designed for easy usage in Python applications that require both log analysis and performance requirements.
In particular, the formatters in quantpylib.logger.formatters
contain log formatters extending the logging.Formatter
classes,
designed for structured logging to be used in log aggregation and analysis tools
downstreams. The quantpylib.logger.handlers
contain handlers extending
logging.Handler
classes, designed for improving write log latencies, such as
buffered handlers that reduce I/O operations. These can be used separetely in your own
logging setup.
Alternatively, the quantpylib.logger.Logger
object pre-configures a singleton,
named logger. It allows the user to be able to log messages to stdout, files or both,
with the custom handlers and formatters. We will immediately demonstrate with examples.
The documentations follow.
Examples
The examples are really simple to use. Let us start with some imports, taking the logging
library and the quantpylib.logging
's Logger
and JSONFormatter
object. Let's also take
the ability to parse arguments into our Python script.
import logging
from quantpylib.logger.logger import Logger
from quantpylib.logger.formatters import JSONFormatter
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-c', '--config', type=str, help='Specify logging config')
args = parser.parse_args()
We want to log different kinds of information about our Python application - let's create a function that generates the logs. We will use the same function for all configurations, and see how the different specifications change logging behavior.
def log():
logging.debug("debug log", extra={"my debug key":"my debug value"})
logging.info("info log", extra={"my info key":"my info value"})
# input("hold")
logging.warning("warning log", extra={"my warning key":"my warning value"})
try:
1/0
except ZeroDivisionError:
logging.exception("caught zero denominator", extra={"my exception key":"my exception value"})
input('hold')
for now, and have it commented out. We will look at that later.
Let us take a look at what are the configuration options for our
Logger
object (see documentation)
class Logger:
#... some stuff
def __new__(
cls,
name='root',
register_handlers=[],
stdout_register=True,
stdout_level=logging.INFO,
stdout_formatter_cls=JSONFormatter,
stdout_formatter_kwargs={},
file_register=True,
filename="app.log",
logs_dir="./logs/",
file_level=logging.INFO,
file_formatter_cls=JSONFormatter,
file_formatter_kwargs={},
file_handler_cls=BufferedFileHandler,
file_handler_kwargs={"buffer_size": 64},
):
root
logger. stdout_register
and file_register
is both set to True
, meaning we log to console and to a file. Both levels are set as logging.INFO
, meaning we should only see logging.INFO
and higher levels. The remainder of the arguments are quite self intuitive, so let us try different configurations:
Default Python Logger
Let us begin with no arguments:
We run and we getusing default python logger
WARNING:root:warning log
ERROR:root:caught zero denominator
Traceback (most recent call last):
File "/Users/admin/Desktop/projects/quantpylib/examples/example_logger.py", line 16, in log
1/0
~^~
logging.DEBUG
, logging.INFO
level messages, and we get a nice, compact message. However, this can be rather difficult to parse in a log ingestion stack.
Additionally, we see that the extra
values in the log records are ignored.
Default Configurations
Let's try the default configs in the Logger
:
example_configs = {
"name" : "root",
"register_handlers" : [],
"stdout_register" : True,
"stdout_level" : logging.INFO,
"file_register" : True,
"file_level" : logging.INFO
}
if args.config:
if args.config == "d":
print("using default quantpylib logger")
Logger()
{"level": "INFO", "time": "2024-06-15T15:48:17.562138+00:00", "message": "info log", "my info key": "my info value"}
{"level": "WARNING", "time": "2024-06-15T15:48:17.562633+00:00", "message": "warning log", "my warning key": "my warning value"}
{"level": "ERROR", "time": "2024-06-15T15:48:17.562706+00:00", "message": "caught zero denominator", "exc_info": "Traceback (most recent call last):\n File \"/Users/admin/Desktop/projects/quantpylib/examples/example_logger.py\", line 16, in log\n 1/0\n ~^~\nZeroDivisionError: division by zero", "my exception key": "my exception value"}
./logs/app.log
file with the same content. Here, we see that the key-values in the extra
parameter are written as part of the logs.
Custom Configuring our Logger
We can of course, configure the logger. Say, we want to print to console
logs formatted by logging.Formatter
, while writing to file logs formatted by JSONFormatter
. Furthermore, in the JSONFormatter
, I also want to include more information from the logging.LogRecord
. I choose funcName
and filename
from the possible attributes, listed here.
elif args.config == "jf":
print("using quantpylib logger with json formatter")
Logger(
stdout_formatter_cls=logging.Formatter,
file_formatter_cls=JSONFormatter,
file_formatter_kwargs={"include_attrs":["funcName","filename"]},
file_handler_cls=logging.FileHandler,
file_handler_kwargs={},
**example_configs
)
logging.Formatter
)
using quantpylib logger with json formatter
info log
warning log
caught zero denominator
Traceback (most recent call last):
File "/Users/admin/Desktop/projects/quantpylib/examples/example_logger.py", line 16, in log
1/0
~^~
ZeroDivisionError: division by zero
JSONFormatter
with funcName
and filename
)
{"level": "INFO", "time": "2024-06-15T16:26:01.515551+00:00", "message": "info log", "filename": "example_logger.py", "funcName": "log", "my info key": "my info value"}
{"level": "WARNING", "time": "2024-06-15T16:26:02.830514+00:00", "message": "warning log", "filename": "example_logger.py", "funcName": "log", "my warning key": "my warning value"}
{"level": "ERROR", "time": "2024-06-15T16:26:02.830820+00:00", "message": "caught zero denominator", "exc_info": "Traceback (most recent call last):\n File \"/Users/admin/Desktop/projects/quantpylib/examples/example_logger.py\", line 16, in log\n 1/0\n ~^~\nZeroDivisionError: division by zero", "filename": "example_logger.py", "funcName": "log", "my exception key": "my exception value"}
app.log
file. So far, the default is the BufferedFileHandler
, which is a logging.Filehandler
that only flushes log to files when a buffer is full or on application exit.
What you would have noticed is that if you now uncomment the input('hold')
line and run
again, you would not see the logs written to file yet, when the execution pauses waiting for input. You can of course, adjust this buffer size.
We can also used the BufferedRotatingFileHandler
, so that we can store logs in different back up files - this of course - extends the logging.handlers.RotatingFileHandler
.
Let us try:
elif args.config == "brfh":
print("using quantpylib logger with buffered rotating file handler")
from quantpylib.logger.handlers import BufferedRotatingFileHandler
Logger(
stdout_formatter_cls=logging.Formatter,
file_formatter_cls=JSONFormatter,
filename="brfh.log",
logs_dir="./logs/",
file_handler_cls=BufferedRotatingFileHandler,
file_handler_kwargs={"maxBytes":64},
**example_configs
)
RotatingFileHandler
, we can set the the number of backup files. the maximum byte condition for rotation and so on.
This now behaves like a combination of BufferedFileHandler
we have defined and RotatingFileHandler
we extended -
the only difference is that this rotation is only checked when the
buffer is flushed, and the rotate condition is checked on the last record inside the buffer state.
We can run:
and we since we set the maxBytes
to only 64 - when we flush the three log messages, we get
{"level": "INFO", "time": "2024-06-15T16:04:58.394926+00:00", "message": "info log", "my info key": "my info value"}
{"level": "WARNING", "time": "2024-06-15T16:05:02.579663+00:00", "message": "warning log", "my warning key": "my warning value"}
brfh.log.1
, and the last record
{"level": "ERROR", "time": "2024-06-15T16:05:02.579820+00:00", "message": "caught zero denominator", "exc_info": "Traceback (most recent call last):\n File \"/Users/admin/Desktop/projects/quantpylib/examples/example_logger.py\", line 16, in log\n 1/0\n ~^~\nZeroDivisionError: division by zero", "my exception key": "my exception value"}
brfh.log
.
Documentation
Logger
A configurable singleton logger class that sets up logging for the application.
This logger allows for the registration of custom handlers and configuration of console (stdout) and file handlers. It supports various configuration options including log levels, formatters, and buffer sizes for file handling.
Attributes:
Name | Type | Description |
---|---|---|
_logger |
Logger
|
The singleton logger instance. |
__new__(name='root', register_handlers=[], stdout_register=True, stdout_level=logging.WARNING, stdout_formatter_cls=JSONFormatter, stdout_formatter_kwargs={}, file_register=True, filename='app.log', logs_dir='./logs/', file_level=logging.WARNING, file_formatter_cls=JSONFormatter, file_formatter_kwargs={}, file_handler_cls=BufferedFileHandler, file_handler_kwargs={'buffer_size': 64}, *args, **kwargs)
Creates and configures the logger instance.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name
|
str
|
The name of the logger. Defaults to 'root'. |
'root'
|
register_handlers
|
list
|
A list of additional logging handlers to register. Defaults to an empty list. |
[]
|
stdout_register
|
bool
|
Whether to register a console (stdout) handler. Defaults to True. |
True
|
stdout_level
|
int
|
The logging level for the console handler. Defaults to logging.INFO. |
WARNING
|
stdout_formatter_cls
|
type
|
The formatter class for the console handler. Defaults to |
JSONFormatter
|
stdout_formatter_kwargs
|
dict
|
Keyword arguments for the console handler formatter. Defaults to an empty dict. |
{}
|
file_register
|
bool
|
Whether to register a file handler. Defaults to True. |
True
|
filename
|
str
|
The name of the log file. Defaults to "app.log". |
'app.log'
|
logs_dir
|
str
|
The directory where log files are stored. Defaults to "./logs/". |
'./logs/'
|
file_level
|
int
|
The logging level for the file handler. Defaults to logging.INFO. |
WARNING
|
file_formatter_cls
|
type
|
The formatter class for the file handler. Defaults to |
JSONFormatter
|
file_formatter_kwargs
|
dict
|
Keyword arguments for the file handler formatter. Defaults to an empty dict. |
{}
|
file_handler_cls
|
type
|
The handler class for file logging. Defaults to |
BufferedFileHandler
|
file_handler_kwargs
|
dict
|
Keyword arguments for the file handler. Defaults to {"buffer_size": 64}. |
{'buffer_size': 64}
|
*args
|
Additional positional arguments passed to the logging configuration. |
()
|
|
**kwargs
|
Additional keyword arguments passed to the logging configuration. |
{}
|
Returns:
Type | Description |
---|---|
logging.Logger: The configured singleton logger instance. |
app(message, *args, **kwargs)
Logs a message with the custom APP level.