-
-
Notifications
You must be signed in to change notification settings - Fork 757
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JsonLogging format #680
Comments
This is something that I do in all of my projects. Currently, I am running gunicorn with the uvicorn worker and passing a custom logfile like so: $ gunicorn example:app -w 2 -k uvicorn.workers.UvicornWorker --log-config logging_config.conf The
Note that for high throughput, the uvicorn docs mention turning off access logging, which I have not done here. |
thanks a lot @erewok! |
@erewok about the last comment: "he uvicorn docs mention turning off access logging" thanks again! |
I myself am running uvicorn programatically, and serializing in JSON with loguru. Everything is contained in one Python script: import os
import logging
import sys
from uvicorn import Config, Server
from loguru import logger
LOG_LEVEL = logging.getLevelName(os.environ.get("LOG_LEVEL", "DEBUG"))
JSON_LOGS = True if os.environ.get("JSON_LOGS", "0") == "1" else False
class InterceptHandler(logging.Handler):
def emit(self, record):
# Get corresponding Loguru level if it exists
try:
level = logger.level(record.levelname).name
except ValueError:
level = record.levelno
# Find caller from where originated the logged message
frame, depth = logging.currentframe(), 2
while frame.f_code.co_filename == logging.__file__:
frame = frame.f_back
depth += 1
logger.opt(depth=depth, exception=record.exc_info).log(level, record.getMessage())
def setup_logging():
# intercept everything at the root logger
logging.root.handlers = [InterceptHandler()]
logging.root.setLevel(LOG_LEVEL)
# remove every other logger's handlers
# and propagate to root logger
for name in logging.root.manager.loggerDict.keys():
logging.getLogger(name).handlers = []
logging.getLogger(name).propagate = True
# configure loguru
logger.configure(handlers=[{"sink": sys.stdout, "serialize": JSON_LOGS}])
if __name__ == '__main__':
server = Server(Config("my_project.main:app", log_level=LOG_LEVEL))
# setup logging last, to make sure no library overwrites it
# (they shouldn't, but it happens)
setup_logging()
server.run() Export a It's working well in my Linux VM, unfortunately logs are still duplicated (one line as JSON, one line as text) on OpenShift... Still not sure why. EDIT: ah, I just needed to upgrade my uvicorn version. It requires at least 0.11.6. |
Hello all, since it looks like this is mostly a logging configuration question and that some solutions were provided above, I'm going to close this off for housekeeping purposes. Thanks! |
Reviving this thread as the answer above keeps the "color_message" which include special characters (see end of log below):
Any recommendation on how to avoid this? Mayybe subclassing |
@armand-sauzay That's exactly what I did, and it seems to work. Snippet below if it helps you: from pythonjsonlogger import jsonlogger
class MyCustomFormatter(jsonlogger.JsonFormatter):
def __init__(
self,
*args,
**kwargs,
):
reserved_attrs = ("color_message",) + jsonlogger.RESERVED_ATTRS
super().__init__(
reserved_attrs=reserved_attrs,
*args,
**kwargs,
) |
Hi folks, could be possible a Json format logger? i'm using elasticsearch to log data, and json is very intuitive
there's a logger but i don't know how to use it with uvicorn:
The text was updated successfully, but these errors were encountered: