Logging¶
We use structlog
to format the logs generated by the application. We redirect the logs to STDOUT
as JSON. You need to use an external system to collect an aggregate the logs.
Lego contains a middleware that binds request information to each log message. There is no need to add things like request path and authenticated user to the log message, this is done automatically.
Import and use the logger¶
from structlog import get_logger
log = get_logger()
log.warn('important_message', message_recipient=user.username)
Celery and structlog¶
If you also want to add the request context to log messages in celery tasks you have to use a
special task base and bind the task. Also remember to call the setup_logger
function.
@celery_app.task(bind=True, base=AbakusTask)
def task_name(self, normal_argument, logger_context=None):
self.setup_logger(logger_context)
other work...
- class lego.utils.tasks.AbakusTask¶
Bases:
Task
This base task supplies the logger_context to the underlying worker.
> @celery_app.task(bind=True, base=AbakusTask) > def task_name(self, logger_context=None): > self.setup_logger(logger_context) > other work…
- apply_async(args=None, kwargs=None, *arguments, **keyword_arguments)¶
Apply tasks asynchronously by sending a message.
- Arguments:
args (Tuple): The positional arguments to pass on to the task.
kwargs (Dict): The keyword arguments to pass on to the task.
- countdown (float): Number of seconds into the future that the
task should execute. Defaults to immediate execution.
- eta (~datetime.datetime): Absolute time and date of when the task
should be executed. May not be specified if countdown is also supplied.
- expires (float, ~datetime.datetime): Datetime or
seconds in the future for the task should expire. The task won’t be executed after the expiration time.
- shadow (str): Override task name used in logs/monitoring.
Default is retrieved from
shadow_name()
.- connection (kombu.Connection): Re-use existing broker connection
instead of acquiring one from the connection pool.
- retry (bool): If enabled sending of the task message will be
retried in the event of connection loss or failure. Default is taken from the :setting:`task_publish_retry` setting. Note that you need to handle the producer/connection manually for this to work.
- retry_policy (Mapping): Override the retry policy used.
See the :setting:`task_publish_retry_policy` setting.
time_limit (int): If set, overrides the default time limit.
- soft_time_limit (int): If set, overrides the default soft
time limit.
- queue (str, kombu.Queue): The queue to route the task to.
This must be a key present in :setting:`task_queues`, or :setting:`task_create_missing_queues` must be enabled. See guide-routing for more information.
- exchange (str, kombu.Exchange): Named custom exchange to send the
task to. Usually not used in combination with the
queue
argument.- routing_key (str): Custom routing key used to route the task to a
worker server. If in combination with a
queue
argument only used to specify custom routing keys to topic exchanges.- priority (int): The task priority, a number between 0 and 9.
Defaults to the
priority
attribute.- serializer (str): Serialization method to use.
Can be pickle, json, yaml, msgpack or any custom serialization method that’s been registered with
kombu.serialization.registry
. Defaults to theserializer
attribute.- compression (str): Optional compression method
to use. Can be one of
zlib
,bzip2
, or any custom compression methods registered withkombu.compression.register()
. Defaults to the :setting:`task_compression` setting.- link (Signature): A single, or a list of tasks signatures
to apply if the task returns successfully.
- link_error (Signature): A single, or a list of task signatures
to apply if an error occurs while executing the task.
- producer (kombu.Producer): custom producer to use when publishing
the task.
- add_to_parent (bool): If set to True (default) and the task
is applied while executing another task, then the result will be appended to the parent tasks
request.children
attribute. Trailing can also be disabled by default using thetrail
attribute- ignore_result (bool): If set to False (default) the result
of a task will be stored in the backend. If set to True the result will not be stored. This can also be set using the
ignore_result
in the app.task decorator.
publisher (kombu.Producer): Deprecated alias to
producer
.headers (Dict): Message headers to be included in the message.
- Returns:
celery.result.AsyncResult: Promise of future evaluation.
- Raises:
- TypeError: If not enough arguments are passed, or too many
arguments are passed. Note that signature checks may be disabled by specifying
@task(typing=False)
.- kombu.exceptions.OperationalError: If a connection to the
transport cannot be made, or if the connection is lost.
- Note:
Also supports all keyword arguments supported by
kombu.Producer.publish()
.
- ignore_result = True¶
If enabled the worker won’t store task state and return values for this task. Defaults to the :setting:`task_ignore_result` setting.
- priority = None¶
Default task priority.
- rate_limit = None¶
Rate limit for this task type. Examples:
None
(no rate limit), ‘100/s’ (hundred tasks a second), ‘100/m’ (hundred tasks a minute),`’100/h’` (hundred tasks an hour)
- reject_on_worker_lost = None¶
Even if
acks_late
is enabled, the worker will acknowledge tasks when the worker process executing them abruptly exits or is signaled (e.g., :sig:`KILL`/:sig:`INT`, etc).Setting this to true allows the message to be re-queued instead, so that the task will execute again by the same worker, or another worker.
Warning: Enabling this can cause message loops; make sure you know what you’re doing.
- request_stack = <celery.utils.threads._LocalStack object>¶
Task request stack, the current request will be the topmost.
- serializer = 'json'¶
The name of a serializer that are registered with
kombu.serialization.registry
. Default is ‘json’.
- store_errors_even_if_ignored = False¶
When enabled errors will be stored even if the task is otherwise configured to ignore results.
- track_started = True¶
If enabled the task will report its status as ‘started’ when the task is executed by a worker. Disabled by default as the normal behavior is to not report that level of granularity. Tasks are either pending, finished, or waiting to be retried.
Having a ‘started’ status can be useful for when there are long running tasks and there’s a need to report what task is currently running.
The application default can be overridden using the :setting:`task_track_started` setting.
- typing = True¶
Enable argument checking. You can set this to false if you don’t want the signature to be checked when calling the task. Defaults to
app.strict_typing
.