Module Description
Append a field to your logs that you can ingest directly into elasticsearch with no dissect or parsing magic

Usage: In your services file you append it as a monolog processor. So for example if your service parameters look like this:

parameters: monolog.channel_handlers: default: ['file'] monolog.processors: ['message_placeholder', 'current_user', 'request_uri', 'ip', 'referer'] Change it to this:

parameters: monolog.channel_handlers: default: ['file'] monolog.processors: ['message_placeholder', 'current_user', 'request_uri', 'ip', 'referer', 'elasticsearch_date'] This will give you an additional field called extra.elasticsearch_date. If you are sending this field directly to elasticsearch from filebeat, you can use it like this (example):

processors: - timestamp: ignore_missing: true ignore_failure: true field: extra.elasticsearch_date timezone: "Europe/Oslo" layouts: - '2006-01-02T15:04:05Z' - '2020-06-02T13:20:50.516Z' test: - '2019-06-22T16:33:51Z' - '2020-06-02T13:20:50.516Z'
Project Usage
64
Creation Date
Changed Date
Security Covered
Covered By Security Advisory
Version Available
Production
Module Summary
This module solves the issue of appending a field to logs for direct ingestion into Elasticsearch without the need for dissect or parsing magic.
Data Name
monolog_elasticsearch_date_processor

OPENAI CHATBOT

OPENAI CHATBOT

16:26:01
Generic Chatbot
Hi, I'm a Drupal module expert powered by OpenAI, answering your questions about the Drupal module ecosystem. How can I be helpful today? Please note that we will log your question.