Skip to content

Logging Service Core Module

The logging service is for all those times you wish you could tell what is going on in your software. This could be for error debugging, or for gathering usage data.

Logging errors is (or should be) standard practice in software engineering. High test coverage helps to ensure that processes will work as expected, but it does not at all cover those unexpected and hard-to-find problems. The challenge of such unexpected errors is that they tend to happen long after the code has been deployed and engineers have forgotten the details of the code. Error logs can serve as a trigger for alerts to the development team as well as give them quick insight into exactly where to start troubleshooting.

Logging can also be handy for collecting data about how a page or a service is being used. This can be a part of user experience development, A/B testing, collecting data download stats, or collecting information about which tools are seeing the most use. Usage statistics can be collected from either the front-end or the back-end code.

The Logging Service

To view current logs and to get a feel for what logging does, naviagate to the BioConnect Elastic cluster to check out what's there.

http://34.148.52.185:5601

From the Kibana home page, click to expand the navigation menu on top left. Kibana discover

From the navigation menu, click on "Discover".

Kibana discover

Select an index to view from the list on the left side of the screen. An index is like a channel, i.e., it is a stream of log messages all pertaining to a common context. For example, a single application, or even a particular part of an application like the login view, sends all of its messages. When setting up a new logging index, or adding new data to the channel, be sure to respect this need for consistent context and avoid adding messages that are not relevant to that channel. Steps for how to set up a new index are given in a following section.

Kibana select index

Once you have chosen an index, then select a time frame for exploration.

Kibana select time frame

To create a new specialized logging channel, an index pattern will need to be created. To get started, from the home navigation menu, click on the "Overview" of "Analytics" section, then click on the "Manage" button on the top right. Click on the "Index Pattern" on the left menu, and finally, click on “Create Index Pattern".

Kibana create index patten

Logging in your code

Generating messages to be sent to the logging service is a snap with the BioConnect logging libary. Here we will describe the logging process for applications written in Python. In a later installment we will detail the process for logging in javascript code running in a browser.

Library installation

The BioConnect logging library is available from PyPI, the Python packaging index.

https://test.pypi.org/project/bioconnect-lib

Installation can be done in the usual ways by adding the repository to your Poetry pyproject.toml file, or to requirements.py for pip. With pip from the command line do:

pip install -i https://test.pypi.org/simple/ bioconnect-lib

Configure logging in a Python application

Using the BioConnect logging library in your Python application requires that you import both the standard logging package and the BioConnectLogHandler.

import logging
from bioconnect_lib.log.log_handler import BioconnectLogHandler

logging.basicConfig(level="INFO", handlers=[BioconnectLogHandler()])

To use, employ logging as you would normally. For example:

logger = logging.getLogger(__name__)

# log as usual
logger.info("logging info")

# log error
try:
    response = authorizer.is_access_allowed()
except Exception as e:
    logger.error("Failed to authorize", exc_info=True)

# add more custom information into the log  
more_info = {
    "user_id": "a_user",
    "sample_name": "a_sample",
    "project_id": "a_project",
}
logger.info("log_more_user_info", extra={"tags": more_info})

Configure logging in a Django application

To use the BioConnect log handler in a Django application, the only file that needs to change is settings.py. First, import the handler:

from bioconnect_lib.log.log_handler import BioconnectLogHandler

Then, add the handler to your logging configuration:

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'formatters': {
        'simple': {
            'format': '{levelname} {asctime} {name}: {message}',
            'style': '{',
        }
    },    
    'handlers': {
        'bioconnect-handler': {
            'class': 'bioconnect_lib.log.log_handler.BioconnectLogHandler',
            'formatter': 'simple'
        }
    },
    'loggers': {
        '': {
            'handlers': ['bioconnect-handler'], 
            'level': 'INFO'
        },
    },
}

Use logging as normal, and the messages will automagically be transmitted to the logging service backend (Elastic) and be viewable in the client application (Kibana).

logger = logging.getLogger(__name__)

# regular log
logger.info("logging info")

# add more info from Django request
logger.info("logging info with more tag info", extra=BioconnectLogHandler.djang_request_to_log_extra(request))

Logging to a specific index and other configurations

The index to send log messages to (a.k.a. channel) is configured using environmental variables. The index is configured with the variable "LOGGING_ELASTIC_INDEX_NAME". For example, if the environment variable in your application is set to

LOGGING_ELASTIC_INDEX_NAME="my-awesome-api"

then select the my-awesome-api index in the Kibana "Discover" section to view the latest log messages.

Additional configurable environment variables are listed in the following table.

Variable Required Default Descption
LOGGING_ELASTIC_INDEX_NAME Required None The logging channel, correspondent to an index name in elasticsearch, is a unique name to identify the logging. It has to be lower case and contains alphanumericals, plus "-"/"_" only. Multiple applications can share the same logging channel.
LOGGING_SAVE_TO_ELASTIC Required false A flag to control if the logging is saved in ES or not. "true" is to save and "false" not to save
LOGGING_TO_ES_MODULES Required None Fine tune which python module is saved in ES. It supports prefix. For example, the value "bioconnect.apps.data_package" for "bioconnect.apps.data_package" app only, while "bioconnect.apps" will include all the sub apps: "bioconnect.apps.data_package", "bioconnect.apps.metadata_repositiory", which means "bioconnect.apps*"
LOGGING_TYPE Optional None An optional field to better organize the logging within a logging channel. A logging channel can be written to by more than one application and LOGGING_TYPE can be used to identify which application is writing

Logging in Angular App

To log data into a server from an Angular UI application, "ngx-logger" is tested and easy to configure. https://github.com/dbfannin/ngx-logger

Configuration

The only configuration is to config the “serverLoggingUrl” to point to our Elasticsearch instance:

  • http://35.231.253.113:9200/logs/xxxxxx

where “xxxxxx” is a unique string to identify the UI application. This will be the Elasticsearch index name. In the demo app, the unique string is “ui-demo-app”, and full URL is:

  • http://35.231.253.113:9200/logs/ui-demo-app
@NgModule({
  declarations: [AppComponent, ...],
  imports:
  [
    // HttpClientModule is only needed if you want to log on server or if you want to inspect sourcemaps
    HttpClientModule,
    LoggerModule.forRoot({
      serverLoggingUrl: 'http://35.231.253.113:9200/logs/xxxxxx',
      level: NgxLoggerLevel.DEBUG,
      serverLogLevel: NgxLoggerLevel.ERROR
    }),
    ...
  ],
  bootstrap: [AppComponent]
})
export class AppModule {
}

View the Logging

  1. Open browser: http://34.148.52.185:5601
  2. Select “Discover” by expanding the top left navigation menu
  3. Click on the down arrow to select the “log*” index
  4. Select time frame to last 15 minutes

See above section for detailed screenshots