Log Everything as JSON

Logging and monitoring are like Tony Stark and his Iron Man suit, the two will go together. Similarly, logging and monitoring work best together because they complement each other well.

For many years, logs have been an essential part of troubleshooting application and infrastructure performance. But over the period of time we have realized that logs are not only meant for troubleshooting purposes, they can also be used for business dashboards visualization and performance analysis.

So logging application data in a file is great, but we need more.

Why JSON logging is best framework?

For understanding the greatness of the JSON logging framework, let’s understand this conversation between Anuj(A system Engineer) and Kartik(A Business Analyst).


A few days later Kartik complains that Web Interface is broken. Anuj scratches his head and takes a look at the logs and realizes that Developer has added an extra field to the log lines broked his custom parser.

I am sure anyone can face a similar kind of situation.

In this case, if Developer has designed the application to write logs as JSON, it would be a piece of cake for Anuj to create a parser for that because then he has to search fields on the basis of the JSON key and it doesn’t matter how many new fields are getting added in the logline.

The biggest benefit of logging in JSON is that it has a structured format. This makes possible to analyze application logs just like Big Data. It’s not just readable, but a database that can be queried for each and every field. Also, every programming language can parse it.

Magic with JSON logging

Recently, we have created a sample Golang application to get Code Build, Code Test and Deployment phase experience with Golang Applications. So while writing this application we have incorporated the functionality to write logs in JSON.
The sample logs are something like this:-

And while integrating ELK for logs analysis, the only parsing line we have to add in logstash is:-

 
filter {
    json {
        source => "message"
    }
}

After this, we don’t require any further parsing and we can add as many fields in the log file.

As you can see I have all fields available in Kibana like:- employee name, employee city and for this, we do not have to add some complex parsing in logstash or in any other tool. Also, I can create a beautiful Business Dashboard with this data.

Application Repository Link:-
https://github.com/opstree/Opstree-Go-WebApp

Conclusion

It will not take too long to migrate from text logging to JSON logging as there are multiple programming language log drivers are available. I am sure JSON logging will provide more flexibility to your current logging system.
If your organization is using any Log Management platform like Splunk, ELK, etc. I think JSON logging could be a companion of it.

Some of the popular logging drivers which support JSON output are:-

Golang:- https://github.com/sirupsen/logrus
Python:- https://github.com/thangbn/json-logging-python
Java:- https://howtodoinjava.com/log4j2/log4j-2-json-configuration-example/
PHP:- https://github.com/nekonomokochan/php-json-logger

I hope now we have a good understanding of JSON logging. So now it’s time to choose your logging wisely.


That’s all I have, thanks for reading, I’d really appreciate any and all feedback, please leave your comment below if you guys have any feedback or any queries.

Cheers till next time!!

Image Reference:- https://www.google.com/url?sa=i&source=images&cd=&cad=rja&uact=8&ved=2ahUKEwi0jJzD8t_mAhVOzDgGHc6ODNQQjB16BAgBEAM&url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DbPTZ43nM688&psig=AOvVaw0N3k7-bSq4OD4oMxPxYPrN&ust=1577881894317756

Logstash Timestamp

Introduction

A few days back I encountered with a simple but painful issue. I am using ELK to parse my application logs  and generate some meaningful views. Here I met with an issue which is, logstash inserts my logs into elasticsearch as per the current timestamp, instead of the actual time of log generation.
This creates a mess to generate graphs with correct time value on Kibana.
So I had a dig around this and found a way to overcome this concern. I made some changes in my logstash configuration to replace default time-stamp of logstash with the actual timestamp of my logs.

Logstash Filter

Add following piece of code in your  filter plugin section of logstash’s configuration file, and it will make logstash to insert logs into elasticsearch with the actual timestamp of your logs, besides the timestamp of logstash (current timestamp).
 
date {
  locale => "en"
  timezone => "GMT"
  match => [ "timestamp", "yyyy-mm-dd HH:mm:ss +0000" ]
}
In my case, the timezone was GMT  for my logs. You need to change these entries  “yyyy-mm-dd HH:mm:ss +0000”  with the corresponding to the regex for actual timestamp of your logs.

Description

Date plugin will override the logstash’s timestamp with the timestamp of your logs. Now you can easily adjust timezone in kibana and it will show your logs on correct time.
(Note: Kibana adjust UTC time with you bowser’s timezone)

Stunnel a Proxy to ship the log on SSL

Introduction

p { margin-bottom: 0.25cm; line-height: 120%; }a:link { }

Few days ago I got a task to create the SSL connection with logstash redis plug-in with Azure Redis. As we are shipping the logs form the several data center to the Azure Redis. So logs must be shipped on SSL connection. There is no provision to create SSL connection through logstash redis plug-in thats why logstash redis plug-in is not able to make SSL connection with Azure redis.To resolve this problem we have to setup the stunnel as proxy front of the logstash redis plug-in. Stunnel can create SSL connection with Azure redis and Stunnel provide non-ssl connection for the logstash redis plug-in.

p { margin-bottom: 0.25cm; line-height: 120%; }a:link { Azure redis provide two type of connections, SSL on 6380 port and non SSL on 6379 port and also provide primary and secondary key(password).

Installation

Install Stunnel on the ubuntu.

$ sudo apt-get install stunnel  

Configuration

Create a configuration for stunnel /etc/stunnel/stunnel.conf and put the following lines into the configure file.

setuid = root
setgid = root
pid = /var/run/stunnel-azureredis.pid
debug = 7
output = /var/log/stunnel4/azureredis.log
options = NO_SSLv2
options = NO_SSLv3
[azureredis]
accept=127.0.0.1:6379
connect=:6380
client=yes
Timeout idle = 30

p { margin-bottom: 0.25cm; line-height: 120%; }a:link { }

Restart the stunnel4
p { margin-bottom: 0.25cm; line-height: 120%; }a:link { }
$ service stunnel4 restart

Install Redis-cli to test the connection try to ping the redis azure it should reply as PONG

$ redis-cli -a
127.0.0.1:6379> ping
PONG

Now you can make SSL connection with azure redis on SSL.