Recently we got an interesting project from our one of long term customer, They have more than 100+ servers for theirs business app. It will generate lots of logs day by day and it will be stored in common storage servers.
Actually they want to have a system to analyze and visualised the logs data. There is different kind of logs from different kind of systems. Quilltez has gone through all the sample logs files given by the client and noticed one thing that Some of them in well-formatted log messages some of them in the unformatted log message and some of them in CSV data.
We decided to use ELK Stack solution for this problem, ELK stack contains Elastic Search, Logstash, Kibana, and FileBeat (Why Filebeat not in the ELK abbreviations?), Filebeat closely monitor the configured logs files if any changes happen it will report it to Logstash from the Logstrash we can format the unformatted data and mapped the formatted data field into the custom variables. And Elastic search will index all the data formatted from logstash and it will be used for quick search functionality. Finally, Kibana is used to visualize the data comes from the elastic search. It is an analytics tool for end-users.
Formatting and mapped the correct to the index is the main part of this functionality. Logstash used a grok filter for this. Grok filter is a kind of pattern to separate the needed from text work like Regex. This free online tool helps much to find and test the correct grok filter patterns.
Quilltez did the great work and delivered the project within the deadline without any issues back. The client happier with that, It is a great milestone for Quilltez and it shows our's QA efforts.
If you have a kind of project Let’s Talk, We are interesting to look into it.