Analyse Large Log Files Using ELK

Ahmed Musaad
Ahmed Musaad
Analyse Large Log Files Using ELK
Photo by Atlas Kadrów / Unsplash

Elasticsearch, Logstash, and Kibana (usually abbreviated as ELK) are three powerful open-source tools that are widely used for log collection, indexing, and analysis for all sorts of use cases and purposes. It's been a while since I spun an ELK instance so when I had to analyse some logs, I decided to give it another go. This short blog post outlines how to spin a local instance, load your data, and offers some pointers on how to analyse Okta logs using KQL and Sigma rules.

Spin A Local ELK Instance

You only need to do this if you don't have an operative ELK instance already. I just use the docker-elk project to spin local instances whenever I need one.

  • Install docker and docker-compose.
  • Clone the repo: git clone https://github.com/deviantony/docker-elk.git
  • Switch to the docker-elk directory.
  • Run docker-compose up

Increase File Size Limit

Import Your Logs

Uploading your logs is ridiculously easy, specially if they are in a CSV file. Here are the steps:

  • The system will take a moment to parse your file and once it's done, it will show you some stats about the different fields and how much data coverage there is for each field. Click on the Import button in the lower left-hand corner to proceed with the data import.
  • You will be asked to provide an index name before the import is complete. Once you click import, it will take some time –based on how much data you are importing– to complete the process.
  • Click on View Data in Discover to see your imported data in the Discover tab, from there, you can search and analyse it using the KQL or maybe build some visualizations.

Example: Analysing Okta Logs

Many blue teams are having a rough week after the news of the security breach affecting Okta (and other companies). One good advice that's been circulating online is to download the system logs from Okta (as they are rotated in a 90-days window) and review them for any malicious or suspicious activity.

Okta logs a lot of information and the logs for three months could easily be in the hundreds of thousands (if not millions) of entries. You could use ELK to import these logs and run all the important queries on the data in a simple and plainless manner. Follow the same steps I outlined above, and you should end up with a local instance that has all of your logs ready for analysis.

Optimally, you will be streaming these logs to your SIEM all the time and not need to do this manual import at all, but in case you don't have a SIEM or can't use it for any reason, this method should help load and analyse your logs in a simple way.

A couple of people on Twitter pointed out that Sigma (a Generic Signature Format for SIEM Systems) has some good queries that can help you find suspicious or malicious activity in your Okta logs. You could either use sigmac to convert these queries into KQL queries that can be executed in your ELK instance, or you can use uncoder.io to do that via a web browser.

sigma/rules/cloud/okta at master · SigmaHQ/sigma
Generic Signature Format for SIEM Systems. Contribute to SigmaHQ/sigma development by creating an account on GitHub.
Uncoder.IO | Universal Sigma Rule Converter for SIEM, EDR, and NTDR
Use Uncoder.IO, the free Sigma rule converter for online content translations to various SIEM, EDR, and NTDR formats.

Final Thoughts

There are many tools available for log analysis, but I really like the simplicity and friendly user-experience provided by ELK and how easy it is to spin a local instance, load the data and perform all types of analysis on it.



Great! Next, complete checkout for full access to Ahmed Musaad
Welcome back! You've successfully signed in
You've successfully subscribed to Ahmed Musaad
Success! Your account is fully activated, you now have access to all content
Success! Your billing info has been updated
Your billing was not updated