Skip to main content

How to parse Vertical/Multiple line logs using SPLUNK for investigations



Problem statement : Finding user who deleted mails from common mailbox such as info@example.com used by a team of people. We have Windows mailbox audit logs to Investigate.


 From the above image you can notice that the each event starts with  the field"RunspaceId"

You need to tell splunk the start of the event line and end of event line and also the timestamp.




Regex for identification of timestamp  field : LastAccessed\s+\D\s
Identifying start of event Pattern: Runspace

Once the data is ingested we need to parse interesting fields.

^FolderPathName\s+\D\s+(?P<path>.??)$

here  ?? is the non-greedy qualifier. If you use +?  instead of ??  there will be inconsistencies in the result if the field is empty it will pick up the next field value.

I am using +? for other fields because these fields cant be empty.



Similarly you can parse other fields

^Operation\s+\D\s+(?P<Action>.+?)$
^FolderPathName\s+\D\s+(?P<path>.??)$
^OperationResult\s+\D\s+(?P<result>.+?)$
^SourceItemSubjectsList\s+\D\s(?P<subject>.*?)$
^LogonUserDisplayName\s+\D\s+(?P<username>.+?)$

When I  was started investigating I used below splunk query which made my investigation easier.

index=test   | timechart cont=false span=1d count(Action) values(Action) values(path) by username

This gave me a beautiful chart to pin point. So this give you the details such as what actions were taken on which folder with the count.





Now you can play with the query at your own convenience and filter out the results.

Note : The action= soft delete doesn't suggest that the user has performed deletion only. Even if you move the email from one folder to another it will create two events action=soft delete & action=create this can be validated by below query and chart.

When you a mail from one folder to another the difference between two events is in seconds.



Comments

Popular posts from this blog

DNS sinkhole & Analysis

What is DNS sinkhole? A sinkhole is a standard DNS server that has been configured to hand out non-routable addresses for all domains in the sinkhole, so that every computer that uses it will fail to get access to the real website.  The higher up the DNS server is, the more computers it will block. Some of the larger botnets have been made unusable by TLD sinkholes that span the entire Internet. DNS Sinkholes are effective at detecting and blocking malicious traffic, and used to combat bots and other unwanted traffic. A sinkhole does not need to be a large DNS server, it only needs to be in the DNS lookup chain. The local  hosts file  on a  Windows ,  Unix  or  Linux  computer is checked before DNS servers, and can also be used to block sites in the same way. Example of Sinkhole Alert. ET TROJAN DNS Reply Sinkhole - Anubis - 195.22.26.192/26 SRC: 195.22.26.248 DST:10.101.8.116 ET TROJAN AnubisNetworks Sinkhole SSL Cert lolcat - 195.22.26.192/26 Fi...

Analyze web attacks using SPLUNK and OSSEC

1. Get access.log /access combine logs from your webserver 2. Install splunk. 3. Download security onion and import the logs to security onion 4.  Go to terminal and type  cat [path of your webserver logs ]  | /var/ossec/bin/ossec-logtest -a >results.txt 5. Now you have to upload results.txt to SPlunk and create source type to parse logs 6. Splunk will not detect these logs once you upload there will be two time values in results.txt. 7. If BREAK_ONLY_BEFORE is not there in advance setting. Create a new setting and add. 8. Now once the logs are indexed you need to parse and the more important field is the rules that are triggered 9. Go to extract fields   and write regex to extract the rule trigerred 10 . Regex is (?<rule>(Rule)\:\s\d+\s\(\w+\s\d+\)\s\-\>\s\'\w+.+\.\'\s) including brackets               ?<rule> --- is the new field name you can change accor...