Tracking an Insider Threat: TryHackMe "Have a Break" Writeup
So I just wrapped up the "Have a Break" room on TryHackMe, and honestly, this one was a blast. It's not your typical CTF — instead, you're thrown into an insider threat investigation with actual corporate artifacts to analyze. A truck carrying 400,000 KitKats disappeared somewhere between Italy and Poland, and someone on the inside leaked the route. Your job? Find out who did it.
The Setup
The room gives you 6 files to work with:
- A briefing memo (PDF)
- An anonymous tip email (
.emlfile) - A dashcam image from the crime scene
- Employee directory (
employees.csv) - Access log of file activity (
access_log.csv) - Internal messaging export (
comms_export.txt)
The investigation basically chains these together — you find one clue that leads to another. Let me walk through how I solved each question.
Question 1: Which VPN was used?
The anonymous email was the first clue. I needed to figure out which VPN service the whistleblower used.
I opened the .eml file and scrolled down to the email headers. Here's a trick I learned — always read email headers from bottom to top. The very first Received: header (at the bottom) shows the actual IP the email was sent from.
Found it:
Received: from [193.32.249.132] ([193.32.249.132])
So the sender's IP was 193.32.249.132. I threw it into an IP lookup tool (I used ipinfo.io), and it came back with ASN AS39351 owned by 31173 Services AB.
That rang a bell — 31173 Services AB is the data center that hosts Mullvad VPN exit nodes. So the answer was Mullvad.
Question 2: Petrol station address
This one tripped me up at first. The dashcam image (exhibit_b.png) showed an ORLEN petrol station with road signs saying "Olomouc 27 km / Brno 45 km." I almost went down a rabbit hole searching every ORLEN station on the D1 highway.
Then I re-read the briefing memo. It mentioned the dashcam was recovered from a vehicle stopped near Hulín. That was the key — the station had to be near Hulín.
I searched Google Maps for "ORLEN Hulín" and boom — found it right away:
Kroměřížská 1281, 768 24 Hulín, Czechia
I double-checked against the format hint (*********** ****, *** ** *****, *******) and it matched perfectly. The truck was last seen here on March 26, 2026 at 22:31 (from the dashcam timestamp).
Question 3: When did the suspicious action happen?
Now I had to dig into the access logs. I opened access_log.csv and filtered for March 25 (the day before the truck departed).
Most entries were normal — people viewing or editing files during work hours. But one entry stood out like a sore thumb:
2026-03-25, 22:14:09, BR-0291, ROUTE_IT_PL_Q1_2026.pdf, EXPORT
This was the only EXPORT in the entire log. Someone exported the Italy-Poland route file at 10pm the night before departure. That's not normal business activity — exporting means they pulled a copy out of the system, probably to send it somewhere.
Answer: 22:14:09
Question 4: Who sent the anonymous email?
The whistleblower mentioned in the email that they were working late that night and saw the unusual activity. So I looked at the access log around the time of that export (22:14 on March 25).
Right after BR-0291's export, I found this:
2026-03-25, 23:41:17, BR-0312, DRIVER_SCHEDULE_WK13.xlsx, EDIT
BR-0312 was editing the driver schedule at 11:41pm — they were genuinely working late. They would've seen the audit trail showing BR-0291's weird export. Since they couldn't trust internal channels (someone in the company was compromised), they went external — sent an anonymous tip via Mullvad VPN.
If you check the email headers again, the User-Agent shows K-9 Mail for Android, which makes sense for someone quickly sending a tip from their phone.
Answer: BR-0312
Question 5: Who leaked the route details?
This one was already pretty clear from Question 3. BR-0291 was the only person who exported the route file. But there was more evidence:
- The export itself — 22:14:09 on March 25, the ROUTE_IT_PL file
- Failed authentication — On March 24 at 07:11:03, BR-0291 got
AUTH_FAILEDtrying to access the same file. They were really pushing to get this document. - The personal email attempt — In
comms_export.txt, there's a message from IT (BR-0255) on March 24:"a request was received from an external address (kraliknovak09@gmail.com) to access files in the route planning shared folder. The request was blocked."
That email address looked suspicious. I checked BR-0291's employee record — their hometown is Králice nad Oslavou. "Kralik" in the email is a reference to that town. Classic mistake — using personal info in your burner email.
Answer: BR-0291
Question 6: What's the culprit's real name?
Now for the final piece — I had the email address kraliknovak09@gmail.com from the leaked access attempt. Time for some OSINT.
I used Epieos (epieos.com), which does Google account lookups. I entered the email, and it pulled up the associated Google profile.
The name came back as Radovan Blšťák. I also saw they had Google Maps reviews under that account, which confirmed it was their real identity.
This was the opsec failure — BR-0291 tried to use their personal Gmail to access corporate files. The moment they did that, they connected their insider activity to their real Google account.
Answer: Radovan Blšťák
What I learned from this room
This room was great practice for thinking like an actual analyst. Here's what stuck with me:
-
Email headers tell the truth — Always read from bottom to top. The first
Received:header is where the real sender IP shows up. -
Read the briefing carefully — The memo mentioning "Hulín" saved me hours of blind searching. OSINT is about narrowing down your search space, not brute-forcing it.
-
Action types matter in logs — EXPORT is very different from VIEW. When you see someone exporting sensitive files at odd hours, that's a red flag.
-
The bystander is often the witness — BR-0312 was active in the system that night, which is why they noticed BR-0291's suspicious export. Access logs don't just show suspects — they show witnesses too.
-
Personal + corporate = identity leak — The moment
kraliknovak09@gmail.comhit the corporate access request system, BR-0291's real identity was exposed. Mixing personal accounts with insider activity is how people get caught. -
Format hints are your friend — Those asterisks in the answer format tell you exactly how many characters each word should be. Use them to validate before submitting.
If you're preparing for SOC analyst roles or learning incident response, this room is 100% worth doing. It's realistic, it chains multiple investigation techniques together, and it shows you how small opsec mistakes create big leads.
That's it for this walkthrough. If you found this helpful, feel free to reach out. Always happy to talk OSINT and threat hunting.
— Aniket