TryHackMe CTF Pickle Rick

This was my first CTF with TryHackMe. It is classified as easy and has some basic tools which are used to exploit a web server using Command Injection.

I would recommend being familiar with tools like nmap, Nikito and Dirb. I had heard and done some lab on those tools but had no idea how to use it in real setting. In this write up along with doing a walkthrough of the CTF, I will also add some information and links about the tools used.

Nmap (“Network Mapper”) is a free and open source (license) utility for network discovery and security auditing. Many systems and network administrators also find it useful for tasks such as network inventory, managing service upgrade schedules, and monitoring host or service uptime. Nmap uses raw IP packets in novel ways to determine what hosts are available on the network, what services (application name and version) those hosts are offering, what operating systems (and OS versions) they are running, what type of packet filters/firewalls are in use, and dozens of other characteristics. It was designed to rapidly scan large networks, but works fine against single hosts. Nmap runs on all major computer operating systems, and official binary packages are available for Linux, Windows, and Mac OS X. In addition to the classic command-line Nmap executable, the Nmap suite includes an advanced GUI and results viewer (Zenmap), a flexible data transfer, redirection, and debugging tool (Ncat), a utility for comparing scan results (Ndiff), and a packet generation and response analysis tool (Nping). In this CTF, we will be using nmap is network mapper.

Nikto is an Open Source (GPL) web server scanner which performs comprehensive tests against web servers for multiple items, including over 6700 potentially dangerous files/programs, checks for outdated versions of over 1250 servers, and version specific problems on over 270 servers. It also checks for server configuration items such as the presence of multiple index files, HTTP server options, and will attempt to identify installed web servers and software. Scan items and plugins are frequently updated and can be automatically updated.

Nikto is not designed as a stealthy tool. It will test a web server in the quickest time possible, and is obvious in log files or to an IPS/IDS. However, there is support for LibWhisker’s anti-IDS methods in case you want to give it a try (or test your IDS system).

Not every check is a security problem, though most are. There are some items that are “info only” type checks that look for things that may not have a security flaw, but the webmaster or security engineer may not know are present on the server. These items are usually marked appropriately in the information printed. There are also some checks for unknown items which have been seen scanned for in log files.

DIRB is a Web Content Scanner. It looks for existing (and/or hidden) Web Objects. It basically works by launching a dictionary based attack against a web server and analyzing the response.

DIRB comes with a set of preconfigured attack wordlists for easy usage but you can use your custom wordlists. Also DIRB sometimes can be used as a classic CGI scanner, but remember is a content scanner not a vulnerability scanner.

DIRB main purpose is to help in professional web application auditing. Specially in security related testing. It covers some holes not covered by classic web vulnerability scanners. DIRB looks for specific web objects that other generic CGI scanners can’t look for. It doesn’t search vulnerabilities nor does it look for web contents that can be vulnerable.

As this CTF indicates in the beginning that the purpose is to exploit a web server to identify 3 important ingredients we will start with using nmap

Use the IP provided when the machine become active. In my case, I was using the machine in TryHackMe’s browser.
It is important to use right case in Linux. As you can see above -a gave a response of “ambiguous possibilities” -sV stands for version scan.

We see from the above screen shot that port 22 and port 80 are open. Port 22 is secure shell or ssh port.

If you are new to cybersecurity like me, you may be wondering what is ssh. Here is a brief description of what ssh is:

The SSH protocol also stated to as Secure Shell is a technique for secure and reliable remote login from one computer to another. It offers several options for strong authentication, as it protects the connections and communications\ security and integrity with strong encryption. It is a secure alternative to the non-protected login protocols (such as telnet, rlogin) and insecure file transfer methods (such as FTP). By default, ssh listen on port 22 which means if the attacker identifies port 22 is open then he can try attacks on port 22 in order to connect with the host machine. Check this site out for more details on some of the way ssh can be used by attackers:

Next we move to Port 80: Port 80 is the port number assigned to commonly used internet communication protocol, Hypertext Transfer Protocol (HTTP). It is the port from which a computer sends and receives Web client-based communication and messages from a Web server and is used to send and receive HTML pages or data.

Going back to our nmap scan, we see that port 80 and port 22 are open.

Next we run Nikito:

Use nikto -help or man nikto for full list of options

We see above there is a login.php page found. Although not relevant to this CTF, but I also noticed that anti-click jacking X-Frame-Options header is not present. Robots.txt file was also retrieved and doesn’t have any ‘disallow’ entries.

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should block indexing with noindex or password-protect the page.

If you want to read more about robots.txt file see this link:

Output of these commands can be examined in more detail by creating a text file.

Next command I ran was Dirb

Here we see index.html and robots.txt file both responding with Code 200 which is good.

HTTP response codes indicate whether a specific HTTP request has been successfully completed. In the above screen shot Code:200 is a successful response and Code: 403 shows Client Error. Check out this site for more HTTP codes.

This completes the basic enumeration.

Now let’s review the web application and see what else we can find.

To access the web application, I entered the IP in the URL bar:

Tried BURP few times, no luck

There is nothing on this web page that seems relevant. I then looked at the source of the web page by doing a right click and selecting page source.

Right Click to see Page Source.
Username is visible in the source

Hurray! We have a username.

I then checked to see if there was anything in the robots.txt file.

Looks like a password

As you see in the screenshot, the username and password are found. We saw from enumeration exercise that login.php file is accessible.

Login page is visible

Using the username and password obtained from previous steps, I was able to login to the web application.

We are in the application

Start by running the ls command

Output of ls command

Using the first text file, I was able to find clue for the first question. I did try other text files to see the output, however only the response for this one matched the answer format.

Answer to Question #1

For clue #2, let’s see what other hints we have from the files found.

web page clue.txt

Clue.txt webpage tells us to look around in the file system. I did a Directory traveral by checking to see access to the root directory.

Accessing the root directory
Root directory screen shot

Looking at the root, nothing stuck out to me except for the home directory.

Entering the home directory

Interestingly just doing cd /home did not give me any results. I don’t know why.

Output of /home ;ls -al; pwd
cd /home/rick ;ls -al; pwd

Navigate to the rick directory to find answer to the second question.

Cat command didn’t work to open the second ingredients file.

Flag for Question #2

Let’s navigate to the assets directory found by running DIRB. We can use Dirb to scan the webserver for important words. In this case we got a directory named “assets”


I didn’t find anything significant here. I read some write up’s where people opened the .jpg and .gif file to see for any hidden stego type objects or clues.

Next I tried to check if sudo would work to elevate the privileges and access the root

Are you wondering what is sudo or why did I use -l.

Before we can access this folder, let’s run the sudo -l command to see what privileges we have on this box.

Executing this command we see that we can execute ANY command on this box WITHOUT a password (ALL NOPASSWD: ALL). This surely is NOT good security! Let’s use the sudo command with the listing (ls) with the root folder and see what we can find.

Executing the command, we see that there are two files in the /root folder. One of them is named 3rd.txt. Could this be our third ingredient? Let’s see.

executing sudo command for 3rd.txt

Hurray and here’s our 3rd. Flag.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store