Press enter or click to view image in full size
During an approved red team test for Huge Logistics, our team found a public IP address, 13.43.144.61, and AWS login details stored directly in a shipping application. The goal of this task is to use this information to explore their cloud setup further and show the possible impact by reaching sensitive data.
Let’s begin with an Nmap scan.
The scan shows a Node.js application running on its default port. It also indicates that the app is hosted on an EC2 instance located in the eu-west-2 region.
nmap -Pn -sC -sV -A --top-ports=1000 13.43.144.61Press enter or click to view image in full size
The Nmap scan shows that the target system is online and reachable. Almost all ports are filtered, but port 3000 is open. This port is running a web application built with the Node.js Express framework. When accessing it, the page title shows “Huge Logistics > Home,” which confirms it is the main application.
The server appears to be running on a Linux system and is hosted on an AWS EC2 instance in the eu-west-2 region. The operating system version could not be identified with full confidence, but it looks like a Linux 4.x based system. Overall, this tells us there is a single exposed web service on port 3000 that is likely the main entry point for further testing.
Opening the IP address in a web browser loads the Huge Logistics homepage.
http://13.43.144.61:3000/Press enter or click to view image in full size
The site includes a login page, but trying common usernames and passwords does not work. There also does not appear to be any other features that stand out as easy to exploit. At this point, the next step is to review the website’s source code for anything useful.
http://13.43.144.61:3000/login?Press enter or click to view image in full size
<header class="main-header">
<section class="container main-hero-container">
<div class="row">
<div class="col-12 col-lg-6">
<h1 class="Heading display-5 align-items-center">
Huge Logistics
</h1>
<img src="https://hugelogistics-data.s3.eu-west-2.amazonaws.com/truck.png" class="truckimg" alt="" srcset="">
<h2 class="sub-Heading">Everyday Anytime Anywhere</h2>
<p class="main-hero-para">
Your Ultimate Logistics Partner. With a focus on precision and efficiency,
we redefine supply chain management. Seamlessly connecting your shipments across distances,
our advanced solutions ensure your cargo's safe and timely arrival.
Our technology-driven approach empowers you with real-time tracking.Press enter or click to view image in full size
This shows that the site is loading static files from an S3 bucket called hugelogistics-data.
Next, we set up the AWS credentials using aws configure.
aws configurePress enter or click to view image in full size
I try to list the contents of the S3 bucket, but it is not successful. The first command checks whether any authenticated AWS user has permission to access the bucket. The second command checks whether the bucket allows public access. Since I am using AWS CLI v2, all requests are signed by default. To test public access, I use the --no-sign-request option, which sends the request without credentials. This assumes that permissions are not already set for my specific user.
aws s3 ls hugelogistics-data
aws s3 ls hugelogistics-data --no-sign-requestPress enter or click to view image in full size
aws s3api get-bucket-acl --bucket hugelogistics-dataPress enter or click to view image in full size
I might try to download all the files from the bucket using aws s3 cp s3://hugelogistics-data . --recursive, but this also fails because the command first tries to list the bucket to see which files are available to download.
I move on and check for other permissions.
It is worth checking whether I can view the S3 bucket ACL, also known as the Access Control List. Bucket ACLs are an older and simpler permission system compared to bucket policies. They are used to grant read or write access to the bucket or the objects inside it to predefined Amazon S3 groups.
I run the command to get the bucket policy, and it returns the full JSON policy, but it is all in a single line and hard to read. To make it more readable, I can format it with tools like jq or other JSON formatters.
aws s3api get-bucket-policy --bucket hugelogistics-datPress enter or click to view image in full size
I do not have permission to list the bucket ACL. Instead, I check whether I can view the bucket policy. Bucket policies are attached directly to the bucket and specify which actions are allowed or denied for which users or roles.
Success! I am able to retrieve the bucket policy, but to make it easier to read, I can format it using the following command.
aws s3api get-bucket-policy --bucket hugelogistics-data | jq -r '.Policy' | sed 's/\\//g' | jqPress enter or click to view image in full size
I retrieve the bucket policy for hugelogistics-data and see that it allows some interesting access. The policy shows two main statements:
"AWS": "*") is allowed to read the objects backup.xlsx and background.png, and also view their ACLs.This means that even without full access, I can read specific files and see the bucket’s policy.
Even though I cannot list the bucket contents, I am still able to access specific files and leak their data. The next step is to download the Excel file to my local system.
aws s3 cp s3://hugelogistics-data/backup.xlsx .Press enter or click to view image in full size
When I try to open the file in LibreOffice or Microsoft Office, it asks for a password. The next step is to see if I can crack that password.
First, I download the Python script called
office2john.py.
wget https://raw.githubusercontent.com/openwall/john/bleeding-jumbo/run/office2john.pyPress enter or click to view image in full size
This script takes a Microsoft Office file as input and converts it into a hash that matches the version of the Office document.
Different Office versions use different encryption methods, so tools like hashcat need the correct mode to crack them. Office 2007 uses SHA 1 with AES 128. Office 2010 moved to SHA 512 with AES 128 or AES 256. Office 2013 and newer versions improved security even more by using SHA 512 with AES 256 by default. They also add a random salt and increase the number of hashing rounds. Because of this, cracking passwords from newer Office files takes much more time and computing power.
I run the command below to generate a hash of the document so it can be used for an offline brute force attack.
python3 office2john.py backup.xlsx > backup_hash.txtPress enter or click to view image in full size
The generated hash is already in a format that John the Ripper can use, but it needs a small modification to work with hashcat. From the output, I can also see that the Excel file was automatically detected as an Office 2013 or newer version.
Join Medium for free to get updates from this writer.
Remove the filename backup.xlsx: from the hash and save it.
Press enter or click to view image in full size
wget https://github.com/brannondorsey/naive-hashcat/releases/download/data/rockyou.txtPress enter or click to view image in full size
I run the command below and successfully crack the spreadsheet password in less than two minutes.
hashcat -a 0 -m 9600 backup_hash.txt rockyou.txtPress enter or click to view image in full size
Press enter or click to view image in full size
Username Password System
[email protected] 5w8=U5taN]V7 WebCRM
[email protected] {&6Um-aC5@9 HR portal
mflear Michelle123! Email
mflear Password01! Cisco VPN
admin Admin1234 Redshift
michellef HugeLogistics2023 ADP
michellef DataEngineer123 Azure
[email protected] MichelleFleur! Slack
mflear 12345678ABC! Harvest
michellef Password01! SharePoint
[email protected] Welcome01! JIRA
admin admin MongoDB AtlasAt this point, the website is the only attack surface available, so I continue enumerating it to see if there are any other files or directories hosted on it. For this, I use Gobuster. If it is not already installed, it can be installed using the following command.
apt install gobusterNext, I use a small Dirbuster wordlist to fuzz the server and check for hidden files or directories.
wget https://raw.githubusercontent.com/daviddias/node-dirbuster/master/lists/directory-list-lowercase-2.3-medium.txt Press enter or click to view image in full size
gobuster dir -u http://13.43.144.61:3000/ -w directory-list-lowercase-2.3-medium.txtRunning Gobuster reveals a directory called /crm that I have not seen before. I open it in the browser to see what it contains.
Press enter or click to view image in full size
This leads to another login page. Remembering the WebCRM credentials found in the spreadsheet, I try them here, and the login works successfully.
http://13.43.144.61:3000/crmPress enter or click to view image in full size
[email protected] 5w8=U5taN]V7 WebCRMPress enter or click to view image in full size
After logging in, I see a dashboard with options to manage shipments, users, and invoices.
http://13.43.144.61:3000/dashboardPress enter or click to view image in full size
When I click View Invoices Status, a page loads showing users’ credit card details. This clearly demonstrates sensitive data exposure and confirms full impact.
http://13.43.144.61:3000/invoicesPress enter or click to view image in full size
Press enter or click to view image in full size
By clicking on Export Data, I am able to download the full customer dataset.
Press enter or click to view image in full size
hugelogistics-data/crm directoryImpact: Full compromise of sensitive customer and financial data.
I hope you enjoyed this writeup! Happy Hacking :)
Subscribe to me on Medium and be sure to turn on email notifications so you never miss out on my latest walkthroughs, write-ups, and other informative posts.
2. Instagram: reju.kole.9
3. Check My TryHackMe Profile : TryHackMe | W40X
4. Twitter | X : @Mr_W40X
5. GitHub : W40X | Reju Kole | Security Researcher
In case you need any help feel free to message me on my social media handles.