JerryGamblin.com https://jerrygamblin.com Security Advocate. Problem Solver. Hacker. Ebullient Communicator. Mon, 12 Jun 2017 14:22:49 +0000 en-US hourly 1 https://wordpress.org/?v=4.8 https://i0.wp.com/jerrygamblin.com/wp-content/uploads/2015/11/cropped-logo.jpg?fit=32%2C32&ssl=1 JerryGamblin.com https://jerrygamblin.com 32 32 101886185 Quickly Building A Cloud Virtual Lab https://jerrygamblin.com/2017/06/12/quickly-building-a-cloud-virtual-lab/ Mon, 12 Jun 2017 14:10:17 +0000 https://jerrygamblin.com/?p=2392 … ]]> Often while doing research I need temporary access to a bunch of different virtual machines. While it is possible to do this on my Macbook using VMWare Fusion or Virtualbox the overhead seems unnecessary for something I will delete in under a week.

My goto solution is a virtualization stack of:
16GB DigitalOcean Droplet + Wok + Kimchi

Here is the shell script I use to build it:

#!/bin/bash 
apt-get update &&  apt-get upgrade -y
apt-get -y install qemu qemu-kvm libvirt-bin ubuntu-vm-builder bridge-utils nginx python-cherrypy3 python-jsonschema python-m2crypto nginx python-ldap python-psutil fonts-font-awesome texlive-fonts-extra python-configobj python-parted sosreport python-imaging websockify novnc nfs-common python-ethtool open-iscsi python-guestfs libguestfs-tools spice-html5 python-paramiko 
wget http://kimchi-project.github.io/kimchi/downloads/latest/kimchi.noarch.deb
wget http://kimchi-project.github.io/wok/downloads/latest/wok.noarch.deb
wget http://kimchi-project.github.io/gingerbase/downloads/latest/ginger-base.noarch.deb
dpkg -i wok.noarch.deb
apt-get install -f -y
dpkg -i ginger-base.noarch.deb
apt-get install -f -y
dpkg -i kimchi.noarch.deb
apt-get install -f -y
reboot
#You will need to know the root password for the web interface (passwd lets you reset it).

After the server is rebooted you can access the web interface at https://ip:8001:

The next step is to add the templates you want to build VMs for:

You can use these commands to grab newer isos (there is a feature request to automate this):

cd /var/lib/kimchi/isos
wget -c http://cdimage.kali.org/kali-2017.1/kali-linux-2017.1-amd64.iso
wget -c http://releases.ubuntu.com/17.04/ubuntu-17.04-desktop-amd64.iso
wget -c http://releases.ubuntu.com/17.04/ubuntu-17.04-server-amd64.iso
wget -c http://releases.ubuntu.com/16.04/ubuntu-16.04.2-desktop-amd64.iso
wget -c http://releases.ubuntu.com/16.04/ubuntu-16.04.2-server-amd64.iso
wget -c ftp://opensuse.mirrors.ovh.net/opensuse/distribution/13.2/iso/openSUSE-13.2-DVD-x86_64.iso
wget -c http://slackware.mirrors.ovh.net/ftp.slackware.com/slackware64-14.2-iso/slackware64-14.2-install-dvd.iso
wget -c http://archlinux.mirrors.ovh.net/archlinux/iso/2016.09.03/archlinux-2016.09.03-dual.iso
wget -c https://download.fedoraproject.org/pub/fedora/linux/releases/25/Workstation/x86_64/iso/Fedora-Workstation-Live-x86_64-25-1.3.iso
wget -c https://az792536.vo.msecnd.net/vms/VMBuild_20150801/VirtualBox/MSEdge/Windows/Microsoft%20Edge.Win10.For.Windows.VirtualBox.zip

Once you are done with that is is amazingly easy to spin up VMs and manage them in the browser:

I use this virtualization stack a lot in my research and it is amazing.  If you have any questions feel free to reach out to me on twitter.

]]>
2392
Reminder: Operational Security Is Hard https://jerrygamblin.com/2017/06/11/reminder-operational-security-is-hard/ Sun, 11 Jun 2017 17:29:33 +0000 https://jerrygamblin.com/?p=2381 … ]]> I love OWASP  (I wanted to get that out of the way) but they let their TLS certificate expire yesterday:


Should it have happened to an organization whose whole goal is to secure web applications?

No.

There are a million reasons why their TLS certificate could have expired and plenty of reasons it shouldn’t have  (OWASP uses letsencrypt for their TLS certificate which can automatically renew certificates and sends you email when they are close to expiring).

Is it forgivable?
Yes.

Expired certificates,  missing patches and unknown cloud services haunt every security organization. Some people look at these things as *easy* to fix and if you miss them you dont care about security… most of those people have usually never worked in operational security.

Why did it happen?
Operational Security Is Hard.

Being perfect is impossible.   Stephen Curry (Arguably the best shooter in the NBA) only makes 90% on his free throws.  So everyone is going to miss a patch, let a certificate expire and have unknown cloud services.  It.Is.Going.To.Happen.

What can we learn from this?
A lot. 

How would your organization have handled this on Saturday morning?  Would you have been able to update your certificate in an hour on a Saturday morning?    If you know the answer to those questions you can pick a tweet from @badthingsdaily and work through it with your team.

Let me know your thoughts on twitter.

]]>
2381
Build Your Own Honeypot Network In Under An Hour https://jerrygamblin.com/2017/05/29/build-your-own-honeypot-network-in-under-an-hour/ Mon, 29 May 2017 22:31:21 +0000 https://jerrygamblin.com/?p=2354 … ]]> Have you ever wanted to control a vast medium small network of Honeypots but only had an hour and about $40 a month to spend on your project? So did I!  So with the help of Digital Ocean and Anomali‘s Modern Honey Network we can now do it!

For a basic distributed Cowrie network you will need:
1 – $20 a month Digital Ocean Droplet for the MHN Server.
4 – $5 a month Digital Ocean Droplets for the Cowrie honeypots.

Configuring The MHN Server:

Setting up the server is eas easy as running these commands on your controller droplet and and waiting 10 minutes:

sudo apt update
sudo apt upgrade -y
cd /opt/
sudo git clone https://github.com/threatstream/mhn.git
cd mhn/
sudo ./install.sh

After it installs everything it needs it will ask you the following questions:

Do you wish to run in Debug mode?: y/n n
Superuser email: jerry.gamblin@gmail.com
Superuser password:
Superuser password: (again):
Server base url ["http://honeypot.jgamblin.com"]:
Honeymap url [":3000"]: http://honeypot.jgamblin.com:3000
Mail server address ["localhost"]:
Mail server port [25]:
Use TLS for email?: y/n n
Use SSL for email?: y/n n
Mail server username [""]:
Mail server password [""]:
Mail default sender [""]:
Path for log file ["/var/log/mhn/mhn.log"]:
Would you like to integrate with Splunk? (y/n)n
Would you like to install ELK? (y/n)n

Once that is done you now have a working MHN server:

Configuring The HoneyPots:

At this time MHN supports 17 honeypots for easy deployment:

I have used cowrie in the past and like it a lot so decided to use it for this blog post. You can deploy cowrie honeypots to your MHN server with the following commands:

sudo apt update
sudo apt upgrade -y
sudo apt install python -y
wget "https://gist.githubusercontent.com/jgamblin/e2c5432fa4518876c0536b625f90f8da/raw/67f792b549198a9bff15fd863e4e0cca6ae50b37/cowrie.sh" -O deploy.sh && sudo bash deploy.sh http://yourmhnserver yourcode
#An update broke the deployment script and there is a proposed fix.
#I copied the proposed fix to the gist used here. 
#wget "http://yourmhnserver/api/script/?text=true&script_id=14" -O deploy.sh && sudo bash deploy.sh http://honeypot.jgamblin.com yourcode wget

This scripts moves your *real* ssh port to 2222 and starts the honeypot  on port 22 (SSH) and 23 (Telnet). 

Once the script is complete they show up in your MHN server:

Looking at the Data:

Within minutes you will have data to look at.  My honeypots were up for under 30 minutes and I had a lot of data:

Next Steps?

There are 16 other types of honeypots you can run. WordPot is an amazing WordPress Honeypot and Dionaea is a great way to capture your own malware samples.  I will likely run both and a few more as I keep playing with this project.

Have any questions? Reach out to me on twitter @jgamblin.

]]>
2354
Anti-Vaxxers https://jerrygamblin.com/2017/05/16/anti-vaxxers/ Tue, 16 May 2017 12:04:54 +0000 https://jerrygamblin.com/?p=2343 … ]]> In the last couple of years the Anti-Vaccination crowd in the United States has started to make inroads with more and more people deciding that the perceived risk of the vaccination outweighs the known risk of the disease.

When you ask them why they dont vaccinatie they always have anecdotal evidence of how the vaccination could hurt them,  how they know of someone else who 5 years ago got a vaccination and it made them *really sick*  or they have an amazing supplement that they take that does much better than the vaccination would do.


I am not talking about parents who are put their children at risk of getting measles, I am talking about IT shops who are putting their companies, customers and data at risk by not taking proven preventative measures to secure their systems.

After 15 years in security I have heard all the excuses for not vaccinating systems:

It *might* break something.
We have a $500,000 Next-Generation  ██████ Box (Unconfigured).
We have not a had a *serious* outbreak yet.

The problem is when you bring proven and tested solutions like the CIS Critical Security Controls and the anti-vaxxers bring an anecdote you are going to lose.  My favorite mentor told me a long time ago you “you can’t debate an anecdote and win“.

This is normally where I like to end my blog post with a great solution we can all use. The problem is there isn’t a good solution to make people vaccinate their children and there isn’t a solution to make  people to vaccinate their systems.

Until then I am just happy I dont have to deal with polio or WannaCry.

]]>
2343
Finding and Mapping Domains With R https://jerrygamblin.com/2017/05/03/finding-and-mapping-domains-with-r/ Thu, 04 May 2017 01:13:17 +0000 https://jerrygamblin.com/?p=2328 … ]]> As I continue to try to learn R,  I am trying to build tools that other people might find useful. Tonight with the help of Bob Rudis I built a script that will find domains with a keyword in it from DomainPunch, do a geoip lookup and map it if it is online.

Since it is time to start thinking about defcon this summer I decided to use it as my keyword for the demo.

Here are all 544 live IPs with “defcon” in it mapped:

Link to the full screen map.
Here is a CSV of the data.

Here is the source code:
View the code on Gist.

As a reminder if you want to play along at home there is an RStudio docker container so all you need to do is:

docker run -d -p 8787:8787 -e USER=<username> -e PASSWORD=<password> rocker/rstudio

Learning R is turning out to be more fun than I thought it would be so expect some more blog posts!  Here is a picture semi related to this blog post to make it look pretty when I share it on social media.  

]]>
2328
Finding Additions To The Umbrella DNS Popularity List https://jerrygamblin.com/2017/04/30/finding-additions-to-the-umbrella-dns-popularity-list/ Sun, 30 Apr 2017 17:35:57 +0000 https://jerrygamblin.com/?p=2314 … ]]> Since I have started looking at the Umbrella DNS Popularity List I was interested in seeing how much the data changes day to day.  I fired up RStuido and wrote some terrible code but finally got it to work with some help.

Yesterday there were 80937 new DNS names on the list that were not on the list the day before.
(Update: Here is a CSV of the 169366 domains that were not on the list April 1st but was on the May 1st list.)

Here are the new additions on a map:

Link to the full screen map.

Here is a CSV of the data with GEOIP information added. 

Here is code I ended up with if you want to build your own:
View the code on Gist.

Up next is to run these domains through Virustotal to see if any of them are bad.

Here is a picture semi related to this blog post to make it look pretty when I share it on social media. 

]]>
2314
Big Data’ing The Umbrella DNS Popularity List https://jerrygamblin.com/2017/04/29/big-dataing-the-umbrella-dns-popularity-list/ Sat, 29 Apr 2017 20:30:14 +0000 https://jerrygamblin.com/?p=2295 … ]]> Recently I started looking at the Umbrella DNS Popularity List and did a blog post about it here. The data seemed valuable and lacking at the same time so I spent my *limited* free time this week learning about R and RStudio.

Protip:  If you want to play along at home there is an RStudio docker container so all you need to do is:

docker run -d -p 8787:8787 -e USER=<username> -e PASSWORD=<password> rocker/rstudio

Getting today’s list loaded into R is as simple as:

# Get Todays List
if (file.exists(fn)) file.remove(fn)
temp <- tempfile()
download.file("http://s3-us-west-1.amazonaws.com/umbrella-static/top-1m.csv.zip",temp)
unzip(temp, "top-1m.csv")
today <- read_csv("top-1m.csv", col_names = FALSE)
unlink(temp)

Now you have the Top 1 million DNS requests from Umbrella ready to be “big data’ed”.

At the start of this project I wanted to do the following:
Search the DNS names for keywords. (Done).
Map all the DNS records on a map. (Done, Kinda).
Compare today’s and yesterday’s records for new DNS records.
Check all the DNS records against Censys and record open ports, and software.
Check all the DNS records against VirusTotal and see if any of them are known bad.
Check all the DNS records against SSLLabs and record SSL grade.
Take a nap.

My limited results so far follow with hopefully more to come.

Search The DNS Names

I wanted to do this to be able to search the list for a keyword and build a table and map of the data.  This was fairly easy and with help of leaflet and datatables here is the output of searching today’s data for cisco.

Here is the map:

Here is a link to the data. 

Here is the R code I wrote:

View the code on Gist.

Map All The DNS Records On A Map.

I got started on this and quickly realized that looking up the GEOIP information and mapping a million DNS records was going to take a week so I decided to do the Top 25,000 as a POC and come back and do all 1,000,000 later (maybe).

Here is the 25,000 Map:

Here is the R code I wrote:
View the code on Gist.

I also built a map with the Top 100K on it but it is huge (Load at your own risk).

…More to come.

I will be spending some more time on this over the next couple of weeks but cant think @EngelhardtCR and @hrbrmstr enough for all the help they have been over the last week as.   They are true data scientist and I am just a hacker with a blog.  : )

If you have any questions or suggestions please let me know on twitter at @jgamblin.

Here is a picture semi related to this blog post to make it look pretty when I share it on social media. 

]]>
2295
Exploring Cisco’s Top 1 Million Domains Data https://jerrygamblin.com/2017/04/24/exploring-ciscos-top-1-million-domains-data/ Tue, 25 Apr 2017 01:03:34 +0000 https://jerrygamblin.com/?p=2270 … ]]> Cisco offers a daily list of the million most queried domain names from Umbrella (OpenDNS) users.    I had some time this weekend so decided to spend some time playing around with the data to see what I could find so I spun up a lightsail server and got to work.

Grabbing the file is as simple as:
wget http://s3-us-west-1.amazonaws.com/umbrella-static/top-1m.csv.zip

You can retrieve a specific date like this:
wget http://s3-us-west-1.amazonaws.com/umbrella-static/top-1m-yyyy-mm-dd.csv.zip
(Looks like 2017-01-20 is the earliest they have online).

Once you get that downloaded and unzipped (unzip top-1m.csv.zip) you can start exploring.

You can pull out the top 10 domains with this command:
head -n 10 top-1m.csv

1,google.com
2,www.google.com
3,microsoft.com
4,facebook.com
5,doubleclick.net
6,g.doubleclick.net
7,clients4.google.com
8,googleads.g.doubleclick.net
9,apple.com
10,fbcdn.net

(Full Output)

You can search for keywords with this command:
cat top-1m.csv | grep "opendns"

437,opendns.com
719,hydra.opendns.com
720,sync.hydra.opendns.com
1314,disthost.opendns.com
2756,api.opendns.com
4565,cacerts.opendns.com
5569,ipf.opendns.com
5699,block.opendns.com
7024,updates.opendns.com
8482,bpb.opendns.com

(Full Output)

To count the domain levels use this command:
awk -F, '{count=split($2,a,"."); print count}' top-1m.csv | sort | uniq -c | awk '{print $2,$1}' | sort -k1,1n

1 1086
2 263509
3 469756
4 193802
5 54281
6 13698
7 2952
8 689
9 172
10 16
11 26
12 2
13 1
14 1
15 1
16 1
17 1
18 1
19 1
20 1
21 1
22 1
23 1

(Full Output)
Notice anything strange here? Hint: A domain name requires at least two levels to be valid.

To find the broken DNS names in this list this command works:
cat top-1m.csv | awk -F, 'BEGIN {file="top-1m.csv" ; while ((getline line < file) > 0) {if (line ~ /#/) continue; tld[tolower(line)] = 1}} {foo=split($2,a,"."); if (foo == 1) {if (!(a[1] in tld)) {print $0}}}'  

1200,home
1490,local
2082,za
3916,lan
6350,url
10173,belkin
10869,uop
11187,localdomain
12887,localhost

(Full Output)

Find domains added to the list for today.
I  wrote a script to download the last two days of files and compare them for new domains:
View the code on Gist.

You can find the output for April 24, 2017 here.

Overall I am really impressed with this data and will be using it to do more research and to track trends across the internet.  They have some more to do but it is an amazingly valuable free tool.

Also recently I have feel in love with sprunge to push data to an ad free “pastebin” from the command line:

cat file.txt | curl -F 'sprunge=<-' http://sprunge.us
]]>
2270
Burp Settings File https://jerrygamblin.com/2017/04/17/burp-settings-file/ Mon, 17 Apr 2017 13:20:54 +0000 https://jerrygamblin.com/?p=2264 … ]]> I am a huge fan of Tim Tomes and his Burp Suite Configuration Suggestions blog post.   The problem is that I only use Burp a couple times a month and end up facing this screen and have to re-configure burp on every launch:

So I built burpsettings.json that:

  • Disables Browsers XSS Protection
  • Disables Burp Collaborator Server
  • Disables Intercept by Default
  • Changes Scan Mode to Thorough
  • Turns Off Anonymous Feedback

This will help make my burp startup time a lot faster and I thought I would share the config file so it could help someone else also.

]]>
2264
Newly Registered Domain Name Keyword Search https://jerrygamblin.com/2017/04/13/newly-registered-domain-name-search/ Thu, 13 Apr 2017 13:22:03 +0000 https://jerrygamblin.com/?p=2256 … ]]> Today I was asked if it was possible to generate a list of domain names registered everyday with a keyword in the record (company name, city, trademark, etc).   There are a few paid services that do this and domainpunch.com has a web based tool that will do this but I wanted to automate it so I could use it with a slackbot so I put together this 4 line bash script:

View the code on Gist.

Usage:
./domains.sh keyword

Output:
This is super simple script but as they say “simplicity is the ultimate sophistication“.

]]>
2256