Posted in ZAP, Security, Elasticsearch

Continuous security testing of your application with OWASP ZAP and Elasticsearch

This year I traveled to OWASP AppSec EU conference in Rome, Italy. At the conference I attended the lightning training on "Security Automation using ZAP" with Vaibhav Gupta. During the training Vaibhav demonstrated how to automate tasks in ZAP with Python. I found the training very enjoyable and I really like the concept of continuous running ZAP against some (or better yet, all) of your applications on a continuous basis. In this blog post I will show you how you can automate ZAP with Python, to scan your applications. Since we in Monokkel love Elasticsearch, I will also show how to store the scan results in an Elasticsearch instance, where you can query on the resulting data. So let's get started!

Zed Attack Proxy - introduction

OWASP Zed Attack Proxy (ZAP) is a popular open source intercepting proxy. If you are new to ZAP I would recommend you head over to their wiki to learn the basics. ZAP comes with a lot of features, like active and passive scanner, spider, scripting support, fuzzer and a lot more. I could probably write multiple blog posts describing all the features, but instead will concentrated on two features. The spider and the active scan functionality. I will also show how to automate these with python. If you want to learn more about ZAP, check out there website. In order to follow along with my examples, you do not need to download ZAP. Since we are exploring the API of ZAP we will be running ZAP in docker. Hence no need to download it if you would like to follow along.

Spidering

You can use ZAP to discover new URLs on a target web application. To start the spider, the spider needs one or more seeds. A seed is a starting URL for the spidering. What the spider does is that it discovers hyperlinks from the seed(s) and follow these and discover new links from these newly visited links. The pages that ZAP discovered is then displayed with a spider icon next to the URL. This feature is very useful when you have a large site that you cannot do manually discovery on. Just fire of the spider to find some interesting spots to start your security testing.

Active scan

Now that you have used the spider to find a lot of pages in the web app, you can now active scan the app. The ZAP scanner is able to find a lot of different vulnerabilities and will probably find a lot of the "low hanging" security vulnerabilities on your web page. It is very useful to use a tool like ZAPs active scanner to find these vulnerabilities in an early stage of development, and just fix them instead of paying a team of penetration testers to find these vulnerabilities. Let them focus on finding the hard vulnerabilities that you do not have the time to.

ZAP API

ZAP provides an API available in JSON, HTML and XML. After starting ZAP and proxying your browser through it, a web UI is available at http://zap or http://localhost:8090. The thing we need to know about the API is that it is protected by an API-key. So before we can start using the API we need to grab the API-key from ZAP.

Hands on

So, let us get started!

Prerequisite

If you would like to follow along and test this yourself you will need the following installed on your machine:

  • Docker (with docker-compose)
  • curl
  • Python 2.7.X (I tested with version 2.7.12)
  • The OWAPS ZAP python module. Run pip install python-owasp-zap-v2.4 and you should be set.
  • The elasticsearch python client. Run pip install elasticsearch to install it.
  • A REST Client (I recommend Insomnia - https://insomnia.rest/)

So to get running, we need a little setup. We need three things
1. A running instance of Zed Attack Proxy
2. An Elasticsearch instance to store scan results in
3. A web application to scan. It would also be nice if we actually have a vulnerable web application such that we get some results from ZAP.

To fulfil requirements 1 to 3, I have created a small docker compose file to get you up and running fast if you want to follow along. For the vulnerable web application, I have chosen Bodgeit Store, which is a vulnerable web applications. The Python code I will show have been tested using Python 2.7.12.

Setup

Ok, so lets fire things up. We start with the docker compose file:

Download the above compose to docker-compose.yml and run:

docker-compose -f docker-compose.yml up  

Now grab some coffee while you wait for docker to download and fire up the three docker containers for us. Elasticsearch, bodgeit and ZAP.

When docker finish downloading and starting up we should test that everything is ok. Elasticsearch is mapped to port 9200 so we should be able to reach it using curl:

curl localhost:9200  

and we should see something like this:

Ok, we now know that our Elasticsearch instance is up and running. Let us create an index for our scan results with a type called scan. To do this I have created a mapping for our results. Start your favorite REST client and do a PUT request to localhost:9200/scans/ with the following body:

If all is ok you should get the following response:

To make the mapping short, I have only created one property on the scan type, and that is a nested type with name "alerts". Since a scan result can consist of more than one alerts, I need to have nested mapping on this field to do some querying on the results.

Let us check if bodgeit store is up and running. Navigate your browser to http://localhost:8888/bodgeit/ and we should see the page for the "The BodgeIt Store".

And last, let us check that ZAP is up and running. Since we mapped ZAP to port 8090, navigate your browser to http://localhost:8090 and you should see "Welcome to the OWASP Zed Attack Proxy (ZAP)".

Python scripting

Ok, so now all we have a running instance of ZAP, Elasticsearch and a vulnerable web application, let us start with the fun part! I found something odd, and that is ZAP does not create a API-key on startup. In order to call ZAP from Python, we need the API key. So to get ZAP to create an API key we need to do a request against ZAP before the API key is created. I just selected a random endpoint from http://localhost:8090/ So if you run:

curl http://localhost:8090/JSON/ajaxSpider/view/numberOfResults/?zapapiformat=JSON  

we get the response:

And by some magic ZAP will generate an API-key. We can find the API key in "/home/zap/.ZAP/config.xml" file on the ZAP docker container. Just run:

docker exec zap cat /home/zap/.ZAP/config.xml | grep key  

And you will see the key between in the <key> tag. Copy the key because we need it in our Python script.

Now that we have the API-key, we can start to do some coding. First we do our imports and then we will read the input target from the command line and set the API key for ZAP:

Let us connect to ZAP, and make ZAP open the input_target url.

Here we create the zap object with the location of our ZAP proxy, both for http and https link. Then we open the url in zap and sleep 2 seconds, this is to give ZAP some time to fetch the page. Now that we have visited the page, let us fire up the ZAP spider to discover some pages in the web application:

In this snippet we start the spider and get the id for the scan. This id we then use to check the status of the spider, we do not want to start anything else against the web application before the spider discovery is done. When the spider is done, we will fire away an active scan against the application:

Same thing here as with the spider, we start the scan, get the scanid which we use to check the status of the scan. We then wait for the scan to finish. When the scan is done, we can retrieve the results from ZAP and push it to our Elasticsearch instance:

This code creates our scanjson object and add the scanid, date, url and the alert from ZAP to it. We then create a default instance of Elasticsearch(), and since we run on localhost port 9200, we do not need to add any host or port to the constructor. We then push it to the scans index.

Save all the python code snippets to scanner.py (or scroll down to the bottom, where you will find a snippet of the whole script) and let us run it against the bodgeit store to check if we find any vulnerabilities and if they are inserted in our Elasticsearch index. From the command line type the following:

python scanner.py http://bodgeit:8080/bodgeit/  

Since we added a container_name of "bodgeit" in our docker compose file, the ZAP container will be able to reach the web application at this name. When the script is finished we can start to do simple query to check how many potential vulnerabilities ZAP has found. Fire of a POST request to http://localhost:9200/scans/scan/_search using the following body:

This query creates a nested aggregation on the field "alerts", which is the alert objects we created (remember the mapping). We then do a terms aggregation on the field risk, to see how many vulnerabilities of the different risk we get. We also do a terms aggregation on the field confidence, to check the distribution of confidence on the different results. The last aggregation we do is a filter aggregation, where we only get results that have high risk. On the high risk results we do a sub aggregation of top hits. In the top hits sub aggregation we want to extract risk, name, url, param and description of the vulnerability. The result of this query will look something like this:

From the result we can see that ZAP has identified 2 high vulnerabilities, 132 medium and 177 low. We can also see that all the 311 findings have a medium confidence. The result also shows detailed information about the high risk vulnerabilities.

If you start doing this continuously on your applications you can start doing some really fun stuff. You will then start see trends over time or over the different applications. It is also possible to put Kibana on top of your Elasticsearch instance you will also get some fancy graphs to play with.

If you have any comments or questions, feel free to contact me. I am always interested in talking security or Elasticsearch related stuff. My twitter account is @webhak or you can ping me at thomas [theAtSign] monokkel.io

Here is a full version of scanner.py: