Category Archives: apachebench

Apache bench 401 error with keystone openstack token validation

So I am trying to do the simplest and most basic instance of a one line apache bench test to get a better understanding of how to use apache bench. I am trying to create a token, delete a token, and then validate it to benchmark the time it takes to validate in keystone.

I created a file that has the admin information schema inside like the project, password, domain and etc. It is essentially the default information that devstack sets up.

Here is the line I use to create a token and extract the token information (my setup is an all in one node using devstack and fernet tokens)

http -v POST $IP:35357/v3/auth/tokens @auth_request_admin.json | grep "X-Subject-Token:"

This gives me back something that would look like this:

X-Subject-Token: gRIFJWOI3LFlkjsdfo20F83G2gG5VnsRT7_fI0M135n...(you get the idea)

I then copy the token and paste it in the headers of the delete request like so

http -v DELETE $IP:35357/v3/auth/tokens X-Subject-Token:gRIFJWOI3LFlkjsdfo20F83G... X-Auth-Token:gRIFJWOI3LFlkjsdfo20F83G...

This successfully deletes the token and creates an entry in the revocation list (a list of deleted tokens)

Here is where I run into trouble. I then try to run an ab one line to run a benchmark and get back 401 unauthorized errors, but I don't understand why.

ab -r -c 1 -v 2 -n 1 -T 'application/json' -H "X-Auth-Token: gRIFJWOI3LFlkjsdfo20F83G..." -H "X-Subject-Token: gRIFJWOI3LFlkjsdfo20F83G..." http://$IP:35357/v3/auth/tokens

This gives me back a section that is errors which look like:

{"error": {"message": "The request you have made requires authentication.", "code": 401, "title": "Unauthorized"}}
WARNING: Response code not 2xx (401)
LOG: header received:
HTTP/1.1 401 Unauthorized
Date: Thu, 11 Aug 2016 15:39:59 GMT
Server: Apache/2.4.7 (Ubuntu)
Vary: X-Auth-Token
x-openstack-request-id: req-27ef7b6d-9207-4ac7-ae70-106621106e52
WWW-Authenticate: Keystone uri=""
Content-Length: 114
Connection: close
Content-Type: application/json

{"error": {"message": "The request you have made requires authentication.", "code": 401, "title": "Unauthorized"}}
WARNING: Response code not 2xx (401)

I can't figure out why although I have a valid token to run the test with, it gives me a 401 not authorized error even if the token I am using is the admin token. Is there something I am missing?

ApacheBench and Windows Authentication

I'm trying to use the ApacheBench tool in a Windows authenticated, Web Api environment. I am not able to authenticate and get 401 errors. Is it possible to pass Windows credentials using this tool? Including username/password, like below, doesn't work. I really like the simplicity of this tool and will be disappointed if I can't use it in this environment. I'm a Microsoft guy and have no experience in this development realm.

ab -A username:password -c 50 -n 200 -v 2 http://localhost:17325/api/test

Apache and Node intermittently hanging under load from Apache Bench

For some quick background, I'm trying to test Apache/PHP vs Node to see which can server more requests faster. I'm using ApacheBench to serve tons of concurrent requests. I'm just trying to get a 'hello world' level thing going for now.

So the problem is, I'm trying to get consistent results from tests on our server and during any give test run, either with Apache or Node, the web server will intermittently hang (or slow down dramatically) mid-test and then magically come back to life. In other words, if I tell ApacheBench to make 10,000 requests, 5,000 will go through very quickly (in about a second), then Apache (or Node) will hang (or slow down) for about five to fifteen seconds, then return to full steam and complete all remaining requests in just another one second. Sometimes, with no rhyme or reason, a complete run of 10,000 tests will go straight through and complete in about two seconds with no hang.

At first I suspected Node or Apache settings but no longer. Now I suspect either something in the server configurations (machine) or possibly the network administration. Here's why:

I was running an Apache test and it hung. While it was hanging, I tried a Node test and the Node test was blocked completely while the Apache test was hanging. I confirmed this behavior and it was consistent in multiple attempts.

Similarly, Node tests randomly hang as well, and while they hang Apache tests are blocked.

However, network traffic in general is not the culprit; in and outbound traffic are not affected during these server hangs, verified in part by my uninterrupted remote session into this machine.

Worth noting, Apache and Node are running on separate ports (Apache on 80, Node on 3000).

Any ideas would be greatly appreciated, I feel the tests I'm running at this point are generating useless data because I can't trust the server as long as it is behaving like this.

In the meantime I'm suspending testing and I'll try to duplicate these issues on a virtual machine at home and see if I get similar behavior.

Apache Bench – A lot of fails due to length

In a local domain, we tried to test apache server using ApacheBench. By running 1000 tests and 50 concurrent tests, results show that we had 920 fails, which all of them are because of length. What is this and how can we fix that? (Testing environment: Macbook Air 2014, OS X El Capitan 10.11.3, MAMP Pro 3.4, PHP 5.6)

*****:~ *****$ ab -n 1000 -c 50 http://test.local/
This is ApacheBench, Version 2.3 <$Revision: 1663405 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd,
Licensed to The Apache Software Foundation,

Benchmarking test.local (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests

Server Software:        Apache
Server Hostname:        test.local
Server Port:            80

Document Path:          /
Document Length:        251 bytes

Concurrency Level:      50
Time taken for tests:   1.474 seconds
Complete requests:      1000
Failed requests:        920
   (Connect: 0, Receive: 0, Length: 920, Exceptions: 0)
Total transferred:      943551 bytes
HTML transferred:       252551 bytes
Requests per second:    678.20 [#/sec] (mean)
Time per request:       73.724 [ms] (mean)
Time per request:       1.474 [ms] (mean, across all concurrent requests)
Transfer rate:          624.92 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   1.7      0      12
Processing:    21   70  25.4     63     266
Waiting:       21   69  25.2     63     263
Total:         23   70  25.3     64     266

Percentage of the requests served within a certain time (ms)
  50%     64
  66%     66
  75%     69
  80%     73
  90%     88
  95%    104
  98%    188
  99%    195
 100%    266 (longest request)

Apache Bench Load Testing for REST Web Service

I am load testing my REST web service using apache bench.

this is working for me now

ab -n 300 -c 10 http://server:9535/rest/1_0/query/X?accountId=A&startDate=2014-01-01&endDate=2016-01-01

but I want to parameterize the values of accountId, startdate, enddate. Is it possible that rather than hardcoding these values in a URL, I specify a file containing lots of permutations of input values?

Apache benchmark HTTP GET location

I am using Apache Benchmark (ab) for stress testing my Glassfish server. The static content (which is to be downloaded) is a 50MB file binary file. I have been firing thousands of request using AB. My questions are

  • Does the file get actually downloaded on the client from where the AB command is triggered?
  • If yes, at what location do the files get physically downloaded (/tmp) ?
  • I am worried that the client might soon run out of hard-disk space by these request.

I am running AB on a CentOS

BenchMarking Python http server using Apache benchmarking tool

So I am trying to benchmark this simple http server written in python, I am using non-blocking sockets and select

import socket
import sys
from thread import *
import struct
from select import select
import time

class TCPServer:

    def __init__(self, host, port): = host
        self.port = port
        self.write_socks = []
        self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        print 'Socket Created'

        self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
            self.sock.bind((, self.port))
        except socket.error as msg:
            print 'Bind failed. Error Code: ' + str(msg[0]) + 'Message' + msg[1]

        print 'Socket Bind Complete'

        self.connections = [self.sock]

        print 'Socket now listening'

    def write_to_client(self, conn):
        conn.sendall("HTTP/1.0 200 OK\r\nContent-Length: 5\r\n\r\nPong!\r\n")
        print "closing connection"

    def handle_client(self, conn):
        data = conn.recv(40)
        if not data: 
            print "Closing connection"
            if(conn in self.write_socks):
            if conn not in self.write_socks:

    def shut_down(self):
        print "shutting down server, closing all connections"
        for  s in self.connections:
        print "all connections closed"

    def run_forever(self):
        while 1:
                read_sockets,write_sockets,error_sockets = select(self.connections,self.write_socks, self.connections, 2.0)
                print "leng is : " + str(len(read_sockets))
                for sock in read_sockets:
                    if(self.sock == sock):
                        sockfd, addr = self.sock.accept()
                        print "Connection accepted from client : " + addr[0] + " " + str(addr[1])

                for sock in write_sockets:

                for sock in error_sockets:
                     print 'handling exceptional condition for', sock.getpeername()
                     if(sock == self.sock):
                        print "server socket failed due to an error shutting down server"
            except socket.error as msg:
                print msg

if __name__ == '__main__':
     server = TCPServer('', 8888)

When I run the apache benchmark tool like this

ab -n 5000 -c 1 -v 2 -q

it seems to run fine most of the times, but every now and then it will get stuck after a few hundered/thousand requests and then take a long time to complete, and in the stats I see a few requests have taken too much time, for example

Server Software:        
Server Hostname:
Server Port:            8888

Document Path:          /
Document Length:        7 bytes

Concurrency Level:      1
Time taken for tests:   4.838 seconds
Complete requests:      5000
Failed requests:        0
Total transferred:      225000 bytes
HTML transferred:       35000 bytes
Requests per second:    1033.53 [#/sec] (mean)
Time per request:       0.968 [ms] (mean)
Time per request:       0.968 [ms] (mean, across all concurrent requests)
Transfer rate:          45.42 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1  11.6      1     818
Processing:     0    0   0.1      0       9
Waiting:        0    0   0.1      0       9
Total:          0    1  11.6      1     818

Percentage of the requests served within a certain time (ms)
  50%      1
  66%      1
  75%      1
  80%      1
  90%      1
  95%      1
  98%      1
  99%      1
 100%    818 (longest request)

see how 99% of the requests just took 1ms but <1% took 818ms, and if you look at the breakdown it looks like most of that time was spent during the connection phase. If i increase the number of requests or concurrent threads the problems seems to increase, still 1% taking far more time than the rest, I can actually see it stop and then start working again after some seconds. I also tried using other tools like wrk with similar results.

Any ideas what could possibly be wrong with my code that is causing this arbitrary behaviour for a small number of requests ?

How can I do an HTTP benchmark (like ab or siege) to a host with multiple IPs?

I'm trying to test my web servers and I have an A record with multiple IPs (each belongin to a different server, in a form of load balancing). When I connect with my browser I get it to randomly connect to each server, but Apache Benchmark and Siege do only use the first IP in the pool, which screws my tests.

I really need to force benchmark to use all IPs.