Category Archives: benchmarking

Permission error occured while executing yardstick-ignite framework

This console is printed multiple time while running yardStick-ignite framework examples? and can you let me know how to run yardstic default example if I did any mistake while running those example.

<16:56:46><yardstick> Starting driver config '...-cn query -sn IgniteNode -ds Ignite-sql-query-put-1-backup' on localhost
Permission denied (publickey,password).
Permission denied (publickey,password).
Permission denied (publickey,password).

Run steps :
1) create clone of git-hub library (git clone https://github.com/yardstick-benchmarks/yardstick-ignite)
2)use (mvn package) command to compile the project and also to unpack scripts
3)change Ip of driver and server from benchmark.properties
4)run this command ./benchmark-run-all.sh

Load-Testing of Application based on Java Back-end and angular front-end

So I want to load test my application which has a Java-Backend and The Web-pages are served by Angular. Now, I read about ab and how it could be used to test load on my application. I made a sample ab Test String Something like :

ab -A YWRtaW5Abth2aXMuY29tOmFkbWlu -n 1000 -c 100 http://example.com/

Here From the Documentation I got that:

-A is for base-64 encoded username:password.

-c: ("Concurrency"). Indicates how many clients (people/users) will be hitting the site at the same time. While ab runs, there will be -c clients hitting the site. This is what actually decides the amount of stress your site will suffer during the benchmark.

n:Indicates how many requests are going to be made. This just decides the length of the benchmark.

MY Doubts

  1. How Does ab test the load, like does it check the load on the front-end or it would also test the load on my java server, since it does page loads.(I am pretty confused here).

  2. Do I need to give the login credentials of the server in the ab request. Like without credentials to my server how does it enter for each user and then perform multiple page loads for testing

  3. Now if i set the variable -c to 1000 how do I give multiple base64 Encoded strings of username password.

  4. Also, When I have a trial run with and without the auth-key for user the tests run successfully and then result(wrt to time to perform the test) is similar.

Load-Testing of Application based on Java Back-end and angular front-end

So I want to load test my application which has a Java-Backend and The Web-pages are served by Angular. Now, I read about ab and how it could be used to test load on my application. I made a sample ab Test String Something like :

ab -A YWRtaW5Abth2aXMuY29tOmFkbWlu -n 1000 -c 100 http://example.com/

Here From the Documentation I got that:

-A is for base-64 encoded username:password.

-c: ("Concurrency"). Indicates how many clients (people/users) will be hitting the site at the same time. While ab runs, there will be -c clients hitting the site. This is what actually decides the amount of stress your site will suffer during the benchmark.

n:Indicates how many requests are going to be made. This just decides the length of the benchmark.

MY Doubts

  1. How Does ab test the load, like does it check the load on the front-end or it would also test the load on my java server, since it does page loads.(I am pretty confused here).

  2. Do I need to give the login credentials of the server in the ab request. Like without credentials to my server how does it enter for each user and then perform multiple page loads for testing

  3. Now if i set the variable -c to 1000 how do I give multiple base64 Encoded strings of username password.

  4. Also, When I have a trial run with and without the auth-key for user the tests run successfully and then result(wrt to time to perform the test) is similar.

Why is node.js so much slower than apache 2.4

A few year ago I tested node.js vs apache 2. The results have been impressive. node.js was really fast, especially with high concurrency.

Yesterday I wanted to show that to someone and.... outch apache 2.4 was much faster.

The setup:

Node.js (Express.js, node 6.2.2)

const http = require('http');

const hostname = '127.0.0.1';
const port = 3000;

const server = http.createServer((req, res) => {
  res.statusCode = 200;
  res.setHeader('Content-Type', 'text/plain');
  res.end('Hello World\n');
});

server.listen(port, hostname, () => {
  console.log(`Server running at http://${hostname}:${port}/`);
});

Apache 2.4 (serving a php file)

<?php
  $foo = "Hello";
  $bar = "World";
  echo "$foo $bar";
?>

I launched apache with port 80 Then I launched the node.js app on port 3000 and tested everything with Apache Benchmark

ab -r -n 10000 -c 10 http://127.0.0.1/

Results:

Server Software:        Apache/2.4.18
Server Hostname:        127.0.0.1
Server Port:            80

Document Path:          /
Document Length:        11 bytes

Concurrency Level:      10
Time taken for tests:   4.439 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      1980000 bytes
HTML transferred:       110000 bytes
Requests per second:    2252.97 [#/sec] (mean)
Time per request:       4.439 [ms] (mean)
Time per request:       0.444 [ms] (mean, across all concurrent requests)
Transfer rate:          435.63 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   1.3      0      12
Processing:     1    3   1.8      3      88
Waiting:        0    3   1.5      3      38
Total:          1    4   1.8      4      91

Percentage of the requests served within a certain time (ms)
  50%      4
  66%      4
  75%      5
  80%      5
  90%      6
  95%      7
  98%      9
  99%     10
 100%     91 (longest request)

Node.js

ab -r -n 10000 -c 10 http://127.0.0.1/

Results:

Server Software:
Server Hostname:        127.0.0.1
Server Port:            3000

Document Path:          /
Document Length:        19 bytes

Concurrency Level:      10
Time taken for tests:   8.513 seconds
Complete requests:      10000
Failed requests:        0
Non-2xx responses:      10000
Total transferred:      4020000 bytes
HTML transferred:       190000 bytes
Requests per second:    1174.64 [#/sec] (mean)
Time per request:       8.513 [ms] (mean)
Time per request:       0.851 [ms] (mean, across all concurrent requests)
Transfer rate:          461.14 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.3      0       7
Processing:     1    8   4.4      8      69
Waiting:        0    8   4.3      7      69
Total:          2    8   4.4      8      69

Percentage of the requests served within a certain time (ms)
  50%      8
  66%      9
  75%     10
  80%     10
  90%     12
  95%     15
  98%     20
  99%     23
 100%     69 (longest request)

The same if I test for n=1000, c=100 ... or higher Apache is always twice as fast.

Did there change anything? Did they speed up apache 2.4 massively? Or did node.js get old and slow?

I really remember that node.js was faster as soon as there was a concurrency higher than 5 or 10...

Am I am wrong? Any comment appreciated.

Kind regards Martin

UPDATE

I found this article in the web http://zgadzaj.com/benchmarking-nodejs-basic-performance-tests-against-apache-php

I cannot reproduce those results. Apache is faster when I try the same settings.

Siege [error] socket: connection timed out

I am developing a specific web application firewall which I am benchmarking to see the performance degradation when the firewall is enabled.
I am using siege -c3 XXX.XXX.XXX.XXX but I get after 400 requests this error:

[error] socket: -1311799552 connection timed out.: Connection timed out
[error] socket: -1303406848 connection timed out.: Connection timed out
[error] socket: -1295014144 connection timed out.: Connection timed out

Notice I use only 3 concurrent connections, so I am wondering why the socket is timing out... I am using a default apache configuration. I have not installed mod_evasion explicitly. I have no idea what is causing this.

Laravel lag spikes

when i use the apache benchmark in Laravel, i got some serious lag at random request. Sometimes an error happens that tell me the script is too long to run.

this come from the files :

Vendor\Symfony\Component\Finder\Iterator\DateRangeFilterIterator.php

In this function :

public function accept()
{
    $fileinfo = $this->current();

    if (!file_exists($fileinfo->getRealPath())) {
        return false;
    }

    $filedate = $fileinfo->getMTime();
    foreach ($this->comparators as $compare) {
        if (!$compare->test($filedate)) {
            return false;
        }
    }

    return true;
}

Here is the report

And the report when i set the accept function to return true;

If i remove the function and just remplace by return true; the lag stop but this function probably serve to somethings no ? I hope you have a solution, thanks

Don’t know what my apache benchmark results indicate?

Noob alert! i'm following this tutorial here: https://easyengine.io/tutorials/nginx/block-wp-login-php-bruteforce-attack/

i'm up to the part where i'm suppose to stimulate an attack so i put the following command: ab -n 100 -c 10 example.com/wp-login.php

& these were my results:

Server Software:        nginx
Server Hostname:        mydomainwasputhere.com
Server Port:            80

Document Path:          /wp-login.php
Document Length:        162 bytes

Concurrency Level:      10
Time taken for tests:   0.110 seconds
Complete requests:      100
Failed requests:        2
   (Connect: 0, Receive: 0, Length: 2, Exceptions: 0)
Non-2xx responses:      98
Total transferred:      38780 bytes
HTML transferred:       21744 bytes
Requests per second:    911.61 [#/sec] (mean)
Time per request:       10.970 [ms] (mean)
Time per request:       1.097 [ms] (mean, across all concurrent requests)
Transfer rate:          345.24 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   1.8      1       7
Processing:     0    3  14.7      1     108
Waiting:        0    3  14.4      0     103
Total:          1    4  15.2      1     109

Percentage of the requests served within a certain time (ms)
  50%      1
  66%      2
  75%      2
  80%      2
  90%      7
  95%      7
  98%    109
  99%    109
 100%    109 (longest request)

I don't know if those are good results or not? if they're not good results, what should I do?