# Benchmarks

This is sub-layout for documentation pages

Top

#### 1 Introduction

This page sums up benchmark tests that were executed upon the game.
It is very difficult (I would say impossible) to fully automatize the tests, as client must connect, and perform some action, moreover, server shall be deployed on heroku.
Considering, that deploying on heroku server takes minutes (because of compilation), and must be done for any code change, and moreover, it is production server, measuring times is difficult, unless implemented as hidden / toggleable feature in the server code (and you could toggle the time measurement, or test from client $\leftrightarrow$ you are admin)

#### 2 What is benchmark?

Given application / algorithm, benchmark is test, in which the time of execution for algorithm is measured. After the time is measured, input data is altered and measured again.
Usually, the point is to discover, how the system will behave with more input data: how much slower will it be? How much more memory will it consume?

##### 3.1 Items count

Moreover, one item is chosen to be copied. It depends on item stats, how much data is sent over network (because of default value optimalization).
So this is really just to get an idea.
Tested data was measure on game of version dev - 1.2.1, on localhost (transfer between 2 PC over wifi)

0 383ms
1 000 341 ms
3 000 455 ms
4 000 422 ms
5 000 357 ms
6000 420 ms, but connection crashes and it needs to be refreshed
6800 391 ms
8000 427 ms
50 000 963 ms (with printing)
100 000 3s 292ms
500 000 5s 79ms
1 000 000 9s 747ms
1 500 000 14s 955ms
5 000 000 Server takes forever to serialize the data

##### 3.2 Conclusion

With increasing amount of items being sent, times are longer, and system (socketio) even crashes / times out. Considering that the test was performed only on localhost, on real deployment server, situation would be worser: longer loading times, more crashes / timeouts.

Therefore, sending all items / monsters to all newly connected client is not feasible solution for lot of data