rousan / rust-web-frameworks-benchmark Goto Github PK
View Code? Open in Web Editor NEWA hello world benchmark for the available Rust Web Frameworks: hyper vs gotham vs actix-web vs warp vs rocket
License: MIT License
A hello world benchmark for the available Rust Web Frameworks: hyper vs gotham vs actix-web vs warp vs rocket
License: MIT License
Hello, first of all thank you for this benchmark comparison.
I have a question regardless to the performance I got on my machine:
- I ran the Actix server in release mode
- I'm using same wrk command
- My machine is MacBook Pro, 2.3 GHz 8 Core, Inter Core I9, 16GB DDR4
My benchmarking shows 130k/s and yours 562k/s
I'm wondering what should cause such a gap. Which OS are you using?
Thanks
I try to run the benchmark against the master branch of https://github.com/SergioBenitez/Rocket (v5).
To see if there is any speed improvement. Because it's already async.
I run the benchmark to warp
, to see my machine capability.
❯ cd warp
❯ cargo build --release
❯ ./target/release/warp
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:3030
Running 8s test @ http://127.0.0.1:3030
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 817.75us 1.11ms 17.93ms 93.77%
Req/Sec 72.54k 7.78k 95.50k 68.12%
Latency Distribution
50% 519.00us
75% 844.00us
90% 1.41ms
99% 6.21ms
2314469 requests in 8.09s, 180.99MB read
Non-2xx or 3xx responses: 2314469
Requests/sec: 286126.89
Transfer/sec: 22.38MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:3030
Running 8s test @ http://127.0.0.1:3030
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.86ms 1.03ms 16.82ms 93.42%
Req/Sec 65.72k 6.99k 83.28k 63.75%
Latency Distribution
50% 580.00us
75% 0.93ms
90% 1.48ms
99% 5.55ms
2097890 requests in 8.09s, 164.06MB read
Non-2xx or 3xx responses: 2097890
Requests/sec: 259381.43
Transfer/sec: 20.28MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:3030
Running 8s test @ http://127.0.0.1:3030
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.93ms 1.10ms 17.71ms 93.35%
Req/Sec 60.40k 7.81k 82.04k 65.00%
Latency Distribution
50% 632.00us
75% 1.01ms
90% 1.62ms
99% 6.10ms
1926600 requests in 8.09s, 150.66MB read
Non-2xx or 3xx responses: 1926600
Requests/sec: 238068.17
Transfer/sec: 18.62MB
❯ cd rocket
❯ cargo +nightly build --release
❯ ./target/release/rocket
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 786.81us 445.55us 14.41ms 70.86%
Req/Sec 18.22k 734.43 22.02k 78.75%
Latency Distribution
50% 799.00us
75% 1.09ms
90% 1.28ms
99% 1.96ms
580046 requests in 8.05s, 80.76MB read
Socket errors: connect 0, read 580030, write 0, timeout 0
Requests/sec: 72045.45
Transfer/sec: 10.03MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.92ms 3.56ms 205.67ms 99.80%
Req/Sec 17.09k 0.91k 19.49k 68.44%
Latency Distribution
50% 809.00us
75% 1.16ms
90% 1.38ms
99% 2.08ms
544051 requests in 8.04s, 75.75MB read
Socket errors: connect 0, read 544036, write 0, timeout 0
Requests/sec: 67683.32
Transfer/sec: 9.42MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.98ms 2.43ms 209.81ms 99.43%
Req/Sec 15.55k 1.48k 19.28k 90.62%
Latency Distribution
50% 0.91ms
75% 1.29ms
90% 1.53ms
99% 2.69ms
495140 requests in 8.04s, 68.94MB read
Socket errors: connect 0, read 495120, write 0, timeout 0
Requests/sec: 61560.47
Transfer/sec: 8.57MB
❯ cd rocket
❯ # change Cargo.toml
❯ # change main.rs to work with v5
❯ cargo build --release
❯ ./target/release/rocket
@@ -1,5 +1,3 @@
-#![feature(proc_macro_hygiene, decl_macro)]
-
#[macro_use]
extern crate rocket;
@@ -8,6 +6,8 @@ fn index() -> &'static str {
"Hello, world!"
}
-fn main() {
- rocket::ignite().mount("/", routes![index]).launch();
+#[launch]
+fn rocket() -> _ {
+ rocket::build()
+ .mount("/", routes![index])
}
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.30ms 1.03ms 26.44ms 86.44%
Req/Sec 39.56k 3.79k 48.86k 67.19%
Latency Distribution
50% 1.08ms
75% 1.61ms
90% 2.33ms
99% 5.05ms
1261745 requests in 8.08s, 175.68MB read
Requests/sec: 156093.58
Transfer/sec: 21.73MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.38ms 0.95ms 28.54ms 79.82%
Req/Sec 36.52k 3.16k 45.96k 67.81%
Latency Distribution
50% 1.18ms
75% 1.74ms
90% 2.46ms
99% 4.83ms
1164811 requests in 8.07s, 162.18MB read
Requests/sec: 144278.36
Transfer/sec: 20.09MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.43ms 0.99ms 21.53ms 79.25%
Req/Sec 35.24k 3.21k 48.04k 69.38%
Latency Distribution
50% 1.22ms
75% 1.81ms
90% 2.57ms
99% 5.05ms
1124770 requests in 8.08s, 156.61MB read
Requests/sec: 139211.46
Transfer/sec: 19.38MB
❯ cd rocket
❯ # change Cargo.toml
❯ # change main.rs to work with v5
❯ cargo build --release
❯ env ROCKET_ENV=prod ./target/release/rocket
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.21ms 0.92ms 21.33ms 84.57%
Req/Sec 42.26k 3.27k 50.97k 74.38%
Latency Distribution
50% 1.02ms
75% 1.50ms
90% 2.15ms
99% 4.61ms
1348461 requests in 8.08s, 187.75MB read
Requests/sec: 166937.68
Transfer/sec: 23.24MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.32ms 0.94ms 23.26ms 81.14%
Req/Sec 38.34k 3.62k 47.96k 69.38%
Latency Distribution
50% 1.12ms
75% 1.66ms
90% 2.37ms
99% 4.79ms
1223178 requests in 8.08s, 170.31MB read
Requests/sec: 151402.49
Transfer/sec: 21.08MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.45ms 1.00ms 19.32ms 79.40%
Req/Sec 34.91k 3.04k 44.97k 71.25%
Latency Distribution
50% 1.24ms
75% 1.83ms
90% 2.59ms
99% 5.15ms
1113370 requests in 8.06s, 155.02MB read
Requests/sec: 138150.06
Transfer/sec: 19.24MB
async
keyword)❯ cd rocket
❯ # change Cargo.toml
❯ # change main.rs to work with v5
❯ cargo build --release
❯ env ROCKET_ENV=prod ./target/release/rocket
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.26ms 0.94ms 18.77ms 83.92%
Req/Sec 40.41k 3.72k 49.53k 72.50%
Latency Distribution
50% 1.07ms
75% 1.57ms
90% 2.23ms
99% 4.80ms
1289197 requests in 8.08s, 179.50MB read
Requests/sec: 159601.93
Transfer/sec: 22.22MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.41ms 0.99ms 18.31ms 80.39%
Req/Sec 35.79k 3.46k 46.59k 73.44%
Latency Distribution
50% 1.21ms
75% 1.78ms
90% 2.52ms
99% 5.11ms
1142153 requests in 8.08s, 159.03MB read
Requests/sec: 141319.52
Transfer/sec: 19.68MB
❯ wrk --latency -t4 -c200 -d8s http://127.0.0.1:8000
Running 8s test @ http://127.0.0.1:8000
4 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.52ms 1.09ms 29.17ms 80.94%
Req/Sec 33.37k 3.86k 45.75k 82.81%
Latency Distribution
50% 1.29ms
75% 1.92ms
90% 2.73ms
99% 5.45ms
1065163 requests in 8.07s, 148.31MB read
Requests/sec: 131921.73
Transfer/sec: 18.37MB
warp: 286_126.89, 259_381.43, 238_068.17
Rocket v4: 72_045.45, 67_683.32, 61_560.47
Rocket v5: 156_093.58, 144_278.36, 139_211.46
Rocket v5 (ROCKET_ENV=prod): 166_937.68, 151_402.49, 138_150.06
Rocket v5 (witch async
keyword): 159_601.93, 141_319.52, 131_921.73
I have made my machine quite before running the benchmark.
❯ macchina
Machine — LENOVO ThinkPad L390 20NSS0TA00
Kernel — Linux 4.19.0-14-amd64
Distro — Debian GNU/Linux
WM — i3
CPU — Intel® Core™ i7-8565U CPP
Memory — 3.3 GB/15.9 GB
IMHO, to make the command wrk --latency -t4 -c200 -d8s http://127.0.0.1:8080
consistent. We need to set each port of the web framework to 8080
There are some obvious problem with the bench.
actix-web and warp return a String while others return a &'static str. This means they have additional allocation for generating payload while others don't.
hyper being low level does not set content-type header. And I believe all other tests do set it to "plain/text" or "text/plain; charset=utf-8". So it should set one to be on par.
warp by the test method mentioned in README.md does not generate meaningful response because it can not find the route you defined. This makes it return a 404 response that has no payload so it bypass the response body entirely. This is mentioned in issue #2.
Noticed that warp was the only server with Non 2xx and 3xx responses. And not just 1 but all of them. This might have impacted benchmarking
It would be good to add benchmark for tide here as well, I would like to see how well it compares with rocket. https://github.com/http-rs/tide
show result sort by latency in Conclusion section
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.