Comments (20)
Yes! On 2023, the big features we plan to work on:
- IDE support (starting with VSCode)
- this feature (launch test in parallel)
- Official apt support
We're going to work in this order as 1. might force us to rewrite the parser code. The parallel thing is (in my mind at least) one of the main feature we should address (it's in our mind since the begining, practically one of the first issues submitted). It's really adapted to a CLI tool like Hurl. We're using Hurl daily ourselves and I'm sure the parallel tests are going not only to improve tests duration but also find bugs on our web app / APIs.
from hurl.
@lepapareil @fabricereix I've done some initial work about how we could render hurl --test
while running in parallel => doc here https://github.com/Orange-OpenSource/hurl/blob/master/docs/spec/runner/parallel.md#--test-output and proto (asciinema) here => https://jcamiel.github.io/parallel/. For discussion
from hurl.
Hi all,
This is much overdue, but we've implemented --parallel
option to executes Hurl file in parallel, on master. This is an optional flag on the 4.3.0 version (released next week, but already available on master), and will be officially supported in Hurl 5.0.0 (the version after the 4.3.0).
The model used is similar to GNU parallel
, to run some tests in parallel, you can just run :
$ hurl --test --parallel *.hurl
The parallelism used is multithread sync: a thread pool is instantiated for the whole run, each Hurl file is run in its own thread, synchronously . We've not gone through the full multithreaded async route for implementation simplicity. Moreover, there is no additional dependency, only the standard Rust lib. @ppaulweber we've chosen not to expose a kind of "thread affinity" inside a Hurl file, once again for simplicity of implementation. The only user option is --jobs
to set the size of the thread pool. By default, the size is roughly the number of available CPUs.
Regarding stdout/stderr, we've, once again, followed the GNU parallel model
: standard output and error are buffered during the execution of a file, and only displayed when a file has been executed. As a consequence, the debug logs can be a little delayed, but logs are never intermixed between Hurl files.
One can use debugging for a particular file with [Options]
section and everything should be working as intented:
GET https://foo.com
[Options]
verbose: true
HTTP 200
In test mode, the progress bar is a little different from the non-parallel run, it will be harmonised for the official release (the sequential test progress will look like running hurl --test --parallel --jobs 1
).
Regarding report, the HTML, TAP, JUnit reports are not affected: reported tests, in parallel or in sequential mode, are in the same order as execution one. For instance:
$ hurl --test --report-tap a.hurl b.hurl c.hurl
Will always produced this TAP report, in this order, no matter what file is executed first:
TAP version 13
1..3
ok 1 - a.hurl
ok 2 - b.hurl
ok 3 - c.hurl
What's next:
- a lot of tests: we really want to be sure that everything is OK
- maybe some option for the first version: like
GNU parallel
a--keep-order
option to output standard output in the command line order of the files. After this first version, we'll add more option of course (for repeating sequence etc...), base on usage and feedbacks - add a throttle on terminal display:
cargo
do this and we'll add it as the refresh rate can ve very high for the terminal - feedback! We'll really be happy to have feedback on the new feature: it's really exciting, Hurl is already fast; with parallel execution is incredibly fast!
@muffl0n I'll be super happy if you could test it, I'm interested to know if this could potentially replace your usage with parallel !
(same announce made on #88)
from hurl.
My first test looks amazing! I ran our 2520 tests (in 20 files) and the results speak for themselves:
- default (non-parallel) execution: 48s
- parallel execution, 1 job per file: 8s
hurl /tests/*.hurl --test --parallel --jobs $(ls -1 /tests/*.hurl | wc -l)
I chose to run 1 job per file cause the default
By default, the size is roughly the number of available CPUs.
is too conservative in my opinion. My underpowered test VM only has 1 CPU, so it would use one thread. But it shows that it can easily manage the 20 jobs I use in my test.
Tap, junit and html reports look good and in order.
Gonna give it a try in our CI tomorrow, but I do not expect any problems here.
Thank you very much! This is just awesome! β€οΈβπ₯
PS: I built a docker image for the current master
(8dbd6f67
) if anyone has the need:
- ghcr.io/muffl0n/hurl:master
- ghcr.io/muffl0n/hurl:8dbd6f67
from hurl.
Workaround I'm using
parallel -j $(ls -1 *.hurl | wc -l) -i sh -c "hurl {} --test" -- *.hurl
echo "retval: $?"
Spawns one process for each file matching *.hurl
. The output is not perfect, cause you have to find the error output somewhere in the overall output. But it boosts our CI run quite nice from several minutes to some seconds.
from hurl.
It's a high priority in our todo list!
from hurl.
https://jcamiel.github.io/parallel/ parallel console output updated after discussion with @lepapareil / @fabricereix
from hurl.
Old school async: dump a bunch of threads on the OS and let the scheduler deal with it π€·. I could be going out on a limb here, but I doubt curl's C api was designed to work with a rust async runtime, and I'd guess that shoehorning it would be non-trivial to say the least. So I'm glad that throwing threads at it seems to work. π
from hurl.
Was about to open this feature request. Hurl has been great and as the number of .hurl files grows it would be great to run them in parallel.
from hurl.
Yes, we're really going to take inspiration on parallel
to implement these features
from hurl.
Was just about to raise this feature request myself. I'm investigating deprecating our Runscope suite and migrating to Hurl and this would be a killer feature! <3
from hurl.
I chose to run 1 job per file cause the default
is too conservative in my opinion
Good feedback, we'll certainly adjust it
from hurl.
Thanks a lot for the tests @muffl0n , do you get speed improvement in your CI as well?
from hurl.
Yes, the speedup is about the same as in my previous tests.
from hurl.
This feature could be indeed be useful in order to reduce the total test duration. It does not impact the hurl format and any existing semantics. It can not be implemented for the next release but maybe one after.
from hurl.
Great news :)
from hurl.
Would love to to some alpha testing. Let me know if I can be of any help!
from hurl.
Initialisation of an architecture document here => /docs/spec/runner/parallel.md
from hurl.
My tests in our CI also went well: Reports are displayed/parsed like before.
from hurl.
I'm closing this issue, we'll open new, more specific issues from now.
from hurl.
Related Issues (20)
- Cargo install failed HOT 3
- Insert a JSON body from another file, instead of putting the full JSON in the hurl file? HOT 3
- Add expected value field in all assert Error Message HOT 2
- Add Hurl build instructions for Fedora HOT 2
- Improve JSONPath eval performance
- πβοΈHurl Summer HolidaysπββοΈπ
- max-redirs: -1 can't be set in [Options] section
- Getting SSL certificate info on reused connection HOT 1
- APP
- Building hurl with newer rust versions (1.79) errors out due to "error: field `XXX` is never read" HOT 5
- Tests failing when building on openSUSE HOT 6
- Sporadic failures/timeout on CI for Arch Linux / HTTP/3 HOT 1
- Hurl Runtimes HOT 2
- Support ip query for getting resolved response IP
- Only use one backtick for one-line string
- Unable to pass an array of integers in a variable HOT 2
- CLI `--output` option combined with `--json` outputs a single Hurl file only instead of all HOT 1
- Output file not written when asserts fail HOT 3
- Configure --max-time per request HOT 4
- Configure --connect-timeout per request
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from hurl.