Coder Social home page Coder Social logo

Comments (20)

jcamiel avatar jcamiel commented on September 21, 2024 3

Yes! On 2023, the big features we plan to work on:

  1. IDE support (starting with VSCode)
  2. this feature (launch test in parallel)
  3. Official apt support

We're going to work in this order as 1. might force us to rewrite the parser code. The parallel thing is (in my mind at least) one of the main feature we should address (it's in our mind since the begining, practically one of the first issues submitted). It's really adapted to a CLI tool like Hurl. We're using Hurl daily ourselves and I'm sure the parallel tests are going not only to improve tests duration but also find bugs on our web app / APIs.

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 3

@lepapareil @fabricereix I've done some initial work about how we could render hurl --test while running in parallel => doc here https://github.com/Orange-OpenSource/hurl/blob/master/docs/spec/runner/parallel.md#--test-output and proto (asciinema) here => https://jcamiel.github.io/parallel/. For discussion

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 3

Hi all,

This is much overdue, but we've implemented --parallel option to executes Hurl file in parallel, on master. This is an optional flag on the 4.3.0 version (released next week, but already available on master), and will be officially supported in Hurl 5.0.0 (the version after the 4.3.0).

The model used is similar to GNU parallel, to run some tests in parallel, you can just run :

$ hurl --test --parallel *.hurl

The parallelism used is multithread sync: a thread pool is instantiated for the whole run, each Hurl file is run in its own thread, synchronously . We've not gone through the full multithreaded async route for implementation simplicity. Moreover, there is no additional dependency, only the standard Rust lib. @ppaulweber we've chosen not to expose a kind of "thread affinity" inside a Hurl file, once again for simplicity of implementation. The only user option is --jobs to set the size of the thread pool. By default, the size is roughly the number of available CPUs.

Regarding stdout/stderr, we've, once again, followed the GNU parallel model: standard output and error are buffered during the execution of a file, and only displayed when a file has been executed. As a consequence, the debug logs can be a little delayed, but logs are never intermixed between Hurl files.

One can use debugging for a particular file with [Options] section and everything should be working as intented:

GET https://foo.com
[Options]
verbose: true
HTTP 200

In test mode, the progress bar is a little different from the non-parallel run, it will be harmonised for the official release (the sequential test progress will look like running hurl --test --parallel --jobs 1).

Regarding report, the HTML, TAP, JUnit reports are not affected: reported tests, in parallel or in sequential mode, are in the same order as execution one. For instance:

$ hurl --test --report-tap a.hurl b.hurl c.hurl

Will always produced this TAP report, in this order, no matter what file is executed first:

TAP version 13
1..3
ok 1 - a.hurl
ok 2 - b.hurl
ok 3 - c.hurl

What's next:

  • a lot of tests: we really want to be sure that everything is OK
  • maybe some option for the first version: like GNU parallel a --keep-order option to output standard output in the command line order of the files. After this first version, we'll add more option of course (for repeating sequence etc...), base on usage and feedbacks
  • add a throttle on terminal display: cargo do this and we'll add it as the refresh rate can ve very high for the terminal
  • feedback! We'll really be happy to have feedback on the new feature: it's really exciting, Hurl is already fast; with parallel execution is incredibly fast!

@muffl0n I'll be super happy if you could test it, I'm interested to know if this could potentially replace your usage with parallel !

(same announce made on #88)

from hurl.

muffl0n avatar muffl0n commented on September 21, 2024 3

My first test looks amazing! I ran our 2520 tests (in 20 files) and the results speak for themselves:

  • default (non-parallel) execution: 48s
  • parallel execution, 1 job per file: 8s
hurl /tests/*.hurl --test --parallel --jobs $(ls -1 /tests/*.hurl | wc -l)

I chose to run 1 job per file cause the default

By default, the size is roughly the number of available CPUs.

is too conservative in my opinion. My underpowered test VM only has 1 CPU, so it would use one thread. But it shows that it can easily manage the 20 jobs I use in my test.

Tap, junit and html reports look good and in order.

Gonna give it a try in our CI tomorrow, but I do not expect any problems here.

Thank you very much! This is just awesome! ❀️‍πŸ”₯

PS: I built a docker image for the current master (8dbd6f67) if anyone has the need:

  • ghcr.io/muffl0n/hurl:master
  • ghcr.io/muffl0n/hurl:8dbd6f67

from hurl.

muffl0n avatar muffl0n commented on September 21, 2024 2

Workaround I'm using

parallel -j $(ls -1 *.hurl | wc -l) -i sh -c "hurl {} --test" -- *.hurl
echo "retval: $?"

Spawns one process for each file matching *.hurl. The output is not perfect, cause you have to find the error output somewhere in the overall output. But it boosts our CI run quite nice from several minutes to some seconds.

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 2

It's a high priority in our todo list!

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 2

https://jcamiel.github.io/parallel/ parallel console output updated after discussion with @lepapareil / @fabricereix

from hurl.

infogulch avatar infogulch commented on September 21, 2024 2

Old school async: dump a bunch of threads on the OS and let the scheduler deal with it 🀷. I could be going out on a limb here, but I doubt curl's C api was designed to work with a rust async runtime, and I'd guess that shoehorning it would be non-trivial to say the least. So I'm glad that throwing threads at it seems to work. πŸ‘

from hurl.

gbourne1 avatar gbourne1 commented on September 21, 2024 1

Was about to open this feature request. Hurl has been great and as the number of .hurl files grows it would be great to run them in parallel.

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 1

Yes, we're really going to take inspiration on parallel to implement these features

from hurl.

indy-singh avatar indy-singh commented on September 21, 2024 1

Was just about to raise this feature request myself. I'm investigating deprecating our Runscope suite and migrating to Hurl and this would be a killer feature! <3

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 1

I chose to run 1 job per file cause the default
is too conservative in my opinion

Good feedback, we'll certainly adjust it

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024 1

Thanks a lot for the tests @muffl0n , do you get speed improvement in your CI as well?

from hurl.

muffl0n avatar muffl0n commented on September 21, 2024 1

Yes, the speedup is about the same as in my previous tests.

from hurl.

fabricereix avatar fabricereix commented on September 21, 2024

This feature could be indeed be useful in order to reduce the total test duration. It does not impact the hurl format and any existing semantics. It can not be implemented for the next release but maybe one after.

from hurl.

lepapareil avatar lepapareil commented on September 21, 2024

Great news :)

from hurl.

muffl0n avatar muffl0n commented on September 21, 2024

Would love to to some alpha testing. Let me know if I can be of any help!

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024

Initialisation of an architecture document here => /docs/spec/runner/parallel.md

from hurl.

muffl0n avatar muffl0n commented on September 21, 2024

My tests in our CI also went well: Reports are displayed/parsed like before.

from hurl.

jcamiel avatar jcamiel commented on September 21, 2024

I'm closing this issue, we'll open new, more specific issues from now.

from hurl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.