Comments (8)
Based on the log I don't see a test failure, just a compilation failure.
Searching for Failures: 1
, ...Failures: 9
did not result in any matches (all logs report Failures: 0)
from action-junit-report.
Sorry, I referred wrong runs. This is correct:
https://github.com/FgForrest/evitaDB/actions/runs/4880097093/jobs/8707336172
It contains:
Error: Failures:
Error: FilterIndexTest.deliberateTestFailure:64 Deliberate test failure.
[INFO]
Error: Tests run: 3143, Failures: 1, Errors: 0, Skipped: 13
Yet I see:
In following step of report generation: https://github.com/FgForrest/evitaDB/actions/runs/4880097093/jobs/8707457826
I'm uploading the output XML and downloading in the other workflow, but I double checked it and it should download the proper one. I also manually opened the artifact: https://github.com/FgForrest/evitaDB/suites/12666156144/artifacts/679089727 and I see the failure recorded there.
from action-junit-report.
Looking through it, it seems the wrong report may be picked. 🤔
From the failed report I can see this order:
However when looking what the action saw is this:
I don't see a reason why it would not find pick it.
If you place the publish step right after the tests were run (without the downloading) does it work?
The failure build is here:
https://github.com/FgForrest/evitaDB/actions/runs/4880097093/jobs/8707336172
However it seems to have picked up:
With 4892719755
seems to be a different run id? (perhaps)
from action-junit-report.
Ok, I'll try to move the the job to the same workflow to avoid artifact passing, and inform you about the result.
I'm still a newbie in GitHub actions, so there is high chance I missed something - most of my workflow code is copy & paste from different sources and guesswork.
Thank you very much for your time!
from action-junit-report.
I have tried to add it to the main workflow, but the result processing took extremely long time, so I have to cancel it to stop burning my free time on GitHub: https://github.com/FgForrest/evitaDB/actions/runs/4908927186/jobs/8765006012
The workflow was simple: https://github.com/FgForrest/evitaDB/actions/runs/4908927186/workflow
What can take it so long to process a few XML files? Build with tests took around 7 mins, processing (until I cancelled it) took over 15 minutes:
from action-junit-report.
I minified the plugin configuration to:
- name: Publish Test Report
uses: mikepenz/action-junit-report@v3
if: success() || failure() # always run even if the previous step fails
with:
report_paths: '**/TEST-*.xml'
fail_on_failure: true
And now the parsing took only 7 seconds and it found the failing test ok: https://github.com/FgForrest/evitaDB/actions/runs/4909112592/jobs/8765311158
Probably the annotations took long? I don't know.
But your suggestion is most probably correct and the plugin download something else than what's uploaded. I'll investigate more. Thanks.
from action-junit-report.
I got it! The problem was in missing run_id
configuration in download plugin. The correct configuration is:
- name: Download a test results # download `evita-server.jar` artifact if the workflow we react to was successful
uses: dawidd6/action-download-artifact@v2
with:
workflow: ${{ github.event.workflow_run.workflow_id }}
run_id: ${{ github.event.workflow_run.id }}
name: test-results # artifact name
Thank you for your support and I'd like to buy you a coffee. I hope, you don't mind.
from action-junit-report.
Thank you very much @novoj
from action-junit-report.
Related Issues (20)
- Doesn't include errors HOT 3
- Option to not parse the annotations if number of failures is high HOT 1
- support "ignored" attribute HOT 5
- Source of file and line attributes HOT 5
- Different line number representation (xunit) HOT 3
- Weird behavior with multiple `report_paths` and failures not being processed HOT 4
- Test report stopped working without change HOT 8
- require_tests flag is not working HOT 5
- Editor points out that github_token is required HOT 1
- Skipped tests are show as passed when included_passed set to true HOT 3
- action using deprecate 'set-output' command HOT 2
- Documentation for Maven Surefire HOT 2
- Out of memory error on large project HOT 14
- Failed tests are not shown in Annotations HOT 9
- Check Run Not Created when update_check: true HOT 2
- act error: Error: The runs.using key in action.yml must be one of: [composite docker node12 node16], got node20 HOT 4
- Invalid character entity when using with dastardly HOT 6
- Error: Converting circular structure to JSON HOT 9
- Option to explicitly fail check run HOT 8
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from action-junit-report.