Coder Social home page Coder Social logo

ldp-testsuite's Introduction

Archived

This repository is not being maintained anymore. It's only available for historical purposes. There's no follow-up for any security issues that may arise in its use of maven, its code, or any related packages it uses. Use and install at your own risk.

ldp-testsuite

Test Suite for Linked Data Platform 1.0

ldp-testsuite's People

Contributors

ajs6f avatar catch-point avatar cbeer avatar deiu avatar fserena avatar ja-fra avatar jkbzh avatar plehegar avatar sandhawke avatar sspeiche avatar wikier avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ldp-testsuite's Issues

Dependency between tests of different groups

The test method ‘testPreconditionRequiredStatusCode’ belongs to the MUST group and has a dependency with ‘testPutRequiresIfMatch’, which belongs to the SHOULD group. In case we only want to test the MUSTs it doesn’t work. It makes little sense that this is so. It should be possible to positively validate a server that implements only the MUSTs.

Proposal
Those operations that are common to these dependent tests could be extracted and implemented as generic.

"Configuration Failures" not printed to the console

Sometimes all tests are skipped due to "Configuration Failures." These are difficult to diagnose as no strack trace or error details are printed to the console and nothing helpful is in the HTML report.

===============================================
LDP Test Suite
Total tests run: 88, Failures: 0, Skips: 88
Configuration Failures: 2, Skips: 3
===============================================

To ignore bad etags or not

RFC 7232 says:

"A recipient MUST ignore the If-Modified-Since header field if the received field-value is not a valid HTTP-date, or if the request method is neither GET nor HEAD."

Should not the servers behave the same way with bad ETags? Currently, testPutBadETags expects a 412 when the value is not well-formatted. In that case, I think LDP servers should ignore the If-Match header and return a 428 (precondition required) or a 2xx.

Error - NoWriterForLangException: JSON-LD

I see this error POSTing JSON-LD content when running on the command line. (I don't see the error using the Eclipse TestNG plugin.)

[FAILURE] BasicContainerTest.testPostJsonLd

LDP servers SHOULD accept a request entity body with a request header of
Content-Type with value of application/ld+json [JSON-LD].

com.hp.hpl.jena.shared.NoWriterForLangException: JSON-LD
    at com.hp.hpl.jena.rdf.model.impl.RDFWriterFImpl.getWriter(RDFWriterFImpl.java:109)
    at com.hp.hpl.jena.rdf.model.impl.ModelCom.getWriter(ModelCom.java:293)
    at org.w3.ldp.testsuite.mapper.RdfObjectMapper.serialize(RdfObjectMapper.java:54)
    at com.jayway.restassured.mapper.ObjectMapper$serialize.call(Unknown Source)
    at com.jayway.restassured.internal.RequestSpecificationImpl.body(RequestSpecificationImpl.groovy:593)
    at org.w3.ldp.testsuite.test.CommonContainerTest.testPostJsonLd(CommonContainerTest.java:814)
    at org.w3.ldp.testsuite.LdpTestSuite.run(LdpTestSuite.java:308)
    at org.w3.ldp.testsuite.LdpTestSuite.executeTestSuite(LdpTestSuite.java:337)
    at org.w3.ldp.testsuite.RunLdpTestSuite.main(RunLdpTestSuite.java:53)
... Removed 21 stack frames

EARL reports don't match test cases

For example the manifest only defines 1 entry for all Java classes such as:

CommonResource-ETagHeadersGet

but the execution results expects these test cases:

NonRDFSource-ETagHeadersGet
MemberResource-ETagHeadersGet
BasicContainer-ETagHeadersGet

MemberResourceTest.java's memberResource is NOT Always Assigned

The @BeforeSuite needs to be replaced with @BeforeTest or used with a static method.

When running a test suite many of the MemberResourceTest tests are incorrectly skipped. This incorrect behaviour is caused because many test instances have memberResource set to null and never run the setup/tearDown methods. Because of the @BeforeSuite annotation on the setup method, it is only run on a single test instance (that might not be reused) and all other MemberResourceTest instances are then skipped. Either memberResource should be changed to a static variable (so all MemberResourceTest instances share the same value) or the annotation should be changed to @BeforeTest, so each instance gets it own value.

[testRelativeUriResolutionPut] Relative URI resolution and Jena models

The way the test modifies the resource just before PUTting it is not done correctly. Although the goal is to test how the server manages relative URIs, it is not right to modify the model (Jena) by adding a new property with empty subject. Doing this results in “defining” a new resource. When Jena deserializes inputs, it always assigns to subjects their absolute URI, no matter if they were relative or not in the resource representation generated by the server.

Proposal
Replace the line updateResource(model.getResource("")); with updateResource(model.getResource(this.getResourceUri()));

Perhaps a better way to test relative URIs resolution would be manually parse the resource before deserializing. To do so, it would be necessary to look for the evidence of the relative URI and verify the resulting model after deserialization.

Add simple near 1 line description of test in annotation

Currently you need to find the source and then read it to understand. It would be good to provide a short description of what the test does or how the requirement is being tested, the testing approach.

Makes the most sense to put in SpecTest annotations, like:
approach = "Provide some simple RDF on POST, check for 201 response code and Location header."

Separate projects for LDP and LDP paging test suites?

What is the right structure to cover both specifications? We could keep the paging tests under a different Java package. Or we could have separate projects, one for LDP and another for LDP paging.

I'm thinking separate projects makes sense (or a Maven multi-module project). Then we have different JARs and Maven artifacts for each test suite. We might need a common project in this case for things like the EARL reporter.

I'm assuming here we'd want to keep everything in the same Github repository.

@wikier @sspeiche Thoughts?

Propose a set of test cases for the LDP WG to approve

The test suite now has the ability to use annotations to mark up each test as either approved or not. The test case generator keys off this annotation as well.

Task is to propose a set and generate the report to take forward for WG review and approval.

Address problems with different indentations

We are getting contributions in many different kinds of formatting styles which makes github review tools show EVERYTHING as different. We are also getting contributions where alignment and indentation to make certain RESTassured calls more presentable to be swashed.

Proposal: Use only tabs for indentation, coders can set editor preferences to view it otherwise. Research github hooks/services to enforce this.

tabs vs. spaces

Surpriseliy everything (naming, glit-flow, etc) worked quite smoothly without even talking about it, that's good :-)

But not with indentation used for coding. While I use 4 spaces, @sspeiche and @spadgett used tabs. And that's causing a lot of commits with just formatting changes.

Can we agree on using one? And now is when we all pushed our ;-)

If the server doesn't require If-Match cannot test bad ETags

I am using a LDP server implementation that doesn't require clients to send an If-Match header. Due to test dependency between testPutRequiresIfMatch and testPutBadETag, I cannot test the latter because the first fails: TestNG skips after dependency failure.

Is it really necessary to keep this dependency?

run individual test

Is there any way to run just a single test (e.g. testPatchMethod) from the command line? Would be handy for debugging.

How to handle tests that are indirectly covered by the test suite?

The following requirements are covered by the test suite indirectly. There are no tests referencing them, however, so they look like they're missing in the coverage report. What's the best way to handle these?

  • 4.3.1.1 Each LDP RDF Source MUST also be a conforming LDP Resource as defined in section 4.2 Resource, along with the restrictions in this section. LDP clients MAY infer the following triple: one whose subject is the LDP-RS, whose predicate is rdf:type, and whose object is ldp:Resource, but there is no requirement to materialize this triple in the LDP-RS representation.
  • 5.2.1.1 Each Linked Data Platform Container MUST also be a conforming Linked Data Platform RDF Source. LDP clients MAY infer the following triple: one whose subject is the LDPC, whose predicate is rdf:type, and whose object is ldp:RDFSource, but there is no requirement to materialize this triple in the LDPC representation.
  • 5.4.1.1 Each LDP Direct Container MUST also be a conforming LDP Container in section 5.2 Container along the following restrictions. LDP clients MAY infer the following triple: whose subject is the LDP Direct Container, whose predicate is rdf:type, and whose object is ldp:Container, but there is no requirement to materialize this triple in the LDP-DC representation.

I believe these are the only requirements without a corresponding @SpecTest in the test suite.

[testRelativeUriResolutionPut] Forcing LDP servers to persist new triples

The result of the last check (verifyUpdatedResource(...)) is being ignored (returns void). Still, it is not for this test to verify whether the added triples remain after making a PUT or not. Furthermore, the LDP specification does not consider this as a MUST, so in case the server decides not to persist new triples the test must be successful (checking the proper 4xx). See 4.2.4.4 in LDP 1.0.

Proposal
Consider the two possible endings: the server chooses to persist and it does not. If the PUT fails (4xx), check that the last GET result does not reflect the modification. If the PUT does not fail, verify that it kept the new triple.

Move client-only tests out of testng (Java)

Since we will NEVER run these through testng, they keep showing up as skipped which isn't helpful.

Instead, we'll just document the client-only tests in RDF and remove the Java code. The reporting tools will be updated to look for this client-only manifest.

Extend the test subject

As part of issue #17, earl-report requires some information about the concrete implementation, because it fails with the following error:

Test subject info not found for ../testsuite/EarlTestSuiteReportTurtle.ttl, expect DOAP description of project solving the following query:

PREFIX doap: <http://usefulinc.com/ns/doap#>
PREFIX foaf: <http://xmlns.com/foaf/0.1/>

SELECT DISTINCT ?uri ?name ?doapDesc ?homepage ?language ?developer ?devName ?devType ?devHomepage
WHERE {
  ?uri a doap:Project; doap:name ?name; doap:developer ?developer .
  OPTIONAL { ?uri doap:homepage ?homepage . }
  OPTIONAL { ?uri doap:description ?doapDesc . }
  OPTIONAL { ?uri doap:programming-language ?language . }
  OPTIONAL { ?developer a ?devType .}
  OPTIONAL { ?developer foaf:name ?devName .}
  OPTIONAL { ?developer foaf:homepage ?devHomepage .}
}

Right now the generated test subjects are just:

<http://www.w3.org/ns/earl#subject>
      [ a       <http://www.w3.org/ns/earl#TestSubject> ;
        <http://purl.org/dc/terms/title>
                "http://localhost:8080/ldp"
      ] ;

The minimum project information would be something like:

<http://marmotta.apache.org> a doap:Project ;
  doap:name "Apache Marmotta" ;
  doap:homepage <http://marmotta.apache.org> ;
  doap:programming-language "Java" ;
  doap:developer [ foaf:name "Sergio Fernández" ] .

I'll take care of the CLI stuff, but I think is better that whoever implemented the ear report generator would extend it in this direction.

EARL manifest reporter should create unique test case per type of run

Currently it only references 1 test case, even though it may be used for various types of resources. Take for example GetResource, it is actually run 3 times for different kinds of resources (configurations). So we should separate those out with unique URIs. It would make sense to link them back to where it is defined.

Also, need to group based on kinds of conformance: for now, we can just have DC, BC and IC.

testPostResourceAndCheckAssociatedResource SKIPPED - why?

While fighting with #174, I noticed some weird behaviour in the NonRDFSourceTest:

When running the code as-is (f41298c):

$ java -jar target/ldp-testsuite-1.0.0-SNAPSHOT-shaded.jar --server http://localhost:8080/ldp --basic --non-rdf --test testPostResourceAndCheckAssociatedResource
[TestNG] Running:
  LDP Test Suite

testPostResourceAndCheckAssociatedResource         NonRDFSource      Skipped  [MAY]                0ms

Total Time: 0.03s

===============================================
LDP Test Suite
Total tests run: 1, Failures: 0, Skips: 1
Configuration Failures: 0, Skips: 1
===============================================

Now I add two System.out.println()s in the setup()-Method, the test is no longer skipped:

$ java -jar target/ldp-testsuite-1.0.0-SNAPSHOT-shaded.jar --server http://localhost:8080/ldp --basic --non-rdf --test testPostResourceAndCheckAssociatedResource --httpLogging
[TestNG] Running:
  LDP Test Suite

Start of setup()
Running test against http://localhost:8080/ldp
testPostResourceAndCheckAssociatedResource         NonRDFSource      Failed   [MAY]             1736ms

Total Time: 1.81s

[FAILURE] NonRDFSourceTest.testPostResourceAndCheckAssociatedResource

...

and that's the full diff:

diff --git a/src/main/java/org/w3/ldp/testsuite/test/NonRDFSourceTest.java b/src/main/java/org/w3/ldp/testsuite/test/NonRDFSourceTest.java
index 7436a7e..5ec14f3 100644
--- a/src/main/java/org/w3/ldp/testsuite/test/NonRDFSourceTest.java
+++ b/src/main/java/org/w3/ldp/testsuite/test/NonRDFSourceTest.java
@@ -47,6 +47,7 @@ public class NonRDFSourceTest extends CommonResourceTest {
        @Parameters({ "basicContainer", "directContainer", "indirectContainer" })
        @BeforeSuite(alwaysRun = true)
        public void setup(@Optional String basicContainer, @Optional String directContainer, @Optional String indirectContainer) {
+        System.out.println("Start of setup()");
                if (StringUtils.isNotBlank(basicContainer)) {
                        container = basicContainer;
                } else if (StringUtils.isNotBlank(directContainer)) {
@@ -56,7 +57,7 @@ public class NonRDFSourceTest extends CommonResourceTest {
                } else {
                        throw new SkipException("No root container provided in testng.xml. Skipping LDP Non-RDF Source (LDP-NR) tests.");
                }
-
+        System.out.println("Running test against " + container);
                final String slug = "test",
                                file = slug + ".png",
                                mimeType = "image/png";

Any hint/idea about this?

Missing Preference-Applied response header shouldn't necessarily be a failure

We fail servers that don't include a Preference-Applied resopnse header, but there's no requirement in LDP or RFC 7240 that Preference-Applied be there. We should remove this check when it's obvious the preference was honored. See

testPreferContainmentTriples()
testPreferMembershipTriples()

From RFC 7240:

Use of the Preference-Applied header is only necessary when it is
not readily and obviously apparent that a server applied a given
preference and such ambiguity might have an impact on the client's
handling of the response.

From the LDP spec:

Non-normative note: [RFC7240] recommends that server implementers
include a Preference-Applied response header when the client cannot
otherwise determine the server's behavior with respect to honoring
hints from the response content. Examples illustrates some cases
where the header is unnecessary.

(By the way, @sspeiche, minor grammar mistake: it should be, "Examples illustrate...")

[testIsHttp11Server] Verification of HTTP/1.1 conformant servers

The specification of HTTP/1.1 says:

“An application that sends a request or response message that includes HTTP-Version of "HTTP/1.1" MUST be at least conditionally compliant with this specification. Applications that are at least conditionally compliant with this specification SHOULD use an HTTP-Version of "HTTP/1.1" in their messages, and MUST do so for any message that is not compatible with HTTP/1.0.”

This means that not all HTTP/1.1 conformant servers will include HTTP-Version of “HTTP/1.1”, and even if they do, neither does it mean that they are. Passing the test (i.e., that "HTTP/1.1" is contained in the status line) does not ensure that a server is HTTP/1.1 conformant, only that the version number appears in the status line.

Should we fail servers that add additional triples on PUT?

Eclipse Lyo LDP adds the triple

<> dcterms:contributor <current-user>

to resources on PUT. This causes RdfSourceTest.putReplaceResource (MUST) to fail, however, since it has an explicit assertion that no extra triples were added. Is the test correct?

This is what the spec says:

4.2.4.1 If a HTTP PUT is accepted on an existing resource, LDP servers must replace the entire persistent state of the identified resource with the entity representation in the body of the request. LDP servers may ignore server-managed properties such as dcterms:modified and dcterms:creator if they are not under client control. Any LDP servers that wish to support a more sophisticated merge of data provided by the client with existing state stored on the server for a resource must use HTTP PATCH, not HTTP PUT.

I'm not sure how to interpret it. It does mention "server-managed properties," but only says they can be ignored. Nothing about adding server-managed properties.

If the Lyo behavior is wrong, it's easy enough to fix. If adding triples is OK, we should change the test.

I have a related fix for other issues (#141).

Document manual tests

We should give guidance on how to manually test requirements that aren't automated, possibly using Java annotations on the unimplemented test methods.

Link header in all responses to requests made to LDPRs

The specification says:

4.2.1.4 LDP servers exposing LDPRs must advertise their LDP support by exposing a HTTP Link header with a target URI of http://www.w3.org/ns/ldp#Resource, and a link relation type of type (that is, rel='type') in all responses to requests made to an LDPR's HTTP Request-URI [RFC5988].

As far as I've seen, basic tests that perform GET and/or OPTIONS methods are checking this header in the responses. But the same doesn't occur with the other methods (POST, PUT, ...), except for the NonRDFSource tests.

Is there any reason why this is so? (if I'm right)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.