Coder Social home page Coder Social logo

cdn-acceptance-tests's Introduction

CDN Acceptance Tests

Acceptance tests for our Content Delivery Network(s).

These are written using Go's testing package because it provides a framework for running basic assertions and a rich HTTP client/server library.

Methodology

The single Go process acts as both the client and the origin server so that it can inspect the input and output of the CDN.

                   +---------+
         +-------> |         |---------+
         |         |   CDN   |         |
         | +-------|         | <-----+ |
         | |       +---------+       | |
         | |                         | |
 request-| |-response                | |
         | |                         | |
         | |   +-----------------+   | |
         | +-> |     go test     |---+ |
         |     |                 |     |
         +-----| client ¦ server | <---+
               +-----------------+

When testing a real CDN, the tests must be run on a server that the CDN can connect to.

It will not configure the CDN service for you; you'll need to do so, pointing it at the machine that will be running the tests.

For more information please see this post on the GDS Technology blog:

Running

You will need the Go 1.x runtime installed. To install this on OS X:

brew install go

To run all the tests:

go test -edgeHost cdn-vendor.example.com -vendor cdn-vendor

...where -edgeHost specifies the CDN edge.

To run a subset of tests based on a regex:

go test -edgeHost cdn-vendor.example.com -run 'Test(Cache|NoCache)' -vendor cdn-vendor

To see all available command-line options:

go test -usage

Adapting the tests to your own configuration

You may need to make some changes to adapt the tests to your specific configuration.

  • The tests disregard all HEAD requests as healthcheck probes. You may need to modify this or filter those on other HTTP request headers depending on how your edge sends healthcheck probes.

Writing tests

When writing new tests please be sure to:

  • group the test in a file with other tests of similar behaviour e.g. "custom failover"
  • use a consistent naming prefix for the functions that so that they can be run as a group e.g. func TestCustomFailover…(…)
  • always call ResetBackendsInOrder() at the beginning of each test to ensure that all of the backends are running and have their handlers reset from previous tests.
  • use the helpers such as NewUniqueEdgeGET() and RoundTripCheckError() which do a lot of the work, such as error checking, for you.
  • define static inputs such as "number of requests" or "time between requests" at the beginning of the test so that they're easy to locate. Use constants where possible to indicate that they won't be changed at runtime.

Mock CDN virtual machine

You can develop new tests against a Vagrant VM which uses Varnish to simulate a CDN. Nginx and stunnel are used to terminate/initiate TLS and inject headers.

               +---------------------------+
         +---> |        Vagrant VM         |-----+
         |     |                           |     |
         | +---| Nginx ¦ Varnish ¦ stunnel | <-+ |
         | |   +---------------------------+   | |
         | |                                   | |
 request-| |-response                          | |
         | |                                   | |
         | |        +-----------------+        | |
         | +------> |     go test     |--------+ |
         |          |                 |          |
         +----------| client ¦ server | <--------+
                    +-----------------+

You may need to modify the configuration of the VM in mock_cdn_config/ to account for new tests.

To bring up the VM and point the tests at it:

vagrant up && vagrant provision
go test -edgeHost 172.16.20.10 -skipVerifyTLS -vendor fastly

Please note that this is not a complete substitute for the real thing. You must test against a real CDN before submitting any pull requests.

cdn-acceptance-tests's People

Contributors

alexmuller avatar dcarley avatar mattbostock avatar mikepea avatar rjw1 avatar samjsharpe avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cdn-acceptance-tests's Issues

Helper functions to reset all backends

StartBackendsInOrder should be modified so that it can be called multiple times to reset all backends back to a default state, including setting their default handler.

We could call this at the beginning of every test to ensure that we don't have any side effects from previous tests. The current test could then stop any backends it wishes.

It will need to check that each backend variable is bound to an active CDNServeMux that is listening. If any of them aren't, then it would need to Stop() any of the backends above that "level", and bring them back up checking the responses through edge for each (as it currently does at the beginning of the tests).

Use HTTPS

We're currently using HTTP for the fake origin server to simplify development. We should switch to http.ListenAndServeTLS() and remove the hacks from the CDN service config.

Implement remaining 5xx back-off test

TestFailoverOrigin5xxBackOff needs to be implemented.

Based on the behaviour of TestFailoverOrigin5xxUseFirstMirror I expect it will fail against the real CDN service. Technically the "origin should only receive one request" assertion could be removed from that other test when this is implemented, but I think it's still useful to keep.

Improve PURGE test

The test for PURGE requests should probably check that the object hasn't been evicted from the edge's cache. Rather than just ensuring that we get a non-200 response. Because it's quite conceivable that we could get a 403 and still allow the purge.

Document how to find CLI flags

go test takes some useful arguments like -run REGEX. We also have some custom flags like -skipVerifyTLS. It would be good to document how to find these (but not necessarily duplicate a list of them) in the README.

Normally you'd just call -help to see the usage that the flag package generates. However that seems to overridden for go test -help. The only way I've found so far is to call an invalid argument, like:

➜  cdn-acceptance-tests git:(master) go test -wut
flag provided but not defined: -wut
Usage of /var/folders/4f/b_tpfp_s3hn63td00b2v45ym0000gn/T/go-build054563818/_/Users/dcarley/projects/govuk/cdn-acceptance-tests/_test/cdn-acceptance-tests.test:
  -edgeHost="www.gov.uk": Hostname of edge
  -originPort=8080: Origin port to listen on for requests
  -skipVerifyTLS=false: Skip TLS cert verification if set
  -test.bench="": regular expression to select benchmarks to run
  -test.benchmem=false: print memory allocations for benchmarks
  -test.benchtime=1s: approximate run time for each benchmark
  -test.blockprofile="": write a goroutine blocking profile to the named file after execution
  -test.blockprofilerate=1: if >= 0, calls runtime.SetBlockProfileRate()
  -test.coverprofile="": write a coverage profile to the named file after execution
  -test.cpu="": comma-separated list of number of CPUs to use for each test
  -test.cpuprofile="": write a cpu profile to the named file during execution
  -test.memprofile="": write a memory profile to the named file after execution
  -test.memprofilerate=0: if >=0, sets runtime.MemProfileRate
  -test.outputdir="": directory in which to write profiles
  -test.parallel=1: maximum test parallelism
  -test.run="": regular expression to select tests and examples to run
  -test.short=false: run smaller test suite to save time
  -test.timeout=0: if positive, sets an aggregate time limit for all tests
  -test.v=false: verbose: print additional output
exit status 2
FAIL    _/Users/dcarley/projects/govuk/cdn-acceptance-tests     0.025s

Pull repeated logic surrounding `client.RoundTrip(req)`

We are repeatly calling client.RoundTrip(req), with common tests following it:

  • test err from RoundTrip itself.
  • test http.StatusCode

We are also optionally doing some repeated logic before it:

url := fmt.Sprintf("https://%s/%s", *edgeHost, NewUUID())
req, _ := http.NewRequest("GET", url, nil)¬

... and potentially sleeping beforehand too.

We should break this logic out into a helper.

Clarify and implement remaining serve stale test

TestFailoverOriginDownServeStale is not implemented.

While looking at it and reviewing the original tests from Matt, it seems that we need to split this test into two. If origin is down and the cached response is beyond its normal TTL..

  • and the health check has expired then it will serve the stale response.
  • and the health check hasn't expired then it will serve a response from mirror and that will replace the stale object.

Additionally it seems that 5xx responses from the mirrors might have the default TTL applied and be cached in place of a good response from origin.

Prevent switching edge regions during tests

Fastly's DNS returns a single record with a low TTL that is geo-located to one region. When the TTL expires it's quite likely that you'll hit another region (e.g. first AMS and then FRA). Made worst by the fact that I think each HTTP request will do it's own lookup.

Override this to make our tests more reliable.

Additional tests for standard CDN behaviour

I think we should add some additional tests for standard CDN behaviour.

Does not cache requests with:

  • Cookie header
  • Authorization header
  • POST method

Does cache responses with:

  • Cache-Control: max-age=n
  • Expires: n
  • 404 status code (because this always seems to surprise people)

Split tests out to separate files

It would be good to group the tests in separate files. Partly because the file size is growing as we populate more tests and it's harder to find the thing you want. But more importantly so that we can separate the stuff that's specific to www.gov.uk from the generic CDN behaviours in order to:

  • Re-use them for our other services like assets.digital
  • Make it easier for third-parties to use and improve our open source code

We'd have to figure out how this will play with our use of flag and init().

Add Test for X-Cache 'follow on' behaviour?

Raising this issue as a discussion point. When developing [#39] we pondered whether to add a subsequent test to check that the [HIT, MISS] Header would turn to [HIT, HIT] on a subsequent request for the now cached object.

This is more complicated than the base checks, since it requires a pause before the check to ensure the CDN cache has been distributed locally -- and we're not sure it's worth the extra effort/test-time-increase required.

Ability to run without failover tests and additional backends

It should be possible to run the tests without including the failover tests and setting up the additional "backup" backends.

We will want to use this when testing Bouncer or Assets, which don't currently do mirroring. Other open source users would also want this ability.

Might be related to #50.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.