Basic HTTP testing

This is the Vespa Cloud Testing reference for basic HTTP tests. These tests verify the behaviour of a Vespa application by using its HTTP interfaces. Refer to automated deployments for context. Basic HTTP tests are written in JSON; to write more advanced tests, see the Java testing reference.

The Vespa CLI performs runtime-dependent authentication against the Vespa deployment to test, and endpoint discovery. It can be used to run one or more tests against an application in the Vespa Cloud, or against a self-hosted Vespa deployment:

$ vespa test src/test/application/tests/system-test/feed-and-search-test.json
Running feed-and-search-test.json: .... OK

1 test completed successfully

Test format

Each test is described by a single JSON file, and may include other files using relative paths. The placement of a test file, in the src/test/application/tests/ directory, determines what test suite it belongs to, and hence in which production deployment phase it is run. Look further down for a description of the various test suites. The possible sub-directories are system-test, staging-setup, staging-test, and production-test.

$ ls -1 src/test/application/tests/*/*
src/tests/application/tests/production-test/metrics-test.json
src/tests/application/tests/staging-setup/set-up-old-documents.json
src/tests/application/tests/staging-test/verify-search-still-works.json
src/tests/application/tests/system-test/feed-and-search-test.json
src/tests/application/tests/system-test/ranking-test.json

When a Vespa application is submitted to production, all files under src/test/application/ are packaged as a separate test artifact to be submitted alongside the actual application.

Test file structure

Each .json file directly under any of the directories listed above describes one test. Each test consists of a series of steps, and each step specifies an HTTP request to run, and some assertions about the response to obtain. Some additional properties may also be specified on both the test and step levels. A full example, with comments:

{
    "name": "my test",
    "defaults": {
        "cluster": "default",
        "parameters": {
            "timeout": "1.618s"
        }
    },
    "steps": [
        {
            "name": "clear existing documents",
            "request": {
                "method": "DELETE",
                "uri": "/document/v1/",
                "parameters": {
                    "cluster": "music",
                    "selection": "true"
                }
            }
        },
        {
            "name": "feed foo",
            "request": {
                "method": "POST",
                // should contain payload as expected by /document/v1/
                "body": "foo/body.json",
                // specify only the path and query for Vespa requests
                "uri": "/document/v1/test/music/docid/foo?timeout=8s",
                // JSON object file; merged with query from "uri"
                "parameters": "foo/parameters.json"
            }
            // no response spec: just assert code 200
        },
        {
            "name": "query for foo",
            "request": {
                // no "uri": defaults to "/search/"
                "parameters": {
                    "query": "artist: foo"
                }
            },
            "response": {
                "body": {
                    "root": {
                        "children": [
                            // assert "children" has a single element ...
                            {
                                // ... which has the field "fields" ...
                                "fields": {
                                    // ... where the field "artist" is "Foo Fighters" ...
                                    "artist": "Foo Fighters"
                                },
                                // ... and the field "relevance" close to 0.381862383599
                                "relevance": 0.381862383599
                            }
                        ]
                    }
                }
            }
        }
    ]
}

Test JSON specification

A full list of fields, with description:

Name Parent Type Default Description
name root
step
string file name, step n Name used for display purposes in the test report. The file name is used by default for the test, while the 1-indexed "step n" is used for steps.
defaults root object Default settings for all steps in this test. May be overridden in each step.
steps root array The non-empty list of steps that constitute this test.
request step object A specification of a request to send, to Vespa, or to an external service.
cluster defaults request string The name of the Vespa cluster to send a request to, as specified in services.xml. If this is not specified, and the application has a single container cluster, this is used.
method request string "GET" The HTTP method to use for a request.
uri request string "/search/" When this is path + (encoded) query, the host is determined by the specified cluster; otherwise, it must be an absolute URI (with scheme), and its host is used. Query parameters specified here override those specified in the defaults.
parameters defaults request string object HTTP request query parameters. The values should not be encoded. These are merged with parameters from the specified URI, and override those specified in the defaults. If the value is a string, it must be a relative file reference to a parameters object.
body request response string object The body for a request, or the partial body (see matching) for a response. If the value is a string, it must be a relative file reference to a JSON object to be used in its place.
response step object A specification for assertions to make on the body of the HTTP response obtained by executing the HTTP request in the same step.
code response number 200 The status code the response should have.

JSON matching

All requests and responses must be in JSON format. The tests allow simple JSON verification, by describing what should be present in the actual response. This is done by specifying a JSON structure, a template, for each response, and requiring each field present in the template to match fields in the actual response. Unmatched fields result in test failure, with the following rules:

  • Objects must contain all listed fields, and may also contain unlisted ones.
  • Arrays must match element-by-element.
  • Numbers must match within precision 1e-9.
  • All other values must match exactly.

Note that the empty object { } matches any other object, and can be used to fill elements of an array that require no further validation.

System tests

Tests located under src/test/application/tests/system-test are system tests. A system test is a functional test of a Vespa deployment, and each test in a system test suite should be self-contained. To achieve this, the first step should be to clear all existing documents. The next steps then set up a particular state, which is then verified by the final steps.

To run a system test, configure the Vespa CLI as in the getting started guide, and then run:

$ vespa deploy --wait 600
$ vespa test src/test/application/tests/system-test/feed-and-search-test.json

Refer to an example system test in feed-and-search-test.json , or check the example above.

System tests in Vespa Cloud pipeline

During automated tests, a fresh deployment is made to the test environment. The system test suite is then run against the endpoints of the test deployment. The application package and Vespa runtime version is the same as that to be deployed to production; however, the size of each test cluster is reduced to 1 node.

Vespa CD

Staging tests

Tests under src/test/application/tests/staging-test/ and src/test/application/tests/staging-setup/ together comprise a staging test suite. These are run in the automated staging test job, also against a fresh deployment. Unlike the system test suite, the goal of the staging tests is not to ensure the new deployment satisfies its functional specifications; rather, it is to ensure the upgrade of an existing production cluster is safe, and compatible with the behaviour expected by existing clients.

A staging test may, for instance, test an upgrade from application package X to X+1, and from platform version Y to Y+1. The staging test then consists of the following steps:

  1. Deploy the initial pair X, Y to the staging environment.
  2. Populate the deployment with data, making it reasonably similar to a production deployment. This is done by the staging-setup test files, which typically feed a set of static documents.
  3. Upgrade the deployment to the target pair X+1, Y+1.
  4. Verify the deployment works as expected after the upgrade. This is done by the staging-test tests.

The HTTP requests in the staging setup and test represent the clients using the application, and should be kept in sync with these. A staging test can be used to verify both upgrades of clients, or the application itself: To verify an upgrade of clients, change only the staging-test files, and run both phases against the same application. To verify an upgrade of the application, keep the tests fixed, but change the application itself, and deploy that change between running the staging-setup files and the staging-test files:

$ vespa deploy --wait 600
$ vespa test src/test/application/tests/staging-setup
$ # make changes to the application
$ vespa deploy --wait 600
$ vespa test src/test/application/tests/staging-test

Staging tests in Vespa Cloud pipeline

The three phases of the staging test (feed, upgrade, verify) are automatically run in sequence by the system. The sizes of clusters in staging are by default reduced to 10% of the size specified in services.xml, or at least 2 nodes.

Production tests

Tests under src/test/application/tests/production/ will be executed in a production test step. See deployment.xml for details on how to run these. Unlike the system and staging test, the production test does not have access to the Vespa endpoints, due to security reasons. The production test could instead, e.g., verify high-level metrics of the production deployment, using some external service. The release pipeline will stop if the tests fail, but upgraded regions will remain on the version where the test failed.

{
    "steps": [
        {
            "request": {
                "uri": "https://my.external.service/metrics/?query=customer-engagement"
            }
        }
    ]
}

The production test can be configured to run at a time later than the actual deployment, this can be expressed in deployment.xml. Example deployment.xml running production tests for a zone 3 hours after deployment has completed:

  <deployment version="1.0">
    <instance id="default">
      <prod>
        <steps>
          <region active="true">aws-us-west-2d</region>
          <delay hours="3" />
          <test>aws-us-west-2d</test>
        </steps>
      </prod>
    </instance>
  </deployment>