Testing RPDE feeds
Last updated
Last updated
Please note that all feeds must pass the OpenActive Test Suite in order to be usable by the largest number of data users. As described below, the interactive OpenActive Validator only validates the first 10 items / first 20 pages of a feed, whereas the OpenActive Test Suite runs the same checks over all items, in all pages, of all feeds.
The interactive online allows the user to perform two types of validation:
: attempt to consume only the first 20 RPDE pages of an RPDE feed and check for common errors with the RPDE implementation
: depending on the data supplied, check either (i) an individual item, (ii) only the first 10 JSON-LD items in a single feed page, or (iii) an Open Booking API request/response for validity
The OpenActive Validator is useful during development to get instant feedback on the basic data structure and feed implementation, however it is not designed to comprehensively validate a complete implementation.
The downloadable includes an option for comprehensively validating OpenActive open data feeds. Behind the scenes it runs the same checks as the for every RPDE page in the feed, and for every JSON-LD item in the feed.
The OpenActive Test Suite produces a results page that allows the user to explore specific errors with individual data items, by loading those items into the interactive OpenActive Validator.
Clone the test suite repository locally, and install its dependencies.
version 14 or above is required.
Validation mode does not require any specific configuration, simply run the following command with the URL of the dataset site to be tested:
Things to check to ensure the feed is implemented correctly.
Misreading the query in the specification is the single most common cause of incorrect implementation. Please read it carefully and ensure that brackets and comparators are used correctly. >
not >=
for example.
Does the next
url work as expected and return a valid page - it should never result in a 500 error.
"modified
" must always be an integer, afterTimestamp
/ afterChangeNumber
must also be an integer.
Does the afterTimestamp
or afterChangeNumber
of the next url always increase with each new page - if not the query has likely been badly implemented.
There should be "deleted
" items in the feed. If these are missing, it is likely the feed has not been implemented correctly.
The next
URL should be an absolute not relative URL.
Are all responses returned with header Content-Type: application/json
Check for duplicate IDs: items should not appear more than once in the feed if the source data is unchanging.
Pages should contain at least 500 items (this is a warning rather than an error).
Are the next URL parameters urlencoded?
Is the next
URL present on the last page? The next
URL on the last page should match the URL of the current page.
Check that the items array of the last page is empty
To quickly access the last page:
afterChangeNumber: Put a high integer in for afterChangeNumber to return last page
Does the feed include all historical data from the beginning of time and not just data in the future or from today's date?
Does the endpoint without any parameters return the first page (from the beginning of time)?
Does each page contain a "license" key?
Please ensure that you have implemented correctly:
afterTimestamp: If "modified" is an integer, put a high integer in for afterTimestamp to return the last page. (N.B. the spec currently allows for strings to be used for "modified", but a future spec will likely ).