It's really easy to write poor quality API documentation but much harder to write the good stuff. Which is a bit sad, as your API specification and any supporting documentation is a very visible and important part of your API product. In fact, in the early stages it might be the only visible part of your product.
This advice is based on our experience at NHS Digital over the last couple of years building the developer hub and APIs, as well as in previous lives at HMRC. According to our user research, the software development organisations consuming our APIs to develop their applications really value good API documentation. They also really struggle to use an API if the documentation is poor.
So, how can you find out how good your API documentation is now, and where you can improve it? The answer is to use tried and trusted techniques that have been applied to designing government services for decades. In other words, follow the GDS agile delivery process, doing the hard work to make it simple.
1. User research
Not traditionally associated with API documentation, this usually takes the form of contextual testing. With developers, we've found you can do this remotely using screen sharing so you can see what the person is doing and make notes on any pain points. The simplest way is to set tasks for your developers to help investigate your hypotheses.
For example, if we want to find out how easy it is for developers to find the right API, we might ask them to show us "how to search for a patient using their NHS number". Then we see how they use our documentation to find the answer. We'll be watching for things like how quickly can they find:
- our developer hub (with their favourite search engine)
- its API catalogue
- the API catalogue filters (and whether they use them)
- our Personal Demographics Services (PDS) API specifications
- our most modern PDS FHIR API specification (and if they don't choose it, then why not?)
- the right API endpoint to use to search for a patient
If you can stand on the shoulders of giants, and re-use user research into similar target audiences from, say, HMRC, then so much the quicker.
2. Web page feedback
The NHS Digital website that hosts our API documentation uses a tool called Hotjar, and yours might have something similar. In addition to the general traffic monitoring we do with Google Analytics, we use Hotjar to:
- create heatmaps and recordings of key pages to see the general visitor browsing flow
- set up pop-up polls on specific pages to ask specific questions, like "did you find what you were looking for"
- capture feedback on documentation down to the specific sentence or paragraph
- request email addresses for any follow up
3. A well-managed support mailbox
Rather than outsource this to a separate organisation, we wanted to be close to the questions our developers ask as part of their help and support needs. So we manage our own support mailbox within the team. This means we can actively use these questions to clarify and iterate our documentation, sometimes even on the same day, which can really impress a developer.
For example, one developer had an issue understanding our docs on how to generate a JSON Web Token (JWT) key pair, which is needed to securely transmit information between parties. We explained what to do, and the developer explained why they got confused. So we made a quick change to the page to replace bullets with numbered steps which made it clearer. The developer said, "This is amazing! Thank you so much for considering and implementing my feedback, this is such a useful edit."
It just shows you can't get everything right first time, and that these feedback loops are vital.
4. Interactive backlog
We set one up to help us prioritise the APIs and features (including documentation) that we need to deliver our vision. Developers can see what's in our backlog, make suggestions and comments on features, and vote on their importance.
5. Regular surveys
We did an annual survey in the first couple of years with our registered developer (API consumer) population and the wider NHS Digital developer community as an overall benchmark to see how we're doing.
We asked them to score us on areas like ease of "learning" (including our documentation), "design and build", "testing", "onboarding" and "help and support". We saw our scores increase from averages of 2 to 3 out of 5 over the first year.
This year, we’re focussing on help and support. Rather than gathering this kind of information on an annual basis, we’ve introduced more regular ways of inviting feedback around our interactions with them. This includes "exit surveys" as API consumers complete their software integration with our APIs.
We've also started exit surveys of our "API producers" - the internal teams that develop the APIs on our platform - once their API is live to see how easy we made things for them.
By including "ease of learning" in all our surveys, we get incredibly useful feedback on documentation that we take care to put into action.
At some point, with our API consumers' help, we'd like to survey their end users. This is trickier to pull off logistically, but we know from our experience at HMRC that it can yield valuable results.
So there you have it, 5 practical ways to iteratively raise the quality of your API documentation from poor to good, with the help of those developers with a vested interest in helping you improve. It's a win-win scenario.
Making it easier to get help and support is a guiding principle for our API Management (APIM) team. Ernest Kissiedu, a business analyst in the team, explains the new approach to supporting software developers working in health and care.
Last edited: 4 May 2022 3:26 pm