How Would You Approach Exploratory API Testing?

Day 2 of MoT’s 30 Days of Testing asks the question “How would you approach exploratory API Testing?”.  From the perspective of someone who has never done testing that I feel falls into the realm of ‘real’ exploratory testing, this is a tough question to answer.  I’m going to approach this as “You’ve been told to test an API you have no documentation for.  You work at the company creating the API.”

There’s a couple of things I should tell you before getting into the exploratory testing:

  1.  The company I work for is a software developer with many different departments. We’re all friendly, and knowing more about something is as easy as going to someone’s desk and asking if they’ve got time to help you
  2. We’re big into writing specs for things, and having test scripts based on those specs.

The second point typically means that exploratory testing is moved to the sideline.  We will do some loose exploratory testing as we follow the script – based on any quirks we notice in the implementation – but 95% of the time we’re executing test cases that we knew existed far in advance of any code being delivered.

With that out of the way, how would I go about exploratory testing an API with no information about it?

Find Out More

APIs are tricky to exploratory test due to their nature.  There’s no GUI to lead you on your exploratory path.  How can we make it easier on ourselves, as someone in direct contact with the people creating the API?  Talk to them.  Even without a solid spec to work from, business analysts should have an idea of the problem the client wants to solve with the API.  The first thing I do is talk to the BAs and make some notes while having that conversation.  I translate them into a mind map later on.  This gets me one part of puzzle: what we intended to make.

After speaking with the BAs, the next port of call are the developers who worked on implementing the API.  Armed with the knowledge of what needed to be made, I ask the developers how they implemented solutions to each of the problems they were presented with.  This will typically result in a list of methods/paths that allow access to those solutions via the API.  At this point, I ask them to provide sample requests, as well as all parameters that it’s possible to set in each request.  This is important because often there are optional parameters that developers may not think to provide in an example request.  Optional parameters can potentially change how that request is processed, and lead to failures or exceptions that would not be able to find without knowing that the parameter existed in the first place.

Document Stuff

Having had those discussions, I go back to my desk and think about what I’ve learned.  My basic pre-testing discovery period is now complete.  I spend some time attempting to match up what was implemented with what was intended.  If I see any areas where I think the two have significantly diverged, I mark those in some way for testing as soon as I’m confident the very basics of the API are up and running.  This allows for any potential implementation problems to be found and talked about quickly.

I also make sure that what I’ve learned is available to my colleagues.  We keep test scripts in Excel, and maintain an internal Wiki where we write articles about what projects are, where their documentation is, and any information likely to be helpful for someone picking up testing the project for the first time.

Tools

My tool of choice for any kind of API testing is RestAssured.  I’m primarily an automated tester so I’m more comfortable in an IDE than I am in Postman or SoapUI.  Any tests that I made note of earlier that seem like they would be good long-term tests are written using RestAssured, where they can be re-run any time we need do a release of the API.

Conclusion

The easiest way to deal with a situation where you need to test an API where there’s no specification is to ask people about it.  This may all be cheating when it comes to exploratory testing, I’m not sure.  I can only come at this from a perspective I’m familiar with based on the culture where I work.  If we don’t know, we ask.

I’m looking forward to the other posts about this subject and hopefully learning about what people do when there’s nobody to ask.

What is API Testing?

To preface this, I’ve never really read a great deal about API testing.  My opinion here is based on a little bit of reading, and a lot of doing.

If you Google the definition of an API, you get the following:

a set of functions and procedures that allow the creation of applications which access the features or data of an operating system, application, or other service.

That is a little broad.  It encompasses so many different things that it would be tricky to address them in a post that doesn’t make everyone fall asleep.

So let’s narrow the scope of our definition here to what I think the MoT “30 Days of API Testing” event is focusing on: Web API Testing.

What is a Web API?

A web API is essentially a service provided by a company that wishes to allow external third parties access to certain functions or data.  This access can be provided via a Web API.

Let me use a couple of real world examples here.  Where I work, we provide a client-facing API that allows them to make some basic calls over the Internet.  Those calls include

  • Making a new user (alters the state on our system)
  • Requesting score data for a user (reads existing data on our system)

So a client typically makes a new user over the API.  That user then takes various actions that influence their score (outside of the API, via an app).  The client can then request to see the scores for the same user.

Let’s go into a little more detail about what’s happening on each of those examples.  Web APIs are all about requests, responses, and changes. I won’t go into too much detail here, as there are much better resources available online for these things.  As a quick overview:

Requests

Like in life, requests to an API are an “I would like you to do this for me” request.  In plain English this could be as simple as “Hey API, how long have you been awake?”.

Responses

These are what an API sends back to you after a request.  There’s a wide range of codes that can be returned, and the content will vary based on which code it is.  Continuing our example above, the response here would be 200 with a body of “I’ve been awake for 7 days”.

Changes

Sometimes the requests we make change the “state” on the API’s server/database.  Requests that create something, delete something, or update something are examples of this.   Requests that change the system often influence the response from other requests, so need to be tested in conjunction with those other calls.

What is Web API Testing?

With the above information in mind, let’s think about what testing for something like that would be like.

Testing an API includes doing the following things:

  • Making calls directly to it, in the same way that the client will be making those calls.
  • Checking that the response to a call is what is defined in the spec, including
    • The status code
    • The response body (if any)
    • Any data types are correct (e.g. float, integer, date-time)
  • Checking error codes are returned when appropriate
  • Checking that changes occur where appropriate
  • Checking that appropriate security is in place
  • Checking that paths/urls are as defined
  • Checking combinations of calls to see if any state-changing ones break the others

A good spec document should contain information helping you to test all of the above, and make your API testing experience quite pleasant.