What is Prismic's API response time?

1. Queries to retrieve the API Master Ref

These /api or /api/v2 queries are made to retrieve the API Master Ref that your application needs to retrieve the most up-to-date version of the API with all of your latest document updates. These requests should not be cached to ensure that the client application always retrieves the latest API Ref before each query. These requests are hitting Prismic's AWS facilities in the US East (N. Virginia) region.

Observed response times for these types of queries typically range between 100ms to 350ms depending on the network connectivity and the location of the server performing the query.

2. Search queries

Search queries are queries that include search predicates, ones that start with /api/v2/documents/search? .

Search queries are cached by Prismic's CDN (AWS Cloudfront). The first time a search query is performed, it needs to transit through to the US East (N. Virginia) AWS facilities and then gets cached by the CDN. The observed response time for a non-cached search query (first time it's performed) is between 100ms to 350ms.

As soon as a search query is cached by the CDN, the response will be served from Cloudfront's nearest edge location. The average response time observed for cached queries is 25ms which can still vary depending on the nature of the search request and the client's network connectivity.

Important note: the response time displayed on the Prismic status page is an average response time for non-cached queries performed from different locations around the world.

To get a more accurate answer to the response time question, we recommend that you perform your own tests on your end.

Let us know if you'd like more guidance on how to perform such tests so that they will be more representative of your production use. We hope to extend this guide to help you get a sense of the actual response time you'll be observing using Prismic's API.

Threads close after a period of inactivity. Flag this thread to re-open it and continue the conversation.