How to display more than 100 documents with graphql?

Hi, in my project i have more than 100 documents per Custom Types to display, but the problem is that prismic graphql only return a maximum of 100 documents and the rest canot be fetch.

I see this documentation who explain how to paginate the results : https://prismic.io/docs/technologies/paginate-your-results-graphql but this way requires to duplicate the query and put a cursor id manually. What does not very flexible for a big project...

I currently use the old plugin gatsby-source-prismic-graphql but I haven't seen anything about the new gatsby-source-prismic plugin either.

Is there a way to automatize the pagination ? Or make a loop in a graphql query ? Is this possible with the new plugin gatsby-source-prismic ?

Thank you for your help !

Hello @baptiste, thanks for reaching out.

With the gatsby-source-prismic-graphql plugin you can only return 20 documents per request. This is because the plugin is based on Prismicā€™s GraphQL API. In that case, the documentation you need to read is the following:

The gatsby-source-prismic plugin, on the other hand, fetches the data using REST API. It does not have the limitation that GraphQL has. Here is the documentation to create pagination with Gatsby and this plugin:

If you need it, we have a dedicated guide to migrate from one plugin to the other one.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.

Hello, Iā€™m seeing a similar issue to the original question (Iā€™m not using Gatsby but the question is actually related to the graphQL API).

Obviously, switching to use the REST API fixed the issue for @baptiste but the question ā€˜How to display more than 100 documents with graphQL?ā€™ is still unresolved.

When doing a graphQL query for multiple documents that should return > 100 results, the totalCount field is always limited to 100.

Iā€™m aware of the max 20 documents limit and how to use cursors and pagination to retrieve all the results. But the problem is that there is this mystery limit of 100 that isnā€™t documented anywhere. The closest mention I can find is for the @prismicio/client that talks about a ā€˜default and maximum page size is 100 documentsā€™ for standard get methods like getByIds but that get-all methods like getAllByIDs can be used to avoid this limit.

Iā€™m using ids_in to retrieve all matching documents of a custom type. Is there a similar method to getAllByIds available for the graphQL API or a way to increase the limit above 100?

Our repo is metier.prismic.io and this is a reduced test case query:

query ProductCard($ids: [String!]) { 
  allProducts(id_in: $ids) {
    totalCount
    pageInfo {
      hasNextPage
      hasPreviousPage
      startCursor
      endCursor
    }
    edges {
      node {
        title
      }
    }
  }
}

When this array of 157 valid document ids is passed in it always returns a totalCount of exactly 100.

{
  "ids": ["X6CGThAAACAApYU9","YT--lhEAACAAoT_h","X6CIdBAAACAApY7l","X6CI5hAAAB4ApZDw","X6CIBhAAACIApYzy","X6CGsxAAACAApYcA","X6CC7xAAACIApXYp","YT_A-BEAACAAoUqj","X6CENRAAACAApXvV","X6CD6xAAAB4ApXqN","YbsNYBIAACIAHhST","YZzFPBAAACEAkrSJ","X6BOoxAAACEApIlT","YFxrThIAACIAhXc4","YKpe3hEAACIAiwND","YZrUiBAAACEAigNB","YFxodxIAACIAhWpV","YZzGmBAAACEAkrrS","YFxjHBIAACIAhVJR","YFxnexIAACEAhWXU","X6CiLxAAACAApgNR","X6CiyhAAACAApgYb","X6Ci-RAAACIApgby","X6CiYRAAACEApgQz","X6Ch_hAAACIApgJ1","X6ChrBAAACIApgD_","X6ChMhAAACEApf7X","YZzEgxAAACQAkrEz","X6CMKxAAACIApZ-3","X6CMrxAAACIApaIX","YbsO1BIAACAAHhcW","X37wsRUAACgAkfc3","X6CNKRAAACAApaQ9","YT_CvREAAERtoVKG","X37wfxUAACkAkfZD","X6CQbhAAACAApbKD","X37tcxUAACkAkegn","X37vSxUAACkAkfCy","YGSHkhIAACIAp9q8","YZ52NhAAACQAqRwU","X6CRHRAAACAApbWo","X6CRXBAAACAApbbM","X6CSWxAAACIApbtM","X6CSrhAAACAApbzK","X6CS9hAAACEApb4R","YZ5-FhAAACEAqUA0","X6CV3RAAACAApcso","X6CWIxAAACEApcxn","X6CWyBAAACIApc9b","YAYfMxAAACUAQ0h2","X6CkpRAAACEApg6D","X6CjyRAAACEApgqa","X6Ck2hAAACAApg9x","YF3nLBIAACEAi_3w","X6ClPxAAAB4AphEB","YOTtwxIAAGkettny","YT_HgREAACAAoWe3","YOTouRIAACIAtsOV","X6FJDBAAAB4AqPsh","X0_MhRAAAGYIRNMN","X6Cc0hAAACAApese","X6CbahAAACAApeSm","X6CbJxAAACEApeNp","X6CfJxAAAB4ApfWN","X6CfdxAAACEApfb0","X19rixEAACMAd86Y","X6Ce3BAAACAApfRD","X9C-8xYAACoA2pzZ","X9DEmxYAACoA2raF","YZ2JXhAAACIAlgDI","YZ4fwxAAACIAmJ9F","YZ4gtRAAACIAmKOV","YZ4iZhAAACQAmKsm","YG9WJBMAACIAi2k5","YJ8TIBAAACEA_h9G","YG9VNRMAACAAi2Ts","YG9UWxMAACIAi2D7","YG9XKxMAACMAi24T","YG9WtBMAACAAi2vd","X6HsjhAAACEAq-QK","X6HrrxAAAB4Aq-Am","X6HsPRAAACAAq-Ki","X6HquRAAACAAq9vO","YG9q8BMAACEAi8ig","YG9psRMAACMAi8KK","YG9qPxMAACMAi8V6","YG9o6xMAACIAi75r","YG9nWBMAACMAi7dL","YG9oaxMAACIAi7wg","YG9J-hMAACAAizEi","X6KnJxAAACIAry0i","X6KmyxAAACEAryt4","YYB0GhEAAB8ARGjI","X6PnfRMAACIA4PY9","X6PnNRMAACMA4PT0","YZ4MPRAAACMAmElD","YZ4PTRAAACIAmFTR","X6PpcRMAACQA4P9l","YZ4IVxAAACEAmDc-","X6PqcBMAACQA4QQb","X6Pp8RMAACMA4QHD","X6PqNRMAACIA4QMF","YZ4y-xAAACIAmPdI","YZ4zaRAAACIAmPlL","X6PoNxMAACMA4Pmr","X6Pn_RMAACEA4PiX","YZvYkxAAACMAjpD2","YZvZOhAAACIAjpP6","X6HnkRAAAB4Aq811","X6HichAAACIAq7Y_","X6HjVhAAACEAq7pJ","X6HfmRAAACEAq6m-","X6Hf2xAAAB4Aq6ro","X6HjphAAACIAq7u2","X6HmYRAAACIAq8gk","X6HmKRAAACAAq8ce","YVZLVxEAACIAZ6BP","X6Hb_BAAACIAq5l9","YZrlNRAAACEAikzI","YZrglRAAACMAijgs","YZrd5BAAACEAiiw3","YZrfuRAAACMAijRe","YZrhJBAAACQAijqp","YN2lkhAAACIALJ4x","YN2m3RAAACEALKGt","YWA1rhAAACUAie8r","YWA2aRAAACMAifJw","YWA1CxAAACMAiexV","YWA0ZRAAACUAiel3","YWAzexAAACUAieVY","YWAyrRAAACQAieHj","X9Va7BYAACsA7sCN","X6KbQhAAACIArveG","YNx_MhIAACEAya7G","X6KezhAAACEArwgO","X6KfpxAAAB4Arwv4","X6KZ_BAAAB4ArvFl","YNuNXRIAAI8SxYW-","YN2bCBAAACIALHg7","YN2dQxAAACAALIIH","X6Kd0xAAAB4ArwOF","X6KemxAAAB4Arwce","X6KfWxAAACIArwqd","X6KNYxAAACEArrUk","X9VgFhYAACoA7tdn","X9VfUBYAACoA7tP3","X_GxnxAAAMJH6tWg","X_GwaxAAACUA6tA9","YP7_8xAAACMApIeC","YN21exAAACAALNPo","X_GtsBAAACUA6sQZ","YZ2BZBAAACQAldzy","X6KozhAAACAArzTG","X6KpmxAAACIArzh6","X6KpBRAAACIArzW_","X6KpOBAAACAArzan","X6KpbBAAAB4ArzeV"]
}

As @baptiste mentioned, we could run multiple queries 100 items at a time using a manual cursor id but that seems inflexible for a big project.

Thanks in advance

1 Like

Hey there!

I had a chat with the development team. Now it's possible to retrieve a maximum of 100 documents per request. We've just updated the documentation to reflect this change.

Here's how the same query would look like:

query ProductCard($ids: [String!]) { 
  allProducts(id_in: $ids, first: 100) {
    totalCount
    pageInfo {
      hasNextPage
      hasPreviousPage
      startCursor
      endCursor
    }
    edges {
      node {
        title
      }
    }
  }
}
1 Like

Thanks, @Pau, thatā€™s very useful to know that we can now retrieve up to 100 documents per request.

Iā€™m still confused about the limit of 100 being applied to the overall query, though. This isnā€™t to do with pagination limits but an invisible limit being applied to the overall number of documents matched by the query.

The array that Iā€™m passing to the query includes 157 unique and valid document ids but you can see that the totalCount being returned is 100.

If you delete some ids from the beginning of the array and then run the query then it still returns a totalCount of 100.

Unfortunately, this means that I canā€™t paginate to get the rest as hasNextPage is false because the response ā€œthinksā€ it has returned all the results when I know there are still more that should be returned.

If you run the query without the id_in filter to retrieve all documents then it returns a totalCount of 471 so I know that in theory it should be possible to match over 100 documents in a query. Itā€™s only when using id_in to filter the results that this invisible limit kicks in.

I'll raise this question to the relevant team. We'll get back to you as soon as we have an answer.

1 Like

Ok so, the solution is pagination. As explained in this section of the same doc. Use the after argument to be able to get the documents that come after the given first number:

The team created a sample project to exemplify pagination. You can check it out here:

Hi Pau,

Iā€™ve already got pagination set up and working, thatā€™s not the issue here.

Thereā€™s a limit somewhere on Prismicā€™s side thatā€™s causing the query to return incomplete results. Itā€™s not in the documentation and it breaks the pagination example you shared so I can only assume itā€™s a bug.

Pagination in Prismic uses hasNextPage to decide whether there are still more documents to fetch. This code snippet is taken from AllPostsLoader.js in the react-graphql-pagination-example project.

But after retrieving the first 100 documents from my query hasNextPage is false meaning the script wonā€™t trigger another query.

Iā€™ve tried querying the first 99 results, taking the endCursor from that result and passing it in as the after variable for the next query and it only returns 1 document. So totalCount returns 100 and hasNextPage is false after retrieving 100 documents.

Itā€™s clear that only 100 documents are being returned and I canā€™t retrieve any more using pagination.

Please could you test using the info Iā€™ve provided for our repo (metier.prisimic.io)? Itā€™s easy to spot it happening in the graphQL explorer with these steps.

  1. Run the query and youā€™ll see totalCount is 100 and hasNextPage is false.
  2. Delete several ids from the variables array and rerun the query: youā€™ll see totalCount is still 100 and hasNextPage is false.

So that means that valid document ids after the first 100 are being ignored. The first query should have returned totalCount = 157 and hasNextPage = true meaning I could use pagination to retrieve the remaining documents.

Either thereā€™s a limit on the size of variable array that can be submitted or thereā€™s a limit being imposed on the documents that are returned.

query ProductCard($ids: [String!]) { 
  allProducts(id_in: $ids, first: 100) {
    totalCount
    pageInfo {
      hasNextPage
      hasPreviousPage
      startCursor
      endCursor
    }
    edges {
      node {
        title
      }
    }
  }
}
{
  "ids": ["X6CGThAAACAApYU9","YT--lhEAACAAoT_h","X6CIdBAAACAApY7l","X6CI5hAAAB4ApZDw","X6CIBhAAACIApYzy","X6CGsxAAACAApYcA","X6CC7xAAACIApXYp","YT_A-BEAACAAoUqj","X6CENRAAACAApXvV","X6CD6xAAAB4ApXqN","YbsNYBIAACIAHhST","YZzFPBAAACEAkrSJ","X6BOoxAAACEApIlT","YFxrThIAACIAhXc4","YKpe3hEAACIAiwND","YZrUiBAAACEAigNB","YFxodxIAACIAhWpV","YZzGmBAAACEAkrrS","YFxjHBIAACIAhVJR","YFxnexIAACEAhWXU","X6CiLxAAACAApgNR","X6CiyhAAACAApgYb","X6Ci-RAAACIApgby","X6CiYRAAACEApgQz","X6Ch_hAAACIApgJ1","X6ChrBAAACIApgD_","X6ChMhAAACEApf7X","YZzEgxAAACQAkrEz","X6CMKxAAACIApZ-3","X6CMrxAAACIApaIX","YbsO1BIAACAAHhcW","X37wsRUAACgAkfc3","X6CNKRAAACAApaQ9","YT_CvREAAERtoVKG","X37wfxUAACkAkfZD","X6CQbhAAACAApbKD","X37tcxUAACkAkegn","X37vSxUAACkAkfCy","YGSHkhIAACIAp9q8","YZ52NhAAACQAqRwU","X6CRHRAAACAApbWo","X6CRXBAAACAApbbM","X6CSWxAAACIApbtM","X6CSrhAAACAApbzK","X6CS9hAAACEApb4R","YZ5-FhAAACEAqUA0","X6CV3RAAACAApcso","X6CWIxAAACEApcxn","X6CWyBAAACIApc9b","YAYfMxAAACUAQ0h2","X6CkpRAAACEApg6D","X6CjyRAAACEApgqa","X6Ck2hAAACAApg9x","YF3nLBIAACEAi_3w","X6ClPxAAAB4AphEB","YOTtwxIAAGkettny","YT_HgREAACAAoWe3","YOTouRIAACIAtsOV","X6FJDBAAAB4AqPsh","X0_MhRAAAGYIRNMN","X6Cc0hAAACAApese","X6CbahAAACAApeSm","X6CbJxAAACEApeNp","X6CfJxAAAB4ApfWN","X6CfdxAAACEApfb0","X19rixEAACMAd86Y","X6Ce3BAAACAApfRD","X9C-8xYAACoA2pzZ","X9DEmxYAACoA2raF","YZ2JXhAAACIAlgDI","YZ4fwxAAACIAmJ9F","YZ4gtRAAACIAmKOV","YZ4iZhAAACQAmKsm","YG9WJBMAACIAi2k5","YJ8TIBAAACEA_h9G","YG9VNRMAACAAi2Ts","YG9UWxMAACIAi2D7","YG9XKxMAACMAi24T","YG9WtBMAACAAi2vd","X6HsjhAAACEAq-QK","X6HrrxAAAB4Aq-Am","X6HsPRAAACAAq-Ki","X6HquRAAACAAq9vO","YG9q8BMAACEAi8ig","YG9psRMAACMAi8KK","YG9qPxMAACMAi8V6","YG9o6xMAACIAi75r","YG9nWBMAACMAi7dL","YG9oaxMAACIAi7wg","YG9J-hMAACAAizEi","X6KnJxAAACIAry0i","X6KmyxAAACEAryt4","YYB0GhEAAB8ARGjI","X6PnfRMAACIA4PY9","X6PnNRMAACMA4PT0","YZ4MPRAAACMAmElD","YZ4PTRAAACIAmFTR","X6PpcRMAACQA4P9l","YZ4IVxAAACEAmDc-","X6PqcBMAACQA4QQb","X6Pp8RMAACMA4QHD","X6PqNRMAACIA4QMF","YZ4y-xAAACIAmPdI","YZ4zaRAAACIAmPlL","X6PoNxMAACMA4Pmr","X6Pn_RMAACEA4PiX","YZvYkxAAACMAjpD2","YZvZOhAAACIAjpP6","X6HnkRAAAB4Aq811","X6HichAAACIAq7Y_","X6HjVhAAACEAq7pJ","X6HfmRAAACEAq6m-","X6Hf2xAAAB4Aq6ro","X6HjphAAACIAq7u2","X6HmYRAAACIAq8gk","X6HmKRAAACAAq8ce","YVZLVxEAACIAZ6BP","X6Hb_BAAACIAq5l9","YZrlNRAAACEAikzI","YZrglRAAACMAijgs","YZrd5BAAACEAiiw3","YZrfuRAAACMAijRe","YZrhJBAAACQAijqp","YN2lkhAAACIALJ4x","YN2m3RAAACEALKGt","YWA1rhAAACUAie8r","YWA2aRAAACMAifJw","YWA1CxAAACMAiexV","YWA0ZRAAACUAiel3","YWAzexAAACUAieVY","YWAyrRAAACQAieHj","X9Va7BYAACsA7sCN","X6KbQhAAACIArveG","YNx_MhIAACEAya7G","X6KezhAAACEArwgO","X6KfpxAAAB4Arwv4","X6KZ_BAAAB4ArvFl","YNuNXRIAAI8SxYW-","YN2bCBAAACIALHg7","YN2dQxAAACAALIIH","X6Kd0xAAAB4ArwOF","X6KemxAAAB4Arwce","X6KfWxAAACIArwqd","X6KNYxAAACEArrUk","X9VgFhYAACoA7tdn","X9VfUBYAACoA7tP3","X_GxnxAAAMJH6tWg","X_GwaxAAACUA6tA9","YP7_8xAAACMApIeC","YN21exAAACAALNPo","X_GtsBAAACUA6sQZ","YZ2BZBAAACQAldzy","X6KozhAAACAArzTG","X6KpmxAAACIArzh6","X6KpBRAAACIArzW_","X6KpOBAAACAArzan","X6KpbBAAAB4ArzeV"]
}

Hello @pete1, we're reviewing your use case with the dev team. We'll get back to you shortly.

1 Like

Hello @pete1 we've sent you a dm asking for your project details. Let us know if you got it.

Ok so, in your query you'll need to send an ID after that will find where to locate the cursor. Only then the query will know where to look for the next pages:

query ProductCard{
  allProducts(after: "X6HsjhAAACEAq-QK" , first: 100) {
    totalCount
    pageInfo {
      hasNextPage
      hasPreviousPage
      startCursor
      endCursor
    }
    edges {
      node {
        title
        _meta {
          id
        }
      }
    }
  }
}

Iā€™ve explained several times that I know how to use pagination and already have it set up correctly but every answer so far has just been telling me how to use pagination rather than addressing my point!

The example you shared does indeed work using after. But itā€™s querying allProducts without the id_in filter and so returns every single product, which is not what I need.

I explained previously that Iā€™ve tried this and it works as expected when the id_in is not used.

Please try the following in our repoā€™s graphiQL editor using the info Iā€™ve already shared to see what I mean:

  1. Run initial query with no after and copy the endCursor. Note that totalCount is 100 and hasNextPage is false.

  2. Run the query again with the endCursor passed in as the after variable. Youā€™ll see there are no results even though more than 100 valid ids were passed in to the query.

@pete1 I got some more info that can be helpful.
I was able to replicate the error with the id_in operator. It's true that the cursors aren't working along with it. So the pagination won't work. But what you can do instead is to paginate using your ids by batches of 100. Here's an example of how to do it:

// Do a function that crawls the GraphQL endpoint by batching ids in id_in

const queryAll = async (ids) => {
  // Copy the ids array
  const bucketOfIds = [...ids]; // Length 157 in your case

  const results = [];

  // While we have ids in our bucket
  while (bucketOfIds.length) {

    // Get the first 100 of them
    const batchOfIds = bucketOfIds.splice(0, 100);

    // Query the batch from GraphQL and add the results to our results array
    results.push(...await runGraphQLQuery());

    // Log for demo purpose
    console.log(batchOfIds);
  }

  // Returns everything we got from GraphQL
  return results;
};

Let us know if this is helpful.
I reported the issue to our product team.

1 Like

Great, thanks for confirming and suggesting a way around the limit. Iā€™ll try splitting the ids up into batches as suggested.

1 Like