Fetch more than 20 documents with one query in GraphQL

Hello, I'm making a Course filter on all Course documents created. For this I need to get all documents beforehand so there's no loading time when clicking the different categories. I use the following code for fetching all my Course document info.

async function fetchAPI(query, { previewData, variables } = {}) {
  const prismicAPI = await PrismicClient.getApi()
  const res = await fetch(
    `${GRAPHQL_API_URL}?query=${query}&variables=${JSON.stringify(variables)}`,
    {
      headers: {
        'Prismic-Ref': previewData?.ref || prismicAPI.masterRef.ref,
        'Content-Type': 'application/json',
        'Accept-Language': API_LOCALE,
        Authorization: `Token ${API_TOKEN}`,
      },
    }
  )

  if (res.status !== 200) {
    console.log(await res.text())
    throw new Error('Failed to fetch API')
  }

  const json = await res.json()
  if (json.errors) {
    console.error(json.errors)
    throw new Error('Failed to fetch API')
  }
  return json.data
}

export async function getAllCourses() {
  const data = await fetchAPI(`
    {
      allCursos {
        edges {
          node {
            _meta {
              uid
            }
            title
            description
            grade
            level
          }
        }
      }
    }
  `)
  return data?.allCursos?.edges
}

The problem is that this query returns only 20 documents and I need to get all of them recursively. I saw this on the documentation about pagination but I do not understand on how that would look with my current implementation.

I also saw this code example but I'm having throuble wrapping my head around it.

here's a link to my github repo I would appreciate any help on this. Thank you! :slightly_smiling_face:

Hi @jprzpam,

You found the correct example, but I understand that it's difficult to wrap your head around. It contains a few examples. You're looking for a method to recursively fetch your posts. So, you'll fetch 20 posts, and if there are more posts left in the repo, you'll fetch another 20, and so on, until there are no posts left. Here's the function responsible for that behavior in the example project:

Breaking that down:

export const recursivelyFetchAllPosts = async (currentCursor = null) => {
  // The cursor keeps track of where you are in the repo. This queries
  // posts, starting at the beginning (no cursor).
  const response = await queryPosts(currentCursor);

  // This is the 20 posts from the current query
  const currentPosts = response.data.allPosts.edges.map((edge) => edge.node);

  // If you're at the end of the query, return the posts you have so far
  if (!response.data.allPosts.pageInfo.hasNextPage) {
    return currentPosts;
  }

  // If you're not at the end of the query, set the cursor to the new position
  const newCursor = response.data.allPosts.pageInfo.endCursor;

  // Run the query again
  const newPosts = await recursivelyFetchAllPosts(newCursor);

  // Add all of the new posts (fetched recursively) to the posts you have so far
  return [...currentPosts, ...newPosts];
};

You will also need to include the query function, defined here:

Let me know if this helps, or if you have more questions.

Best,
Sam

1 Like

Thank you for the answer, I did saw the recursivelyFetchAllPosts function and kind of understood what was going on there. I tried implementing it that way but got confused on the definition of the query function.

export const queryPosts = async (currentCursor = null, itemsPerPage = maxItemsPerPage) => {

  // Not sure what is going on here.
  return client.query({
    query: gql`
      ${blogPostsQuery}
      ${blogPostsFragment}
    `,
    variables: { currentCursor, itemsPerPage },
  });
};

I see that it does a return client.query where client is defined like this:

export const client = new ApolloClient({
  link: PrismicLink({ uri: apiEndpoint }),
  cache: new InMemoryCache({ fragmentMatcher }),
});

but I do not use ApolloClient so I got lost here. I also did not understand what fragmentMatcher or blogPostFragment were. My query function looks like this and I'm not sure how I can make it work to take variables.

export async function getAllCourses() {
  const data = await fetchAPI(`
    {
      allCursos {
        edges {
          node {
            _meta {
              uid
            }
            title
            description
            grade
            level
          }
        }
      }
    }
  `)
  return data?.allCursos?.edges
}

Do I need to have ApolloClient in my project, or how would that look with my current implementation? I'm sorry but I'm truly lost and a bit frustrated by this :sweat_smile:

Thank you and I appreciate your help!

Hi again @samlittlefair , currently I do my getAllCourses query like this:

export const PrismicClient = Prismic.client(REF_API_URL, {
  accessToken: API_TOKEN,
})

async function fetchAPI(query, { previewData, variables } = {}) {
  const prismicAPI = await PrismicClient.getApi()
  const res = await fetch(
    `${GRAPHQL_API_URL}?query=${query}&variables=${JSON.stringify(variables)}`,
    {
      headers: {
        'Prismic-Ref': previewData?.ref || prismicAPI.masterRef.ref,
        'Content-Type': 'application/json',
        'Accept-Language': API_LOCALE,
        Authorization: `Token ${API_TOKEN}`,
      },
    }
  )

  if (res.status !== 200) {
    console.log(await res.text())
    throw new Error('Failed to fetch API')
  }

  const json = await res.json()
  if (json.errors) {
    console.error(json.errors)
    throw new Error('Failed to fetch API')
  }
  return json.data
}

export async function getAllCourses() {
  const data = await fetchAPI(`
    {
      allCursos {
        edges {
          node {
            _meta {
              uid
            }
            title
            description
            grade
            level
          }
        }
      }
    }
  `)
  return data?.allCursos?.edges
}

I'm truly lost and confused on how to get more than 20 documents, seems like it should not be this hard :sweat_smile:

Hey @jprzpam,

It's hard for me to give more suggestions without knowing more about your code. Could you share your codebase as either a ZIP file or GitHub repo? You can send it in a DM if you like.

Sam