feat(graphql): add paginated errors collection query (#36149)
* feat(graphql): add paginated errors collection query - Add new GraphQL query field 'errors' with cursor-based pagination - Add UUID id column to content.error table for cursor pagination - Implement error collection resolver with forward/backward pagination - Add comprehensive test suite for pagination functionality - Update database types and schema to support new error collection - Add utility functions for handling collection queries and errors - Add seed data for testing pagination scenarios This change allows clients to efficiently paginate through error codes using cursor-based pagination, supporting both forward and backward traversal. The implementation follows the Relay connection specification and includes proper error handling and type safety. * docs(graphql): add comprehensive GraphQL architecture documentation Add detailed documentation for the docs GraphQL endpoint architecture, including: - Modular query pattern and folder structure - Step-by-step guide for creating new top-level queries - Best practices for error handling, field optimization, and testing - Code examples for schemas, models, resolvers, and tests * feat(graphql): add service filtering to errors collection query Enable filtering error codes by Supabase service in the GraphQL errors collection: - Add optional service argument to errors query resolver - Update error model to support service-based filtering in database queries - Maintain pagination compatibility with service filtering - Add comprehensive tests for service filtering with and without pagination * feat(graphql): add service filtering and fix cursor encoding for errors collection - Add service parameter to errors GraphQL query for filtering by Supabase service - Implement base64 encoding/decoding for pagination cursors in error resolver - Fix test cursor encoding to match resolver implementation - Update GraphQL schema snapshot to reflect new service filter field * docs(graphql): fix codegen instruction
This commit is contained in:
267
.cursor/rules/docs-graphql.mdc
Normal file
267
.cursor/rules/docs-graphql.mdc
Normal file
@@ -0,0 +1,267 @@
|
||||
---
|
||||
description: Docs GraphQL Architecture
|
||||
globs: apps/docs/resources/**/*.ts
|
||||
alwaysApply: false
|
||||
---
|
||||
|
||||
# Docs GraphQL Architecture
|
||||
|
||||
## Overview
|
||||
|
||||
The `/apps/docs/resources` folder contains the GraphQL endpoint architecture for the docs GraphQL endpoint at `/api/graphql`. It follows a modular pattern where each top-level query is organized into its own folder with consistent file structure.
|
||||
|
||||
## Architecture Pattern
|
||||
|
||||
Each GraphQL query follows this structure:
|
||||
|
||||
```
|
||||
resources/
|
||||
├── queryObject/
|
||||
│ ├── queryObjectModel.ts # Data models and business logic
|
||||
│ ├── queryObjectSchema.ts # GraphQL type definitions
|
||||
│ ├── queryObjectResolver.ts # Query resolver and arguments
|
||||
│ ├── queryObjectTypes.ts # TypeScript interfaces (optional)
|
||||
│ └── queryObjectSync.ts # Functions for syncing repo content to the database (optional)
|
||||
├── utils/
|
||||
│ ├── connections.ts # GraphQL connection/pagination utilities
|
||||
│ └── fields.ts # GraphQL field selection utilities
|
||||
├── rootSchema.ts # Main GraphQL schema with all queries
|
||||
└── rootSync.ts # Root sync script for syncing to database
|
||||
```
|
||||
|
||||
## Example queries
|
||||
|
||||
1. **searchDocs** (`globalSearch/`) - Vector-based search across all docs content
|
||||
2. **error** (`error/`) - Error code lookup for Supabase services
|
||||
3. **schema** - GraphQL schema introspection
|
||||
|
||||
## Key Files
|
||||
|
||||
### `rootSchema.ts`
|
||||
- Main GraphQL schema definition
|
||||
- Imports all resolvers and combines them into the root query
|
||||
- Defines the `RootQueryType` with all top-level fields
|
||||
|
||||
### `utils/connections.ts`
|
||||
- Provides `createCollectionType()` for paginated collections
|
||||
- `GraphQLCollectionBuilder` for building collection responses
|
||||
- Standard pagination arguments and edge/node patterns
|
||||
|
||||
### `utils/fields.ts`
|
||||
- `graphQLFields()` utility to analyze requested fields in resolvers
|
||||
- Used for optimizing data fetching based on what fields are actually requested
|
||||
|
||||
## Creating a New Top-Level Query
|
||||
|
||||
To add a new GraphQL query, follow these steps:
|
||||
|
||||
### 1. Create Query Folder Structure
|
||||
```bash
|
||||
mkdir resources/newQuery
|
||||
touch resources/newQuery/newQueryModel.ts
|
||||
touch resources/newQuery/newQuerySchema.ts
|
||||
touch resources/newQuery/newQueryResolver.ts
|
||||
```
|
||||
|
||||
### 2. Define GraphQL Schema (`newQuerySchema.ts`)
|
||||
```typescript
|
||||
import { GraphQLObjectType, GraphQLString } from 'graphql'
|
||||
|
||||
export const GRAPHQL_FIELD_NEW_QUERY = 'newQuery' as const
|
||||
|
||||
export const GraphQLObjectTypeNewQuery = new GraphQLObjectType({
|
||||
name: 'NewQuery',
|
||||
description: 'Description of what this query returns',
|
||||
fields: {
|
||||
id: {
|
||||
type: GraphQLString,
|
||||
description: 'Unique identifier',
|
||||
},
|
||||
// Add other fields...
|
||||
},
|
||||
})
|
||||
```
|
||||
|
||||
### 3. Create Data Model (`newQueryModel.ts`)
|
||||
|
||||
> [!NOTE]
|
||||
> The data model should be agnostic to GraphQL. It may import argument types
|
||||
> from `~/__generated__/graphql`, but otherwise all functions and classes
|
||||
> should be unaware of whether they are called for GraphQL resolution.
|
||||
|
||||
> [!TIP]
|
||||
> The types in `~/__generated__/graphql` for a new endpoint will not exist
|
||||
> until the code generation is run in the next step.
|
||||
|
||||
```typescript
|
||||
import { type RootQueryTypeNewQueryArgs } from '~/__generated__/graphql'
|
||||
import { convertPostgrestToApiError, type ApiErrorGeneric } from '~/app/api/utils'
|
||||
import { Result } from '~/features/helpers.fn'
|
||||
import { supabase } from '~/lib/supabase'
|
||||
|
||||
export class NewQueryModel {
|
||||
constructor(public readonly data: {
|
||||
id: string
|
||||
// other properties...
|
||||
}) {}
|
||||
|
||||
static async loadData(
|
||||
args: RootQueryTypeNewQueryArgs,
|
||||
requestedFields: Array<string>
|
||||
): Promise<Result<NewQueryModel[], ApiErrorGeneric>> {
|
||||
// Implement data fetching logic
|
||||
const result = new Result(
|
||||
await supabase()
|
||||
.from('your_table')
|
||||
.select('*')
|
||||
// Add filters based on args
|
||||
)
|
||||
.map((data) => data.map((item) => new NewQueryModel(item)))
|
||||
.mapError(convertPostgrestToApiError)
|
||||
|
||||
return result
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Create Resolver (`newQueryResolver.ts`)
|
||||
```typescript
|
||||
import { GraphQLError, GraphQLNonNull, GraphQLString, type GraphQLResolveInfo } from 'graphql'
|
||||
import { type RootQueryTypeNewQueryArgs } from '~/__generated__/graphql'
|
||||
import { convertUnknownToApiError } from '~/app/api/utils'
|
||||
import { Result } from '~/features/helpers.fn'
|
||||
import { graphQLFields } from '../utils/fields'
|
||||
import { NewQueryModel } from './newQueryModel'
|
||||
import { GRAPHQL_FIELD_NEW_QUERY, GraphQLObjectTypeNewQuery } from './newQuerySchema'
|
||||
|
||||
async function resolveNewQuery(
|
||||
_parent: unknown,
|
||||
args: RootQueryTypeNewQueryArgs,
|
||||
_context: unknown,
|
||||
info: GraphQLResolveInfo
|
||||
): Promise<NewQueryModel[] | GraphQLError> {
|
||||
return (
|
||||
await Result.tryCatchFlat(
|
||||
resolveNewQueryImpl,
|
||||
convertUnknownToApiError,
|
||||
args,
|
||||
info
|
||||
)
|
||||
).match(
|
||||
(data) => data,
|
||||
(error) => {
|
||||
console.error(`Error resolving ${GRAPHQL_FIELD_NEW_QUERY}:`, error)
|
||||
return new GraphQLError(error.isPrivate() ? 'Internal Server Error' : error.message)
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
async function resolveNewQueryImpl(
|
||||
args: RootQueryTypeNewQueryArgs,
|
||||
info: GraphQLResolveInfo
|
||||
): Promise<Result<NewQueryModel[], ApiErrorGeneric>> {
|
||||
const fieldsInfo = graphQLFields(info)
|
||||
const requestedFields = Object.keys(fieldsInfo)
|
||||
return await NewQueryModel.loadData(args, requestedFields)
|
||||
}
|
||||
|
||||
export const newQueryRoot = {
|
||||
[GRAPHQL_FIELD_NEW_QUERY]: {
|
||||
description: 'Description of what this query does',
|
||||
args: {
|
||||
id: {
|
||||
type: new GraphQLNonNull(GraphQLString),
|
||||
description: 'Required argument description',
|
||||
},
|
||||
// Add other arguments...
|
||||
},
|
||||
type: GraphQLObjectTypeNewQuery, // or createCollectionType() for lists
|
||||
resolve: resolveNewQuery,
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Register in Root Schema
|
||||
In `rootSchema.ts`, add your resolver:
|
||||
|
||||
```typescript
|
||||
// Import your resolver
|
||||
import { newQueryRoot } from './newQuery/newQueryResolver'
|
||||
|
||||
// Add to the query fields
|
||||
export const rootGraphQLSchema = new GraphQLSchema({
|
||||
query: new GraphQLObjectType({
|
||||
name: 'RootQueryType',
|
||||
fields: {
|
||||
...introspectRoot,
|
||||
...searchRoot,
|
||||
...errorRoot,
|
||||
...newQueryRoot, // Add this line
|
||||
},
|
||||
}),
|
||||
types: [
|
||||
GraphQLObjectTypeGuide,
|
||||
GraphQLObjectTypeReferenceCLICommand,
|
||||
GraphQLObjectTypeReferenceSDKFunction,
|
||||
GraphQLObjectTypeTroubleshooting,
|
||||
],
|
||||
})
|
||||
```
|
||||
|
||||
### 6. Update TypeScript Types
|
||||
Run the GraphQL codegen to update TypeScript types:
|
||||
```bash
|
||||
pnpm run -F docs codegen:graphql
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Error Handling**: Error handling always uses the Result class, defined in apps/docs/features/helpers.fn.ts
|
||||
2. **Field Optimization**: Use `graphQLFields()` to only fetch requested data
|
||||
3. **Collections**: Use `createCollectionType()` for paginated lists
|
||||
4. **Naming**: Use `GRAPHQL_FIELD_*` constants for field names
|
||||
5. **Documentation**: Add GraphQL descriptions to all fields and types
|
||||
6. **Database**: Use `supabase()` client for database operations with `convertPostgrestToApiError`
|
||||
|
||||
## Testing
|
||||
|
||||
Tests are located in apps/docs/app/api/graphql/tests. Each top-level query
|
||||
should have its own test file, located at <queryName>.test.ts.
|
||||
|
||||
### Test data
|
||||
|
||||
Test data uses a local database, seeded with the file at supabase/seed.sql. Add
|
||||
any data required for running your new query.
|
||||
|
||||
### Integration tests
|
||||
|
||||
Integration tests import the POST function defined in
|
||||
apps/docs/api/graphql/route.ts, then make a request to this function.
|
||||
|
||||
For example:
|
||||
|
||||
```ts
|
||||
import { POST } from '../route'
|
||||
|
||||
it('test name', async () => {
|
||||
const query = `
|
||||
query {
|
||||
...
|
||||
}
|
||||
`
|
||||
const request = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query }),
|
||||
})
|
||||
|
||||
const result = await POST(request)
|
||||
})
|
||||
```
|
||||
|
||||
Include at least the following tests:
|
||||
|
||||
1. A test that requests all fields (including nested fields) on the new query
|
||||
object, and asserts that there are no errors, and the requested fields are
|
||||
properly returned.
|
||||
2. A test that triggers and error, and asserts that a GraphQL error is properly
|
||||
returned.
|
||||
@@ -134,6 +134,24 @@ type RootQueryType {
|
||||
|
||||
"""Get the details of an error code returned from a Supabase service"""
|
||||
error(code: String!, service: Service!): Error
|
||||
|
||||
"""Get error codes that can potentially be returned by Supabase services"""
|
||||
errors(
|
||||
"""Returns the first n elements from the list"""
|
||||
first: Int
|
||||
|
||||
"""Returns elements that come after the specified cursor"""
|
||||
after: String
|
||||
|
||||
"""Returns the last n elements from the list"""
|
||||
last: Int
|
||||
|
||||
"""Returns elements that come before the specified cursor"""
|
||||
before: String
|
||||
|
||||
"""Filter errors by a specific Supabase service"""
|
||||
service: Service
|
||||
): ErrorCollection
|
||||
}
|
||||
|
||||
"""A collection of search results containing content from Supabase docs"""
|
||||
@@ -177,5 +195,44 @@ enum Service {
|
||||
AUTH
|
||||
REALTIME
|
||||
STORAGE
|
||||
}
|
||||
|
||||
"""A collection of Errors"""
|
||||
type ErrorCollection {
|
||||
"""A list of edges containing nodes in this collection"""
|
||||
edges: [ErrorEdge!]!
|
||||
|
||||
"""The nodes in this collection, directly accessible"""
|
||||
nodes: [Error!]!
|
||||
|
||||
"""Pagination information"""
|
||||
pageInfo: PageInfo!
|
||||
|
||||
"""The total count of items available in this collection"""
|
||||
totalCount: Int!
|
||||
}
|
||||
|
||||
"""An edge in a collection of Errors"""
|
||||
type ErrorEdge {
|
||||
"""The Error at the end of the edge"""
|
||||
node: Error!
|
||||
|
||||
"""A cursor for use in pagination"""
|
||||
cursor: String!
|
||||
}
|
||||
|
||||
"""Pagination information for a collection"""
|
||||
type PageInfo {
|
||||
"""Whether there are more items after the current page"""
|
||||
hasNextPage: Boolean!
|
||||
|
||||
"""Whether there are more items before the current page"""
|
||||
hasPreviousPage: Boolean!
|
||||
|
||||
"""Cursor pointing to the start of the current page"""
|
||||
startCursor: String
|
||||
|
||||
"""Cursor pointing to the end of the current page"""
|
||||
endCursor: String
|
||||
}"
|
||||
`;
|
||||
|
||||
356
apps/docs/app/api/graphql/tests/errors.collection.test.ts
Normal file
356
apps/docs/app/api/graphql/tests/errors.collection.test.ts
Normal file
@@ -0,0 +1,356 @@
|
||||
import { describe, expect, it } from 'vitest'
|
||||
import { supabase } from '~/lib/supabase'
|
||||
import { POST } from '../route'
|
||||
|
||||
describe('/api/graphql errors collection', () => {
|
||||
it('returns a list of errors with pagination info', async () => {
|
||||
// Get the expected order of errors from the database
|
||||
const { data: dbErrors } = await supabase()
|
||||
.schema('content')
|
||||
.from('error')
|
||||
.select('id, code, ...service(service:name), httpStatusCode:http_status_code, message')
|
||||
.is('deleted_at', null)
|
||||
.order('id', { ascending: true })
|
||||
|
||||
const errorsQuery = `
|
||||
query {
|
||||
errors(first: 2) {
|
||||
totalCount
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
hasPreviousPage
|
||||
startCursor
|
||||
endCursor
|
||||
}
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
service
|
||||
httpStatusCode
|
||||
message
|
||||
}
|
||||
}
|
||||
nodes {
|
||||
code
|
||||
service
|
||||
httpStatusCode
|
||||
message
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const request = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: errorsQuery }),
|
||||
})
|
||||
|
||||
const result = await POST(request)
|
||||
const {
|
||||
data: { errors },
|
||||
errors: queryErrors,
|
||||
} = await result.json()
|
||||
|
||||
expect(queryErrors).toBeUndefined()
|
||||
expect(errors.totalCount).toBe(3)
|
||||
expect(errors.edges).toHaveLength(2)
|
||||
expect(errors.nodes).toHaveLength(2)
|
||||
expect(errors.pageInfo.hasNextPage).toBe(true)
|
||||
expect(errors.pageInfo.hasPreviousPage).toBe(false)
|
||||
expect(errors.pageInfo.startCursor).toBeDefined()
|
||||
expect(errors.pageInfo.endCursor).toBeDefined()
|
||||
|
||||
// Compare against the first error from the database
|
||||
expect(dbErrors).not.toBe(null)
|
||||
const firstDbError = dbErrors![0]
|
||||
const firstError = errors.nodes[0]
|
||||
expect(firstError.code).toBe(firstDbError.code)
|
||||
expect(firstError.service).toBe(firstDbError.service)
|
||||
expect(firstError.httpStatusCode).toBe(firstDbError.httpStatusCode)
|
||||
expect(firstError.message).toBe(firstDbError.message)
|
||||
|
||||
const firstEdge = errors.edges[0]
|
||||
expect(firstEdge.cursor).toBeDefined()
|
||||
expect(firstEdge.node).toEqual(firstError)
|
||||
})
|
||||
|
||||
it('supports cursor-based pagination', async () => {
|
||||
const firstPageQuery = `
|
||||
query {
|
||||
errors(first: 1) {
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
}
|
||||
}
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
hasPreviousPage
|
||||
startCursor
|
||||
endCursor
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const firstRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: firstPageQuery }),
|
||||
})
|
||||
|
||||
const firstResult = await POST(firstRequest)
|
||||
const firstJson = await firstResult.json()
|
||||
expect(firstJson.errors).toBeUndefined()
|
||||
expect(firstJson.data.errors.edges).toHaveLength(1)
|
||||
expect(firstJson.data.errors.pageInfo.hasNextPage).toBe(true)
|
||||
expect(firstJson.data.errors.pageInfo.hasPreviousPage).toBe(false)
|
||||
|
||||
const firstCursor = firstJson.data.errors.edges[0].cursor
|
||||
|
||||
const secondPageQuery = `
|
||||
query {
|
||||
errors(first: 1, after: "${firstCursor}") {
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
}
|
||||
}
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
hasPreviousPage
|
||||
startCursor
|
||||
endCursor
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const secondRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: secondPageQuery }),
|
||||
})
|
||||
|
||||
const secondResult = await POST(secondRequest)
|
||||
const secondJson = await secondResult.json()
|
||||
expect(secondJson.errors).toBeUndefined()
|
||||
expect(secondJson.data.errors.edges).toHaveLength(1)
|
||||
expect(secondJson.data.errors.pageInfo.hasPreviousPage).toBe(true)
|
||||
|
||||
expect(firstJson.data.errors.edges[0].node.code).not.toBe(
|
||||
secondJson.data.errors.edges[0].node.code
|
||||
)
|
||||
})
|
||||
|
||||
it('returns empty list when paginating past available results', async () => {
|
||||
// Base64 encode the UUID that's guaranteed to be after any real data
|
||||
const afterCursor = Buffer.from('ffffffff-ffff-ffff-ffff-ffffffffffff', 'utf8').toString(
|
||||
'base64'
|
||||
)
|
||||
const query = `
|
||||
query {
|
||||
errors(first: 1, after: "${afterCursor}") {
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
}
|
||||
}
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
hasPreviousPage
|
||||
startCursor
|
||||
endCursor
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const request = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query }),
|
||||
})
|
||||
|
||||
const result = await POST(request)
|
||||
const json = await result.json()
|
||||
expect(json.errors).toBeUndefined()
|
||||
expect(json.data.errors.edges).toHaveLength(0)
|
||||
expect(json.data.errors.pageInfo.hasNextPage).toBe(false)
|
||||
expect(json.data.errors.pageInfo.hasPreviousPage).toBe(true)
|
||||
})
|
||||
|
||||
it('supports backward pagination with last', async () => {
|
||||
const lastPageQuery = `
|
||||
query {
|
||||
errors(last: 1) {
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
}
|
||||
}
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
hasPreviousPage
|
||||
startCursor
|
||||
endCursor
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const lastRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: lastPageQuery }),
|
||||
})
|
||||
|
||||
const lastResult = await POST(lastRequest)
|
||||
const lastJson = await lastResult.json()
|
||||
expect(lastJson.errors).toBeUndefined()
|
||||
expect(lastJson.data.errors.edges).toHaveLength(1)
|
||||
expect(lastJson.data.errors.pageInfo.hasNextPage).toBe(false)
|
||||
expect(lastJson.data.errors.pageInfo.hasPreviousPage).toBe(true)
|
||||
const lastCursor = lastJson.data.errors.edges[0].cursor
|
||||
|
||||
const beforeLastQuery = `
|
||||
query {
|
||||
errors(last: 1, before: "${lastCursor}") {
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
}
|
||||
}
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
hasPreviousPage
|
||||
startCursor
|
||||
endCursor
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const beforeLastRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: beforeLastQuery }),
|
||||
})
|
||||
const beforeLastResult = await POST(beforeLastRequest)
|
||||
|
||||
const beforeLastJson = await beforeLastResult.json()
|
||||
expect(beforeLastJson.errors).toBeUndefined()
|
||||
expect(beforeLastJson.data.errors.edges).toHaveLength(1)
|
||||
expect(beforeLastJson.data.errors.pageInfo.hasNextPage).toBe(true)
|
||||
expect(beforeLastJson.data.errors.edges[0].node.code).not.toBe(
|
||||
lastJson.data.errors.edges[0].node.code
|
||||
)
|
||||
})
|
||||
|
||||
it('filters by service when service argument is provided', async () => {
|
||||
// First, get all errors to check we have errors from different services
|
||||
const allErrorsQuery = `
|
||||
query {
|
||||
errors {
|
||||
nodes {
|
||||
service
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const allRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: allErrorsQuery }),
|
||||
})
|
||||
const allResult = await POST(allRequest)
|
||||
const allJson = await allResult.json()
|
||||
expect(allJson.errors).toBeUndefined()
|
||||
|
||||
// Verify we have errors from multiple services
|
||||
const services = new Set(allJson.data.errors.nodes.map((e: any) => e.service))
|
||||
expect(services.size).toBeGreaterThan(1)
|
||||
|
||||
// Test filtering by AUTH service
|
||||
const authErrorsQuery = `
|
||||
query {
|
||||
errors(service: AUTH) {
|
||||
totalCount
|
||||
nodes {
|
||||
code
|
||||
service
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const authRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: authErrorsQuery }),
|
||||
})
|
||||
|
||||
const authResult = await POST(authRequest)
|
||||
const authJson = await authResult.json()
|
||||
expect(authJson.errors).toBeUndefined()
|
||||
|
||||
// Verify all returned errors are from AUTH service
|
||||
expect(authJson.data.errors.nodes.length).toBeGreaterThan(0)
|
||||
expect(authJson.data.errors.nodes.every((e: any) => e.service === 'AUTH')).toBe(true)
|
||||
})
|
||||
|
||||
it('supports service filtering with pagination', async () => {
|
||||
const firstPageQuery = `
|
||||
query {
|
||||
errors(service: AUTH, first: 1) {
|
||||
edges {
|
||||
cursor
|
||||
node {
|
||||
code
|
||||
service
|
||||
}
|
||||
}
|
||||
pageInfo {
|
||||
hasNextPage
|
||||
endCursor
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const firstRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: firstPageQuery }),
|
||||
})
|
||||
|
||||
const firstResult = await POST(firstRequest)
|
||||
const firstJson = await firstResult.json()
|
||||
expect(firstJson.errors).toBeUndefined()
|
||||
|
||||
// Verify the returned error is from AUTH service
|
||||
expect(firstJson.data.errors.edges[0].node.service).toBe('AUTH')
|
||||
|
||||
// If there are more AUTH errors, test pagination
|
||||
if (firstJson.data.errors.pageInfo.hasNextPage) {
|
||||
const cursor = firstJson.data.errors.pageInfo.endCursor
|
||||
const secondPageQuery = `
|
||||
query {
|
||||
errors(service: AUTH, first: 1, after: "${cursor}") {
|
||||
edges {
|
||||
node {
|
||||
code
|
||||
service
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
const secondRequest = new Request('http://localhost/api/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ query: secondPageQuery }),
|
||||
})
|
||||
|
||||
const secondResult = await POST(secondRequest)
|
||||
const secondJson = await secondResult.json()
|
||||
expect(secondJson.errors).toBeUndefined()
|
||||
|
||||
// Verify the second page also returns AUTH errors
|
||||
expect(secondJson.data.errors.edges[0].node.service).toBe('AUTH')
|
||||
// And it's a different error
|
||||
expect(secondJson.data.errors.edges[0].node.code).not.toBe(
|
||||
firstJson.data.errors.edges[0].node.code
|
||||
)
|
||||
}
|
||||
})
|
||||
})
|
||||
@@ -84,6 +84,30 @@ export class MultiError<ErrorType = unknown, Details extends ObjectOrNever = nev
|
||||
}
|
||||
}
|
||||
|
||||
export class CollectionQueryError extends Error {
|
||||
constructor(
|
||||
message: string,
|
||||
public readonly queryErrors: {
|
||||
count?: PostgrestError
|
||||
data?: PostgrestError
|
||||
}
|
||||
) {
|
||||
super(message)
|
||||
}
|
||||
|
||||
public static fromErrors(
|
||||
countError: PostgrestError | undefined,
|
||||
dataError: PostgrestError | undefined
|
||||
): CollectionQueryError {
|
||||
const fetchFailedFor =
|
||||
countError && dataError ? 'count and collection' : countError ? 'count' : 'collection'
|
||||
return new CollectionQueryError(`Failed to fetch ${fetchFailedFor}`, {
|
||||
count: countError,
|
||||
data: dataError,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
export function convertUnknownToApiError(error: unknown): ApiError {
|
||||
return new ApiError('Unknown error', error)
|
||||
}
|
||||
|
||||
@@ -137,4 +137,24 @@ export class Result<Ok, Error> {
|
||||
if (this.isOk()) return onOk(this.internal.data!)
|
||||
return onError(this.internal.error!)
|
||||
}
|
||||
|
||||
unwrap(): Ok {
|
||||
if (!this.isOk()) {
|
||||
throw new Error(`Unwrap called on Err: ${this.internal.error}`, {
|
||||
cause: this.internal.error,
|
||||
})
|
||||
}
|
||||
return this.internal.data!
|
||||
}
|
||||
|
||||
join<OtherOk, OtherError>(
|
||||
other: Result<OtherOk, OtherError>
|
||||
): Result<[Ok, OtherOk], [Error, OtherError]> {
|
||||
if (!this.isOk() || !other.isOk())
|
||||
return Result.error([this.internal.error, other.internal.error]) as Result<
|
||||
[Ok, OtherOk],
|
||||
[Error, OtherError]
|
||||
>
|
||||
return Result.ok([this.internal.data!, other.internal.data!])
|
||||
}
|
||||
}
|
||||
|
||||
@@ -33,6 +33,7 @@
|
||||
"sync": "tsx --conditions=react-server ./resources/rootSync.ts",
|
||||
"test": "pnpm supabase start && pnpm run test:local && pnpm supabase stop",
|
||||
"test:local": "vitest --exclude \"**/*.smoke.test.ts\"",
|
||||
"test:local:unwatch": "vitest --exclude \"**/*.smoke.test.ts\" --run",
|
||||
"test:smoke": "pnpm run codegen:references && vitest -t \"prod smoke test\"",
|
||||
"troubleshooting:sync": "node features/docs/Troubleshooting.script.mjs",
|
||||
"typecheck": "tsc --noEmit"
|
||||
|
||||
@@ -1,6 +1,13 @@
|
||||
import { ApiErrorGeneric, convertPostgrestToApiError, NoDataError } from '~/app/api/utils'
|
||||
import { type PostgrestError } from '@supabase/supabase-js'
|
||||
import {
|
||||
ApiErrorGeneric,
|
||||
CollectionQueryError,
|
||||
convertPostgrestToApiError,
|
||||
NoDataError,
|
||||
} from '~/app/api/utils'
|
||||
import { Result } from '~/features/helpers.fn'
|
||||
import { supabase } from '~/lib/supabase'
|
||||
import { type CollectionFetch } from '../utils/connections'
|
||||
|
||||
export const SERVICES = {
|
||||
AUTH: {
|
||||
@@ -15,24 +22,29 @@ export const SERVICES = {
|
||||
} as const
|
||||
|
||||
type Service = keyof typeof SERVICES
|
||||
type ErrorCollectionFetch = CollectionFetch<ErrorModel, { service?: Service }>['fetch']
|
||||
|
||||
export class ErrorModel {
|
||||
public id: string
|
||||
public code: string
|
||||
public service: Service
|
||||
public httpStatusCode?: number
|
||||
public message?: string
|
||||
|
||||
constructor({
|
||||
id,
|
||||
code,
|
||||
service,
|
||||
httpStatusCode: httpStatusCode,
|
||||
httpStatusCode,
|
||||
message,
|
||||
}: {
|
||||
id: string
|
||||
code: string
|
||||
service: Service
|
||||
httpStatusCode?: number
|
||||
message?: string
|
||||
}) {
|
||||
this.id = id
|
||||
this.code = code
|
||||
this.service = service
|
||||
this.httpStatusCode = httpStatusCode
|
||||
@@ -50,19 +62,25 @@ export class ErrorModel {
|
||||
await supabase()
|
||||
.schema('content')
|
||||
.from('error')
|
||||
.select('code, ...service(service:name), httpStatusCode:http_status_code, message')
|
||||
.select('id, code, service(name), httpStatusCode:http_status_code, message')
|
||||
.eq('code', code)
|
||||
.eq('service.name', service)
|
||||
.is('deleted_at', null)
|
||||
.single<{
|
||||
id: string
|
||||
code: string
|
||||
service: Service
|
||||
service: {
|
||||
name: Service
|
||||
}
|
||||
httpStatusCode?: number
|
||||
message?: string
|
||||
}>()
|
||||
)
|
||||
.map((data) => {
|
||||
return new ErrorModel(data)
|
||||
return new ErrorModel({
|
||||
...data,
|
||||
service: data.service.name,
|
||||
})
|
||||
})
|
||||
.mapError((error) => {
|
||||
if (error.code === 'PGRST116') {
|
||||
@@ -71,4 +89,107 @@ export class ErrorModel {
|
||||
return convertPostgrestToApiError(error)
|
||||
})
|
||||
}
|
||||
|
||||
static async loadErrors(
|
||||
args: Parameters<ErrorCollectionFetch>[0]
|
||||
): ReturnType<ErrorCollectionFetch> {
|
||||
const PAGE_SIZE = 20
|
||||
const limit = args?.first ?? args?.last ?? PAGE_SIZE
|
||||
const service = args?.additionalArgs?.service as Service | undefined
|
||||
|
||||
const [countResult, errorCodesResult] = await Promise.all([
|
||||
fetchTotalErrorCount(service),
|
||||
fetchErrorDescriptions({
|
||||
after: args?.after ?? undefined,
|
||||
before: args?.before ?? undefined,
|
||||
reverse: !!args?.last,
|
||||
limit: limit + 1,
|
||||
service,
|
||||
}),
|
||||
])
|
||||
|
||||
return countResult
|
||||
.join(errorCodesResult)
|
||||
.map(([count, errorCodes]) => {
|
||||
const hasMoreItems = errorCodes.length > limit
|
||||
const items = args?.last ? errorCodes.slice(1) : errorCodes.slice(0, limit)
|
||||
|
||||
return {
|
||||
items: items.map((errorCode) => new ErrorModel(errorCode)),
|
||||
totalCount: count,
|
||||
hasNextPage: args?.last ? !!args?.before : hasMoreItems,
|
||||
hasPreviousPage: args?.last ? hasMoreItems : !!args?.after,
|
||||
}
|
||||
})
|
||||
.mapError(([countError, errorCodeError]) => {
|
||||
return CollectionQueryError.fromErrors(countError, errorCodeError)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchTotalErrorCount(service?: Service): Promise<Result<number, PostgrestError>> {
|
||||
const query = supabase()
|
||||
.schema('content')
|
||||
.from('error')
|
||||
.select('id, service!inner(name)', { count: 'exact', head: true })
|
||||
.is('deleted_at', null)
|
||||
|
||||
if (service) {
|
||||
query.eq('service.name', service)
|
||||
}
|
||||
|
||||
const { count, error } = await query
|
||||
if (error) {
|
||||
return Result.error(error)
|
||||
}
|
||||
return Result.ok(count ?? 0)
|
||||
}
|
||||
|
||||
type ErrorDescription = {
|
||||
id: string
|
||||
code: string
|
||||
service: Service
|
||||
httpStatusCode?: number
|
||||
message?: string
|
||||
}
|
||||
|
||||
async function fetchErrorDescriptions({
|
||||
after,
|
||||
before,
|
||||
reverse,
|
||||
limit,
|
||||
service,
|
||||
}: {
|
||||
after?: string
|
||||
before?: string
|
||||
reverse: boolean
|
||||
limit: number
|
||||
service?: Service
|
||||
}): Promise<Result<ErrorDescription[], PostgrestError>> {
|
||||
const query = supabase()
|
||||
.schema('content')
|
||||
.from('error')
|
||||
.select('id, code, service!inner(name), httpStatusCode: http_status_code, message')
|
||||
.is('deleted_at', null)
|
||||
.order('id', { ascending: reverse ? false : true })
|
||||
|
||||
if (service) {
|
||||
query.eq('service.name', service)
|
||||
}
|
||||
if (after != undefined) {
|
||||
query.gt('id', after)
|
||||
}
|
||||
if (before != undefined) {
|
||||
query.lt('id', before)
|
||||
}
|
||||
query.limit(limit)
|
||||
|
||||
const result = await query
|
||||
return new Result(result).map((results) => {
|
||||
const transformedResults = (reverse ? results.toReversed() : results).map((error) => ({
|
||||
...error,
|
||||
service: error.service.name,
|
||||
}))
|
||||
return transformedResults as ErrorDescription[]
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,14 +1,40 @@
|
||||
import { GraphQLError, GraphQLNonNull, GraphQLResolveInfo, GraphQLString } from 'graphql'
|
||||
import { type RootQueryTypeErrorArgs } from '~/__generated__/graphql'
|
||||
import { convertUnknownToApiError } from '~/app/api/utils'
|
||||
import type {
|
||||
ErrorCollection,
|
||||
RootQueryTypeErrorArgs,
|
||||
RootQueryTypeErrorsArgs,
|
||||
Service,
|
||||
} from '~/__generated__/graphql'
|
||||
import { ApiError, convertUnknownToApiError } from '~/app/api/utils'
|
||||
import { Result } from '~/features/helpers.fn'
|
||||
import {
|
||||
createCollectionType,
|
||||
GraphQLCollectionBuilder,
|
||||
paginationArgs,
|
||||
type CollectionFetch,
|
||||
} from '../utils/connections'
|
||||
import { ErrorModel } from './errorModel'
|
||||
import {
|
||||
GRAPHQL_FIELD_ERROR_GLOBAL,
|
||||
GRAPHQL_FIELD_ERRORS_GLOBAL,
|
||||
GraphQLEnumTypeService,
|
||||
GraphQLObjectTypeError,
|
||||
} from './errorSchema'
|
||||
|
||||
/**
|
||||
* Encodes a string to base64
|
||||
*/
|
||||
function encodeBase64(str: string): string {
|
||||
return Buffer.from(str, 'utf8').toString('base64')
|
||||
}
|
||||
|
||||
/**
|
||||
* Decodes a base64 string back to the original string
|
||||
*/
|
||||
function decodeBase64(base64: string): string {
|
||||
return Buffer.from(base64, 'base64').toString('utf8')
|
||||
}
|
||||
|
||||
async function resolveSingleError(
|
||||
_parent: unknown,
|
||||
args: RootQueryTypeErrorArgs,
|
||||
@@ -26,6 +52,51 @@ async function resolveSingleError(
|
||||
)
|
||||
}
|
||||
|
||||
async function resolveErrors(
|
||||
_parent: unknown,
|
||||
args: RootQueryTypeErrorsArgs,
|
||||
_context: unknown,
|
||||
_info: GraphQLResolveInfo
|
||||
): Promise<ErrorCollection | GraphQLError> {
|
||||
return (
|
||||
await Result.tryCatchFlat(
|
||||
async (...args) => {
|
||||
const fetch: CollectionFetch<ErrorModel, { service?: Service }, ApiError>['fetch'] = async (
|
||||
fetchArgs
|
||||
) => {
|
||||
const result = await ErrorModel.loadErrors({
|
||||
...fetchArgs,
|
||||
additionalArgs: {
|
||||
service: args[0].service ?? undefined,
|
||||
},
|
||||
})
|
||||
return result.mapError((error) => new ApiError('Failed to resolve error codes', error))
|
||||
}
|
||||
return await GraphQLCollectionBuilder.create<ErrorModel, { service?: Service }, ApiError>({
|
||||
fetch,
|
||||
args: {
|
||||
...args[0],
|
||||
// Decode base64 cursors before passing to fetch function
|
||||
after: args[0].after ? decodeBase64(args[0].after) : undefined,
|
||||
before: args[0].before ? decodeBase64(args[0].before) : undefined,
|
||||
},
|
||||
getCursor: (item) => encodeBase64(item.id),
|
||||
})
|
||||
},
|
||||
convertUnknownToApiError,
|
||||
args
|
||||
)
|
||||
).match(
|
||||
(data) => data as ErrorCollection,
|
||||
(error) => {
|
||||
console.error(`Error resolving ${GRAPHQL_FIELD_ERRORS_GLOBAL}:`, error)
|
||||
return error instanceof GraphQLError
|
||||
? error
|
||||
: new GraphQLError(error.isPrivate() ? 'Internal Server Error' : error.message)
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
export const errorRoot = {
|
||||
[GRAPHQL_FIELD_ERROR_GLOBAL]: {
|
||||
description: 'Get the details of an error code returned from a Supabase service',
|
||||
@@ -41,3 +112,18 @@ export const errorRoot = {
|
||||
resolve: resolveSingleError,
|
||||
},
|
||||
}
|
||||
|
||||
export const errorsRoot = {
|
||||
[GRAPHQL_FIELD_ERRORS_GLOBAL]: {
|
||||
description: 'Get error codes that can potentially be returned by Supabase services',
|
||||
args: {
|
||||
...paginationArgs,
|
||||
service: {
|
||||
type: GraphQLEnumTypeService,
|
||||
description: 'Filter errors by a specific Supabase service',
|
||||
},
|
||||
},
|
||||
type: createCollectionType(GraphQLObjectTypeError),
|
||||
resolve: resolveErrors,
|
||||
},
|
||||
}
|
||||
|
||||
@@ -8,6 +8,7 @@ import {
|
||||
import { SERVICES } from './errorModel'
|
||||
|
||||
export const GRAPHQL_FIELD_ERROR_GLOBAL = 'error' as const
|
||||
export const GRAPHQL_FIELD_ERRORS_GLOBAL = 'errors' as const
|
||||
|
||||
export const GraphQLEnumTypeService = new GraphQLEnumType({
|
||||
name: 'Service',
|
||||
|
||||
@@ -32,7 +32,8 @@ async function resolveSearch(
|
||||
info
|
||||
)
|
||||
).match(
|
||||
(data) => GraphQLCollectionBuilder.create({ items: data }),
|
||||
// Building a collection from an array is infallible
|
||||
async (data) => (await GraphQLCollectionBuilder.create({ items: data })).unwrap(),
|
||||
(error) => {
|
||||
console.error(`Error resolving ${GRAPHQL_FIELD_SEARCH_GLOBAL}:`, error)
|
||||
return new GraphQLError(error.isPrivate() ? 'Internal Server Error' : error.message)
|
||||
|
||||
@@ -50,7 +50,8 @@ export const GraphQLObjectTypeGuide = new GraphQLObjectType({
|
||||
}),
|
||||
description:
|
||||
'The subsections of the document. If the document is returned from a search match, only matching content chunks are returned. For the full content of the original document, use the content field in the parent Guide.',
|
||||
resolve: (node: GuideModel) => GraphQLCollectionBuilder.create({ items: node.subsections }),
|
||||
resolve: async (node: GuideModel) =>
|
||||
(await GraphQLCollectionBuilder.create({ items: node.subsections })).unwrap(),
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
@@ -6,7 +6,7 @@ import {
|
||||
printSchema,
|
||||
} from 'graphql'
|
||||
import { RootQueryTypeResolvers } from '~/__generated__/graphql'
|
||||
import { errorRoot } from './error/errorResolver'
|
||||
import { errorRoot, errorsRoot } from './error/errorResolver'
|
||||
import { searchRoot } from './globalSearch/globalSearchResolver'
|
||||
import { GraphQLObjectTypeGuide } from './guide/guideSchema'
|
||||
import { GraphQLObjectTypeReferenceCLICommand } from './reference/referenceCLISchema'
|
||||
@@ -37,6 +37,7 @@ export const rootGraphQLSchema = new GraphQLSchema({
|
||||
...introspectRoot,
|
||||
...searchRoot,
|
||||
...errorRoot,
|
||||
...errorsRoot,
|
||||
},
|
||||
}),
|
||||
types: [
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import {
|
||||
GraphQLBoolean,
|
||||
GraphQLError,
|
||||
GraphQLInt,
|
||||
GraphQLList,
|
||||
GraphQLNonNull,
|
||||
@@ -7,6 +8,7 @@ import {
|
||||
type GraphQLOutputType,
|
||||
GraphQLString,
|
||||
} from 'graphql'
|
||||
import { Result } from '~/features/helpers.fn'
|
||||
import { nanoId } from '~/features/helpers.misc'
|
||||
|
||||
/**
|
||||
@@ -185,7 +187,7 @@ export function createCollectionType(
|
||||
/**
|
||||
* Interface for standard pagination arguments for a GraphQL connection
|
||||
*/
|
||||
interface IPaginationArgs {
|
||||
export interface IPaginationArgs {
|
||||
first?: number | null
|
||||
after?: string | null
|
||||
last?: number | null
|
||||
@@ -219,21 +221,26 @@ export const paginationArgs = {
|
||||
* GraphQL query. Takes standard pagination args and returns standard page
|
||||
* information.
|
||||
*/
|
||||
interface CollectionFetch<ItemType, FetchArgs = unknown> {
|
||||
export interface CollectionFetch<ItemType, FetchArgs = unknown, ErrorType = Error> {
|
||||
fetch: (
|
||||
args?: IPaginationArgs & {
|
||||
additionalArgs?: FetchArgs
|
||||
}
|
||||
) => Promise<{
|
||||
items: Array<ItemType>
|
||||
totalCount: number
|
||||
hasNextPage?: boolean
|
||||
hasPreviousPage?: boolean
|
||||
}>
|
||||
) => Promise<
|
||||
Result<
|
||||
{
|
||||
items: Array<ItemType>
|
||||
totalCount: number
|
||||
hasNextPage?: boolean
|
||||
hasPreviousPage?: boolean
|
||||
},
|
||||
ErrorType
|
||||
>
|
||||
>
|
||||
args?: IPaginationArgs & {
|
||||
additionalArgs?: FetchArgs
|
||||
}
|
||||
getCursor?: (item: ItemType, idx?: number) => string
|
||||
getCursor: (item: ItemType, idx?: number) => string
|
||||
items?: never
|
||||
}
|
||||
|
||||
@@ -248,41 +255,59 @@ interface CollectionInMemory<ItemType> {
|
||||
getCursor?: never
|
||||
}
|
||||
|
||||
interface GraphQLCollection<ItemType> {
|
||||
edges: Array<{ node: ItemType; cursor: string }>
|
||||
nodes: Array<ItemType>
|
||||
totalCount: number
|
||||
pageInfo: {
|
||||
hasNextPage: boolean
|
||||
hasPreviousPage: boolean
|
||||
startCursor: string | null
|
||||
endCursor: string | null
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Union type for parameters to build a collection. Can be a remote collection
|
||||
* that needs to be fetched or a local one in memory.
|
||||
*/
|
||||
type CollectionBuildArgs<ItemType, FetchArgs = unknown> =
|
||||
| CollectionFetch<ItemType, FetchArgs>
|
||||
type CollectionBuildArgs<ItemType, FetchArgs = unknown, ErrorType = Error> =
|
||||
| CollectionFetch<ItemType, FetchArgs, ErrorType>
|
||||
| CollectionInMemory<ItemType>
|
||||
|
||||
export class GraphQLCollectionBuilder {
|
||||
static async create<ItemType, FetchArgs = unknown>(
|
||||
options: CollectionBuildArgs<ItemType, FetchArgs>
|
||||
) {
|
||||
static async create<ItemType, FetchArgs = unknown, ErrorType = Error>(
|
||||
options: CollectionBuildArgs<ItemType, FetchArgs, ErrorType>
|
||||
): Promise<Result<GraphQLCollection<ItemType>, GraphQLError | ErrorType>> {
|
||||
const { fetch, args = {}, getCursor, items } = options
|
||||
|
||||
if (items) {
|
||||
return GraphQLCollectionBuilder.paginateArray({ items, args })
|
||||
return Result.ok(GraphQLCollectionBuilder.paginateArray({ items, args }))
|
||||
}
|
||||
|
||||
const result = await fetch(args)
|
||||
const { items: fetchedItems, totalCount, hasNextPage = false, hasPreviousPage = false } = result
|
||||
const edges = fetchedItems.map((item) => {
|
||||
return { node: item, cursor: getCursor?.(item) ?? '' }
|
||||
})
|
||||
|
||||
return {
|
||||
edges,
|
||||
nodes: fetchedItems,
|
||||
totalCount,
|
||||
pageInfo: {
|
||||
hasNextPage,
|
||||
hasPreviousPage,
|
||||
startCursor: edges.length > 0 ? edges[0].cursor : null,
|
||||
endCursor: edges.length > 0 ? edges[edges.length - 1].cursor : null,
|
||||
},
|
||||
if (args.first && args.last) {
|
||||
return Result.error(new GraphQLError('Cannot specify both first and last arguments'))
|
||||
}
|
||||
|
||||
return (await fetch(args)).map(
|
||||
({ items: fetchedItems, totalCount, hasNextPage = false, hasPreviousPage = false }) => {
|
||||
const edges = fetchedItems.map((item) => {
|
||||
return { node: item, cursor: getCursor(item) }
|
||||
})
|
||||
|
||||
return {
|
||||
edges,
|
||||
nodes: fetchedItems,
|
||||
totalCount,
|
||||
pageInfo: {
|
||||
hasNextPage,
|
||||
hasPreviousPage,
|
||||
startCursor: edges.length > 0 ? edges[0].cursor : null,
|
||||
endCursor: edges.length > 0 ? edges[edges.length - 1].cursor : null,
|
||||
},
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
private static paginateArray<T>({ items, args }: CollectionInMemory<T>) {
|
||||
|
||||
@@ -9,6 +9,7 @@ export type Database = {
|
||||
created_at: string | null
|
||||
deleted_at: string | null
|
||||
http_status_code: number | null
|
||||
id: string
|
||||
message: string | null
|
||||
metadata: Json | null
|
||||
service: string
|
||||
@@ -19,6 +20,7 @@ export type Database = {
|
||||
created_at?: string | null
|
||||
deleted_at?: string | null
|
||||
http_status_code?: number | null
|
||||
id?: string
|
||||
message?: string | null
|
||||
metadata?: Json | null
|
||||
service: string
|
||||
@@ -29,6 +31,7 @@ export type Database = {
|
||||
created_at?: string | null
|
||||
deleted_at?: string | null
|
||||
http_status_code?: number | null
|
||||
id?: string
|
||||
message?: string | null
|
||||
metadata?: Json | null
|
||||
service?: string
|
||||
|
||||
@@ -15,4 +15,11 @@ values
|
||||
(select id from content.service where name = 'AUTH'),
|
||||
500,
|
||||
'This is a test error message'
|
||||
),
|
||||
('test_code2', (select id from content.service where name = 'AUTH'), 429, 'Too many requests'),
|
||||
(
|
||||
'test_code3',
|
||||
(select id from content.service where name = 'REALTIME'),
|
||||
500,
|
||||
'A realtime error message'
|
||||
);
|
||||
|
||||
Reference in New Issue
Block a user