* Move typescript version definition to pnpm-workspace.
* Bump typescript to 5.9.
* Minor fixes to satisfy the compiler.
---------
Co-authored-by: Raúl Barroso <code@raulb.dev>
strictNullChecks was off for docs, which lets errors slip through and
leads to incorrect required/optional typing on Zod-inferred types. This
PR enables strictNullChecks and fixes all the existing violations.
# Before
Crawlers pages that used MDX components with defined attributes errored.
Example: /docs/reference/kotlin/installing
# After
The pages do not error.
The end of the Great Migration is here!!!
Moves the last pages over, deleting pages like the FAQ that we don't use
and that contain duplicated information anyway.
Dev secret auth page URL had to change as App Router doesn't like the
leading underscores in the path.
Also fixes the not-found recommendations to use the proper Next.js
not-found page so it will return a 404 as it should. (I was under the
erroneous impression that I couldn't get the pathname in not-found.tsx,
that is not true, so this works better.)
Migrates client SDK References to App Router. (Management and CLI API references aren't migrated yet, nor are self-hosting config references.)
Some notes:
Big changes to the way crawler pages are built and individual section URLs (e.g., javascript/select) are served. All of these used to be SSG-generated pages, but the number of heavy pages was just too much to handle -- slow as molasses and my laptop sounded like it was taking off, and CI sometimes refuses to build it all at all.
Tried various tricks with caching and pre-generating data but no dice.
So I changed to only building one copy of each SDK+version page, then serving the sub-URLs through a response rewrite. That's for the actual user-visible pages.
For the bot pages, each sub-URL needs to be its own page, but prebuilding it doesn't work, and rendering on demand from React components is too slow (looking for super-fast response here for SEO). Instead I changed to using an API route that serves very minimal, hand-crafted HTML. It looks ugly, but it's purely for the search bots.
You can test what bots see by running curl --user-agent "Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" <URL_OF_PAGE>
Also added some smoke tests to run against prod for the crawler routes, since we don't keep an eye on those regularly, and Vercel config changes could surprise-break them. Tested the meta images on Open Graph and all seems to work fine.
With this approach, full production builds are really fast: ~5 minutes
Starts using the new type spec handling, which is better at finding params automatically, so I could remove some of the manually written ones from the spec files.