Interact
8 min read
Website changes 2023
As mentioned in my previous post, I recently overhauled my website, this site. It has been completely rebuilt, again. I moved from Astro, the best choice for a static-site, back to Next.js, specifically the new app directory (still in beta as of writing). In this post, I’ll go over why I switched, what I switched and how I switched it.
Firstly, I’ll mention that there are a few new changes since the previous post.
-
I have dropped Sanity as my CMS in favour of
.mdx
files combined with Contentlayer.What are
.mdx
files? MDX is Extended Markdown that combines Markdown with JSX, so you can import React components inside Markdown files. (For the none developers out there. Markdown is a lightweight markup language, it provides a simple way to format plain text using a set of conventions. You’ve probably experienced it through apps like WhatsApp to embolden text, if you are keen to know more, ask ChatGPT).But why did I decide to move? The short answer is that I’m a magpie, I love the new (relative to me) and shiny. The less short answer is that I’ve wanted to learn how MDX works for a while but never had the opportunity at work, so I created one. It allows me to own my data as it’s not on a server owned and controlled by Sanity, makes my data portable as it’s not stuck in a proprietary database. It makes it easier to add one-off features to a post as I can quickly make a component and embed it in the file, whereas before I’d also have to develop the same thing on the Sanity side too.
And finally, it provides a better workflow—I’m typing this post using Ulysses, which by default uses Markdown, so I can simply copy over and done. Previously I had to be reformatted with Sanity’s wysiwyg editor (I know there’s a Markdown plugin for sanity, but I didn’t want to go back and redo everything). Next, how did I get my data from Sanity to MDX? Before looking into it, I was worried that I would have to manually do it for each blog post 😮💨 but I thought why don’t I ask ChatGPT (GPT-4).
How would I export my data from Sanity.io into Markdown files?
Damn. It’s so good. ChatGPT provided a Node script to query data from Sanity, format as Markdown and then save as Markdown files. Obviously, this didn’t work as is for my data and needs, but it got me a lot of the way there. Together with ChatGPT, I was able to write a few different scripts, posts, categories, projects, to get all my data from Sanity into Markdown. Finally, using my content in my TypeScript Next app. That’s where Contentlayer came into play, it basically provides a type-safe interface between your content and your application. It’s remarkable.
-
I have changed the blog directory from
/writes
to/blog
because I decided to be normal, so I have set up redirects for the blog, so any legacy links go to the right place. -
I have removed the like button from blog posts because I’m not concerned with vanity metrics, and I could not be bother to rewrite my mutation that was saving data in Sanity to the new setup described above.
Astro is a relatively new and exciting front-end framework, designed to build faster websites while shipping less JavaScript to the client, and it’s truly wonderful. However, as projects evolve, you might find yourself wanting to switch to a more established framework like Next. Which I did. I also wanted to build with the new app dir. To convert an Astro project to Next, I started by creating a new branch and then moving all the Astro project into an Astro directory, so I could easily reference the files then installed Next.
pnpx create-next-app@latest --experimental-app
Once Next was set up, along with supporting packages, i.e. TailwindCSS, I began migrating files from my Astro project that I wanted to keep or would have minor changes. This includes my query hooks to get data from Sanity (now redundant) and most of my components. Most components just needed to be renamed from .astro
to .tsx
and be reformatted.
One of the biggest reasons to use the new app dir is to trial React Server Components (RSC). RSC is the next step change for React, in the same way hooks were. You can read up on them here, but basically, by default pages and components are server rendered, including any data fetches. Previously, components in Next couldn’t fetch data without doing it client-side, this is a massive DX improvement not having to fetch in a page then prop drill into the desired component or using Context. Component needs data? Fetch data there and then. And the fetch happens server-side. The main components I have utilising RSC are my activity widgets. Spotify is a client component, but more on that later. My Audible and health widgets are both RSC and each is supported by a secondary service that I’ve set up, which I won’t go into right now. In each component, there’s an async function to fetch data, and then it’s called inside the component. See example below:
async function getBook(): Promise<Book | null> {
// fetch
const res = await fetch(ENDPOINT_URL, {
next: {
revalidate: 60 * 60 * 24, // this will invalidate the data every 24 hours
},
});
// handle error
if (!res.ok) {
throw new Error(`HTTP error! status: ${res.status}`);
}
// get data off response
const data = await res.json();
// if no data bail
if (!data) {
return null;
}
return data[0];
}
export default async function Audible() {
// called function to fetch
const data = await getBook(); // this is typed
...
}
The mental model presented in this example just makes so much sense to me. Colocating my data fetch with my usage, plus the types are inherited which is so much nice that assigning multiple times which I’ve had to do in the past.
To use client components, you must declare it "use client"
at the top of the component. I’m not a fan of this implementation, but I don’t have a better solution to offer, so 🤐. As mentioned above, my Spotify widget is a client component, as it’s fetching data on the fly and using hooks. I’m using SWR to hit my api endpoint every 6 seconds (the duration of the playing animation) to get the currently playing track, if any. It’s using https://api.spotify.com/v1/me/player/currently-playing
behind the scene, but I’m also cleaning up the data, so I only get want I need on the client. I’m still using the pages directory for my api routes because the new route handlers weren’t available at the time of building, but I will be migrating at some point soon.
"use client";
export default function NowPlaying(...) {
const fetcher = (url: URL) => fetch(url).then((r) => r.json());
const { data, isLoading } = useSWR<NowPlaying>(ENDPOINT_URL, fetcher, {
refreshInterval: 6000,
});
...
}
For the new design, I was inspired by following prominent front-end devs and designers on Twitter. I thought about doing a mockup in Figma first, but I always get fatigued as I’m not quick in Figma, so I just move straight to designing in the browser with Tailwind. This unlocks rapid prototyping to quickly build and iterate on designs. The speed and flexibility of Tailwind is perfect for design experimentation.
My blog is the section with the most new features—I added pagination, topics (categories), series’ and month/year archives. Pagination allowed for smaller, quick pages, as before I was just a long stream which over time would get annoying. Topics allow for grouping of similar content. Series’ to group posts that are in parts, like this one is. And archives to easily catch up on a month or years posts.
I removed the snippets section as it wasn’t very useful to me—originally I thought it would be fast to have a central location for me to reference snippets of code, but since Copilot I haven’t the need.
The last thing I want to mention is open graph (OG) images, I’m using the vercel/og library to generate images on the fly and caching them at the edge. Buzzwords! That’s just means they’re dynamically generated using some static assets and my data then cached, so they’re super quick and reduces compute.
To wrap up, this website revamp journey has been an exciting exploration of new technologies and approaches. The switch from Astro to Next.js, and the embrace of MDX files with Contentlayer, have opened up fresh possibilities for both the design and functionality of the site. The enhancements to the blog section, along with the removal of snippets and implementation of dynamic Open Graph images, demonstrate a continuous drive to refine and optimise the user experience as well as my skills. As the digital landscape continues to evolve, so too will this website, ever-adapting and growing to better serve its audience and me.