Svelte rendering and SEO

It goes beyond Svelte, SEO is a concern no matter what fancy js tool you use. You should never forget that.

Svelte is strong. It generates efficient Vanilla JavaScript from a hybrid HTML-like syntax.

Even if Svelte’s ecosystem is still young, you’ll find many resources to boost your project and save time. But how do you handle SEO? Before we can answer that question, we need to explore essential concepts.


We won’t see how to super-boost your ranking. The idea is not to harm indexation and understand what’s at stake.

What is rendering?

The rendering engine parses your documents and displays the parsed content on the screen.

There are various rendering engines. WebKit is probably the most popular.

Before you can see something on the screen, there are many steps (layers) and calculations. That is why you need to optimize many things, including stylesheets and scripts, to optimize global rendering and prevent any render-blocking issue.

Why using Javascript for rendering

JavaScript-powered frameworks are more and more popular. Websites and apps are fast, smooth, and thanks to beautiful tools such as Svelte, the setup is easier than ever.

You know, simplicity is a feature. If something is too complicated, it’s rarely the right way. Keeping things stupid and simple requires expertise and accuracy.

With js websites, the rendering can be excellent. The difference is that you don’t use any server but the browser to render your project. I know it’s way more subtle than that, but let’s keep it basic.

As a result, you get more interactivity (and reactivity ^^) and tones of great features.

Client-Side Rendering (CSR)

Roughly speaking, it’s rendering with the browser. You send some basic HTML structure with a relatively small piece of JavaScript to the browser, and the “magic” happens in there.

The Javascript puts the puzzle (~ your contents) together. Keep in mind that not all bots can run JavaScript.

Googlebot does read JavaScript, but not the way you might think. First, it reads your robots.txt to see if the URL is allowed. Then it knows it can skip or fetch your contents.

If it’s “pure” HTML/CSS, then all contents are in the HTTP response. If it’s a JavaScript-powered website, it parses the HTML and comes back later to render the JavaScript.

It defers js rendering because such a rendering has a high cost. It needs resources to compute things, but resources are not infinite, so it has a queue mechanism. Likewise, the queue is not infinite, so GoogleBot cannot render everything every day.

The bot might index your content after several days or even weeks. Google calls those steps the initial and second waves of indexing.


Server-Side Rendering (SSR)

Roughly speaking, you need a server this time, but all contents are directly indexable with SSR.

Javascript SSR happens when it’s the server that builds the code (e.g., React) and sends the HTML/CSS/Js result to the browser.

A lot of frameworks, such as Nextjs, follow that process. The caveat is that every single request makes the server work all over again.

Besides, it often has a high infrastructure cost.

Dynamic rendering

It’s a workaround for Googlebot and other crawlers.

The server reads the CSR contents and sends a simplified version to search engines and crawlers while humans still get the CSR part. There are some caveats too.

It requires a lot of resources, and you have to detect crawlers precisely. There are some tools for that, but it’s not easy to set up and maintain correctly.

Hybrid rendering

In a nutshell, you ship the most critical part of your content as static HTML, and the Javascript will add everything else. This time there is neither crawler detection nor separate versions.

As a result, it acts like SSR, and you can still leverage the benefits of CSR.

How to modify the head section with Svelte

If you look at the basic source code provided by the Svelte template, you might be afraid:


There are frameworks built upon Svelte that bring kick-ass features, including SEO (e.g., Sapper). I won’t talk about them specifically, but please have a look at them. You could save a lot of time.

If you prefer handling that, you can start with the head section using Svelte head. This element allows for adding stuff in the head section, so in your .svelte file, you can do the following:

	<!-- your meta here -->

Once you have a robust <head>, it’s relatively easy to add routing, for example, with the Svelte routing package or with any framework powered by Svelte.

I strongly recommend using prerendering techniques, especially if you have a lot of content and pages.


Here is what Netlify says about prerendering :

Google has marked their Standard for Ajax Crawling as deprecated. They’re still following the standard, but recommend that single page app authors just rely on Google’s built-in capacity for interpreting JavaScript applications. In our experience that’s often still not enough and prerendering is often still a necessity.


If you don’t know how to start prerendering, some hosts are quite useful for that. For example, Netlify has a beta feature called “prerendering” in the post-processing options. Please enable it and enjoy \o/.

There are efficient external services for too, such as

To test if everything works fine, you can do simple things like:

curl -A Googlebot

It will give you what Googlebot gets, but be aware Google has cached versions of your pages. It’s also a good idea to watch it with the Google Search Console.

Wrap up

Prerendering and hybrid approaches are probably the easiest AND most recommended ways to handle SEO with JavaScript-powered websites.

Svelte is impressive—High-trafic websites such as Spotify or the New York Times use Svelte in production. However, be extra careful with the SEO part when migrating from any other tool or starting a new fantastic project.