Remix Search Engine Optimization
How to improve SEO in your Remix application
Thursday, March 17, 2022
TL;DR
Here is the Remix SEO checklist
Introduction: Why bother?
SEO stands for search engine optimization, which is a set of practices designed to improve the appearance and positioning of web pages in organic search results. Because organic search is the most prominent way for people to discover and access online content, a good SEO strategy is essential for improving the quality and quantity of traffic to your website.
Although Remix is a great framework, we still need to manually configure SEO to reach more users. In this blog, I will discuss how to improve SEO in your Remix application.
Check this blog in case you are not familiar with Remix.
High-quality content
This is not Remix specific, but just like in any website, the best way to improve your SEO is to have good content.
Now, let's continue on things we can control, as a developer, in our Remix application.
Using meta tags
Meta tags are specific snippets of text and image content that summarize a web page. Often meta-tag data shows up whenever someone shares a link on social media, in messaging, or in your business chat software.
To render the meta tags that we will declare in our routes, we first need to add the Meta
component in the head
of your app/root
file.
app/root.jsx12345678910111213
Check this link to see a sample usage of the above code.
You can technically add meta tags directly to the
head
of thehtml
template. However, it is recommended to add "unique" information such astitle
anddescription
on every route.
Using Vanila JavaScript
app/routes/[routeName].jsx12345
Using TypeScript
app/routes/[routeName].tsx1234567
Remix is smart enough to convert a
meta
property into the appropriate tag with the correct value as shown in the upcoming examples.
Must have meta tags
Title
A title tag is the second most important factor for on-page SEO, only trailing high-quality content.
.jsx12345
.html123
Description
The meta description often serves as a pitch to people who find your website on Google or social media sites. While it's not required and Google can use text from you website instead of what you specifiy in the meta data, it's better to control the description text where you can.
.jsx123456
.html1234
Image
With the visual nature of the web, your Meta Tag Image is the most valuable graphic content you can create to encourage users to click and visit your website.
.jsx123456
.html1234
Social media meta tags
Although it is not required, with a good social media presence, you can attract more users which will organically increase your search ranking.
OG:info
Open Graph meta tags are snippets of code that control how URLs are displayed when shared on social media.
They're part of Facebook's Open Graph protocol and are also used by other social media sites, including LinkedIn and Twitter (if Twitter Cards are absent).
.jsx12345678910
.html12345678
Adding twitter:info
These are used by Twitter to display information about your website.
You don't define all of these as Twitter will reuse some og
meta tags.
In case of an overlap(og:description
and twitter:description
), Twitter will pick the Twitter-specific information.
.jsx1234567891011
.html123456789
Putting all the meta tags together
.jsx1234567891011121314151617
Validators
Here are some validators that you can use to test your meta tags.
- One Stop Shop validator: https://metatags.io/
- Facebook: https://developers.facebook.com/tools/debug
- Twitter: https://cards-dev.twitter.com/validator
- LinkedIn: https://www.linkedin.com/post-inspector/inspect/
- Pinterest: https://developers.pinterest.com/tools/url-debugger/
- Structured Data: https://developers.google.com/search/docs/advanced/structured-data
Sitemap.xml
A sitemap is a file where you provide information about the pages, videos, and other files on your site and their relationships. Search engines like Google read this file to crawl your site more efficiently. A sitemap tells Google which pages and files you think are important in your site and provides valuable information about these files. For example, when the page was last updated and any alternate language versions. Learn more
Create a sitemap.xml
inside the public
directory
sitemap.xml123456789101112
If you have a lot of dynamic pages, you can create a sitemap generator.
Robots.txt
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. Learn more
Create a robots.txt
inside the public
directory
robots.txt12