ReactJS, the Next ‘Supreme’ Front End Framework/Library?
There are several types of JS frameworks/libraries available to developers, with varying levels of experience and enthusiasm surrounding them. In my opinion, the rising ‘Supreme’ that SEOs should keep their eye on in the next few years is React.
What this means for SEOs in particular: Though only 18% of the top 10,000 websites are currently built with React, this number has been steadily climbing over the past three years. This type of data suggests that SEOs will become increasingly likely to work on a React app at some point within their SEO career.
This is especially true for SEOs that aspire to work with enterprise-level companies. In fact, many impressive companies (that SEOs may like to work at one day 🤓) have already joined React app bandwagon – including CNN, Instagram, Facebook, FOX, Netflix, and the New York Times.
SEO-Optimizing Your ReactJS App
When performing an initial SEO audit on a React app, there are five items that are a little bit different / I recommend paying especially close attention to. Check them out below:
1) Don’t Block Important JS Files from Googlebot (Accidentally or on Purpose)
If search engines are blocked from crawling your site’s important JS files, Googlebots is unable to “see” the same thing as the end user – resulting in a loss of page authority and rankings in search.
Solution: Check your website’s robots.txt file to ensure no that no important JS files are labeled as “disallow”.
I also recommend referencing Google Search Console’s URL Inspection Tool > page resources section, which lets you know if/which page resource are blocked by the robots.txt.
If your team deems an external resource important enough, a solid work around I’ve found is to copy the js file to your own server, and link to it there. For this particular file, we’ve kept it as is.
- Syntax errors:
- Related to spelling errors in your code that cause the program not to run/stop working part way through.
- Logic errors:
- Related to errors where the syntax is correct but the code is not what you intended it to be. So, while the program runs successfully but gives incorrect results.
You will notice however, that there is a warning regarding ‘Unsupported Browser’. Using deductive reasoning (and confirming one of our in-house developers 😉) we can see that it is related to a third party customer survey software (note the ‘Delighted’ hint). It provides this message to user(s) that an older version of a web browser. This is fine.
3) Make sure internal links are implemented via anchor tags
Solution: Rely on good old fashioned anchor tags (href=”URL”) to communicate most effectively with Googlebot.
4) Maintain an organized URL structure, free of fragment identifiers
To ensure that Google knows you want it to crawl two different pages and index them separately, it is important to have two separate URLs rather than fragments (denoted with a # that changes the content of a page).
Hypothetical Fragment Page: FOX.com#entertainment
SEO-optimized Page: FOX.com/entertainment
Solution: Work with your developer to optimize site wide usage of fragments.
5) Double Check That Google Can See Your Menu Links and Tabbed Content
Next, navigate to the elements section and find the div element that holds one of your menu or tabbed links/content. If you can find it there Google has access to it!
Another way to spot check that your VIP content is being indexed is to type in the following formula: “site:example.com [content name].” As we can see below, Google is able to index the Masked Singer clip requested, which is good.
Bonus Tip: Dynamic Rendering of the Page
If you’re in need of a work around to help bots render your page correctly, consider implementing dynamic rendering.
“[With dynamic rendering] requests from crawlers are routed to a renderer, [whereas] requests from users are served normally.” Martin Splitt, Webmaster Trends Analyst
This can be a good solution for when there are no clearly blocked resources, when JS takes longer to load, or you’re just otherwise stumped. An example of this can be made of FOX.com’s filter pages (e.g., /comedy).
Solution: To learn more about how to implement dynamic rendering on your ReactJS site, check out this great resource from Google on the topic.