Does using JS on your website harm your Organic Search rankings? It depends.
There are few hard and fast rules when it comes to SEO. That’s why boosting (or hindering) your SEO can feel like a dependency puzzle. The most common way SEOs answer questions like this is “It depends.” It depends on whether it’s hindering your specific website from meeting its goals.
JS is how modern websites are built, and Google has been adjusting over the years to adapt to that. However, sometimes the way JS is implemented on a page still seems to be detrimental to organic search rankings. Our SEO team keeps an eye out for these potential discrepancies, as we have many different teams touching client websites for digital marketing campaigns and redesigns.
- Google can and does crawl and render JS
- Googlebot does not necessarily crawl and render JS frequently
- Usually, if the JS element requires a mouse-click to activate a button or a link, Googlebot is not doing this interaction
- JS can be used to speed up a website a great deal
- JS can also very much slow down a website… a lot
Answer the following questions to know whether the JS implementation on your site is getting in the way of SEO success.
- Is new content appearing in search results within about a week?
- Are fewer than 10% of your Site’s Pages Excluded from the Index in Google Search Console?
- Is Pagespeed Good in Core Web Vitals?
- Are your Videos and Podcasts Being Indexed?
- Are Your Pages Ranking for Lots of Long-Tail Terms for Content Deeper on the Page?
Is New Content Appearing in Google Search Results Within about a Week?
If your website is ranking well, new content is indexed and cached quickly, and the updates you make to existing content appear in Google search results within a day or two, you have nothing to worry about in terms of JS and SEO.
Google your website with the “site” operator, followed by the target term for the page (e.g. site:example.com “target term” ). This will let you quickly to see all of the URLs indexed that include that term. If new content is there within about a week after publication, you probably don’t have a problem.
The URL inspection element in Google Search Console includes page-specific JS errors, if there are any. It can be helpful to use that while troubleshooting pages that are not ranking well.
How Many Pages are Excluded from the Index in Google Search Console?
Many clients find their Google Search Console Index Coverage reports list hundreds of blog posts as “Discovered – Not Indexed” or “Crawled – Not Indexed”. This is an indication that there may be a problem.
Usually, these URLs are not indexed because Googlebot doesn’t have enough contextual information about the pages – and because the mobile, text-only version of the robot can’t locate internal links pointing to them for that additional context.
Even if your content is indexed, if it takes more than a week or two for optimizations of existing content to be updated in the Google search results, there may be a problem.
Internal links on most standard websites happen in the top nav and the pagination of resources and blog posts. If any of the above errors are occurring in indexation speed and completeness, then the following actions should be considered:
- The top-level navigation should not rely upon JS to function, and the URLs should be available as a text-only HTML link. If the nav code prevents this, then use the noscripts tag to include the link elements for all of the pages in the top nav. (Even popular REACT sites like Tesla use this workaround).
- Pagination of blog posts, blog categories, resource center categories and types should not rely upon a JS “view more” button. Unique URLs should drive to pagination for the list of text links to be surfaced to an html-only robot. If that’s not possible, consider linked breadcrumbs on posts that lead back to more specific landing pages that summarize relevant content for each category without paginating.
- Manually crosslink to the most important pages within the body copy of your content with rich, specific anchor text.
The inclusion of these deeper links in an XML sitemap, but without internal linking available to the text-only bot is usually what creates these “discovered but not indexed” errors. The sitemap reinforces the information architecture, it doesn’t replace it.
We all know that it’s not usually possible for developers to spend time creating unique, bespoke DOMs for each template of a website. Usually, the JS that serves the homepage is the same JS on every page of the site. Google Pagespeed Insights would love it if that weren’t the case, to keep each payload small and crisp.
JS elements that can help speed up page load times:
- Make sure none of the elements “block” the initial render of the H1 and the hero image – the largest element of content above the fold on the mobile screen
- Lazy-load everything below the fold on a mobile screen (including a facade on the chatbot widget)
- Pre-fetch third-party tools to avoid having to wait for their resources
- Minify JS
- Strip out any JS no longer used on the site on a regular basis
- Ensure all JS is secure and up to date
Are your Videos and Podcasts Being Indexed?
Each video or podcast episode should have it’s own unique URL so that Google can crawl, contextualize and index that specific URL.
(For best results, include chapter bookmarks, Schema markup, and transcripts for each asset on that unique URL!)
Are Your Pages Ranking for Lots of Long-Tail Terms for Content Deeper on the Page?
Sometimes, to save real estate on a page, website designers will “Tab” or hide content behind a read-more click – these elements are usually coded and implemented with JS. They aren’t a problem, per se, but if that content that is hidden is important for the page to rank, it’s important to ensure that Googlebot can get to it in text-only form.
The noscripts tag is useful here. The alternative is simply ensuring that all of the text on the page shows up in the HTML.
JS and SEO are a moving target
Over the past months and years, we have seen Google index and cache JS more and more frequently. In time, we believe this will be a non-issue. However, right now this is still very much a gray area.
It’s worth watching for the key elements noted above, and ensuring that contexts are similar across desktop and mobile, JS and HTML. It’s something that Google can see and measure, and to most of us SEO experts, that signals a potential future ranking factor. Our team is constantly looking out for this in B2B marketing, as discrepancies can happen often affected by redesigns, third-party resources, and more.