Is JavaScript Hurting your SEO?

April 18, 2023
SEO Javascript

Does using JS on your website harm your Organic Search rankings? It depends.

There are few hard and fast rules when it comes to SEO. That’s why boosting (or hindering) your SEO can feel like a dependency puzzle. The most common way SEOs answer questions like this is “It depends.” It depends on whether it’s hindering your specific website from meeting its goals. 

JS is how modern websites are built, and Google has been adjusting over the years to adapt to that. However, sometimes the way JS is implemented on a page still seems to be detrimental to organic search rankings. Our SEO team keeps an eye out for these potential discrepancies, as we have many different teams touching client websites for digital marketing campaigns and redesigns.

Here’s what we know for certain about JavaScript (JS) and SEO:

  • Google can and does crawl and render JS
  • Googlebot does not necessarily crawl and render JS frequently
  • Usually, if the JS element requires a mouse-click to activate a button or a link, Googlebot is not doing this interaction
  • JS can be used to speed up a website a great deal
  • JS can also very much slow down a website… a lot

Answer the following questions to know whether the JS implementation on your site is getting in the way of SEO success.

  • Is new content appearing in search results within about a week?
  • Are fewer than 10% of your Site’s Pages Excluded from the Index in Google Search Console?
  • Is Pagespeed Good in Core Web Vitals?
  • Are your Videos and Podcasts Being Indexed?
  • Are Your Pages Ranking for Lots of Long-Tail Terms for Content Deeper on the Page?

If you can answer “Yes” to all of these questions, then your JavaScript Implementation is not impeding SEO!

Is New Content Appearing in Google Search Results Within about a Week?

If your website is ranking well, new content is indexed and cached quickly, and the updates you make to existing content appear in Google search results within a day or two, you have nothing to worry about in terms of JS and SEO. 

Google your website with the “site” operator, followed by the target term for the page (e.g.   site:example.com “target term” ).  This will let you quickly to see all of the URLs indexed that include that term.  If new content is there within about a week after publication, you probably don’t have a problem.

The URL inspection element in Google Search Console includes page-specific JS errors, if there are any. It can be helpful to use that while troubleshooting pages that are not ranking well.

How Many Pages are Excluded from the Index in Google Search Console?

Many clients find their Google Search Console Index Coverage reports list hundreds of blog posts as “Discovered – Not Indexed” or “Crawled – Not Indexed”. This is an indication that there may be a problem.

 Usually, these URLs are not indexed because Googlebot doesn’t have enough contextual information about the pages – and because the mobile, text-only version of the robot can’t locate internal links pointing to them for that additional context. 

Even if your content is indexed, if it takes more than a week or two for optimizations of existing content to be updated in the Google search results, there may be a problem. 

Internal links on most standard websites happen in the top nav and the pagination of resources and blog posts. If any of the above errors are occurring in indexation speed and completeness, then the following actions should be considered:

  1. The top-level navigation should not rely upon JS to function, and the URLs should be available as a text-only HTML link. If the nav code prevents this, then use the noscripts tag to include the link elements for all of the pages in the top nav. (Even popular REACT sites like Tesla use this workaround). 
  2. Pagination of blog posts, blog categories, resource center categories and types should not rely upon a JS “view more” button. Unique URLs should drive to pagination for the list of text links to be surfaced to an html-only robot. If that’s not possible, consider linked breadcrumbs on posts that lead back to more specific landing pages that summarize relevant content for each category without paginating. 
  3. Manually crosslink to the most important pages within the body copy of your content with rich, specific anchor text.

    The inclusion of these deeper links in an XML sitemap, but without internal linking available to the text-only bot is usually what creates these “discovered but not indexed” errors. The sitemap reinforces the information architecture, it doesn’t replace it.

Is JavaScript Impacting Your Pagespeed in Core Web Vitals?

We all know that it’s not usually possible for developers to spend time creating unique, bespoke DOMs for each template of a website. Usually, the JS that serves the homepage is the same JS on every page of the site. Google Pagespeed Insights would love it if that weren’t the case, to keep each payload small and crisp.

JS elements that can help speed up page load times:

  • Make sure none of the elements “block” the initial render of the H1 and the hero image – the largest element of content above the fold on the mobile screen
  • Lazy-load everything below the fold on a mobile screen (including a facade on the chatbot widget)
  • Pre-fetch third-party tools to avoid having to wait for their resources
  • Minify JS
  • Strip out any JS no longer used on the site on a regular basis
  • Ensure all JS is secure and up to date

Are your Videos and Podcasts Being Indexed?

Google is prioritizing podcast and video content more with each update the algorithm. If your website has this kind of multimedia available, and it’s not getting organic search traffic, JS might be your problem. The problem happens when all of these assets appear on a single JavaScript “player” page, rather than driving to unique URLs.  

Each video or podcast episode should have it’s own unique URL so that Google can crawl, contextualize and index that specific URL. 

(For best results, include chapter bookmarks, Schema markup, and transcripts for each asset on that unique URL!)

Are Your Pages Ranking for Lots of Long-Tail Terms for Content Deeper on the Page?

Sometimes, to save real estate on a page, website designers will “Tab” or hide content behind a read-more click – these elements are usually coded and implemented with JS. They aren’t a problem, per se, but if that content that is hidden is important for the page to rank, it’s important to ensure that Googlebot can get to it in text-only form. 

The noscripts tag is useful here. The alternative is simply ensuring that all of the text on the page shows up in the HTML.

JS and SEO are a moving target

Over the past months and years, we have seen Google index and cache JS more and more frequently. In time, we believe this will be a non-issue. However, right now this is still very much a gray area. 

It’s worth watching for the key elements noted above, and ensuring that contexts are similar across desktop and mobile, JS and HTML. It’s something that Google can see and measure, and to most of us SEO experts, that signals a potential future ranking factor. Our team is constantly looking out for this in B2B marketing, as discrepancies can happen often affected by redesigns, third-party resources, and more.

Hey There!

Thanks for reaching out.

Name
Job Title
Phone
Email
Company
What're you interested in?
Message

Thanks!
We'll be in touch shortly.

Until then, why not browse some of our work?

See our work