General

How to Diagnose and Solve JavaScript SEO Issues in 6 Steps

ADVERTISEMENT

Step 1: Utilize Step 1: Use the URL inspect tool check whether Google will render your content

The URL inspection tool (formerly known as Google Fetch and Render) (formerly Google Fetch and Render) is a excellent free tool that lets users to verify whether Google can render your webpages.

The tool to inspect URLs will require you to connect your site with Google Search Console. If you do not have an account go to the Google Help page.

Start Google Search Console Click on”URL inspection.

In the URL field, type in the complete URL of a page you wish to examine.
Finally, click the Screenshot tab to see the page that has been rendered.

Scroll down the screen to ensure that your site’s page is being rendered in a proper manner. You can ask yourself these questions:

  • Do you see the most important content?
  • Can Google be able to read user-generated comments?
  • Can Google be accessed to areas similar to products or articles?
  • Are you able to let Google be able to see other elements on your site?

Why is the screen-shot looking different from the one do I get in the browser? Here are a few possible causes:

  • Google ran into timeouts while rendering.
  • There were some errors when rendering. It is likely that you used features that aren’t supported by the Google Web Rendering Service (Google utilizes an outdated four years-old Chrome 41 for web rendering and doesn’t support the latest features).
  • You have blocked critical JavaScript files to Googlebot.

Step 2 Step 2: Ensure that you did not block JavaScript files in error.

If Google can’t render your site correctly, you must be sure that you did not block crucial JavaScript files that are used by Googlebot in robots.txt

It’s a simple text file that tells Googlebot as well as any other bots that they’re permitted to request a webpage or resource.

Fortunately it is possible to use it is possible to use the URL Inspection tool can reveal every resource on a web page rendered that have been being blocked by robots.txt.

How can you tell whether a restricted resource can be crucial from a rendering point the point of

There are two options available two options: the Basic option and advanced.

Basic

In the majority of situations, it’s a best idea to consult your developers on the issue. They built your website and should be aware of the subject well.

Naturally, when you notice that the title of a script is content.js or productListing.js is it likely pertinent and shouldn’t be prevented.

For now, however URL Inspection does not provide you with information about the extent of a blocked JS file. The versions of Google Fetch as well as Render offered this option:

Advanced

Today, we can utilize Google Developer Tools to do this.

For educational purposes, we will be checking the following

The page should be opened in the most current version of Chrome and then go into Chrome Developers Tools. Next, click on the Network tab and refresh the page.

Select the resource you wish to access (in our case , it’s YouShallNotPass.js) Click on the right click, and select the option to block the request URL.

Refresh the page and check whether any content that is important has disappeared. If not, then you need to think about deleting the rule that applies to you out of the robots.txt file.

Step 3: Utilize to use the URL Inspection tool to correct JavaScript issues.

If you notice that Google Fetch or Render isn’t rendering your website correctly, it could be due to JavaScript mistakes that occurred while rendering.

To determine the cause, in the URL Inspection tool, click on the More Info tab.

A single error in just one line of JavaScript code can cause it to stop rendering for Google and in turn renders your website unindexable.

Your website might work flawlessly in modern browsers however, if it fails in the older versions of web browsers (Google Web Rendering Service an extension of Chrome 41) and your Google ranking could drop.

Do you need some examples?

  • A single mistake in the official Angular documentation resulted in Google to not be able to create our test Angular website.
  • Once there was a moment, Google blocked certain pages from Angular.io the official site that was part of Angular 2+.

If you’re interested in knowing the reason for this, check out my Ultimate Guide to JavaScript SEO.

Note In the event that for any reason you’re not able to utilize it, you can use the URL Inspection tool to debug JavaScript issues, you could make use of Chrome 41 instead.

Personally, I would prefer to use Chrome 41 to debug purposes as it’s more universal and has greater flexibility. However URL Inspection is a better tool. URL Inspection tool is more precise in modeling Google’s Google Web Rendering Service, that’s why I suggest it to people who are brand new to JavaScript SEO.

Step 4: Ensure that your content is indexable in Google

It’s not enough just to test the if Google will render your website correctly. It is essential to ensure that Google has correctly indexed your website’s content. The most effective method for this is to check the command on their website: command.

It’s a extremely easy and effective tool. Its syntax is fairly simple website: [URL of a websiteof a website] “[fragment to be searched]”. Be sure that you don’t leave a space between the words site: as well as the URL.

Let’s suppose you wish to see whether Google has indexed the phrase “Develop across all platforms” that is displayed on the home page of Angular.io.

As you can observe, Google indexed that content and that’s what you’re looking for, however it’s not always the case.

Takeaway:

  • Make use of the command site whenever it is possible.
  • Review different page templates to ensure that your website is working properly. Do not stop at just one page!

If everything is fine, move on ahead to step. If not it could be a few reasons as to the reason for this happening:

  • Google still didn’t render your content. It could take up to a few days or weeks following the time Google visited the site’s URL. If the nature of your website demand your website’s content to be indexable quickly Implement SSR.
  • Google faced timeouts while creating a page. Are your scripts responsive? Are they responsive even in times of server load high?
  • Google continues to request the old JS files. It’s true, Google tries to cache a large amount of data to conserve their processing power. Therefore, CSS and JS files could be cached in a swarm. If you have fixed all JavaScript issues but Google is unable to render your website correctly It could be because Google makes use of old cached JS or CSS files. To fix this issue you could include a Version number in the filename. For example, you could name it bundle3424323.js. For more information, read in the Google Guides regarding HTTP Caching.
  • In searching, Google may not access certain resources if it concludes that they aren’t contributing to the primary content of the page.

Step 5: Make sure Google can discover your internal links

The following are a few rules that to adhere to:

  1. Google requires correct hyperlinks to locate the URLs on your site.
  2. If your links are added to DOM only when someone clicks a click, Google won’t see it.

It’s that simple the truth, many big corporations commit these errors.

Proper link structure

Googlebot, in order to crawl a site, requires it to include the traditional “href” links. If this isn’t present the majority of your pages are likely to be inaccessible for Googlebot!

I believe it was well explained through Tom Greenway (a Google representative) at the Google I/O conference.

A frequent error that developers make is: Googlebot can’t access the next and subsequent pages of pagination

Doing not let Googlebot find pages on the pagination page 2 as well as beyond it is a common error made by developers.

If you visit the mobile versions of Gearbest, Aliexpress and IKEA You will notice the fact that in the actuality, they do not allow Googlebot access the pagination links. This is quite bizarre. If Google allows mobile-first indexing on these sites, they will be affected.

How do you test it yourself?

If you’re not yet downloading Chrome 41, you can download it at Ele.ph/chrome41.

Navigate to the desired page. For the purposes of this demonstration, I’m using a Mobile version of AliExpress.com. For purposes of education it’s recommended to use the same method.

As you can see, there is none. , nor links pointing to the second page of pagination.

There are more than 2,000 items in the category of mobile phones on Aliexpress.com. Because mobile Googlebot can only access twenty of these, it’s less than 1!

That’s a lot of the products that fall into the category aren’t accessible to mobile Googlebot! It’s insane!

These errors are caused by improper application of the lazy load. There are numerous other websites that have made similar errors. You can learn further in my piece ” Popular Websites that May Fail in Mobile First Indexing”.

TL;DR: Using the link rel=”next” by itself isn’t enough of a indicator for Google

Note: It’s normal to utilize “link rel=”next” to signify pagination series. However, the findings of Kyle Blanchette seem to show that “link rel=”next” by itself is not enough of a indicator for Google and needs to be bolstered with the more conventional hyperlinks.

John Mueller discussed this more:

“We know the pages that are connected using”rel next” or rel=”previous”, however if there aren’t any links to the page It’s really difficult to move through the pages. (…) Also, using rel=”next” rel=”previous” in the header of a page is a excellent way to inform us how the pages are linked, but you really require regular, on-page HTML link.

Verifying to see if Google can view the menu link

Another step in reviewing a JavaScript web site, is to ensure that Google is able to view the menu link. To verify this, you can use Chrome 41.

For the purposes of this course, we’ll make use of the example of Target.com:

What are the results? It’s a good thing Google is able to recognize the menu links on Target.com.

Step 6: Checking if Google can discover content hidden under tabs

I’ve observed many times I have often observed that in cases of a lot of online stores, Google cannot discover and index their content which is hidden in tabs (product descriptions and opinions, similar products, etc.). It’s not a good thing however it’s a common occurrence.

It’s a essential element of any SEO audit to ensure that Google can find pages that are hidden behind tabs.

Start Chrome 41 and then navigate to any item from Boohoo.com For instance, Muscle Fit Vest.

Next Post