In this guide, we’ll cover:
- Six easy ways to test how search engines view your website
When Google crawls a website, it needs to understand the content and structure of the site to index its pages.
This allows search engines to index the dynamically generated content.
If Google cannot crawl an element of a page, that section of content will not be added to Google’s index.
In some cases, this can be the entire content of a page, meaning no chance of rankings!
An uncrawlable page not only prevents its ability to rank but can also inhibit a site’s SEO in general.
Any links on a page that a crawler has failed to render will not be crawled either, so no link equity from that page will pass on to internal or external sources.
With significant internal links being missed by crawlers, a search engine’s understanding of your site’s navigation may also become skewed, while there is also potential for important pages to be treated as orphan pages.
- Images being dropped from the index, reducing image traffic.
- Dynamically-generated page titles may be overlooked, affecting search rankings and click-through rates.
Well, there are actually plenty of reasons why you would want to use it for a web development project.
It can also improve the performance and speed of your website. You can optimize your site using lazy loading, code splitting, caching, and service workers.
You can also create dynamic and personalized content based on user behavior, preferences, and location.
The web has moved from plain HTML – as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS’s not going away.
— John Mueller (official) · #StaplerLife (@JohnMu) August 8, 2017
Here are six easy ways to do it:
Once the page is loaded in Google Chrome, click the padlock in the address bar and select ‘Site settings.’
When you return to the tab, Chrome will suggest reloading the page. Click Reload.
Can you still see the content?
Google likely won’t be able to crawl the content.
In some cases, only certain features, such as ad banners, will be missing on the page.
You’re looking to see that all the content you want to be crawled is visible. For example, if Google cannot crawl entire paragraphs, it will miss key information, jeopardizing the page’s ranking potential.
It’s not just the page’s body content at risk, either…
2. Check How An SEO Analysis Tool Views The Page
By mimicking the crawling process, you can get some indication of how search engines understand the page.
Now, analyze the on-page SEO of the webpage using a browser add-on such as SEOMinion.
A sidebar will launch. Now, select ‘Analyze On-Page SEO.’
Use the generated report and compare it to what you can actually see on the page:
- What is the word count of the page?
- Are all the Heading tags present?
- Is the number of images correct?
- Are any other elements appearing differently according to the analysis tool?
In this example, we can see the word count for the page is just one word, which would make for a very thin article!
There is only one heading tag (the H1) on the page, and this appears to be pulling through fine. But, if there were any H2 or H3 tags, they probably wouldn’t pull through, as the word count indicates that the body of the content is not rendering correctly.
The page has three images, but this tool shows that only one is visible.
When downloading the image file, we found this was the site logo (pulling from the header).
If what you can see clashes with what the report ‘sees,’ then this is another indication that content within the body of the page cannot be crawled properly.
3. Google Mobile-Friendly Test
You can also use Google’s Mobile-Friendly Test tool to check the rendered HTML of your web page.
Enter the URL you want to check, and you’ll see a screenshot of the page on the right-hand side of the results page.
This is a visual representation of how Googlebot sees your page. You can compare the screenshot to your page and check for missing elements.
If you notice anything irregular, check the rendered HTML and identify whether anything missing from the screenshot is also missing from the HTML code.
Note: Google will remove this tool at the end of 2023. Not to worry though, as SEO Kristina Azarenko has a great work-around, both for now and for when that comes around:
Google is dropping the Mobile-Friendly Test tool at the end of this year.
For a long time, many technical SEOs have been using the Mobile-Friendly Test tool to check rendered HTML of a page. It’s super handy when you don’t have access to Google Search Console’s URL Inspect tool… pic.twitter.com/AcSh7J7jhu
— Kristina Azarenko (@azarchick) June 12, 2023
This leads us perfectly on to…
4. Google Rich Results Tool
Enter your URL and run a test to see a screenshot revealing how Googlebot sees the page.
5. Google Search Console URL Inspection
6. Use A Specialist Rendering Tool
- Fetch & Render – this tool mimics the rendering process (similar to the Google inspection tools) but allows you to test using different user agents.
- Pre-rendering Testing Tool allows you to compare the pre-rendering information from different crawlers. Simply enter your URL and select a user agent, and it’ll allow you to compare what content is being served to different crawlers ahead of the rendering process.
Plenty of really powerful Technical SEO software is available today, which is especially useful for large-scale or enterprise-level projects.
Whichever tool you use, make sure the following elements are being rendered correctly to ensure crawlability and indexability and that your content is ranking as best it can be from a technical standpoint:
- Copy on the page
- Canonical tag
- Title & meta description
- Meta robots tag
- Structured data
- Heading tags
- Content within interactive elements (i.e. accordion features)
Luckily plugins exist to automatically create an XML sitemap with different options for the major frameworks you are likely to use.
Use HTML anchor tags with a href attribute for your internal and external links.
Search engines recognize and understand <a> tags as links. Googlebot pulls these links and adds them to the crawl queue.
Use descriptive anchor texts to help Google understand the linked page’s content. Avoid generic phrases and opt for a natural, keyword-rich anchor text that accurately represents the destination page.
Include descriptive alt tags for your images. Alt tags provide alternative text that describes the image content.
This helps search engines understand the image’s relevance to the surrounding content.
Using descriptive and keyword-rich file names for your images is also recommended.
You can use structured data markup like schema.org’s ImageObject to provide additional context about your images.
Server-Side Rendering or Dynamic Rendering
The rendering of the web page takes place in the user’s browser.
The issue is that search engine crawlers may not be able to understand the content – they see a blank page.
This allows Google to directly access and index the pre-rendered HTML content.
But, SSR can be expensive and resource-heavy.
A workaround is to use dynamic rendering.
SEO goes beyond optimizing for keywords and securing backlinks. It also involves considering how your website is rendered and presented to search engine crawlers as you move into Technical SEO.
Head of Content
Become a Pro at SEO
Join 65,000 others and learn the secrets to SEO success with our weekly blog posts.