Rendering AJAX-crawling pages

The AJAX crawling scheme was introduced as a way of making JavaScript-based webpages accessible to Googlebot, and we’ve previously announced our plans to turn it down. Over time, Google engineers have significantly improved rendering of JavaScript for Googlebot. Given these advances, in the second quarter of 2018, we’ll be switching to rendering these pages on Google’s side, rather than on requiring that sites do this themselves. In short, we’ll no longer be using the AJAX crawling scheme. As a reminder, the AJAX crawling scheme accepts pages with either a “#!” in the URL or a “fragment meta tag” on them, and then crawls them with an “?_escaped_fragment_=” in the URL. That escaped version needs to be a fully-rendered and/or equivalent version of the page, created by the website itself. With this change, Googlebot will render the #! URL directly, making it unnecessary for the website owner to provide a rendered version of the page. We’ll continue to support these URLs in our search results. We expect that most AJAX-crawling websites won’t see significant changes with this update. Webmasters can double-check their pages as detailed below, and we’ll be sending notifications to any sites with potential issues. If your site is currently using either #! URLs or the fragment meta tag, we recommend: Verify ownership of the website in Google Search Console to gain access to the tools there, and to allow Google to notify you of any issues that might be found. Test with Search Console’s Fetch & Render. Compare the results of the #! URL and the escaped URL to see any differences. Do this for any significantly different part of the website. Check our developer documentation for more information on supported APIs, and see our debugging guide when needed. Use Chrome’s Inspect Element to confirm that links use “a” HTML elements and include a rel=nofollow where appropriate (for example, in user-generated content) Use Chrome’s Inspect Element to check the page’s title and description meta tag, any robots meta tag, and other meta data. Also check that any structured data is available on the rendered page. Content in Flash, Silverlight, or other plugin-based technologies needs to be converted to either JavaScript or “normal” HTML, if their content should be indexed in search. We hope that this change makes it a bit easier for your website, and reduces the need to render pages on your end. Should you have any questions or comments, feel free to drop by our webmaster help forums, or to join our JavaScript sites working group. Posted by John Mueller, Google Switzerland


Source: Google Webmaster Central Blog
Link: Rendering AJAX-crawling pages

Be the first to comment

Leave a Reply

Your email address will not be published.


*