WSO Reviews

SEO Best Practices From Google Starter Guide

Google has an SEO Starter guide that contains a ton of information that can help everyone who wants to know more about search engine optimization and Google guidelines. You will know about the basics of SEO based on best practices formulated by Google itself.

Search engine optimization is really about making your website understood by search engines. Changes are made, which are also called optimizations in order for search engines to know more about your website’s content and in turn share that content to users who type queries on their search boxes.

Some Best Practices To Apply

One of the ways you can help Google find your content is by submitting a sitemap. A sitemap notifies search engines about new or updated pages on your website. Make sure that you have a sitemap file on your website.

You can also tell Google which pages on your website should not be crawled. This can be done by using a “robots.txt” file. This file is placed in the root directory of your website and it tells search engines that a page is blocked for crawling.

Google search console has a robots.txt generator that is user-friendly which helps you create this type of file. Take note that if your site has subdomains, you will need to create a separate robots.txt file if you wish this to be blocked from crawlers.

You should avoid your internal search results to be crawled by Google. Users are annoyed when they click on a search engine result and land on another search result found on your website.

For your confidential information, you should use more secure methods like using passwords. Robots.txt is not an effective way of blocking sensitive material from being accessed. It is more for crawlers to know if a page should be crawled or not. There are rogue search engines out there who don’t acknowledge the robots exclusion standard so you should be careful with your confidential information and secure them with the right methods.

To help Googlebot crawl your page better, you should allow access to JavaScript, CSS and other image files used on your website. Do not use robot.txt files for those files. This will result in lower rankings due to multiple errors in reading your site. Algorithms are designed to render and index the said files. Use the Fetch as Google tool to see how Googlebot sees and renders your content. This tool will also help in fixing indexing issues on your site.

When your page appears in a search results page, the contents of a title tag may appear in the first line of the results. This is why creating unique and accurate titles are essential. Snippets may include the name of your website or brand and a short description like main product offerings.

Don’t use titles that are vague and have no relation to the content of the page. Default titles like “page 1” or “untitled” should be avoided.

Use unique titles for each page so that Google will see that each page is distinct from the other pages on your website. Don’t use a single title for all your sites pages especially if you have a mobile version for your website. Be concise and informative, never stuff keywords in your title tags.

Take advantage of the page description meta tag and put in a short paragraph about what your page is about. Google allows you to come up with a summary that should not be too short, long or repeated too many times. Description meta tags may be used by Google as snippets for your pages.

Leave a Reply

Your email address will not be published. Required fields are marked *