Optimizing your website for Google search: It's not voodoo

Optimizing your website for Google search: It’s not voodoo or rocket science

Optimizing your website for Google search

I’m amazed at the amount of almost voodoo like superstition I get when engaging in discussions with clients over search engine optimization and what it takes to improve natural search rankings in Google.

As if a sacrificial offering must be made first to obtain the blessings of the search gods or some other mystical ritual must be imparted to succeed.

What’s worse is the number of almost predatory marketing practices conducted that take advantage of those beliefs.

The result is a large population of website owners who become jaded and cynical about SEO in general with many feeling they have been duped into paying too much with too little to show in the end; in their experience SEO did make someone money….the person they paid to help them.

That isn’t good for anyone in the long run and makes the marketplace a less friendly place to live and work.

In reality, there is not great secret, no special knowledge that one must possess above another, no sacred ritual or secret hand shake. Nor does it require a degree in computer science or engineering and decades of experience in technology design.

Nope, it’s not rocket science, in fact it’s all right here (https://support.google.com), on Google’s website, for anyone to read and understand….everything you need to know and exactly what you need to do to improve your website’s natural search result rankings on Google’s search pages.

Shocking right? Yes, the secret is out, I’ve just given you access to all you need to know – the long knives are out and I will be a marked man from this day forward.

I’ll even take it a step further, I’ll highlight everything right here, you don’t even need to go find it on Google…

Help visitors use your pages…

  • Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the alt attribute to include a few words of descriptive text.
  • Ensure that all links go to live web pages. Use valid HTML.
  • Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow Internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page.
  • Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile friendly testing tool to test how well your pages work on mobile devices, and get feedback on what needs to be fixed.
  • Ensure that  your site appears correctly in different browsers.
  • If possible, secure your site’s connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web.
  • Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader.

Help Google find your pages…

  • Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page.
  • Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).
  • Limit the number of links on a page to a reasonable number (a few thousand at most).
  • Make sure that your web server correctly supports the If-Modified-Since HTTP header. This feature directs your web server to tell Google if your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
  • Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool.

Ways to help Google find your site…

Help Google understand your pages…

  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Ensure that your <title> elements and alt attributes are descriptive, specific, and accurate.
  • Design your site to have a clear conceptual page hierarchy.
  • Follow our recommended best practices for imagesvideo, and structured data.
  • When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl.
  • To help Google fully understand your site’s contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files. To see which page assets that Googlebot cannot crawl, or to debug directives in your robots.txt file, use the blocked resources report in Search Console and the Fetch as Google and robots.txt Tester tools.
  • Allow search bots to crawl your site without session IDs or URL parameters that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
  • Make your site’s important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view.
  • Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel="nofollow" to prevent advertisement links from being followed by a crawler.

That’s it in a nutshell. So what now?

True, it’s technical work, but the question and answer are really simple and come down to this: “Do you want to be the one responsible for doing, or managing, the things Google has explained, or do you want to hire someone else to do the work for you?”

Knowing that you CAN do it yourself and HOW to do it yourself empowers you to make a more informed decision and puts you back in CONTROL of that decision and the work being done.

I can change the oil in my car, I can rotate my tires, I can even do some major repairs with the right knowledge and tools. But I don’t, because I’ve made an informed choice and decided my time (and knuckles) are worth more to me! So I hire a mechanic to do the work for me, and the mechanic and I can both stay busy doing what we enjoy most and do best.

Eric Ramos is a life long artist, graphic designer, programmer, website developer and entrepreneur who enjoys working with businesses to create, maintain and improve their “home on the web” with an understanding that successful websites require a strategy of ongoing assessment, adjustment and refinement, not just a one time launch hoping for results. Passionate about technology, design and business development, Eric’s goal is to not just build better websites, but better businesses, growing and improving with them together, side-by-side.