If you’re only just getting started with SEO, then it’s quite possible that you don’t even know what the initialism stands for. For those in the dark, it’s Search Engine Optimization and, as the name suggests, the goal of SEO is to improve your website’s visibility when anyone uses a search engine to look for a company offering what you do.
Google Rules The Roost
Sure, maybe there’s a tiny fraction of potential customers who are using search engines other than Google, but it’s pretty unlikely.
If you’ve watched an average smartphone user recently, then you’ll know that they (and most likely, you) don’t even go to URLs any more – even when the address is known. They’ll simply type a keyword into Google and access the target site via the search results listing. This kind of behaviour means three things:
- Businesses need to make sure that they’re ideally the first hit
- Google’s market share is huge: As of July 2019, statcounter.com put its market share at 92.19% (with bing, Yahoo!, Baidu, YANDEX RU and DuckDuckGo sharing the remaining 7.81%)
- Your business’ website needs to appeal to Googlebot
Quality Content Over Quantity of Content
With SEO, it’s important to consider the content of your website. Having quality content, such as articles, essays, videos etc. not only increases other sites’ willingness to link to your page but will also improve its visibility.
Many people mistakenly believe that Google will penalize a site for duplicating content and that’s not true. The reason duplicate content hurts performance in search engine results is that duplicate hits are filtered at the listing level. Try adding &filter=0 to the end of the Google search URL the next time you use it – You’ll most likely find a lot of sites popping up twice.
Having said that, it’s still important to ensure to make sure that your content (and site on the whole) is as lean as it can be. It used to be the case that a webmaster could generate tens or hundreds of pages loaded with keywords and artificial backlinks but Googlebot is getting ever-wiser and better at spotting high-quality pages. Therefore, make sure you avoid index bloat by removing as many low quality and non-essential site pages as possible.
What is the Googlebot?
A bot (also known as a robot or spider) is a piece of software that “crawls” web pages across the internet, analysing their content and building a searchable index for Google’s search engine.
Googlebot visits web pages and analyzes their content and, when it finds a link, it wanders off to crawl the linked page too. Therefore, it’s important that Googlebot can find your page(s) and this can be ensured by submitting individual URLs to Google or getting pages that Google already knows to link to your page so that Googlebot follows the link and crawls it.
There are some situations where you might not want a page to be crawled at all. For example, there may be some pages that you don’t want Google users to land on (in the interest of good customer experience), such as:
- Policy pages (relating to shipping and returns, for example)
- Internal site pages
- Landing pages that relate to specific pay-per-click campaigns
To stop Googlebot from crawling those pages, add a noindex meta tag to the <head> section of the page’s HTML mark-up. This will tell Googlebot to drop any page marked with the noindex meta tag when it crawls your site. Linkbuilder.io also highlight that the noindex meta is useful to ensure that Googlebot is focused on crawling your most important pages, and isn’t distracted by the non-essential.
Ultimately SEO is a vital component of business marketing. With more and more people using Google to find what they need, you’ll leave customers (and money) on the table if you’re not using this strategy.