What is SEO?

SEO stands for "Search Engine Optimization."

Search engine optimization is the name of the process conducted by a webmaster in order to improve the visibility of a web page, or a website, in the organic results page of a search engine.

By applying the recommended SEO practices, the goal of a webmaster is to bring his website or a page as closely as possible to the first results in SERP (search engines results page) for one or more specific keyword phrases.

The higher a page appears in the organic search results for a given query, the more clicks it will receive.

According to a study conducted by Chitika, the first three results receive more than 60% of all the clicks.

SERP Average Traffic Share

  • First result: 32.5%
  • Second result: 17.6%
  • Third result: 11.4%

The lower your site's content appears in the searches, the lower the number of clicks it will receive.

If you don't count YouTube, then Google and Bing are the two most popular search engines.

Besides these two, there are also hundreds of other small search engines, but none can bring you so much traffic like Google and Bing (especially Google).

Every independent search engine has its own search algorithms and can display different results for the same search query.

Although Bing has greatly expanded in recent years, Google remains the largest and the most complex search engine.

Google changes its search algorithms regularly in order to improve the quality of search results and to get rid of the spam sites and websites that "try to cheat the system".

While the Google search results can now display results based on LSI keywords and synonyms, I would say that Bing shows more exact-match results (which is not always a bad thing). Besides that, these two search engines have comparable algorithms for displaying their search results, so I wouldn't say you can optimize your website particularly for Google or Bing.

How SEO Works?

If you want your website's content to rank well in the results of the search engines, it's crucial to optimize it for that purpose.

By applying the best SEO practices, you will help the search engines to index and understand your site's content quicker.

Some SEO tactics will also improve the user experience, the readability of your articles, and the overall aspect of your website (things like a proper usage of the HTML headings to structure the content, using the title attributes correctly, writing long content, etc.).

When you optimize the content of your site for the search engines, the chances of starting to receive free traffic from their organic search results increase drastically.

SEO is a complex field which is constantly changing and today's "best practices" might not be considered the best practices the next year.

However, if you do not want to pay for traffic or if you are not willing to post on the social networks all day long, SEO is one of the best ways of driving free traffic to your website.

How search engines discover my content?

Search engines use automated bots, called “crawlers” or “spiders” to discover and index the billions of existing websites.

If these web spiders cannot access your pages, or cannot fetch or understand their content, your site will not be added to the index of the search engines.

There are also other methods for getting your website's content found and indexed faster by the search engine and I'll cover these topics in the upcoming posts.

How to prevent search engines to display specific content?

In some cases, you might want to avoid having indexed certain pages of your website (such as the admin login page, privacy policy page, terms, etc.). 

There are several ways to achieve that. However, the most common is by adding a robots.txt file to the root directory of your website and specify a set of rules for these crawlers.

The crawlers of the major search engines will visit that file before accessing your site, and according to the rules found inside, they will access or not these particular pages or paths.

For example, the following rules added inside the robots.txt file will tell all the robots that should not access any pages in the admin directory.

User-agent: *
Disallow: /admin/

The robots.txt file should restrict the robot access to a page, but there is also a meta robots HTML tag that you can add to your pages in order to instruct the crawlers to index/not index or follow/not follow the links on that page.

For example, the HTML meta tag below, once added to the <HEAD> section of a page, will tell the search engine crawlers not to add the page URL to the search index and also not to follow the page links.

<meta name="robots" content="noindex, nofollow">

Even so, do not rely entirely on the robots.txt file or the meta robots tag to prevent pages that contain sensitive data from being added to the index of the search engines. 

Besides the crawlers of the search engines, there are many other "evil bots" that are crawling the Web (eg. spam bots, or bots that try to find vulnerabilities in the source code of your website). These bots ignore these rules and can access any page and file they can find.

Also, many small search engines might ignore completely the directives inside the robots.txt file.

I'll cover more in-depth the robots.txt file in an upcoming post.

SEO Types

If you are learning SEO, probably you have met terms like “White Hat SEO,” “Grey Hat SEO,” and “Black Hat SEO”.

White-Hat SEO

White Hat SEO refers to the usage of “clean” optimization strategies, techniques and tactics that completely follow and comply with the rules and policies of the search engines.

This optimization technique is usually used by those people who do care about their websites, and who expect to get SEO benefits from their efforts in the long-term.

A clean SEO strategy eliminates the risk of having your site penalized by the search engines.

Grey-Hat SEO

Grey Hat SEO is somewhere in the middle of the clean White Hat SEO and the Black Hat SEO.

Gray Hat SEO refers to the questionable practices that may or may not attract an SEO penalty for your website.

This approach is riskier than White Hat SEO, but not as dangerous as the Black Hat SEO.

Black-Hat SEO

Black Hat SEO is the opposite of White Hat SEO, and it refers to the use of aggressive SEO strategies, techniques, and tactics that clearly violates the guidelines of the search engines. Below are some examples of Black Hat SEO techniques:

  • Keyword stuffing (adding a list of keywords somewhere on the page with the intention of having your content ranked for them).
  • Adding invisible text (example white text on a white background).
  • Building backlinks using PBNs (private blogging networks).
  • The usage of 301 redirects for domains with a high number of backlinks to another domain in order for the second domain name to inherit the backlinks of the first.
  • Building a high number of backlinks in a short time using automated software.

Black Hat SEO techniques are often used by those people who are looking for quick financial gains from their website, rather than a long-term investment.


Having a solid SEO base is essential if you want to build a successful website that relies on traffic from the search engines. In addition to that, if you have an excellent understanding of search engine optimization and how the algorithms of the search engines work, you can even build a career from it.

The search engine optimization market is huge and many companies and freelancers provide paid SEO services. Daily, more companies make the transition from offline to online, and the demand for SEO specialists is thriving.

I hope you now have a better understanding of what SEO is and how it works. In the upcoming posts, I will cover more aspects of SEO and teach you how you can start optimizing varios aspects of your website.

Leave a Comment