History and Evolution of Search Engine Optimization
History Of Search Engine Optimization
Search Engine Optimization was developed as a result of trying to improve the quality of results being searched and the credibility of the results without losing the confidence of the user on Google. This article discusses the history and evolution of Search Engine Optimization. Although there was another giant like yahoo that was present at the time google was introduced, google later on went to become the tech giant that is today through the constant change in their optimization techniques and algorithms. The unfortunate incident which led to this development was the World Trade Center terrorist attack on September 11, 2001. The general public was not even able to access the relevant information about the incident from google even though it was the latest events. The problem which was identified by the search engineers on the team was that the websites which were available at that time were not crawlable by Google.
The search engine mainly uses three processes to search and provide the optimum result for the searched keyword which are namely Crawling, Caching, and Indexing.
Crawling
Crawling is a process done by Google, which is basically scanning the website including the URL, texts, images using a program having different names such as bots, google bots, spiders, robots, etc. It is essentially the same program but known as said above names. For easy crawling, caching and indexing by google, On-Site SEO or On-Page SEO can be used. This is done by making changes to the title, contents, images on the website.
Caching
After a website has been crawled, Google takes a snapshot of the website and stores it, and categorizing it based on the contents known as indexing. Caching can even happen more than one time a day over the course of its days. The website may not even show up on the results for the keywords searched even if the contents of the website are intended to be the same but the way it presented has been changed.
After identifying the problem, the team was tasked with solving how the websites can be crawled by google. One of the solutions provided was to share the optimization practices used by google with the Webmasters and make it public.
This idea was not welcomed at first by the management team at Google, because they believed it to be sharing their inner workings procedures and practices. But later on, it was decided to share the details and it is known as the SEO starter guide, which is still being used to this day.
Webmasters
Webmasters are responsible parties for maintaining and running their respective websites. Any changes made to the websites are done by the Webmasters.
Evolution of Search Engine Optimization
Like any other digital entity, SEO has evolved through many stages, overcoming many obstacles and shortcomings each one facing its own challenges. But the most important one of all was its reliability which they couldn’t afford to lose. They followed many tactics and made major changes and improvements in their algorithms and coding with the sole purpose of gaining the user's trust with maintaining quality and quantity.
Niche-Specific
At first, the search engine was made as Niche-specific, i.e. content-specific. The ranking of the website was determined by the number of keywords present in the webpage. This concept was widely misused through Keyword Stuffing. So as the number of keywords in the content increases the page will rank higher.
Keyword Stuffing
Keyword stuffing is a blackhat SEO technique which uses the overuse of keywords. Keywords are loaded into the pages’ meta tags, visible content, or anchor text to gain an unfair advantage. But from September 21, 2009 google stopped considering keyword meta-tagging in web ranking. This is having been clearly mentioned in the SEO blog by google at webmasters.googleblog.com. Keyword stuffing is an unethical practice of SEO. The webpage could be wildly off-topic, but using this technique webpage could be easily be ranked to the top with just enough number of keywords. The opposite is the white hat SEO technique which is the ethical way to do.
Link Specific Ranking
So over the course of time, Niche-specific ranking was no longer being used and the next step was the introduction of Link Specific ranking. This system ranked a webpage based on the number of links received from other webpages. If a decent number of links were received from other webpages, the website related to the keywords being searched could show up on top of the list. So some webpages took advantage of this by selling the links for a specific amount of money and the higher bidders started showing up on top of the results. This was not acceptable with google. So another change was made in their algorithm for ranking websites.
Quality Link specific.
In Quality link specific, the ranking depends on the quality of the webpages from which the links are received for the websites. If a website receives 5 links from different webpages out of which 3 are webpages that are trustworthy pages then that website will be ranked on top of a website receiving 5 links but only 2 trustworthy webpage links. Here also the question of money arises. So in order to avoid the situation again, changes were made.
Link Juice
The current ranking system of a website in SEO is now known as Passing Juice or Link Juice. This term is used to refer to the value passed from one webpage to another. In this system, the ranking of a website is based on the value of equity sent and received. If a webpage is only sending a link to another website the value of the webpage decreases if it does not receive links from other webpages. A way to avoid this loss of value is by giving the attribute rel=”nofollow” so that a link can be given to other sites without losing its own value or equity.
On the other hand, if the received site is of better quality, based on its content and the time a user spends on the page will rank the page in its favor even if it is not receiving any value from the other sites.
Google creates revenue through a variety of fields including AI, gadgets, cloud computing businesses such as Play Store, Chromecast, Google Cloud Platform, etc. But the main revenue-generating sources for google is through advertisements which they effectively utilize. For these purposes, they are using two programs namely Google Adwords and Google AdSense.
In the same year, User interaction update was introduced in the search engine which depended on how a website is appealing to the user and how much time is being spent there which would later help in the ranking of the webpage. It determined the trust value of the page depending on the bounce rate. The bounce rate is expressed as the percentage of the visitors exiting a page only after spending a very short amount of time on the webpage. If the bounce rate is higher, then the trust value of the page will decline to result in lower web ranking and vice versa. The exit rate is another factor that is analyzed to determine from which page the site was exited the most.
Google Revenue Streams
Google creates revenue through a variety of fields including AI, gadgets, cloud computing businesses such as Play Store, Chromecast, Google Cloud Platform, etc. But the main revenue-generating sources for google is through advertisements which they effectively utilize. For these purposes, they are using two programs namely Google Adwords and Google AdSense.
Google Adwords
Now known as Google Ads is a prepaid ad program. It is very customizable and user friendly. The ad can be customized like when it should be shown, targeted customers, a specific area, whether pay per impression should be enabled or not, etc. The user only requires a google account to sign in to the program and pay for its services. The user can create ad groups, ad campaigns, etc.
Google Adwords operates on a pay per click (PPC) model. A cost per click value (CPC value) is predetermined for the selected ad and since it is a bidding process, the CPC value may change. Each time the ad is clicked by an interested user, the CPC value is deducted from the prepaid amount given for the ad. If the user chooses to stop the ads, the remaining amount will be refunded to the same account the money was paid from after deducting the CPC rate if any clicks occurred. The CPC rates can be set manually or automatically and changes will occur automatically according to the selected settings. Pay per impression ads charges a certain amount of money to show the ad to a certain number of people. The ranking of the ads is based on the ad score. It is determined using a copy score and bid rate also known as CPC rate.
Copy score depends on the quality of the ad, whether the ad gives an honest description of the product.
Google AdSense
Google AdSense is a resource that can be used to earn money for the user. If an individual has a website having a decent amount of traffic he can apply for AdSense from Google. Google after analyzing the webpage and the flow of traffic through the page, if it is up to google standards he can get approval for AdSense. The ads that are shown on the pages come from Google Adwords and a certain percentage of the CPC value will be given to the page owner if the links are clicked by the user. Google also analyzes the source of the click to block any suspicious or fraudulent activity. There is a threshold amount of 100$, only after the owner will receive the payments for the ads. CTR or Click Through Rate can be used to find out how the ads are performing on a site. This can be found out using the Google Analytics program.
Google Search Results Updates
Personalized Search Result Update – 2011
Over the course of years, google brought new updates to improve the reliability of its search engine. Personalized Search Result Update was among the first of these updates brought on the search engine. If the user is logged into the google account, the browser or address bar collects the data and cookies and sends this to the data center to analyze and to provide search results based on one’s previous searches. This update was made to improve the user experience more to an individual’s search habits.
User Interaction Update - 2011
In the same year, User interaction update was introduced in the search engine which depended on how a website is appealing to the user and how much time is being spent there which would later help in the ranking of the webpage. It determined the trust value of the page depending on the bounce rate. The bounce rate is expressed as the percentage of the visitors exiting a page only after spending a very short amount of time on the webpage. If the bounce rate is higher, then the trust value of the page will decline to result in lower web ranking and vice versa. The exit rate is another factor that is analyzed to determine from which page the site was exited the most.
Social Media Signals - 2010
In 2010, social media was on the boom and google started considering the interactions on the links shared on social media platforms as a ranking factor, and a new update was introduced known as Social Media Signals. It referred to the overall likes, shares, and comments on the link from social media platforms.
Comments
Post a Comment