SEO(Search Engine Optimization) - AndroTechHacks


Search Engine optimization (SEO) is the process of affecting the online visibility of a website or a web page in a web search  engine's unpaid result-often refered to as "natural","organic", or "earned" results. In general, the earlier  (or higher  ranked on a search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can be converted into customers. SEO may target different kinds of search, including image search, video search,  acedemic search, news search and industry-specific vertical search engines.SEO differs from the local search engine optimization in that the latter is focused on optimizing a business online presense so that it's web pages  will be displayed  by search engines when a user enters a local search for its products or services.

As an internet marketing strategy, SEO considers how search engine work, the computer programmed algorithms which dicate search engine behavior , what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred  by their targeted  audience.Optimizing a website may involve editing its content,  ading content doing HTML, and associated  coding  to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the numbsr of backlinks, or inbound links is another SEO tactic. By may 2015, mobile search  had surpassed desktop search. In 2015, it was reported that Google was developing and promoting mobile searches as a key feature within future products. In response, many brands are beginning to take a different approach to their internet marketing strategies.

Webmasters and content providers began optimizing websites for search engines in the mid 1990s, as the first search engines were cataloging the early web. Initially, all webmasters needed only to submit the address  of page or URL to the various engines which would send a "spider" to "crawl" that page extract links to other pages from it, and return information  found on the page to be indexed. The process involves a search engine's own server. A second programme known as an indexer, extracts information about the page, such as the words it contains, where fhey are located and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Website  owners recognized the value of a high ranking and visibility in search results, creating an opportunity  for both white hat and black hat SEO practitioners
 According to industry analyst Danny sulivan, the phrase "search engine optimization" probably came into use in 1997. Sulivan credits Bruce Clay as one of the first people to popularize the term. On may 2, 2007 Jason Gambert attempted  to trademark the term SEO is a "process"involving manipulation of keywords and not a "marketing service".

Early version of search algorithms  relied on webmaster provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta tag to index pages was found less than reliable, however,  because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete data in meta tags could and did cause pages to rank for irrelevant  searches. Web content providers also manipulated some attributes within the HTML source of a page in a attempt to rank well in search engines. By 1997, search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrevalent keywords.

In 2007, Google announced a campaign against paid links that transfer pagerank. On june 15, 2009, Google disclosed that they had taken measures to mtigate the effects of page rank sculpting by use of nofollow attributes  on links. The leading  search engines, Google bing, and yahoo do not disclose the algorithms they use to rank pages. Some SEO practitioners  have studied different approaches to search engine optimization, and have shared their personal opinions.

Methods

Getting indexed
Search engines use complex mathematical algorithms  to guess which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites with arrows representing these links. Website gettin more inbound links or stronger links are presumed to be more important and what the user is searching for.
In this example since website B is the recipient of numerous inbound links , it ranks more highly in a web search. And the links "carry through"such that website C even though it only has one inbound links ,has an inbound link from a highly popular site (B) while site E does not. Note; percentage are rounded.

The leading search engines, such as Google, Bing and Yahoo! Use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo and DMOZ two major directories which closed in 2014 and 2017.Google offers Google Search Console, for which an XML sitemap feed can be created and submitted for free to emsure that all pages are found.

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root direcrory of a site may also be a factor in whether or not pages get crawled.

Preventing Crawling

To avoid undesirable contents in the search indexes, webmaster can instructs not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally a page can be explicitly excluded from the search engine's data base by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled.
The robots.txt file is then parsed and instruct the robot as to which page are not to be crawled. As a search engine crawler may keep a cached copy of file, it may on occasion crawl pages a webmaster doesnot  wish crawled. Pàges typicall prevented from being crawled include login specific pages such as shopping cart and user-specific content such as search results from internal searches. In march 2007 ,Google warned webmasters that they should prevent ndexing of internal searches result because those pages are considered search spam'.

Increasing Prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross-linking between pages of the same website to provide more links to important pages may improve its visibility. Writting content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search eñgine crawling back frequently can give additional weight to a sight. Adding relevant keyword to a webpage's meta data, including title tag and meta description, it will tend to increase relevancy of site's search listings, thus increasing traffic's.

White hat versus black hat techniques



SEO techniques can be cladsified into two broad catagories; techniques that search engine companies recommend as part of good design ("white hat") and those techniques of which search engine does not approve ("black hat"). Industry commentators have classified these methods, and the practicenor who employ them, as either White hat SEO and Black hat SEO.

An SEO technique considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines are not written as series of rules and commandments, this is an important distinction to note. White hat SEO is not just about following guidelines,but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users not for search engines and making that content easily accessible to the online spider algorithms rather than attempting to trick the algorithms from its  intended purpose. White hat seo is in many ways similar to web development that promotes accessibility although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden,eiter as text coloured similar to the background in an invisible div or positioned off screen. Another methods gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloakíng. This is in netween white hat and black hat approaches, where the methods employed avoid the site being penalized.

Comments

Popular Posts