Home » » Search engine optimization 2013

Search engine optimization 2013

Written By Unknown on Monday 15 April 2013 | 23:23

From Wikipedia, the free encyclopedia
Non-checked baggage allowance
Search engine optimization (SEO) (SEO) is the process of improving the volume or quality of traffic to a Web site through search engine s or "natural" ("organic" or "algorithmic") search results instead of search engine marketing (SEM) which deals with paid inclusion. Usually, the earlier (or higher) a site appears in the search results list, the more visitors it will receive from the search engine. SEO (SEO) may target different kinds of search, including image search, local search, video search and industry-specific vertical search engines. This gives a Web site Web presence.
As Internet marketing strategy, SEO (SEO) considers how search engines work and what people search for. Improving a website primarily involves editing its content and HTML (HTML) and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing search engines.
The acronym "SEO" can refer to "search engine optimization", a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and employees who perform SEO services in-house. Search engine may offer SEO as a stand-alone service or as part of the marketing campaign. Because effective SEO may require changes to the HTML (HTML) source code for the site, integrate SEO tactics in website development and design. The term "search engine friendly" may be used to describe Web site designs, menus, content management systems, images, videos, shopping carts, and other items that have been optimized for the purpose of search engine exposure.
Another class of techniques, known as black hat SEO (SEO) or sbamdixingh (Spamdexing), using methods such as link farms, keyword stuffing, and article spinning that degrade both the relevance of search results and user experience in search engines. Search engines to find sites using these techniques in order to remove them from its indices




Webmaster's and content providers began optimizing websites for search engines in the mid-1990 's, as the first search engines were cataloging the early Web. In the beginning, all webmaster need to do is provide the title of the page, or a link (URL), to the various engines which would send a spider that page, extract links to other pages from it, and return information found on the page to be indexed. [1] the process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts information about the various pages, such as the words it contains and where this incident, as well as any weight for specific words, and each page contains links, which is placed on the schedule for crawling at a later date.
Site owners started to recognize the value of having their websites rank high in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, search engine optimization probably came into use in 1997.[2]
Early versions of the search algorithm is based on information provided by the site, such as keyword meta tag, or index files in engines like ALIWEB. Meta tags provide the directory everything on the page. But it was found using meta data to index pages to be less reliable because the site choice of terms in the meta tag may be an inaccurate representation of the actual site content. Inaccurate, incomplete, and inconsistent data in meta tags and they can cause pages to rank for irrelevant searches.[3] Web content providers also manipulated a number of attributes within the HTML source page in an attempt to rank well in the search engines.[4]
By relying so much on factors such as keyword density which were exclusively within the site and control, early search engines suffered from abuse and ranking manipulation. To deliver the best results to their users, search engines had to adapt to ensure the results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous terminology by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Graduate students at Stanford University, Larry page and Sergey Brin developed "backrub", and a search engine depends on a mathematical algorithm to rate the prominence of Web pages. The algorithm calculates the number, location, is a function of the quantity and strength of the Association s. inside [5] PageRank estimates the likelihood that a particular page will be reached by Internet users browsing the Web at random, following links from one page to another. In fact, this means that some links are stronger than others, as the top page of the site is likely to be a random surfer.
Page and Brin founded Google in 1998. Google attract many supporters among a growing number of Internet users, who liked its simple.[6] apart from the page factors (such as location and hyperlink analysis) as well as factors (such as repeat key words, meta tags, titles, links and site structure) to enable groups to avoid this kind of manipulation in the search engines only consider factors page respectively. Although the site was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi Search engine, and these methods proved similarly applicable to games. Many sites focused on Exchange and buy and sell links, often on a large scale. Some of these schemes, or link farm again, involves the establishment of thousands of sites for the sole purpose of spamming the link. [7]
By 2004, the search engine included a wide range of undisclosed factors in ranking algorithms to reduce the impact of the manipulated link. Google says it ranks sites using more than 200 different indicators.[8] the three leading search engines, Google, Yahoo and Microsoft's Bing, do not disclose the algorithms they use to rank pages. Notable SEOs, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, published their views in online forums and blogs.[9][10] SEO practitioners may also study patents held by various search engines to get an insight into the algorithms.[11]
In 2005 I started customizing search results sets for each user. According to the history of previous searches, the results of the logon groups for users.[12] Bruce Clay in 2008, and said that "Mitt" because of personalized search. Will become meaningless to discuss how the Centre's website, because the rank is likely to be different for each user and for each search.[13]
In 2007 Google announced the campaign against paid links s that the transfer site.[14] in 2009 revealed that Google has taken measures to mitigate the effects of sculpture site by using the nofollow attribute on links.[15]
[Edit] relationship with the search engines

By 1997 search engines recognized sites make efforts to rank well in the search engines, and some sites that are arranged so as to manipulate search results by stuffing pages with excessive keywords or irrelevant

0 comments:

Post a Comment

Enter your email address:

Delivered by FeedBurner

Labels