Content deleted Content added
Arbor to SJ (talk | contribs) m Reverted edits by 68.110.232.29 (talk) to last version by Andrewlp1991 |
Undid revision 441467053 by Andrewlp1991 (talk) |
||
Line 20:
Graduate students at [[Stanford University]], [[Larry Page]] and [[Sergey Brin]], developed "backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, [[PageRank]], is a function of the quantity and strength of [[inbound link]]s.<ref name="lgscalehyptxt">{{cite web|author=Brin, Sergey and Page, Larry|url=http://www-db.stanford.edu/~backrub/google.html|title=The Anatomy of a Large-Scale Hypertextual Web Search Engine|publisher=Proceedings of the seventh international conference on World Wide Web|year=1998|pages=107–117|accessdate=2007-05-08}}</ref> PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.
Page and Brin founded [[Google]] in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.<ref name="bbc-1">{{cite news|author=Thompson, Bill|url=http://news.bbc.co.uk/1/hi/technology/3334531.stm|title=Is Google good for you?|publisher=[[BBC News]]|date=December 19, 2003|accessdate=2007-05-16}}</ref> Off-page factors (such as [[PageRank]] and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, [[meta tags]], headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the [[Inktomi]] search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or [[link farm]]s, involved the creation of thousands of sites for the sole purpose of [[spamdexing|link spamming]].<ref>{{cite web|author=Zoltan Gyongyi and Hector Garcia-Molina|url=http://infolab.stanford.edu/~zoltan/publications/gyongyi2005link.pdf| format = PDF | title=Link Spam Alliances|publisher=Proceedings of the 31st VLDB Conference, Trondheim, Norway|year=2005|accessdate=2007-05-09}}</ref>
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.<ref name="nyt0607">{{cite news|publisher=[[New York Times]]|accessdate=2007-06-06|url=http://www.nytimes.com/2007/06/03/business/yourmoney/03google.html|title=Google Keeps Tweaking Its Search Engine|date=June 3, 2007 | first=Saul | last=Hansell}}</ref> The leading search engines, [[Google]], [[Bing]], and [[Yahoo]], do not disclose the algorithms they use to rank pages. Notable SEO service providers, such as Rand Fishkin
In 2005 Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.<ref>{{cite web |url=http://searchenginewatch.com/3563036 |title=Google Personalized Search Leaves Google Labs - Search Engine Watch (SEW) |publisher=searchenginewatch.com |accessdate=2009-09-05 }}</ref> In 2008, [[Bruce Clay]] said that "ranking is dead" because of [[personalized search]]. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.<ref>{{cite web |url=http://www.webpronews.com/topnews/2008/11/17/seo-about-to-get-turned-on-its-ear |title=Will Personal Search Turn SEO On Its Ear? | WebProNews |publisher=www.webpronews.com |accessdate=2009-09-05 }}</ref>
Line 36:
== Relationship with search engines ==
By 1997 search engines recognized that [[webmaster]]s were making efforts to rank well in their search engines, and that some webmasters were even [[spamdexing|manipulating their rankings]] in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as [[Altavista ]]& [[Infoseek]], adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.<ref name="infoseeknyt">{{cite news|url=http://query.nytimes.com/gst/fullpage.html?res=940DE0DF123BF932A25752C1A960958260 |title=Desperately Seeking Surfers|author=Laurie J. Flynn|date=November 11, 1996|publisher=[[New York Times]]|accessdate=2007-05-09}}</ref>
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,<ref name="airweb">{{cite web|url=http://airweb.cse.lehigh.edu/|title=AIRWeb|publisher=Adversarial Information Retrieval on the Web, annual conference|accessdate=2007-05-09}}</ref> was created to discuss and minimize the damaging effects of aggressive web content providers.
|