Search Definitions

Learn what Search words mean and how to use them

Spider
Term for computer program operated by a search engine company that is used to visit pages on the internet, record information about words and keywords on the page, and store the data in an archive or index.

Search Engine
A website that organizes webpages according to keywords, and relevancy to deliver choices to a web user in response to his or her individual search query terms.

Query or Search
This process is composed of entering keywords related to ones interest at the moment in a given search engine's search box and retrieving the results. Every time one enters a new keyword a new query or search is occuring. One does not have to click on a result from a query for it to count as a query or search.

Proximity Search
Term for a specialized search in which one can search for a given phrase found together such as "cars for sale" specifically. A proximity search will deliver a phrase found together while ignoring the individual keywords if they are found far away from each other on a page.

Relevancy
Subjective term referring to the relative quality of search engine results in response to a given user's query. A search engine that results in more searchers finding what they want for a given search has a greater relevancy.

Stemming
Term for a practice by search engines that results in root words and their endings resulting in the same results being returned from a search engine. For example a search for car or cars in many engines due to stemming would return the same result. For terms such as walk walking and walked via stemming some search engines would return the same search results.

Case Sensitive
Term for a practice deployed by some search engines in the past that results in capitalized and lower case words yielding different results. For example a case sensitive search engine would return different results for CAR and car.

Stop Words
Common words like of, and, the, but that are found often in documents and as such are not used as keywords by search engines for ranking purposes. In short the words are ignored.

SERPS
Acronym for Search Engine Result Pages which is frequently used by search engine optimizers to refer to the actual search engine results. For example saying a given engine has updated SERPS means the search engine now delivers different results than previously due to a larger number of pages in the search engine index, changes in relevancy ranking, or for some other reason.

Googling
Term for the practice where by someone types another person's name in the Google search engine to see just what pops up. Googling often includes a person searching for their own name or searching for another name to see if anything "interesting" pops up.

Google Dance
Term frequently used by members of the Webmasterworld Message Board that is used to refer to the time during which Google is in the middle of its monthly update. During this time period search queries may result in differeint results from time to time and all three Google's Google1, Google 2 and Google 3 which are used for testing purposes and can be used to see how the update may be affecting how things are ranked.

Google Bombing
This term refers to the use of linking campaigns designed to make a given site appear under a given keyword sometimes negatively. If lots of sites link to a given site with text surrounding the link containing a negative keyword then the site itself will appear higher in google results for that negative term through no action of their own. For example if lots of sites link to a given company and describe it as "Evil Company" then the company could be associated with evil just by its prominent return in search engine results on Google.

Cloaking
Term referring to a practice used by search engine optimizers in which a different keyword rich page is delivered to search engine spiders while the regular real page is delivered to actual human visitors. With cloaking the ip of a visitor to a site determines what version of a page it receives. Ip's from search engines get mathematically optimized keyword pages while surfers get regular pages. Using Ip's to determine content delviered is known as Ip based delivering ad it used during cloaking. Site's found to be cloaking are often dropped from search engines entirely as the process is considered dishonest.

Invisible Text
This refers to the use of text in a color that can not be seen relative to the background of a site or text too smale for a human to be able to see. Invisible text is used by some in an attempt to gain additional keywords to rank for on search engines. Search engines generally lower the rankings of sites they find doing this.

Metasearch
Term for a search engine that gets its results from multiple search engines and combines them in some format that in theory would take the best results from multiple search engines for a presentation of the best overall results.

Web Directory
Term for a category based organized scheme of the web usually maintained by humans. Popular webdirectories include Yahoo, Zeal/Looksmart, and ODP.

ODP
Acronym for the Open Directory Project which is owned by AOL/Netscape and consists today as a volunteer run directory of the web with a few AOL/Netscape staff overseeing the project. ODP data can be used freely with copyright credit given and is in wide use on search engines such as Gooogle, AOL, and Netscape and that gives the directory its importance.

Zeal
Term for the noncommerical portions of the Looksmart directory that are maintained by volunteer editors known as Zealots along with paid Looksmart editors. Zeal is the only way for noncommercial sites to get into Looksmart and consequently gain prominence on MSN search which uses Looksmart data.

Robots.txt
Term for a file placed on ones website or webserver that will block almost all search engines from visiting the pages of the site or server if set up appropriately or it can block selective search engines or rogue spiders. Just a text file named Robots.txt specifying the robots or search engine spiders to be ignored in a format as explained at http://www.robotstxt.org/wc/exclusion-admin.html is enough to limit search engines from further visiting a site. Robots.txt files are mainly used to limit search engines from viewing too many pages on a site if search engine visits and not real visitors are taking the majority of a site's resources. If you want traffic from search engines you should not upload a file named Robots.txt file anywhere on your site.

Anchor Text
Term for the linked text description of a hyperlink to another website. For example the fake link A really bad site has "a really bad site" as the anchor text which leads to the actual site which is fake in this instance. The anchor text heavily influences the keywords the site will rank well on for search engines like google. If lots of sites link to this site with the same anchor text then the site might appear very prominenty for the phrase "a really bad site."


Go Back Home