Posted by Deepak kumar satapathy on 05:32 with No comments
"A program that searches for and identifies items in a database that correspond to keywords or characters specified by the
user, used especially for finding particular sites on the World Wide Web."
"A program that searches for and identifies items in a database that correspond to keywords or characters specified by the user, used especially for finding particular sites on the World Wide Web."
Basically "Search Engines" are programs or applications that search for documents with in it's database for the particular keyword you are looking for and take you to the Search Engine Result Page (SERP) and where it shows you a list of documents where the keywords you were looking for were found. We can also define it as a web based application or tool which enables you to extract the information from world wide web. Some of the popular search engines for present days are Google, Yahoo, msn, bing and etc. Every search engine is having their own automated programs or algorithms (called as Robot or Bot or Spider) to gather data from different sites by visiting them.
As informed above about the spider/Bots, all of them are having their own algorithms which are based on some different and complex mathematical formulas to generate search results. Then the information regarding a user's query is displayed on the search engine result page by using the algorithm to find out properly. The accuracy of a search engine result page for your query actually depends on it's algorithm. While looking for the data on every page through the spider a Search Engine takes page title, content and keyword density as the key element of a web page and then it decides to which result t place where on the SERP. As stated above every search algorithm is different so it should not be expected for one search result on a search engine to be same for the other search engine's result page. To make it more complicate and the search engine results accurate, the search engine algorithms are closely guarded and constantly undergoing for changes and modification.
For any search engine there are three processes. They are Crawling, Indexing and SERP sorting. Under the first process Crawling; the spider follows all the web pages over world wide web and crawls or visits the contents with in it and finds if any link present on the page to other pages to visit. Thus it completes it's process of crawling all the pages present over web. After Spider completes it's process of crawling the next starts Indexing. Indexing is nothing but to follow the spider crawled pages and index them into categories to store in their database. While indexing a spider always gives preference to the key elements of a web page such as Page title, Meta tag, Meta Description, Keywords etc to categorize it as it plays a vital role for a SERP rankings. After finishing the process of indexing the spider stores the collected data in the search engine database. And finally coming to the SERP Sorting; while a user searches for any keyword or phrase on search engine search box the Bot/Spider tries to match the searched keywords with the data available in database and thus it allows to show no. of results for your searched queries on the Search Engine Result Page.All these complicated process takes only fraction of seconds to complete.
There are two views for every web pages. One is the user view and the other one is Search Engine View. Meaning to the statement is how it seems us a web page doesn't same for a search engine algorithm. The version of a web page we see is designed with large photos,Flash animation and other designing stuffs which are not visible to a search engine.Search Engine reads the plain text forms of all web pages that is the simple and clean html formats of all web pages so it is better to keep your website simple and clean.