Web+databases,+encyclopaedias

A web database is a wide term for managing data online. It is basically database application designed to be managed and accessed through the Internet. Website operators can manage this collection of data and present analytical results based on the data in the Web database application. Databases first appeared in the 1990s, and have been an asset for businesses, allowing the collection of seemingly infinite amounts of data from infinite amounts of customers.

Web databases store information in record and index structures. The record structure is visible to users, while the index structure is typically not available for users to browse. A number of web databases use artificial intelligence to handle updates to links. When a source of data moves to a new location on the Internet, the artificial intelligence changes the hyperlink's address to match the new destination. Other web databases use link-checking programs that must be manually run by a web database administrator.

The most common web databases are MySQL, Oracle, Microsoft SQL Server, Postgre SQL, IBM DB2 and HSQLDB. The platforms web databases run on are Windows, Linux, Unix, and Solaris. Preprocessor hypertext (PHP) scripting language is used to create web databases (PHP runs on the server and not the browser).



__ Online encyclopaedias __ Search Engines and Online encyclopaedias are the key to finding specific information on the vast expanse of the World Wide Web .Without sophisticated search engines, it would be virtually impossible to locate anything on the Web without knowing a specific URL.

__ How they Work __ Crawler-based search engines are those that use automated software agents (called crawlers) that visit a Web site, read the information on the actual site, read the site's m eta tags and also follow the links that the site connects to performing indexing on all linked Web sites as well. The crawler returns all that information back to a central depository, where the data is indexed. The crawler will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the administrators of the search engine. Human-powered search engines rely on humans to submit information that is subsequently indexed and catalogued. Only information that is submitted is put into the index.

In both cases, when you query a search engine to locate information, you're actually searching through the index that the search engine has created —you are not actually searching the Web. These indices are giant databases of information that is collected and stored and subsequently searched

Online encyclopeadia such as Wikipedia employs an open, "wiki" editing model. Except for particularly vandalism-prone pages, every article may be edited anonymously or with a user account. Different language editions modify this policy: only registered users may create a new article in the English edition. No article is owned by its creator or any other editor, or is vetted by any recognized authority; rather, the articles are agreed on by consensus.By default, any edit to an article becomes available immediately, prior to any review. This means that an article may contain errors, misguided contributions, advocacy, or even patent nonsense, until another editor corrects the problem. Different language editions, each under separate administrative control, are free to modify this policy. For example the German Wikipedia maintains a system of "stable versions" of articles, to allow a reader to see versions of articles that have passed certain reviews.

[] [|http://www.ehow.com/how-does_5217412_do-databases-work_.html#ixzz1eV2vDTsO] htttp://www. Wikipedia/wikipedia.com