Encyclopedia Britannica 2012 Ultimate Edition Free Download For Pc

Posted on by
Encyclopedia Britannica 2012 Ultimate Edition Free Download For Pc

2009 / August 2008; 9 years ago ( 2008-08) Development status Discontinued Website at the (archived October 31, 2009) Microsoft Encarta was a published by from 1993 to 2009. Originally available for sale on 2 to 4 or a, it was also later available on the World Wide Web via an annual subscription – although later many articles could also be viewed free online with advertisements. By 2008, the complete English version, Encarta Premium, consisted of more than 62,000 articles, numerous photos and illustrations, music clips, videos, interactive content, timelines, maps, atlases and homework tools. Microsoft published similar encyclopedias under the Encarta trademark in various languages, including,,,,, and.

Localized versions contained contents licensed from national sources and more or less content than the full English version. For example, the Dutch version had content from the Dutch encyclopedia.

Microsoft Encarta 1 free download. Get new version of Microsoft Encarta. A comprehensive educational library ✓ Free ✓ Updated ✓ Download now. Jun 25, 2016 - 15 sec - Uploaded by Artlook ArtlookBritannica Encyclopedia 2016 Free Download link Britannica.

In March 2009, Microsoft announced it was discontinuing both the Encarta disc and online versions. The Encarta site was closed on October 31, 2009 in all countries except, where it was closed on December 31, 2009. Microsoft continued to operate the Encarta online dictionary at dictionary.msn.com until 2011. Contents • • • • • • • • • • • • • • • History [ ] After the successes of (1989) and (1992), Microsoft initiated Encarta, under the internal codename 'Gandalf', by purchasing non-exclusive to the Encyclopedia, incorporating it into its first edition in 1993. (Funk & Wagnalls continued to publish revised editions for several years independently of Encarta, but then ceased printing in the late 1990s.) Launch [ ] The name Encarta was created for Microsoft by an advertising agency and launched in 1993 as a $395 product, although it soon dropped to $99, and was often into the price of a new computer purchase.

In the late 1990s, Microsoft added content from and New Merit Scholar's Encyclopedia from into Encarta after purchasing them. Thus the final Microsoft Encarta can be considered the successor of the Funk and Wagnalls, Collier, and New Merit Scholar encyclopedias. None of these formerly successful encyclopedias remained in print for long after being merged into Encarta.

Microsoft introduced several regional versions of Encarta translated into languages other than English. For example, the version was introduced in 1999 and suspended in 2002. The version was somewhat smaller than the English one, at 42,000 articles. Move to the web [ ] In 2000, the full Encarta content became available on the to subscribers, with a subset available for free to anyone. Demise [ ] In July 2006, Websters Multimedia, a subsidiary of London-based Websters International Publishers, took over maintenance of Encarta from Microsoft.

The last version was Encarta Premium 2009, released in August 2008. Microsoft announced in April 2009 that it would cease to sell and all editions of Encarta Premium software products worldwide by June 2009, citing changes in the way people seek information, and in the traditional encyclopedia and reference material market, as the key reasons behind the termination.

Updates for Encarta were offered until October 2009. Additionally, MSN Encarta web sites were discontinued around October 31, 2009, with the exception of Encarta Japan which was discontinued on December 31, 2009. Existing MSN Encarta Premium (part of MSN Premium) subscribers were refunded. Encarta 's demise was widely attributed to competition from the free and user-generated, which, from small beginnings in 2001, grew to be larger than Encarta, thanks to popularization by web search services like. By the time of Encarta closure announcement (April 2009), Encarta had about 62,000 articles, most behind a, while the had over 2.8 million articles in.

By the time of Encarta's closure (December 2009), the English Wikipedia had over 3.1 million articles. Contents and features [ ] Encarta 's standard edition included approximately 50,000 articles, with additional images, videos and sounds. The premium editions contained over 62,000 articles and other content, such as 25,000 pictures and illustrations, over 300 videos and animations, and an interactive atlas with 1.8 million locations. Its articles were integrated with multimedia content and could include links to selected by its. Encarta 's articles in general were less lengthy and more summarized than the printed version of or the online. Like most multimedia encyclopedias, Encarta 's articles tended to provide an overview of the subject rather than an exhaustive coverage and can only be viewed one at a time. A sidebar could display alternative views,, journals or original materials relevant to the topic.

For example, when reading about, it featured annals since 1967 of the computer industry. Encarta also supported for the hearing impaired. A separate program, called Encarta Research Organizer was included in early versions for gathering and organizing information and constructing a Word document-based report. Later versions included Encarta Researcher which was a browser plugin to organize information from Encarta articles and web pages into research projects. Content copied from Encarta was appended with a boilerplate message after the selection. The user interface allowed for viewing content with only images, videos, sounds, animations, 360-degree views, virtual tours, charts and tables or only interactivities.

Encarta 2000 and later had 'Map Treks', which were tours of geographic features and concepts. Microsoft had a separate product known as Encarta Africana which was an encyclopedia of black history and culture. It was integrated into the standard Encarta Reference suite starting with the 2001 version. Encarta 2002 and onward feature 3D Virtual Tours of ancient structures, for example the; 2D panoramic images of world wonders or major cities; and a virtual flight feature which allows users to fly a virtual over a coarsely generated artificial area.

Version 2002 also introduced the ability to install the entire encyclopedia locally to the hard disk drive to prevent frequent swapping of discs. Encarta 2003 incorporated literature guides and book summaries, foreign language translation dictionaries, a Homework Center and Chart Maker.

Encarta 's Visual Browser, available since the 2004 version, presented a user with a list of related topics making them more discoverable. A collection of 32 videos were also later added. Encarta 2005 introduced another program called Encarta Kids aimed at children to make learning fun. Encarta also included a game called 'MindMaze' (accessible through Ctrl+Z) in which the player explores a castle by answering questions whose answers can be found in the encyclopedia's articles. There was also a 'Geography Quiz' and several other games and quizzes, some quizzes also in Encarta Kids. For years, Encarta came in three primary software editions: Standard, Premium, and Reference Library (price and features in that order). Beginning with Encarta 2006, however, when Websters Multimedia took over its maintenance, Encarta became a feature of Microsoft Student.

Although it was possible to purchase only the Encarta encyclopedia separately, Microsoft Student bundles together Encarta Premium with (a program) and Learning Essentials, which provides templates for. In addition, the Reference Library was discontinued, absorbed into a newer, more comprehensive Premium package. Encarta 's user interface was shared with Microsoft Student, and was streamlined to reduce clutter with only a Search box which returned relevant results; however it became no longer possible to simply browse all the encyclopedia articles alphabetically. World Atlas [ ] The dynamic maps were generated with the same engine that powered software.

The map was a that one could freely rotate and magnify to any location down to major streets for big cities. The globe had multiple surfaces displaying political boundaries, physical landmarks, historical maps and statistical information. One could selectively display statistical values on the globe surface or in a tabular form, different sized cities, various geological or man-made features and in a map.

The maps contained to related articles ('Map Trek') and also supported a 'Dynamic Sensor' that provides the,,, and for any point on the globe. Encarta also generated a visible-light moon atlas with names of major and hyperlinks. However, it did not include a, but instead had a small interactive -only map. In addition to database generated maps, many other illustrative maps in Encarta ('Historical Maps') were drawn by artists. Some more advanced maps were: for example, the large African map for Africana could display information such as political boundaries or the distribution of African.

Encarta Dictionary [ ] When Encarta was released as part of the 'Reference Suite' in 1998 to 2000, Microsoft bundled ' with the other programs ( Encarta Encyclopedia 98 Deluxe Edition, Encarta Desk Atlas, Encarta Virtual Globe 98, Encarta World English Dictionary and Encarta Research Organizer). 'Bookshelf' was discontinued in 2000, and in later Encarta editions ( Encarta Suite 2000 and onward), 'Bookshelf' was replaced with a dedicated Encarta Dictionary, a superset of its printed version, The Encarta World English Dictionary (later ). There was some controversy over the decision, since the dictionary lacks the other books provided in 'Bookshelf' which many found to be a useful reference, such as Columbia Dictionary of Quotations (replaced with a quotations section in Encarta that links to relevant articles and people) and an (although many of the sites listed in offline directories no longer exist). Print versions of Encarta dictionaries has also been published, including: • Encarta World English Dictionary (St Martin's Press, / ) • Bloomsbury English Dictionary • Second edition (Bloomsbury Publishing PLC, / ) • Microsoft Encarta Dictionary: The first dictionary for the Internet age (St. Martin's Paperbacks, / ) • Microsoft Encarta College Dictionary: The First Dictionary For The Internet Age (St.

Martin's Press, / ) • Encarta Webster's College Dictionary of the English Language • Second edition (Bloomsbury Publishing PLC, / ) • Encarta Webster's College Dictionary • Second edition (Bloomsbury Publishing PLC, / ) Regional versions [ ], while Editor-in-Chief of the, criticized Encarta for differences in factual content between national versions of Encarta, accusing Microsoft of 'pandering to local prejudices' instead of presenting subjects objectively. An article written by addressed the nature of writing encyclopedias for different regions. Technology [ ]. Microsoft Student with Encarta Premium 2007 running on.

Before the emergence of the for information browsing, Microsoft recognized the importance of having an engine that supported a, full text search, and extensibility using software objects. The display, and search software was created by a team of Division developers in the late 1980s who designed it as a generalized engine for uses as diverse as, and as ambitious as a multimedia encyclopedia. Encarta was able to use various Microsoft technologies because it was extensible with for displaying unique types of multimedia information. For example, a snap in map engine is adapted from its software. More information on the hypertext and search engine used by Encarta may be found in the article.

Encarta used database technologies to generate much of its multimedia content. For example, Encarta generated each zoomable map from a global database on demand. When a user used the function of Microsoft Windows on Encarta on more than five words, Encarta automatically appended a message after the paste. User editing [ ] Early in 2005, Gary Alt announced that the online Encarta started to allow users to suggest changes to existing articles. Encarta 's content was accessible using a conversational interface on via the MSN 'Encarta Instant Answers'. The bot could answer many encyclopedia related questions directly in the IM window.

It used short sentences from the Encarta website, and sometimes displays full articles in the -based browser on the right. It also could complete simple mathematical and advanced algebra problems. This service was also available in,, and.

Updates [ ] Each (in the Northern hemisphere) or (in the Southern hemisphere), Microsoft published a new version of Encarta. However, despite the inclusion of news-related and some supplementary articles, Encarta 's contents had not been changed substantially in its later years. Besides the yearly update, the installed offline copy could be updated over the Internet for a certain period for free depending on the edition. Some articles (usually about 2,000) were updated to reflect important changes or events. When the update period expired, an advertisement prompting to upgrade to the new version was displayed to the user occasionally.

Reception [ ] The editors of nominated Microsoft Encarta '95 for their 1994 'Best Educational Product' award, although it lost to the CD-ROM adaptation of. See also [ ]. • For the free service, one should use the URL. Archived from on 2005-08-11. Retrieved 2006-01-07. (MSN Search Encarta) rather than (MSN Encarta: Online Encyclopedia, Dictionary, Atlas, and Homework).

Retrieved 2012-03-13. • ^ (MSN Encarta). • Protalinski, Emil (March 30, 2009).. Ars Technica. Retrieved 2009-04-08. • 's Best Encyclopedias,1994 • (PDF).

Archived from (PDF) on August 10, 2007. Retrieved 2009-08-24. Retrieved 2009-08-24.

Retrieved 2009-08-24. • Cohen, Noam (March 30, 2009).. The New York Times. Retrieved 2009-08-24. • Harvard Business School Case Study 'Blown to Bits' •, 22 Dec 1992, PC Mag • ^, Randall Stross, May 2, 2009, New York Times •: ' Encarta was not given away but sold at retail for about $100, and sold wholesale to PC manufacturers who bundled it with new machines.' Retrieved 2008-08-05. Retrieved 2012-03-13.

Archived from on 2012-03-22. Retrieved 2012-03-13.

• Gralla, Preston (March 31, 2009)... Retrieved 2009-11-12. • McDougall, Paul (March 31, 2009).. Retrieved 2017-01-25. • Alderman, Naomi (7 April 2009)..

Ati Radeon Hdmi Driver Windows 7 64 Bit. Retrieved 29 April 2010. • Noam Cohen. New York Times • •. Retrieved 2012-03-13. Archived from on 2011-06-04.

Retrieved 2012-03-13. Archived from the original on August 15, 2015. Retrieved July 25, 2015.

CS1 maint: BOT: original-url status unknown () • • • • 2007-09-27 at the., essay by Robert McHenry •. Archived from the original on June 29, 2012.

Retrieved 2004-09-13. CS1 maint: BOT: original-url status unknown (), essay by Bill Gates reprinted in of South Africa, April 6, 1997, archived in 2012 and accessed Jan 9 2014. •, 4/15/2005, Associated Press • MSN screenname: encarta@conversagent.com and encarta@botmetro.net • MSN screenname: de.encarta@botmetro.net • MSN screenname: es.encarta@botmetro.net • MSN screenname: fr.encarta@botmetro.net • MSN screenname: jp.encarta@botmetro.net • Staff (March 1995). 'The First Annual PC Gamer Awards'.. 2 (3): 44, 45, 47, 48, 51.

External links [ ] • at the (archived October 31, 2009).

INTERNET A worldwidetelecommunications network of business, government, and personal computers. The internet is a network of computers linking the United States with the rest of the world. Originally developed as a way for U.S. Research scientists to communicate with each other, by the mid 1990s the Internet had become a popular form of telecommunication for personal computer users. The dramatic growth in the number of persons using the network heralded the most important change in telecommunications since the introduction of television in the late 1940s. However, the sudden popularity of a new, unregulated communications technology raised many issues for U.S.

The Internet, popularly called the Net, was created in 1969 for the U.S. Defense department. Funding from the Advanced Research Projects Agency (ARPA) allowed researchers to experiment with methods for computers to communicate with each other. Their creation, the Advanced Research Projects Agency Network (ARPANET), originally linked only four separate computer sites at U.S.

Universities and research institutes, where it was used primarily by scientists. In the early 1970s, other countries began to join ARPANET, and within a decade it was widely accessible to researchers, administrators, and students throughout the world. The National Science Foundation (NSF) assumed responsibility for linking these users of ARPANET, which was dismantled in 1990.

The NSF Network (NSFNET) now serves as the technical backbone for all Internet communications in the United States. The Internet grew at a fast pace in the 1990s as the general population discovered the power of the new medium. A significant portion of the Net's content is written text, in the form of both electronic mail (e-mail) and articles posted in an electronic discussion forum known as the Usenet news groups.

In the mid-1990s the appearance of the World Wide Web made the Internet even more popular. The World Wide Web is a multimedia interface that allows for the transmission of text, pictures, audio, and video together, known as web pages, which commonly resemble pages in a magazine. Together, these various elements have made the Internet a medium for communication and for the retrieval of information on virtually any topic. The sudden growth of the Internet caught the legal system unprepared.

Before 1996, Congress had passed little legislation on this form of telecommunication. In 1986, Congress passed the Electronic Communications Privacy Act (ECPA) (18 U.S.C.A.

§ 2701 et seq. [1996]), which made it illegal to read private e-mail. The ECPA extended most of the protection already granted to conventional mail to electronic mail. Just as the post office may not read private letters, neither may the providers of private bulletin boards, on-line services, or Internet access. However, law enforcement agencies can subpoena e-mail in a criminal investigation.

The ECPA also permits employers to read their workers' e-mail. This provision was intended to protect companies against industrial spying, but it has generated lawsuits from employees who objected to the invasion of their privacy. Federal courts, however, have allowed employers to secretly monitor an employee's e-mail on a company-owned computer system, concluding that employees have no reasonable expectation of privacy when they use company e-mail. Should the Internet Be Policed? Few observers could have predicted the fuss that the Internet began to generate in political and legal circles in the mid-1990s. After all, the global computer network linking 160 countries was hyped relentlessly in the media in the early 1990s. It spawned a multimillion-dollar industry in Internet services and a publishing empire devoted to the online experience—not to mention Hollywood movies, newspaper columns, and new jargon.

But the honeymoon did not last. Like other communications media before it, the Internet provoked controversy about what was actually sent across it. Federal and state lawmakers proposed crackdowns on its content. Prosecutors took aim at its users. Civil liberties groups fought back.

As the various factions engaged in a tug-of war over the future of this sprawling medium, the debate became a question of freedom or control: should the Internet be left alone as a marketplace of ideas, or should it be regulated, policed, and ultimately 'cleaned up'? Although this question became heated during the early- to mid-1990s, it has remained a debated issue into the early 2000s. More than three decades after defense department contractors put it up, the network remains free from official control. This system has no central governing authority for a very good reason: the general public was never intended to use it. Its designers in the late 1960s were scientists. Several years later, academics and students around the world got access to it. In the 1990s, millions of people in U.S.

Businesses and homes signed on. Before the public signed on its predecessors had long since developed a kind of Internet culture—essentially, a freewheeling, anything-goes setting. The opening of the Internet to everyone from citizens to corporations necessarily ruptured this formerly closed society, and conflicts appeared. Speech rights quickly became a hot topic of debate.

The Internet is a communications medium, and people have raised objections to speech online just as they have to speech in the real world. The Internet allows for a variety of media—text, pictures, movies, and sound—and pornography is abundantly accessible online in all these forms. It is commonly 'posted' as coded information to a part of the Internet called Usenet, a public issues forum that is used primarily for discussions. With over 10,000 topic areas, called news groups, Usenet literally caters to the world's panoply of interests and tastes. Certain news groups are devoted entirely to pornography.

As the speed of the Internet increased dramatically with the development of broadband access in the late 1990s and early 2000s, not only has more of this type of information become more available, but also users have been able to access this information in greater quantity. Crandall, Robert W., and James H. Alleman, eds. Broadband: Should We Regulate High-Speed Internet Access?

Washington, D.C.: AEI-Brookings Joint Center for Regulatory Studies. Federal Trade Commission. Self-Regulation and Privacy Online: A Report to Congress.

Washington, D.C.: Federal Trade Commission. Criminal activity on the Internet generally falls into the category of computer crime. It includes so-called hacking, or breaking into computer systems, stealing account passwords and credit-card numbers, and illegally copying intellectual property. Because personal computers can easily copy information—including everything from software to photographs and books—and the information can be sent anywhere in the world quickly, it has become much more difficult for copyright owners to protect their property.

Public and legislative attention, especially in the mid to late 1990s, focused on Internet content, specifically sexually explicit material. The distribution of pornography became a major concern in the 1990s, as private individuals and businesses found an unregulated means of giving away or selling pornographic images. As hard-core and child pornography proliferated, Congress sought to impose restrictions on obscene and indecent content on the Internet. In 1996, Congress responded to concerns that indecent and obscene materials were freely distributed on the Internet by passing the Communications Decency Act (CDA) as part of the Telecommunications Act of 1996, Pub. 104-104, 110 Stat. This law forbade the knowing dissemination of obscene and indecent material to persons under the age of 18 through computer networks or other telecommunications media.

The act included penalties for violations of up to five years imprisonment and fines of up to $250,000. The american civil liberties union (ACLU) and online Internet services immediately challenged the CDA as an unconstitutional restriction on freedom of speech.

A special three-judge federal panel in Pennsylvania agreed with these groups, concluding that the law was overbroad because it could limit the speech of adults in its attempt to protect children. American Civil Liberties Union v. The government appealed to the U.S.

Supreme Court, but the Court affirmed the three-judge panel on a 7-2 vote, finding that the act violated the first amendment. American Civil Liberties Union, 521 U.S. 2d 236 (1997). Though the Court recognized the 'legitimacy and importance of the congressional goal of protecting children from the harmful materials' on the Internet, it ruled that the CDA abridged freedom of speech and that it therefore was unconstitutional. Justice john paul stevens, writing for the majority, acknowledged that the sexually explicit materials on the Internet range from the 'modestly titillating to the hardest core.' He concluded, however, that although this material is widely available, 'users seldom encounter such content accidentally.' In his view, a child would have to have 'some sophistication and some ability to read to retrieve material and thereby to use the Internet unattended.'

He also pointed out that systems for personal computers have been developed to help parents limit access to objectionable material on the Internet and that many commercial web sites have age-verification systems in place. Turning to the CDA, Stevens found that previous decisions of the Court that limited free speech out of concern for the protection of children were inapplicable. The CDA differed from the laws and orders upheld in the previous cases in significant ways. The CDA did not allow parents to consent to their children's use of restricted materials, and it was not limited to commercial transactions.

In addition, the CDA failed to provide a definition of 'indecent,' and its broad prohibitions were not limited to particular times of the day. Finally, the act's restrictions could not be analyzed as forms of time, place, and manner regulations because the act was a content-based blanket restriction on speech. Accordingly, it could not survive the First Amendment challenge. In 1998, Congress responded to the decision by enacting the Child Online Protection Act (COPA), Pub.

105-277, 112 Stat. This act was narrower in its application than the CDA, applying only to commercial transactions and limited to content deemed to be 'harmful to minors.' The new statute was subject to immediate litigation. A federal district court placed a preliminary injunction on the application of the statute, and this decision was affirmed by the U.S.

Court of Appeals for the Third Circuit. American Civil Liberties Union v. Reno, 217 F.3d 162 (3d Cir. Although the U.S. Supreme Court vacated the decision, it was due to procedural grounds rather than the merits of the challenge.

American Civil Liberties Union, 535 U.S. 2d 771 (2002). On remand, the Third Circuit again affirmed the injunction, holding that that statute likely violated the First Amendment. American Civil Liberties Union v. Ashcroft, 322 F.3d 240 (3d Cir. The questions raised in Reno and subsequent decisions have also been raised in the debate over the use of Internet filters.

Many schools and libraries, both public and private, have installed filters that prevent users from viewing vulgar, obscene, pornographic, or other types of materials deemed unsuitable by the institution installing the software. The ACLU, library associations, and other organizations that promote greater access to information have objected to the use of these filters, especially in public libraries. The first reported case involving libraries and Internet filters occurred in Mainstream Loudon v. Board of Trustees of the London County Library, 24 F. A Virginia federal court judge in that case ruled that the use of screening software by a library was unconstitutional, as it restricted adults to materials that the software found suitable for children. Courts have generally been split about his issue, and several have found that the use of these filters in public schools is allowed under the First Amendment.

Pornography is not the only concern of lawmakers and courts regarding potential crime on the Internet. The Internet has produced forms of terrorism that threaten the security of business, government, and private computers. Computer 'hackers' have defeated computer network 'firewalls' and have vandalized or stolen electronic data.

Another form of terrorism is the propagation and distribution over the Internet of computer viruses that can corrupt computer software, hardware, and data files. Many companies now produce virus-checking software that seeks to screen and disable viruses when they arrive in the form of an e-mail or e-mail file attachment.

However, computer hackers are constantly inventing new viruses, thus giving the viruses a window of time to wreak havoc before the virus checkers are updated. Moreover, the fear of viruses has led to hoaxes and panics. One of the most infamous viruses, dubbed the Melissa virus, was created in 1999 by David Smith of New Jersey. It was sent through a Usenet newsgroup as an attachment to a message the purported to provide passwords for sexrelated web sites.

When the attachment was opened, it infected the user's computer. The program found the user's address book and sent a mass message with attachments containing the virus. Within a few days, it had infected computers across the globe and forced the shutdown of more than 300 computer networks from the heavy loads of e-mail that Melissa generated. The Melissa virus represented one of the first instances where law enforcement personnel were able to take advantage of new technologies to track the creator of the virus.

On April 1, 1999, about a week after the virus first appeared on the Usenet newsgroups, police arrested Smith. He pled guilty to one count of computer fraud and abuse. He was sentenced to 20 months in prison and was fined $5,000. Another area of legal concern is the issue of libel. In tort law, libel and slander occur when the communication of false information about a person injures the person's good name or reputation. Where the traditional media are concerned, it is well settled that libel suits provide both a means of redress for injury and a punitive corrective against sloppiness and malice.

Regarding communication on the Internet, however, there is little case law, especially on the key issue of liability. In suits against newspapers, courts traditionally held publishers liable, along with their reporters, because publishers were presumed to have reviewed the libelous material prior to publication. Because of this legal standard, publishers and editors are generally careful to review anything that they publish.

However, the Internet is not a body of material that is carefully reviewed by a publisher, but an unrestricted flood of information. If a libelous or defamatory statement is posted on the Internet, which is owned by no one, the law is uncertain as to whether anyone other than the author can be held liable. Some courts have held that online service providers, companies that connect their subscribers to the Internet, should be held liable if they allow their users to post libelous statements on their sites. An online provider is thus viewed like a traditional publisher. Other courts have rejected the publisher analogy and instead have compared Internet service providers to bookstores.

Like bookstores, providers are distributors of information and cannot reasonably be expected to review everything that they sell. Libel law gives greater protection to bookstores because of this theory ( Smith v.

California, 361 U.S. 2d 205 [1959]), and some courts have applied it to online service providers.

Trademark infringement on the Internet has also led to controversy and legal disputes. One of the biggest concerns for registered trademark and service mark holders is protection of the mark on the Internet.

As Internet participants establish sites on the Web, they must create domain names, which are names that designate the location of the web site. Besides providing a name to associate with the person or business that created the site, a domain name makes it easy for Internet users to find a particular home page or web site.

As individuals and businesses devised domain names in this medium, especially during the mid to late 1990s, they found that the names they created were similar to, or replicas of, registered trademarks and service marks. Several courts have considered complaints that use of a domain name violated the rights of a trademark or service mark holder, and early decisions did not favor these parties' rights. In 1999, Congress enacted the Anti-cyber-squatting Consumer Protection Act, Pub. 106-113, 113 Stat.

The act strengthened the rights of trademark holders by giving these owners a cause of action against so-called 'cybersquatters' or 'cyberpirates,' individuals who register a third-party's trademark as a domain name for the purpose of selling it back to the owner for a profit. Prior to the enactment of this law, an individual could register a domain name using the trademark or service mark of a company, and the company would have to use a different domain name or pay the creator a sum of money for the right to use the name.

Thus, for example, an individual could register the name www.ibm.com, which most web users would have associated with International Business Machines (IBM), the universally recognized business. Because another individual used this domain name, IBM could not create a Web site using www.ibm.com without paying the cyber-squatter a fee for its use. The 1999 legislation eradicated this problem. During the 1990s, a number of companies were formed that operated completely on the Internet.

Due to the overwhelming success of these companies, the media dubbed this phenomenon the 'dot-com bubble.' The success of these companies was relatively short-lived, as the 'bubble' burst in early 2000. Many of these Internet companies went out of business, while those that remained had to reconsider new business strategies. Notwithstanding these setbacks, the Internet itself has continued to develop and evolve. During the 1990s, the vast majority of Internet users relied upon telephone systems to log on to the Internet. This trend has changed drastically in recent years, as many users have subscribed to services that provide broadband access through such means as cable lines, satellite feeds, and other types of high-speed networks. These new methods for connecting to the Internet allow users to retrieve information at a much faster rate of speed.

They will likely continue to change the types of content that are available through this means of telecommunications. Further readings 'ACLU Analysis of the Cox/Wyden Bill (HR 1978).' July 10, 1995. American Civil Liberties Union site. Available online at (accessed November 20, 2003). 'ACLU Cyber-Liberties Alert: Axe the Exon Bill!'

April 29, 1995. American Civil Liberties Union site. Available online at (acccessed November 20, 2003). 'A Civil Liberties Ride on the Information Superhighway.' Civil Liberties: The National Newsletter of the ACLU 380 (spring).

'Amicus Curiae Brief in re U.S. Jake Baker and Arthur Gonda, Crim. 95-80106, U.S. District Court Eastern District of Michigan Southern Division.' April 26, 1995.

American Civil Liberties Union site. Available online at (accessed November 20, 2003). Blanke, Jordan M.

Minnesota Passes the Nation's First Internet Privacy Law.' Rutgers Computer & Technology Law Journal 29 (summer). 'Can the Use of Cyberspace Be Governed?'

Congressional Quarterly Researcher (June 30). 'Constitutional Problems with the Communications Decency Amendment: A Legislative Analysis by the EFF.' June 16, 1995. Electronic Frontier Foundation site. Available online at (accessed November 20, 2003). 'Legislative Update: Pending State Legislation to Regulate Online Speech Content.' April 17, 1995.

American Civil Liberties Union site. Available online at (accessed November 20, 2003). Leiter, Richard A. 'The Challenge of the Day: Permanent Public Access.' Legal Information Alert 22 (February): 10. Peck, Robert S.

Libraries, the First Amendment, and Cyberspace: What You Need to Know. Chicago: American Library Association. Peters, Robert. 'Marketplace of Ideas' or Anarchy: What Will Cyberspace Become?' Mercer Law Review 51 (spring): 909–17.

'Prodigy Stumbles as a Forum Again.' Electronic Frontier Foundation site. Available online at (accessed November 20, 2003). Reed, Cynthia K., and Norman Solovay. The Internet and Dispute Resolution: Untangling the Web. New York: Law Journal Press.

Smith, Mark, ed. Managing the Internet Controversy. New York: Neal-Schuman Publishers. Tsai, Daniel, and John Sullivan. 'The Developing Law of Internet Jurisdiction.'

The Advocate 61 (July). The The Internet is the world's largest computer network. It is a global information infrastructure comprising millions of computers organized into hundreds of thousands of smaller, local networks. The term “information superhighway ” is sometimes used to describe the function that the Internet provides: an international, high-speed telecommunications network that offers open access to the general public. The Internet provides a variety of services, including electronic mail (e-mail), the (WWW), Intranets, File Transfer Protocol (FTP), Telnet (for remote login to host computers), and various file-location services. HISTORY OF THE The idea for the Internet began in the early 1960s as a military network developed by the U.S. Department of Defense's Advanced Research Project Agency (DARPA).

At first, it was a small network called ARPANET, which promoted the sharing of super-computers amongst military researchers in the. A few years later, DARPA began to sponsor research into a cooperative network of academic time-sharing computers. By 1969, the first ARPANET hosts were constructed at Stanford Research Institute, University of, (UCLA), University of California Santa Barbara, and the University of.

A second factor in growth was the National Science Foundation's NSFNET, built in 1986 for the purpose of connecting university computer science departments. NSFNET combined with ARPANET to form a huge backbone of network hosts. This backbone became what we now think of as the Internet (although the term “Internet ” was used as early as 1982). Hinari Steam Generator Iron Manual.

The explosive growth of the Internet came with major problems, particularly related to privacy and security in the digital world. Computer crime and malicious destruction became a paramount concern. One dramatic incident occurred in 1988 when a program called the “Morris worm ” temporarily disabled approximately 10 percent of all Internet hosts across the country.

The Computer Emergency Response Team (CERT) was formed in 1988 to address such security concerns. In 1990, as the number of hosts approached 300,000, the ARPANET was decommissioned, leaving only the Internet with NSFNET as its sole backbone. The 1990s saw the commercialization of the Internet, made possible when the NSF lifted its restriction on commercial use and cleared the way for the age of electronic commerce. Electronic commerce was further enhanced by new applications being introduced to the Internet. For example, programmers at the University of developed the first point-and-click method of navigating Internet files in 1991.

This program, which was freely distributed on the Internet, was called Gopher, and gave rise to similar applications such as Archie and Veronica. An even more influential development, also started in the early 1990s, was Tim Berners-Lee's work on the World Wide Web, in which hypertext-formatted pages of words, pictures, and sounds promised to become an advertiser's dream come true. At the same time, Marc Andreessen and colleagues at the National Center for Supercomputing Applications (NCSA), located on the campus of University of at Urbana-Champaign, were developing a graphical browser for the World Wide Web called Mosaic (released in 1993), which would eventually evolve into Netscape.

By 1995, the Internet had become so commercialized that most access to the Internet was handled through Internet service providers (ISPs), such as America Online and Netcom. At that time, NSF relinquished control of the Internet, which was now dominated by Web traffic. Partly motivated by the increased commercial interest in the Internet, Sun Microsystems released an Internet programming language called Java, which promised to radically alter the way applications and information can be retrieved, displayed, and used over the Internet. By 1996, the Internet's twenty-fifth anniversary, there were 40 million Internet users; by 2002, that number had increased to 531 million, and by 2006 the number of Web users was roughly 750 million. Internet-based electronic commerce has reached major proportions as well, totalling roughly $140 million in revenue in the United States alone in 2007.

This number continues to rise steadily throughout the 2000s. BANDWIDTH Bandwidth is the capacity of a particular pathway to transmit information for online purposes. It is bandwidth that controls how fast Web sites download. In analog settings (such as dial-up), bandwidth is measured by frequency, the difference between the highest and lowest frequencies, expressed in Hertz. Digital lines measure bandwidth in bits/bytes per second (the amount of information transferred every second).

Companies often determine and set the amount of bandwidth allowed for certain activities, an activity called bandwidth allocation. INTERNET CONNECTIONS There are many types of Internet connections, which have changed in sophistication and speed throughout the Internet's history.

The first kind is the analog connection, or dial-up, one of the cheapest and slowest ways to connect. The computer dials a phone number to access the network and the modem can convert the data to either format, as required. This analog format is the slowest connection, and the one most subject to quality issues. ISDN, or integrated services digital network, is the international format for normal phone-related Internet connections.

B-ISDN is a more recent format for other phone connections, such as fiber optics. DSL is a constant connection that will take up the phone line the way an analog connection does. There are two main types of DSL —ADSL, which is used most commonly in America, and SDSL, which can transmit a larger amount of information and is more often found in. Others receive Internet through cable, a broadband connection that operates through TV lines. Certain TV channels are used to take and receive Internet information, and since these coaxial cable connections can handle a much higher rate of data than phone lines, cable Internet service tends to be faster.

Wireless Internet is also becoming popular —connecting computers to the Internet through radio-wave transmissions. This requires a wireless hub or router that transmits information into radio waves, but the connection can be accessed from anywhere in the radius of the broadcast. E-MAIL Electronic mail, or e-mail, is the most widely used function used on the Internet today.

Millions of messages are passed via Internet lines every day throughout the world. Compared to postal service, overnight delivery companies, and telephone conversations, e-mail via the Internet is extremely cost-effective and fast.

E-mail facilities include sending and receiving messages, the ability to broadcast messages to several recipients at once, storing and organizing messages, forwarding messages to other interested parties, maintaining address books of e-mail partners, and even transmitting files (called “attachments ”) along with messages. Internet e-mail messages are sent to an e-mail address. The structure of an e-mail address is as follows: PersonalID@DomainName. The personal identifier could be a person's name or some other way to uniquely identify an individual.

The domain is an indicator of the location of that individual, and appears to the right of the “at ” (@) sign. A domain name is the unique name of a collection of computers that are connected to the Internet, usually owned by or operated on behalf of a single organization (company, school, or agency) that owns the domain name. The domain name consists of two or more sections, each separated by a period.

From right-to-left, the portions of the domain name are more general to more specific in terms of location. In the United States, the rightmost portion of a domain is typically one of the following: • com —indicating a commercial enterprise • edu —indicating an educational institution • gov —indicating a governmental body • mil —indicating a military installation • net —indicating a network resource • org —indicating a nonprofit organization In non-U.S. Countries, the rightmost portion of a domain name is an indicator of the geographic origin of the domain. For example, Canadian e-mail addresses end with the abbreviation “ca.

” The World Wide Web (WWW) is a system and a set of standards for providing a graphic user interface (GUI) to Internet communications. The Web is the single most important factor in the popularity of the Internet, because it makes the technology easy to use and gives attractive and entertaining presentation to users. Graphics, text, audio, animation, and video can be combined on Web pages to create dynamic and highly interactive access to information. In addition, Web pages can be connected to each other via hyperlinks.

These hyperlinks are visible to the user as highlighted text, underlined text, or images that the user can click to access another Web page. Web pages are available to users via Web browsers, such as Mozilla/Firefox, Apple's Safari, Opera, or Microsoft's Internet Explorer. Browsers are programs that run on the user's computer and provide the interface that displays the graphics, text, and hyperlinks to the user. Browsers recognize and interpret the programming language called Hypertext Markup Language (HTML).

HTML includes the ability to format and display text; size and position graphics images for display; invoke and present animation or video clips; and run small programs, called applets, for more complex interactive operations. Browsers also implement the hyperlinks and allow users to connect to any Web page they want.

Search Engines. Sometimes a user knows what information she needs, but does not know the precise Web page that she wants to view.

A subject-oriented search can be accomplished with the aid of search engines, which are tools that can locate Web pages based on a search criterion established by the user. By far, Google is the most commonly used search engine. The ease with which users can publish their own information using the World Wide Web has created an opportunity for everyone to be a publisher. An outcome from this is that every topic, hobby, niche, and fetish now has a thriving community of like-minded people. The ease of publishing information on the Web became easier with the advent of Web logs or “blogs, ” online diaries that opened the floodgates to an even greater level of individual participation in information sharing and community. UNIFORM RESOURCE LOCATORS (URL) A Uniform Resource Locator (URL) is a networked extension of the standard filename concept.

It allows the user to point to a file in a directory on any machine on the Internet. In addition to files, URLs can point to queries, documents stored deep within databases, and many other entities. Primarily, however, URLs are used to identify and locate Web pages. A URL is composed of three parts: Protocol. This is the first part of the address. In a Web address, the letters “http ” stand for Hypertext Transfer Protocol, signifying how this request should be dealt with.

The protocol information is followed by a colon. URL protocols usually take one of the following types: • http —for accessing a Web page • ftp —for transferring a file via FTP • file —for locating a file on the client's own machine • gopher —for locating a Gopher server • mail —for submitting e-mail across the Internet • news —for locating a Usenet newsgroup Resource Name. This is the name of the server/machine at which the query should be directed.

For an “http ” request, the colon is followed by two forward slashes, and this indicates that the request should be sent to a machine. Path and File Name.

The rest of a URL specifies the particular computer name, any directory tree information, and a file name, with the latter two pieces of information being optional for Web pages. The computer name is the domain name or a variation on it (on the Web, the domain is most commonly preceded by a machine prefix “www ” to identify the computer that is functioning as the organization's Web server, as opposed to its e-mail server, etc.). If a particular file isn't located at the top level of the directory structure (as organized and defined by whoever sets up the Web site), there may be one or more strings of text separated by slashes, representing the directory hierarchy. Finally, the last string of text to the right of the rightmost slash is the individual file name; on the Web, this often ends with the extension “htm ” or “html ” to signify it's an HTML document. When no directory path or file name is specified (e.g., the URL ), the browser is typically pointed automatically to an unnamed (at least from the user's perspective) default or index page, which often constitutes an organization's home or start page. Thus, a full URL with a directory path and file name may look something like this: Lastly, a Web URL might also contain, somewhere to the right of the domain name, a long string of characters that does not correspond to a traditional directory path or file name, but rather is a set of commands or instructions to a server program or database application.

The syntax of these URLs depends on the underlying software program being used. Sometimes these can function as reusable URLs (e.g., they can be bookmarked and retrieved repeatedly), but other times they must be generated by the site's server at the time of use, and thus can't be retrieved directly from a bookmark or by typing them in manually. Commercial abuse of e-mail continues to be problematic as companies attempt to e-mail millions of online users in bulk. This technique is called “spam, ” (so named after a skit by the comedy troupe Monty Python that involved the continuous repetition of the word). Online users are deluged with a massive amount of unwanted e-mail selling a wide array of products and services. Spam has become a network-wide problem as it impacts information transfer time and overall network load. Several organizations and governments are attempting to solve the spam problem through legislation or regulation.

Computer viruses spread by e-mail have also grown as the Internet has grown. The widespread use of e-mail and the growing numbers of new, uninformed computer users has made it very easy to spread malicious viruses across the network.

Security issues for both personal computers and for network servers will continue to be a crucial aspect of the ongoing development of the Internet and World Wide Web. INTRANET Intranets are private systems, contained within servers owned by companies. They are based on the same principles that govern the Internet but are not widely available; instead, they are used only for communicating and transferring company information between employees. Companies utilize intranets to protect valuable information from outside access, creating them with layers of protection in place.

Because intranet systems are private, they do not suffer from some of the problems the Internet faces, such as speed-related performance issues from too many users trying to access the same sites. Companies can place multimedia presentations on their systems more easily, showing presentations and running training programs for employees. Company uses for intranet systems are varied, including procedural manuals, employee benefit resources, orientation programs, software and hardware instructions, and even company social networks or e-zine postings. Intranets can also be constructed for a company's specific needs, tailored in functions and appearance. They can include simple files of information, such as spreadsheets or word documents. They can also incorporate search engines that employees can use to find particular components or analyze sets of data. Many also provide links to the Internet and relevant Web sites.

VOIP VoIP, or Voice over Internet Protocol, is a developing technology allowing users to access audio communication through their Internet settings. The Internet line sends voice transmissions in the form of data packets, like all other types of information stored in servers, which are then changed in audio on a receiving phone system.

Users of VoIP benefit by not having to pay for separate phone and Internet services. Beyond the software and hardware required to set up VoIP, companies usually do not need to pay for more than their normal Internet service. The most important factors in VoIP service are audio quality and accessibility. VoIP can be provided by many different companies, including CoolTalk, Vonage, and Phone Power, but companies should always be sure to conduct tests of the audio quality to ensure it is as good as normal phone service. Also, some companies may prefer to have a back-up system installed in case of emergencies, such as Internet shut-downs or power outages. SOCIAL NETWORKING Social networks have become increasingly popular in the past few years with the rise of such Web sites as MySpace and Facebook, where Internet users can create their own profiles and structure personal Web sites in online communities.

Thanks to the ease of Internet communication, participants can form friendships and spread information at a high speed across a vast area. Businesses can make use of these social networks in several ways. Many social networks employ widgets, or embedded advertisements, often in the form of rich media. These interactive advertisements can be posted along the edges of the Web sites and can serve as both marketing and analyzing tools.

By making an animated advertisement that can be clicked on or interacted with, a business can judge how attractive the advertisement is through programs designed to collect widget data. Because social networks spread information so quickly, businesses can also use them as platforms to propagate their messages and brand. Some companies have their own MySpace sites to use for marketing purposes, trying a more personal form of promotion that many social network users find honest. Other organizations are beginning to view social networks as an effective way to recruit new employees. SMARTPHONES AND PDAS Mobile, handheld computer devices are very common in today's business world. PDAs, which offer online interaction and note-taking abilities, are being increasingly replaced by smartphones, which are phones configured to offer the same services, including connection to the Internet, e-mail, and document programs. While many companies are eager to offer these mobile devices to their employees as a communication tool, only some are currently taking advantage of handhelds as a marketing tool.

Websites can be configured to the mini-browsers smart-phones rely on, giving those using handheld devices easier access to online information and advertisements. The primary problem cited with smartphones and PDAs is security, since they are not affected by companies' intranet or Internet protections. E-commerce can take many different forms. Some companies use a “click and mortar ” system where they operate stores or factories in physical locations while also offering their products in an online store where orders can be made.

Other companies have a central, physical hub and warehouses from which they conduct a large amount of business over the Internet without other bases, such as Amazon.com. Some companies exist by offering purely online services with only a central office, such as eBay. A company's online store can be constructed to help customers personally, by keeping track of what they view, what they order, and offering similar products that they may be interested in. This is called personalization, and the ability to offer each customer their own experience every time they access the company Web site is a powerful marketing tool. It is also important for companies to consistently update their online stores to reflect their changing services or merchandise, including deals and discounts.

The interface companies use is also important —how the Web site looks and reacts to customers, especially in response to searches and guided navigation. WEB CAMS The current quality of web cams allows companies to transfer video images in real time, letting them use the Internet to video-conference. Some companies are beginning to use video-messaging, a service that often accompanies instant messaging. This technology works for one-on-one meetings and conferences involving multiple attendees. SEE ALSO Computer Networks; Computer Security; Electronic Commerce; Electronic Data Interchange and Electronic Funds Transfer BIBLIOGRAPHY “Bandwidth Shaping. Jupiter Media Corporation, 2008. Berners-Lee, Tim.

Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web.: HarperBusiness, 2000. “The Red Queen of E-commerce. ” Ecommerce Times, 2008. “The Difference Between VoIP and PSTN Systems.

Jupiter Media Corporation, 2008. Grauer, Robert, and Gretchen Marx. Essentials of the Internet. Upper Saddle River, NJ: Prentice Hall, 1997.

Hafner, Katie. Where Wizards Stay Up Late: The Origin of the Internet. New York: Simon & Schuster, 1998. “Intranet Corner. ” Intranet Journal, 2008. Available from:. Kalakota, Ravi, and Andrew B.

Electronic Commerce: A Manager's Guide. Reading, MA: Addison-Wesley, 1996. “Types of Internet Connections. Jupiter Media Corporation, 2008.

The Internet is a vast global system of interconnected technical networks made up of heterogeneous information and communication technologies. It is also a social and economic assemblage that allows diverse forms of communication, creativity, and cultural exchange at a scope and scale unknown before the late twentieth century. The terms Internet and net are often used when discussing the social implications of new information technologies, such as the creation of new communal bonds across great distances or new forms of wealth and inequality. Such a usage is imprecise: The Internet is distinct from the applications and technologies that are built upon it, such as e-mail, the, online gaming, filesharing networks, and e-commerce and e-governance initiatives.

There are also many networks that are or were once distinct from the Internet, such as mobile telephone networks and electronic financial networks. Stated more precisely, the Internet is an infrastructural substrate that possesses innovative social, cultural, and economic features allowing creativity (or innovation) based on openness and a particular standardization process. It is a necessary, but not a sufficient, condition for many of the social and cultural implications often attributed to it. Understanding the particularity of the Internet can be key to differentiating its implications and potential impact on society from the impacts of “information technology ” and computers more generally.

The Internet developed through military, university, corporate, and amateur user innovations occurring more or less constantly beginning in the late 1960s. Despite its complexity, it is unlike familiar complex technical objects —for example, a jumbo jetliner —that are designed, tested, and refined by a strict hierarchy of experts who attempt to possess a complete overview of the object and its final state. By contrast, the Internet has been subject to innovation, experimentation, and refinement by a much less well-defined collective of diverse users with wide-ranging goals and interests. In 1968 the Internet was known as the ARPAnet, named for its principal funding agency, the U.S. Department of Defense Advanced Research Projects Agency (ARPA). It was a small but extensive research project organized by the Information Processing Techniques Office at ARPA that focused on advanced concepts in computing, specifically graphics, time-sharing, and networking.

The primary goal of the network was to allow separate administratively bounded resources (computers and software at particular geographical sites) to be shared across those boundaries, without forcing standardization across all of them. The participants were primarily university researchers in computer and engineering departments. Separate experiments in networking, both corporate and academic, were also under way during this period, such as the creation of “Ethernet ” by Robert Metcalfe at Xerox PARC and the X.25 network protocols standardized by the International Telecommunications Union. By 1978 the ARPAnet had grown to encompass dozens of universities and military research sites in the. At this point the project leaders at ARPA recognized a need for a specific kind of standardization to keep the network feasible, namely a common operating system and networking software that could run on all of the diverse hardware connected to the network. Based on its widespread adoption in the 1970s, the UNIX operating system was chosen by ARPA as one official platform for the Internet.

UNIX was known for its portability (ability to be installed on different kinds of hardware) and extensibility (ease with which new components could be added to the core system). Bill Joy (who later cofounded Sun Microsystems) is credited with the first widespread implementation of the Internet Protocol (IP) software in a UNIX operating system, a version known as Berkeley Systems Distribution (BSD). The Internet officially began (in name and in practice) in 1983, the date set by an ad hoc group of engineers known as the Network Working Group (NWG) as the deadline for all connected computers to begin using the Transmission Control Protocol and Internet Protocol (TCP/IP) protocols. These protocols were originally designed in 1973 and consistently improved over the ensuing ten years, but only in 1983 did they become the protocols that would define the Internet. At roughly the same time, ARPA and the Department of Defense split the existing ARPAnet in two, keeping “Milnet ” for sensitive military use and leaving ARPAnet for research purposes and for civilian uses. From 1983 to 1993, in addition to being a research network, the Internet became an underground, subcultural phenomenon, familiar to amateur computer enthusiasts, university students and faculty, and “hackers.

” The Internet ’s glamour was largely associated with the arcane nature of interaction it demanded —largely text-based, and demanding access to and knowledge of the UNIX operating system. Thus, owners of the more widespread personal computers made by IBM and Apple were largely excluded from the Internet (though a number of other similar networks such as Bulletin Board Services, BITNet, and FidoNET existed for PC users). A very large number of amateur computer enthusiasts discovered the Internet during this period, either through university courses or through friends, and there are many user-initiated innovations that date to this period, ranging from games (e.g., MUDs, or Multi-User Dungeons) to programming and scripting languages (e.g., Perl, created by Larry Wall) to precursors of the World Wide Web (e.g., WAIS, Archie, and Gopher). During this period, the network was overseen and funded by the National Science Foundation, which invested heavily in improving the basic infrastructure of fiberoptic “backbones ” in the United States in 1988. The oversight and management of the Internet was commercialized in 1995, with the backing of the presidential administration of. In 1993 the World Wide Web (originally designed by Tim Berners-Lee at CERN in ) and the graphical Mosaic Web Browser (created by the National Center for Supercomputing Applications at the University of ) brought the Internet to a much larger audience.

Between 1993 and 2000 the “dot-com ” boom drove the transformation of the Internet from an underground research phenomena to a nearly ubiquitous and essential technology with far-reaching effects. Commercial investment in infrastructure and in “web presence ” saw explosive growth; new modes of interaction and communication (e.g., e-mail, Internet messaging, and mailing lists) proliferated; Uniform Resource Locators (URLs, such as ) became a common (and highly valued) feature of advertisements and corporate identity; and artists, scientists, citizens, and others took up the challenge of both using and understanding the new medium. The core technical components of the Internet are standardized protocols, not hardware or software, strictly speaking —though obviously it would not have spread so extensively without the innovations in microelectronics, the continual enhancement of telecommunications infrastructures around the globe, and the growth in ownership and use of personal computers over the last twenty years.

Protocols make the “inter ” in the Internet possible by allowing a huge number of nonoverlapping and incompatible networks to become compatible and to route data across all of them. The key protocols, known as TCP/IP, were designed in 1973 by Vint Cerf and Robert Kahn. Other key protocols, such as the Domain Name System (DNS) and User Datagram Protocol (UDP), came later. These protocols have to be implemented in software (such as in the UNIX operating system described above) to allow computers to interconnect.

They are essentially standards with which hardware and software implementations must comply in order for any type of hardware or software to connect to the Internet and communicate with any other hardware and software that does the same. They can best be understood as a kind of technical Esperanto. The Internet protocols differ from traditional standards because of the unconventional social process by which they are developed, validated, and improved. The Internet protocols are elaborated in a set of openly available documents known as Requests for Comments (RFCs), which are maintained by a loose federation of engineers called the Internet Engineering Task Force (IETF, the successor to the Network Working Group). The IETF is an organization open to individuals (unlike large standards organizations that typically accept only national or corporate representatives) that distributes RFCs free of charge and encourages members to implement protocols and to improve them based on their experiences and users ’ responses. The improved protocol then may be released for further implementation. This “positive feedback loop ” differs from most “consensus-oriented ” standardization processes (e.g., those of international organizations such as ISO, the International Organization for Standardization) that seek to achieve a final and complete state before encouraging implementations.

The relative ease with which one piece of software can be replaced with another is a key reason for this difference. During the 1970s and 1980s this system served the Internet well, allowing it to develop quickly, according to the needs of its users. By the 1990s, however, the scale of the Internet made innovation a slower and more difficult procedure —a fact that is most clearly demonstrated by the comparatively glacial speed with which the next generation of the Internet protocol (known as IP Version 6) has been implemented. Ultimately, the IETF style of standardization process has become a common cultural reference point of engineers and expert users of the Internet, and has been applied not only to the Internet, but also to the production of applications and tools that rely on the Internet. The result is a starkly different mode of innovation and sharing that is best exemplified by the growth and success of so-called “free software ” or “open-source software. ” Many of the core applications that are widely used on the Internet are developed in this fashion (famous examples include the Linux operating system kernel and the Apache Web Server).

As a result of the unusual development process and the nature of the protocols, it has been relatively easy for the Internet to advance around the globe and to connect heterogeneous equipment in diverse settings, wherever there are willing and enthusiastic users with sufficient technical know-how. The major impediment to doing so is the reliability (or mere existence) of preexisting infrastructural components such as working energy and telecommunications infrastructures. Between 1968 and 1993 this expansion was not conducted at a national or state level, but by individuals and organizations who saw local benefit in expanding access to the global network. If a university computer science department could afford to devote some resources to computers dedicated to routing traffic and connections, then all the researchers in a department could join the network without needing permission from any centralized state authority.

It was not until the late 1990s that Internet governance became an issue that concerned governments and citizens around the world. In particular, the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) has been the locus of fractious dispute, especially in international arenas. ICANN ’s narrow role is to assign IP numbers (e.g., 192.168.0.1) and the names they map to (e.g., www.wikipedia.org), but it has been perceived, rightly or wrongly, as an instrument of U.S. Control over the Internet. With each expansion of the Internet, issues of privacy, security, and organizational (or national) authority have become more pressing.

At its outset the Internet protocols sought to prioritize control within administrative boundaries, leaving rules governing use to the local network owners. Such a scheme obviated the need for a central authority that determined global rules about access, public/private boundaries, and priority of use. With the advent of widespread commercial access, however, such local control has been severely diluted, and the possibility for individual mischief (e.g., identity theft, spam, and other privacy violations) has increased with increasing accessibility. On the one hand, increased commercial access means a decline in local organized authority over parts of the Internet in favor of control of large segments by Internet Service Providers (ISPs) and telecommunications/cable corporations. On the other hand, as the basic infrastructure of the Internet has spread, so have the practices and norms that were developed in concert with the technology —including everything from the proper way to configure a router, to norms of proper etiquette on mailing lists and for e-mail.

Applications built on top of the Internet have often adopted such norms and modes of use, and promoted a culture of innovation, of “hacking ” (someone who creates new software by employing a series of modifications that exploit or extend existing code or resources, with good or bad connotations depending on the context), and of communal sharing of software, protocols, and tools. It is thus important to realize that although most users do not experience the Internet directly, the development of the particular forms of innovation and openness that characterize the Internet also characterize the more familiar applications built on top of it, due to the propagation of these norms and modes of engineering.

There is often, therefore, a significant difference between innovations that owe their genesis to the Internet and those developed in the personal computer industry, the so-called “proprietary ” software industry, and in distinct commercial network infrastructures (e.g., the SABRE system for airline reservations, or the MOST network for credit card transactions). The particularity of the Internet leads to different implications and potential impact on society than the impacts of “information technology ” or computers more generally. One of the most widely discussed and experienced implications of the Internet is the effect on the culture industries, especially music and film. As with previous media (e.g., video and audio cassette recorders), it is the intersection of technology and intellectual property that is responsible for the controversy. Largely due to its “openness, ” the Internet creates the possibility for low-cost and extremely broad and fast distribution of cultural materials, from online books to digital music and film. At the same time, it also creates the possibility for broad and fast violation of intellectual property rights —rights that have been strengthened considerably by the copyright act of 1976 and the Digital Millennium Copyright Act (1998). The result is a cultural battle over the meaning of “sharing ” music and movies, and the degree to which such sharing is criminal.

The debates have been polarized between a “war on piracy ” on the one hand (with widely varying figures concerning the economic losses), and “consumer freedom ” on the other —rights to copy, share, and trade purchased music. The cultural implication of this war is a tension among the entertainment industry, the artists and musicians, and the consumers of music and film. Because the openness of the Internet makes it easier than ever for artists to distribute their work, many see a potential for direct remuneration, and cheaper and more immediate access for consumers. The entertainment industry, by contrast, argues that it provides more services and quality —not to mention more funding and capital —and that it creates jobs and contributes to a growing economy. In both cases, the investments are protected primarily by the mechanism of intellectual property law, and are easily diluted by illicit copying and distribution.

And yet, it is unclear where to draw a line between legitimate sharing (which might also be a form of marketing) and illegitimate sharing ( “piracy, ” according to the industry). A key question about the Internet is that of social equity and access. The term digital divide has been used primarily to indicate the differential in individual access to the Internet, or in computer literacy, between rich and poor, or between developed and developing nations.

A great deal of research has gone into understanding inequality of access to the Internet, and estimates of both differential access and the rate of the spread of access have varied extremely widely, depending on methodology. It is, however, clear from the statistics that between 1996 and 2005 the rate of growth in usage has been consistently greater than 100 percent in almost all regions of the globe at some times, and in some places it has reached annual growth rates of 500 percent or more. Aside from the conclusion that the growth in access to the Internet has been fantastically rapid, there are few sure facts about differential access. There are, however, a number of more refined questions that researchers have begun investigating: Is the quantity or rate of growth in access to the Internet larger or smaller than in the case of other media (e.g., television, print, and radio)?

Are there significant differences within groups with access (e.g., class, race, or national differences in quality of access)? Does access actually enhance or change a person ’s life chances or opportunities? The implication of a digital divide (whether between nations and regions, or within them) primarily concerns the quality of information and the ability of individuals to use it to better their life chances. In local terms, this can affect development issues broadly (e.g., access to markets and government, democratic deliberation and participation, and access to education and employment opportunities); in global terms, differential access can affect the subjective understandings of issues ranging from religious intolerance to global warming and environmental issues to global geopolitics.

Digital divides might also differ based on the political situation —such as in the case of the Chinese government ’s attempt to censor access to politicized information, which in turn can affect the fate of cross-border investment and trade. SEE ALSO Information, Economics of; Media; Microelectronics Industry; Property Rights, Intellectual Abbate, Janet. Inventing the Internet. Cambridge, MA: MIT Press.

Castells, Manuel. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford University Press. DiMaggio, Paul, Eszter Hargittai, Coral Celeste, and Steven Shafer. Digital Inequality: From Unequal Access to Differentiated Use. In Social Inequality, ed. Kathryn Neckerman, 355 –400.

New York: Russell Sage Foundation. International Telecommunication Union. ICT Indicators.. Meuller, Milton. Ruling the Root: Internet Governance and the Taming of Cyberspace. Cambridge, MA: MIT Press.

National Telecommunications and Information Administration. A Nation Online: Entering the Broadband Era.. Norberg, Arthur L., and Judy E. Transforming Computer Technology: Information Processing for the Pentagon, 1962 –1986. Baltimore: Johns Hopkins University Press.

Pew Internet and American Life Project.. Schmidt, Susanne K., and Raymund Werle. Coordinating Technology: Studies in the International Standardization of Telecommunications.

Cambridge, MA: MIT Press. The Dream Machine: JCR Licklider and the Revolution That Made Computing Personal.

New York: Viking Penguin. Weber, Steven. The Success of Open Source. Cambridge, MA: Harvard University Press. Christopher M. The Internet is a vast network that connects many independent networks and links computers at different locations.

It enables computer users throughout the world to communicate and to share information in a variety of ways. Its evolution into the made it easy to use for those with no prior computer training. History The Internet could not exist until the modern computer came to be. The first electronic computers were developed during the 1940s, and these early machines were so large —mainly because of all the bulky vacuum tubes they needed to perform calculations —that they often took up an entire room by themselves. They were also very expensive, and only a few corporations and government agencies could afford to own one. The decade of the 1950s proved to be one of silent conflict and tension between the and the —a period called the 'cold war' —and computers naturally came to play a.