PC

3.3962829735954 (1668)
Posted by bender 03/06/2009 @ 19:07

Tags : pc, hardware, technology

News headlines
The Next Steps in the PC vs. Mac Ballet - Softpedia
A price measuring contest of Windows PCs vs. Macs, a comparison in which the machines produced by Apple are at a consistent disadvantage. The latest Laptop Hunters ad from Microsoft features mother-daughter couple Lauren and Sue....
Is Your PC Security Up to Date? - New York Times
If you have a Windows PC, you should immediately download the fixes from Microsoft and Adobe. And better yet, enable automatic software updates you don't have to worry about this stuff in the future. Setting up automatic updates in Windows differs for...
HP Tries to Revive PC Sales With Touch Screens - Wall Street Journal
But the touch-screen PCs, which can cost twice as much as typical machines, have been slow to catch on. HP only sold about 400000 of its TouchSmart desktops last year, compared with 54 million traditional desktops and laptops, estimates research firm...
Some PC fixes don't require special skills - Atlanta Journal Constitution
I am a mass murderer of pcs. It's true: The main cause of computer problems is faulty fixing. To avoid joining me on murderer's row, you need to know what jobs you can safely handle and what ought to be left to the folks at the computer shop....
Best Responses to 'Five Controversial Ways to Speed Your PC' - New York Times
By Paul Boutin My post on Five Controversial Ways to Speed Your PC drew lots of comments. Too bad most of them boiled down to “Get a Mac” or “Install Linux.” I could make myself a cult hero in Silicon Valley by repeatedly advocating that every reader...
"Tmax Window", a new PC OS is coming from Korea - Cnet Asia
Local software company call ※Tmaxsoft§ announced that new PC OS is under development and it will be demonstrated by July 7. Tmaxsoft is known as middleware No.1 in Korea among global companies like IBM, Oracle and BEA system. Tmaxsoft, provides a full...
PC students make fireworks possible - Port Clinton News Herald
BY CATHARINE HADLEY • Staff writer • May 16, 2009 PORT CLINTON -- Port Clinton High School students have come to the rescue for this year's Independence Day fireworks event. "It's a great day, an absolutely wonderful day. It started out spectacular,"...
Possible PC firming seen in Hewlett-Packard 2Q - The Associated Press
s CEO, Paul Otellini, says PC sales have "bottomed out," and market research firm IDC reported that global PC shipments fell less than expected in the first quarter, and could turn around by the end of the year. That's significant for HP because the...
Fix your slow-pc instantly - Culture11
You can fix a slow pc today and do some house cleaning for your PC by simply taking out the trash. Fragmented files and folders are a huge reasons why we grow old waiting for a single program to load. It is the cause of stress, high blood pressure and...

IBM PC compatible

The original IBM PC (Model 5150) motivated the production of clones in the early-1980s.

IBM PC compatible computers are those generally similar to the original IBM PC, XT, and AT. Such computers used to be referred to as PC clones, or IBM clones since they almost exactly duplicated all the significant features of the PC architecture, facilitated by various manufacturers' ability to legally reverse engineer the BIOS through clean room design. Columbia Data Products built the first clone of an IBM personal computer through a clean room implementation of its BIOS. Many early IBM PC compatibles used the same computer bus as the original PC and AT models. The IBM AT compatible bus was later named the ISA bus by manufacturers of compatible computers.

The term "IBM PC compatible" became relegated to historical use with the rise of Windows and IBM's loss of dominance in the personal computer market.

Descendants of the IBM PC compatibles make up the majority of microcomputers on the market today, although interoperability with the bus structure and peripherals of the original PC architecture may be limited or non-existent.

The origins of this platform came with the decision by IBM in 1980 to market a low-cost single-user computer as quickly as possible in response to Apple Computer's success in the burgeoning market. On 12 August 1981, the first IBM PC went on sale. There were three operating systems (OS) available for it but the most popular and least expensive was PC DOS, a version of MS DOS licensed from Microsoft. In a crucial concession, IBM's agreement allowed Microsoft to sell its own version, MS-DOS, for non-IBM platforms. The only proprietary component of the PC was the BIOS (Basic Input/Output System).

A number of computers of the time based on the 8086 and 8088 processors were manufactured during this period, but with different architecture to the PC, and which ran under their own versions of DOS and CP/M-86. However, software which addressed the hardware directly instead of making standard calls to MS-DOS was faster. This was particularly relevant to games. The IBM PC was the only machine sold in high enough volumes to justify writing software specifically for it, and this encouraged other manufacturers to produce machines which could use the same programs, expansion cards and peripherals as the PC. The 808x computer marketplace rapidly excluded all machines which were not functionally very similar to the PC. The 640kB limit on "conventional" system memory available to MS-DOS is a legacy of that period; other non-clone machines did not have this limit.

The original "clones" of the IBM Personal Computer were created without IBM's participation or approval. Columbia closely modeled the IBM PC and produced the first "compatible" PC (i.e., more or less compatible to the IBM PC standard) in June 1982 closely followed by Eagle Computer. Compaq Computer Corp. announced its first IBM PC compatible a few months later in November 1982—the Compaq Portable. The Compaq was the first sewing machine-sized portable computer that was essentially 100% PC-compatible. The company could not directly copy the BIOS as a result of the court decision in Apple v. Franklin, but it could reverse-engineer the IBM BIOS and then write its own BIOS using clean room design.

At the same time, many manufacturers such as Xerox, HP, Digital, Sanyo, Texas Instruments, Tulip, Wang and Olivetti introduced personal computers that were MS DOS compatible, but not completely software- or hardware-compatible with the IBM PC.

Microsoft's intention, and that of the industry from 1981 to as late as the mid-1980s, was that application writers would write to the APIs in MS-DOS or the firmware BIOS, and that this would form what would now be called a hardware abstraction layer. Each computer would have its own OEM version of MS-DOS, customized to its hardware. Any software written for MS-DOS would run on any MS-DOS computer, despite variations in hardware design. A similar trend was with the MSX home computer series.

This expectation seemed reasonable in the computer marketplace of the time. Until then Microsoft was primarily focused on computer languages such as BASIC. The established small system operating software was CP/M from Digital Research was in use both at the hobbyist level and at the more professional end of those using microcomputers. To achieve such widespread use, and thus make the product economically viable, the OS had to operate across a range of machines from different vendors that had widely varying hardware. Those customers who needed other applications beyond the starter pack could reasonably expect publishers to offer their products for a variety of computers, on suitable media for each.

Microsoft's competing OS was initially targeted to run on a similar varied spectrum of hardware, although all based on the 8086 processor. Thus, MS-DOS was for many years sold only as an OEM product. There was no Microsoft-branded MS-DOS: MS-DOS could not be purchased directly from Microsoft, and each OEM release was packaged with the trade dress of the given PC vendor. The different versions were in general incompatible with different hardware. Bugs were to be reported to the OEM, not to Microsoft. However, as clones became widespread, it soon became clear that the OEM versions of MS-DOS were virtually identical, except perhaps for the provision of a few utility programs.

At first, few "compatibles" other than Compaq's offered compatibility beyond the DOS/BIOS level. Reviewers and users developed suites of programs to test compatibility; the ability to run Lotus 1-2-3 or Microsoft Flight Simulator became one of the most significant "stress tests". Vendors gradually learned not only how to emulate the IBM BIOS but also where to use identical hardware chips to perform key functions within the system. Eventually, the Phoenix BIOS and similar commercially-available products permitted computer makers to build essentially 100%-compatible clones without having to reverse-engineer the IBM PC BIOS themselves.

Over time, IBM damaged its own market by itself failing to appreciate the importance of "IBM compatibility", introducing products such as the IBM Portable (which underperformed and selled les than the earlier Compaq Portable) and the PCjr, which had significant incompatibilities with the original PC and was quickly discontinued. By the mid to late 1980s buyers began to regard PCs as commodity items, and doubted that the security blanket of the IBM brand warranted the price difference. Meanwhile, MS-DOS-compatible (but not hardware-compatible) systems did not succeed in the marketplace. Being unable to run off-the-shelf high-performance software packages that the IBM PC and true compatibles could made for poor sales and the eventual extinction of this category of systems. Because of hardware incompatibility with the IBM PC design, the 80186 processor released only a year after the IBM PC was never popular in general-purpose personal computers.

However, as the market evolved, despite the failure of MCA, IBM derived a considerable income stream from license fees from companies who paid for licenses to use IBM patents that were in the PC design, to the extent that IBM's focus changed from discouraging PC clones to maximizing its revenue from license sales. IBM finally relinquished its role as a PC manufacturer in April 2005, when it sold its PC division to Lenovo for $1.75 billion.

As of October 2007, Hewlett-Packard and Dell hold the largest shares of the PC market in North America. They are also successful overseas, with Acer, Lenovo, and Toshiba also notable. Worldwide, a huge number of PCs are "white box" systems assembled by a myriad of local systems builders. Despite advances in computer technology, all current IBM PC compatibles remain very much compatible with the original IBM PC computers, although most of the components implement the compatibility in special backward compatibility modes used only during a system boot.

One of the strengths of the PC compatible platform is its modular hardware design. End-users could readily upgrade peripherals and to some degree, processor and memory without modifying the computer's motherboard or replacing the whole computer, as was the case with many of the microcomputers of the time. However, as processor speed and memory width increased, the limits of the original XT/AT bus design were soon reached, particularly when driving graphics video cards. IBM did introduce an upgraded bus in the IBM PS/2 computer that overcame many of the technical limits of the XT/AT bus, but this was rarely used as the basis for IBM compatible computers since it required licence payments to IBM both for the PS/2 bus and any prior AT-bus designs produced by the company seeking a license. This was unpopular with hardware manufacturers and several competing bus standards were developed by consortiums, with more agreeable license terms. Various attempts to standardize the interfaces were made, but in practice, many of these attempts were either flawed or ignored. Even so, there were many expansion options, and the PC compatible platform advanced much faster than other competing platforms of the time, even if only because of its market dominance.

In the 1990s, IBM's influence on PC architecture became increasingly irrelevant. An IBM-brand PC became the exception not the rule. Instead of focusing on staying compatible with the IBM PC, vendors began to focus on compatibility with the evolution of Microsoft Windows. In 1993, a version of Windows NT was released that could run on processors other than x86. (It did require that applications be recompiled, a step most developers didn't take.) Still, its hardware independence was taken advantage of by SGI x86 workstations - thanks to NT's HAL, they could run NT (and its vast application library). No mass-market personal computer hardware vendor dares to be incompatible with the latest version of Windows, and Microsoft's annual WinHEC conferences provide a setting in which Microsoft can lobby for and —in some cases— dictate the pace and direction of the hardware side of the PC industry. Microsoft and Windows have become so important to the ongoing development of the PC hardware that industry writers have taken to using the term "Wintel architecture" ("Wintel" being a portmanteau of "Windows" and "Intel") to refer to the combined hardware-software platform. This terminology itself is becoming a misnomer, as on the one hand Intel has lost absolute control over the direction of the hardware platform's development due to the influence of others, and on the other hand non-Windows operating systems running on this hardware platform have established and maintained a notable presence.

Although the IBM PC was designed for expandability, the designers could not anticipate the hardware developments of the '80s. To make things worse, IBM's choice of the Intel 8088 for the CPU introduced several limitations which were hurdles for developing software for the PC compatible platform. For example, the 8088 processor only had a 20-bit memory addressing space. To expand PCs beyond one megabyte, Lotus, Intel, and Microsoft jointly created expanded memory (EMS), a bank-switching scheme to allow more memory provided by add-in hardware, and seen through a set of four 16-Kilobyte "windows" inside the 20-bit addressing. Later, Intel CPUs had larger address spaces and could directly address 16- MiBs (80286) or more, leading Microsoft to develop extended memory (XMS) which did not require additional hardware.

Expanded and extended memory have incompatible interfaces, so anyone writing software that used more than one megabyte had to support both systems for the greatest compatibility until MS-DOS began including EMM386, which simulated EMS memory using XMS memory. A protected mode OS can also be written for the 80286, but DOS application compatibility was harder than expected, not only because most DOS application directly accessed the hardware, but also that most BIOS requests were made via IRQs, hindering multitasking and programmer's predictions of speed.

Video cards suffered from their own incompatibilities. Once video cards advanced to SVGA the standard for accessing them was no longer clear. At the time, PC programming used a memory model that had 64 KB memory segments. The most common VGA graphics mode's screen memory fitted into a single memory segment. SVGA modes required more memory, so accessing the full screen memory was tricky. Each manufacturer developed their own ways of accessing the screen memory, even going so far as not to number the modes consistently. An attempt at creating a standard called VBE was made, but not all manufacturers adhered to it.

Because of the wide number of third-party adapters and no standard for them, programming the PC could be difficult. Professional developers would run a large test-suite of various hardware combinations. Even the PC itself had no clear application interface to the flat memory model the 386 and higher could provide in protected mode.

When the 386 arrived, again a protected mode OS could be written for it. This time, DOS compatibility was much easier because of virtual 8086 mode. Unfortunately programs could not switch directly between them, so eventually, some new memory-model APIs were developed, VCPI and DPMI, the latter becoming the most popular.

Meanwhile, consumers were overwhelmed by the many different combinations of hardware on offer. To give them some idea of what sort of PC they would need to run their software, the Multimedia PC (MPC) standard was set in 1990. A PC that met the minimum MPC standard could be considered, and marketed as, an MPC. Software that could run on the most minimal MPC-compliant PC would be guaranteed to run on any MPC. The MPC level 2 and MPC level 3 standards were later set, but the term "MPC compliant" never caught on. After MPC level 3 in 1996, no further MPC standards were established.

The success of Microsoft Windows had driven nearly all other rival commercial operating systems into near-extinction, and had ensured that the “IBM PC compatible” computer was the dominant computing platform. This meant that if a manufacturer only made their software for the Wintel platform, they would be able to reach out to the vast majority of computer users. By the late 1980s, the only major competitor to Windows with more than a few percentage points of market share was Apple Inc.'s Macintosh. The Mac started out billed as "the computer for the rest of us" but the DOS/Windows/Intel onslaught quickly drove the Macintosh into an education and desktop publishing niche, from which it has only recently begun to emerge. By the mid 1990s Mac marketshare had dwindled to around 5% and introducing a new rival operating system had become too risky a commercial venture. Experience had shown that even if an operating system was technically superior to Windows, it would be a failure in the marketplace (BeOS and OS/2 for example). In 1989 Steve Jobs said of his new NeXT platform, "It will either be the last new hardware platform to succeed, or the first to fail." In 1993 NeXT announced it was ending production of the NeXTcube and porting NeXTSTEP to Intel processors. In 1997, NeXT was acquired by Apple, which then introduced the iMac in 1998, and afterwards the Mac continues to regain marketshare, which is still happening today.

On the hardware front, Intel initially licensed their technology so that other manufacturers could make x86 CPUs. As the "Wintel" platform gained dominance Intel abandoned this practice. Companies such as AMD and Cyrix developed alternative CPUs that were functionally compatible with Intel's. Towards the end of the 1990s, AMD was taking an increasing share of the CPU market for PCs. AMD even ended up playing a significant role in directing the evolution of the x86 platform when its Athlon line of processors continued to develop the classic x86 architecture as Intel deviated with its "Netburst" architecture for the Pentium 4 CPUs and the IA-64 architecture for the Itanium line of server CPUs. AMD developed the first 64 bit extension of the x86 architecture that forced Intel to make a clean-room version of it, in all its latest CPUs. In 2006 Intel began abandoning Netburst with the release of their line of "Core" processors that represented an evolution of the earlier Pentium III.

The term 'IBM PC compatible' is not commonly used for current computers because all the mainstream computers are now PC compatibles. Most competing platforms have either died off or been relegated to niche, enthusiast markets like the Amiga. One notable exception was Apple Macintosh computers, that were running on PowerPC architecture until 2006, when Apple switched its computers to Intel processors and adopted the x86 architecture, which is IBM PC compatible. The processor speed and memory capacity of modern PCs are many orders of magnitude greater than they were on the original IBM PC and yet backwards compatibility has been largely maintained - a 32-bit operating system published in the 2000s can still run many of the simpler programs written for the OS of the early 1980s without needing an emulator.

To the top



Personal computer game

Vg icon.svg

A personal computer game (also known as a computer game or simply PC game) is a game played on a personal computer, rather than on a video game console or arcade machine. Computer games have evolved from the simple graphics and gameplay of early titles like Spacewar!, to a wide range of more visually advanced titles.

PC games are created by one or more game developers, often in conjunction with other specialists (such as game artists) and either published independently or through a third party publisher. They may then be distributed on physical media such as DVDs and CDs, as Internet-downloadable shareware, or through online delivery services such as Direct2Drive and Steam. PC games often require specialized hardware in the user's computer in order to play, such as a specific generation of graphics processing unit or an Internet connection for online play, although these system requirements vary from game to game.

Although personal computers only became popular with the development of the microprocessor, mainframe and minicomputers, computer gaming has existed since at least the 1960s. One of the first computer games was developed in 1961, when MIT students Martin Graetz and Alan Kotok, with MIT employee Steve Russell, developed Spacewar! on a PDP-1 computer used for statistical calculations.

The first generation of PC games were often text adventures or interactive fiction, in which the player communicated with the computer by entering commands through a keyboard. The first text-adventure, Adventure, was developed for the PDP-11 by Will Crowther in 1976, and expanded by Don Woods in 1977. By the 1980s, personal computers had become powerful enough to run games like Adventure, but by this time, graphics were beginning to become an important factor in games. Later games combined textual commands with basic graphics, as seen in the SSI Gold Box games such as Pool of Radiance, or Bard's Tale.

By the mid-1970s, games were developed and distributed through hobbyist groups and gaming magazines, such as Creative Computing and later Computer Gaming World. These publications provided game code that could be typed into a computer and played, encouraging readers to submit their own software to competitions.

Microchess was one of the first games for microcomputers which was sold to the public. First sold in 1977, Microchess eventually sold over 50,000 copies on cassette tape.

As the video game market became flooded with poor-quality games created by numerous companies attempting to enter the market, and over-production of high profile releases such as the Atari 2600 adaptation of E.T. and Pacman grossly underperformed, the popularity of personal computers for education rose dramatically. In 1983, consumer interest in video games dwindled to historical lows, as interest in computer games and the MTV-fueled music industry roses.

The effects of the crash were largely limited to the console market, as established companies such as Atari posted record losses over subsequent years. Conversely, the home computer market boomed, as sales of low-cost color computers such as the Commodore 64 rose to record highs and developers such as Electronic Arts benefited from increasing interest in the platform.

The console market experienced a resurgence in the United States with the release of the Nintendo Entertainment System. In Europe, computer gaming continued to boom for many years after.

Increasing adoption of the computer mouse, driven partially by the success of games such as the highly successful King's Quest series, and high resolution bitmap displays allowed the industry to include increasingly high-quality graphical interfaces in new releases. Meanwhile, the Commodore Amiga computer achieved great success in the market from its release in 1985, contributing to the rapid adoption of these new interface technologies.

Further improvements to game artwork were made possible with the introduction of the first sound cards, such as AdLib's Music Synthesizer Card, in 1987. These cards allowed IBM PC compatible computers to produce complex sounds using FM synthesis, where they had previously been limited to simple tones and beeps. However, the rise of the Creative Labs Sound Blaster card, which featured much higher sound quality due to the inclusion of a PCM channel and digital signal processor, led AdLib to file for bankruptcy in 1992.

The year before, id Software had produced one of the first first-person shooter games, Hovertank 3D, which was the company's first in their line of highly influential games in the genre. The same team went on to develop Wolfenstein 3D in 1992, which helped to popularize the genre, kick-starting a genre that would become one of the highest-selling in modern times. The game was originally distributed through the shareware distribution model, allowing players to try a limited part of the game for free but requiring payment to play the rest, and represented one of the first uses of texture mapping graphics in a popular game, along with Ultima Underworld.

While leading Sega and Nintendo console systems kept their CPU speed at 3-7 MHz, the 486 PC processor ran much faster, allowing it to perform many more calculations per second. The 1993 release of Doom on the PC was a breakthrough in 3D graphics, and was soon ported to various game consoles in a general shift toward greater realism. In the same time frame, games such as Myst took advantage of the new CD-ROM delivery format to include many more assets (sound, images, video) for a richer game experience.

Many early PC games included extras such as the peril-sensitive sunglasses that shipped with The Hitchhiker's Guide to the Galaxy. These extras gradually became less common, but many games were still sold in the traditional over-sized boxes that used to hold the extra "feelies". Today, such extras are usually found only in Special Edition versions of games, such as Battlechests from Blizzard .

By 1996, the rise of Microsoft Windows and success of 3D console titles such as Super Mario 64 sparked great interest in hardware accelerated 3D graphics on the PC, and soon resulted in attempts to produce affordable solutions with the ATI Rage, Matrox Mystique and Silicon Graphics ViRGE. Tomb Raider, which was released in 1996, was one of the first third person shooter games and was praised for its revolutionary graphics. As 3D graphics libraries such as DirectX and OpenGL matured and knocked proprietary interfaces out of the market, these platforms gained greater acceptance in the market, particularly with their demonstrated benefits in games such as Unreal. However, major changes to the Microsoft Windows operating system, by then the market leader, made many older MS-DOS-based games unplayable on Windows NT, and later, Windows XP (without using an emulator, such as DOSbox).

The faster graphics accelerators and improving CPU technology resulted in increasing levels of realism in computer games. During this time, the improvements introduced with products such as ATI's Radeon R300 and NVidia's GeForce 6 Series have allowed developers to increase the complexity of modern game engines. PC gaming currently tends strongly toward improvements in 3D graphics.

Unlike the generally accepted push for improved graphical performance, the use of physics engines in computer games has become a matter of debate since announcement and 2005 release of the nVidia PhysX PPU, ostensibly competing with middleware such as the Havok physics engine. Issues such as difficulty in ensuring consistent experiences for all players, and the uncertain benefit of first generation PhysX cards in games such as Tom Clancy's Ghost Recon Advanced Warfighter and City of Villains, prompted arguments over the value of such technology.

Similarly, many game publishers began to experiment with new forms of marketing. Chief among these alternative strategies is episodic gaming, an adaptation of the older concept of expansion packs, in which game content is provided in smaller quantities but for a proportionally lower price. Titles such as Half-Life 2: Episode One took advantage of the idea, with mixed results rising from concerns for the amount of content provided for the price.

Game development, as with console games, is generally undertaken by one or more game developers using either standardised or proprietary tools. While games could previously be developed by very small groups of people, as in the early example of Wolfenstein 3D, many popular computer games today require large development teams and budgets running into the millions of dollars.

PC games are usually built around a central piece of software, known as a game engine, that simplifies the development process and enables developers to easily port their projects between platforms. Unlike most consoles, which generally only run major engines such as Unreal Engine 3 and RenderWare due to restrictions on homebrew software, personal computers may run games developed using a larger range of software. As such, a number of alternatives to expensive engines have become available, including open source solutions such as Crystal Space, OGRE and DarkPlaces.

The multi-purpose nature of personal computers often allows users to modify the content of installed games with relative ease. Since console games are generally difficult to modify without a proprietary software development kit, and are often protected by legal and physical barriers against tampering and homebrew software, it is generally easier to modify the personal computer version of games using common, easy-to-obtain software. Users can then distribute their customised version of the game (commonly known as a mod) by any means they choose.

The inclusion of map editors such as UnrealEd with the retail versions of many games, and others that have been made available online such as GtkRadiant, allow users to create modifications for games easily, using tools that are maintained by the games' original developers. In addition, companies such as id Software have released the source code to older game engines, enabling the creation of entirely new games and major changes to existing ones.

Modding had allowed much of the community to produce game elements that would not normally be provided by the developer of the game, expanding or modifying normal gameplay to varying degrees. One notable example is the Hot Coffee mod for the PC port of Grand Theft Auto: San Andreas, which enables access to an abandoned sex minigame by simply modifying a bit of the game's data file.

Computer games are typically sold on standard storage media, such as compact discs, DVD, and floppy disks. These were originally passed on to customers through mail order services, although retail distribution has replaced it as the main distribution channel for video games due to higher sales. Different formats of floppy disks were initially the staple storage media of the 1980s and early 1990s, but have fallen out of practical use as the increasing sophistication of computer games raised the overall size of the game's data and program files.

The introduction of complex graphics engines in recent times has resulted in additional storage requirements for modern games, and thus an increasing interest in CDs and DVDs as the next compact storage media for personal computer games. The rising popularity of DVD drives in modern PCs, and the larger capacity of the new media (a single-layer DVD can hold up to 4.7 gigabytes of data, more than five times as much as a single CD), have resulted in their adoption as a format for computer game distribution. To date, CD versions are still offered for most games, while some games offer both the CD and the DVD versions.

Shareware marketing, whereby a limited or demonstration version of the full game is released to prospective buyers without charge, has been used as a method of distributing computer games since the early years of the gaming industry and was seen in the early days of Tanarus as well as many others. Shareware games generally offer only a small part of the gameplay offered in the retail product, and may be distributed with gaming magazines, in retail stores or on developers' websites free of charge.

In the early 1990s, shareware distribution was common among fledging game companies such as Apogee Software, Epic Megagames and id Software, and remains a popular distribution method among smaller game developers. However, shareware has largely fallen out of favor among established game companies in favour of traditional retail marketing, with notable exceptions such as Big Fish Games and PopCap Games continuing to use the model today.

With the increased popularity of the Internet, online distribution of game content has become more common. Retail services such as Direct2Drive and Download.com allow users to purchase and download large games that would otherwise only be distributed on physical media, such as DVDs, as well as providing cheap distribution of shareware and demonstration games. Other services, allow a subscription-based distribution model in which users pay a monthly fee to download and play as many games as they wish.

The Steam system, developed by Valve Corporation, provides an alternative to traditional online services. Instead of allowing the player to download a game and play it immediately, games are made available for "pre-load" in an encrypted form days or weeks before their actual release date. On the official release date, a relatively small component is made available to unlock the game. Steam also ensures that once bought, a game remains accessible to a customer indefinitely, while traditional mediums such as floppy disks and CD-ROMs are susceptible to unrecoverable damage and misplacement. The user would however depend on the Steam servers to be online to download its games. According to the terms of service for Steam, Valve has no obligation to keep the servers running. Therefore, if the Valve Corporation shut down, so would the servers.

The real-time strategy genre, which accounts for more than a quarter of all PC games sold, has found very little success on video game consoles, with releases such as Starcraft 64 failing in the marketplace. Strategy games tend to suffer from the design of console controllers, which do not allow fast, accurate movement.

Conversely, action games have found considerable popularity on video game consoles, making up nearly a third of all console video games sold in 2004, compared to just four percent on the computer. Sports games have also found greater support on game consoles compared to personal computers.

Modern computer games place great demand on the computer's hardware, often requiring a fast central processing unit (CPU) to function properly. CPU manufacturers historically relied mainly on increasing clock rates to improve the performance of their processors, but had begun to move steadily towards multi-core CPUs by 2005. These processors allow the computer to simultaneously process multiple tasks, called threads, allowing the use of more complex graphics, artificial intelligence and in-game physics.

Similarly, 3D games often rely on a powerful graphics processing unit (GPU), which accelerates the process of drawing complex scenes in realtime. GPUs may be an integrated part of the computer's motherboard, the most common solution in laptops, or come packaged with a discrete graphics card with a supply of dedicated Video RAM, connected to the motherboard through either an AGP or PCI-Express port. It is also possible to use multiple GPUs in a single computer, using technologies such as NVidia's Scalable Link Interface and ATI's CrossFire.

Sound cards are also available to provide improved audio in computer games. These cards provide improved 3D audio and provide audio enhancement that is generally not available with integrated alternatives, at the cost of marginally lower overall performance. The Creative Labs SoundBlaster line was for many years the de facto standard for sound cards, although its popularity dwindled as PC audio became a commodity on modern motherboards.

Physics processing units (PPUs), such as the Nvidia PhysX (formerly AGEIA PhysX) card, are also available to accelerate physics simulations in modern computer games. PPUs allow the computer to process more complex interactions among objects than is achievable using only the CPU, potentially allowing players a much greater degree of control over the world in games designed to use the card.

Virtually all personal computers use a keyboard and mouse for user input. Other common gaming peripherals are a headset for faster communication in online games, joysticks for flight simulators, steering wheels for driving games and gamepads for console-style games.

Computer games also rely on third-party software such as an operating system (OS), device drivers, libraries and more to run. Today, the vast majority of computer games are designed to run on the Microsoft Windows OS. Whereas earlier games written for MS-DOS would include code to communicate directly with hardware, today Application programming interfaces (APIs) provide an interface between the game and the OS, simplifying game design. Microsoft's DirectX is an API that is widely used by today's computer games to communicate with sound and graphics hardware. OpenGL is a cross-platform API for graphics rendering that is also used. The version of the graphics card's driver installed can often affect game performance and gameplay. It is not unusual for a game company to use a third-party game engine, or third-party libraries for a game's AI or physics.

Multiplayer gaming was largely limited to local area networks (LANs) before cost-effective broadband Internet access became available, due to their typically higher bandwidth and lower latency than the dial-up services of the time. These advantages allowed more players to join any given computer game, but have persisted today because of the higher latency of most Internet connections and the costs associated with broadband Internet.

LAN gaming typically requires two or more personal computers, a router and sufficient networking cables to connect every computer on the network. Additionally, each computer must have a network card installed or integrated onto its motherboard in order to communicate with other computers on the network. Optionally, any LAN may include an external connection to the Internet.

Online multiplayer games have achieved popularity largely as a result of increasing broadband adoption among consumers. Affordable high-bandwidth Internet connections allow large numbers of players to play together, and thus have found particular use in massively multiplayer online RPGs, Tanarus and persistent online games such as World War II Online.

Although it is possible to participate in online computer games using dial-up modems, broadband internet connections are generally considered necessary in order to reduce the latency between players (commonly known as "lag"). Such connections require a broadband-compatible modem connected to the personal computer through a network interface card (generally integrated onto the computer's motherboard), optionally separated by a router. Online games require a virtual environment, generally called a "game server." These virtual servers inter-connect gamers, allowing real time, and often fast paced action. To meet this subsequent need, Game Server Providers (GSP) have become increasingly more popular over the last half decade. While not required for all gamers, these servers provide a unique "home," fully customizable (such as additional modifications, settings, etc) - giving the end gamers the experience they desire. Today there are over 500,000 game servers hosted in North America alone.

Emulation software, used to run software without the original hardware, are popular for their ability to play legacy video games without the consoles or operating system for which they were designed. Console emulators such as NESticle and MAME are relatively commonplace, although the complexity of modern consoles such as the Xbox or Playstation makes them far more difficult to emulate, even for the original manufacturers.

Most emulation software mimics a particular hardware architecture, often to an extremely high degree of accuracy. This is particularly the case with classic home computers such as the Commodore 64, whose software often depends on highly sophisticated low-level programming tricks invented by game programmers and the demoscene.

PC games have long been a source of controversy, particularly related to the violence that has become commonly associated with video gaming in general. The debate surrounds the influence of objectionable content on the social development of minors, with organisations such as the American Psychological Association concluding that video game violence increases children's aggression, a concern that prompted a further investigation by the Center for Disease Control in September 2006. Industry groups have responded by noting the responsibility of parents in governing their children's activities, while attempts in the United States to control the sale of objectionable games have generally been found unconstitutional.

Video game addiction is another cultural aspect of gaming to draw criticism as it can have a negative influence on health and on social relations. The problem of addiction and its health risks seems to have grown with the rise of Massively Multiplayer Online Role Playing Games (MMORPGs).

To the top



IBM Personal Computer

IBM original PC.jpg

The IBM Personal Computer, commonly known as the IBM PC, is the original version and progenitor of the IBM PC compatible hardware platform. It is IBM model number 5150, and was introduced on August 12, 1981. It was created by a team of engineers and designers under the direction of Don Estridge of the IBM Entry Systems Division in Boca Raton, Florida.

Alongside "microcomputer" and "home computer", the term "personal computer" was already in use before 1981. It was used as early as 1972 to characterize Xerox PARC's Alto. However, because of the success of the IBM Personal Computer, the term came to mean more specifically a microcomputer compatible with IBM's PC products.

The original line of PCs were part of an IBM strategy to get into the small computer market then dominated by the Commodore PET, Atari 8-bit family, Apple II and Tandy Corporation's TRS-80s, and various CP/M machines. IBM's first desktop microcomputer was the IBM 5100, introduced in 1975. It was a complete system - with a built-in monitor, keyboard, and data storage. It was also very expensive - up to US$20,000. It was specifically designed for professional and scientific problem-solvers, not business users or hobbyists. When the PC was introduced in 1981, it was originally designated as the IBM 5150, putting it in the "5100" series, though its architecture wasn't directly descended from the IBM 5100.

Rather than going through the usual IBM design process, a special team was assembled with authorization to bypass normal company restrictions and get something to market rapidly. This project was given the code name Project Chess at the IBM Entry Systems Division in Boca Raton, Florida. The team consisted of twelve people directed by Don Estridge with Chief Scientist Larry Potter and Chief Systems Architect Lewis Eggebrecht. They developed the PC in about a year. To achieve this they first decided to build the machine with "off-the-shelf" parts from a variety of different original equipment manufacturers (OEMs) and countries. Previously IBM had always developed their own components. Secondly for scheduling and cost reasons, rather than developing unique IBM PC monitor and printer designs, project management decided to utilize an existing "off-the-shelf" IBM monitor developed earlier in IBM Japan as well as an existing Epson printer model. Consequently, the unique IBM PC industrial design elements were relegated to the system unit and keyboard. They also decided on an open architecture, so that other manufacturers could produce and sell peripheral components and compatible software without purchasing licenses. IBM also sold an IBM PC Technical Reference Manual which included a listing of the ROM BIOS source code.

At the time, Don Estridge and his team considered using the IBM 801 processor and its operating system that had been developed at the Thomas J. Watson Research Center in Yorktown Heights, New York (The 801 is an early RISC microprocessor designed by John Cocke and his team at Yorktown Heights.) The 801 was at least an order of magnitude more powerful than the Intel 8088, and the operating system many years more advanced than the DOS operating system from Microsoft, that was finally selected. Ruling out an in-house solution made the team’s job much easier and may have avoided a delay in the schedule, but the ultimate consequences of this decision for IBM were far-reaching. IBM had recently developed the Datamaster business microcomputer which used an Intel processor and peripheral ICs; familiarity with these chips and the availability of the Intel 8088 processor was a deciding factor in the choice of processor for the new product. Even the 62-pin expansion bus slots were designed to be similar to the Datamaster slots. Delays due to in-house development of the Datamaster software also influenced the design team to a fast track development process for the PC, with publicly-available technical information to encourage third-party developers.

Other manufacturers soon reverse engineered the BIOS to produce their own non-infringing functional copies. Columbia Data Products introduced the first IBM-PC compatible computer in June 1982. In November 1982, Compaq Computer Corporation announced the Compaq Portable, the first portable IBM PC compatible. The first models were shipped in March 1983.

Once the IBM PC became a commercial success, the product came back under the more usual tight IBM management control. IBM's tradition of "rationalizing" their product lines, deliberately restricting the performance of lower-priced models in order to prevent them from "cannibalizing" profits from higher-priced models, worked against them.

ComputerLand and Sears Roebuck partnered with IBM from the beginning of development. IBM's head of sales and marketing, H.L. ('Sparky') Sparks, relied on these retail partners for important knowledge of the marketplace.

As a natural progression, Computerland and Sears became the main outlets for the new product. More than 190 Computerland stores already existed, while Sears was in the process of creating a handful of in-store computer centers for sale of the new product. This guaranteed IBM widespread distribution across the United States.

Targeting the new PC at the home market, Sears Roebuck sales failed to live up to expectations. This unfavourable outcome revealed that the original strategy - targeting the office market - was the key to higher sales.

All IBM personal computers are software compatible with each other in general, but not every program will work in every machine. Some programs are time sensitive to a particular speed class. Older programs will not take advantage of newer higher-resolution display standards.

The original PC had a version of Microsoft BASIC — IBM Cassette BASIC — in ROM. The CGA (Color Graphics Adapter) video card could use a standard television set or an RGBI monitor for display; IBM's RGBI monitor was their display model 5153. The other option that was offered by IBM was an MDA (Monochrome Display Adapter) and their monochrome display model 5151. It was possible to install both an MDA and a CGA card and use both monitors concurrently, if supported by the application program. For example, AutoCAD allowed use of a CGA card for graphics and a separate monochrome board for text menus. Some model 5150 PCs with CGA monitors and a printer port also included the MDA adapter by default, because IBM provided the MDA port and printer port on the same adapter card; it was in fact an MDA/printer port combo card.

Although the TV-compatible video board, cassette port and FCC Class B certification were all aimed at making it a home computer the original PC proved too expensive for the home market. At introduction a PC with 64 kB of RAM and a single 5 1/4 inch floppy drive and monitor sold for US $3,005, while the cheapest configuration ($1,565) that had no floppy drives, only 16KB RAM, and no monitor (again, the expectation was that users would connect their existing TV sets and cassette recorders) proved too unattractive and low-spec, even for its time (cf. footnotes to the above IBM PC range table). While the 5150 did not become a top selling home computer, its floppy-based configuration became an unexpectedly large success with businesses.

The "IBM Personal Computer XT", IBM's model 5160, was an enhanced machine that was designed for business use. It had 8 expansion slots and a 10 megabyte hard disk (later versions 20MB). Unlike the model 5150 PC, the model 5160 XT no longer had a cassette jack. The XT could take 256 kB of memory on the main board (using 64 kbit DRAM); later models were expandable to 640 kB. (The 384 kB of BIOS ROM, video RAM, and adapter ROM space filled the rest of the one megabyte address space of the 8088 CPU.) It was usually sold with a Monochrome Display Adapter (MDA) video card. The processor was a 4.77 MHz Intel 8088 and the expansion bus 8-bit Industry Standard Architecture (ISA) with XT bus architecture. The XT's expansion slots were placed closer together than with the original PC; this rendered the XT's case and mainboard incompatible with the model 5150's case and mainboard. The slots themselves and the peripheral cards however were compatible. The XT's expansion slot spacing was identical to the one that is still used as of 2008, albeit with different actual slots and bus standards.

The "IBM Personal Computer/AT", announced August 1984, uses an Intel 80286 processor, originally at 6 MHz. It has a 16-bit ISA bus and 20 MB (20 million bytes) hard drive. A faster model, running at 8 MHz, was introduced in 1986. IBM made some attempt at marketing it as a multi-user machine, but it sold mainly as a faster PC for power users. Early PC/ATs were plagued with reliability problems, in part because of some software and hardware incompatibilities, but mostly related to the internal 20 MB hard disk. While some people blamed IBM's hard disk controller card and others blamed the hard disk manufacturer Computer Memories Inc. (CMI), the IBM controller card worked fine with other drives, including CMI's 33-megabyte model. The problems introduced doubt about the computer and, for a while, even about the 286 architecture in general, but after IBM replaced the 20 MB CMI drives, the PC/AT proved reliable and became a lasting industry standard.

The main circuit board in an IBM PC is called the motherboard (IBM terminology calls it a planar). This carries the CPU and memory, and has a bus with slots for expansion cards.

The bus used in the original PC became very popular, and was subsequently named ISA. It is in use to this day in computers for industrial use. Later, requirements for higher speed and more capacity forced the development of new versions. IBM introduced the MCA bus with the PS/2 line. The VESA Local Bus allowed for up to three, much faster 32-bit cards, and the EISA architecture was developed as a backward compatible standard including 32-bit card slots, but it only sold well in high-end server systems. The lower-cost and more general PCI bus was introduced in 1994 and has now become ubiquitous.

The motherboard is connected by cables to internal storage devices such as hard disks, floppy disks and CD-ROM drives. These tend to be made in standard sizes, such as 3.5" (90 mm) and 5.25" (133.4 mm) widths, with standard fixing holes. The case also contains a standard power supply unit (PSU) which is either an AT or ATX standard size.

Intel 8086 and 8088-based PCs require expanded memory (EMS) boards to work with more than one megabyte of memory. The original IBM PC AT used an Intel 80286 processor which can access up to 16 megabytes of memory (though standard DOS applications cannot use more than one megabyte without using additional APIs.) Intel 80286-based computers running under OS/2 can work with the maximum memory.

The original 1981 IBM PC's keyboard at the time was an extremely reliable and high quality electronic keyboard originally developed in North Carolina for the Datamaster system . Each key was rated to be reliable to over 100 million keystrokes. For the IBM PC, a separate keyboard housing was designed with a novel usability feature that allowed users to adjust the keyboard angle for personal comfort. Compared with the keyboards of other small computers at the time, the IBM PC keyboard was far superior and played a significant role in establishing a high quality impression. For example, the industrial design of the keyboard, together with the system unit, was recognized with a major design award. Byte magazine in the fall of 1981 went so far as to state that the keyboard was 50 percent of the reason to buy an IBM PC. The importance of the keyboard was definitely established when the 1983 IBM PCjr flopped, in very large part for having a much different and mediocre Chiclet keyboard that made a poor impression on customers. Oddly enough, the same thing almost happened to the original IBM PC when in early 1981 management seriously considered substituting a cheaper but lower quality keyboard. This mistake was narrowly avoided by the advice of one of the original development engineers.

However, the original 1981 IBM PC's keyboard was severely criticized by typists for its non-standard placement of the Return and left Shift keys. In 1984, IBM corrected this on its AT keyboard, but shortened the 'backspace' key, making it harder to reach. In 1987, it introduced the enhanced keyboard, which relocated all the function keys and the Ctrl keys. The Esc key was also relocated to the opposite side of the keyboard.

Another criticism of the original keyboard was the relatively loud "clack" sound each key made when pressed. Since typewriter users were accustomed to keeping their eyes on the hardcopy they were typing from and had come to rely on the mechanical sound that was made as each character was typed onto the paper to ensure that they had pressed the key hard enough (and only once), the PC keyboard electronic "clack" feature was intended to provide that same reassurance. However, it proved to be very noisy and annoying, especially if many PCs were in use in the same room, and later keyboards were significantly quieter.

An "IBM PC compatible" may have a keyboard that does not recognize every key combination a true IBM PC does, such as shifted cursor keys. In addition, the "compatible" vendors sometimes used proprietary keyboard interfaces, preventing the keyboard from being replaced.

Although the PC/XT and AT used the same style of keyboard connector, the low-level protocol for reading the keyboard was different between these two series. An AT keyboard could not be used in an XT, nor the reverse. Third-party keyboard manufacturers provided a switch to select either AT-style or XT-style protocol for the keyboard.

The serial port is an 8250 or a derivative (such as the 16450 or 16550), mapped to eight consecutive IO addresses and one interrupt request line.

Only COM1: and COM2: addresses were defined by the original PC. Attempts to share IRQ 3 and IRQ4 to use additional ports require special measures in hardware and software, since shared IRQs were not defined in the original PC design.

The original IBM PC used the 7-bit ASCII alphabet as its basis, but extended it to 8 bits with nonstandard character codes. This character set was not suitable for some international applications, and soon a veritable cottage industry emerged providing variants of the original character set in various national variants. In IBM tradition, these variants were called code pages. These codings are now obsolete, having been replaced by more systematic and standardized forms of character coding, such as ISO 8859-1, Windows-1251 and Unicode. The original character set is known as code page 437.

As mentioned above, IBM equipped the model 5150 with a cassette port for connecting a cassette drive, and originally intended compact cassettes to become the 5150's most common storage medium. However, adoption of the floppy- and monitor-less configuration was low; few (if any) IBM PCs left the factory without a floppy disk drive installed. Also, DOS was not available on cassette tape, only on floppy disks (hence "Disk Operating System"). 5150s with just external cassette recorders for storage could only use the built-in ROM BASIC as their operating system. As DOS saw increasing adoption, the incompatibility of DOS programs with PCs that used only cassettes for storage made this configuration even less attractive.

Most or all 5150 PCs had one or two 5¼ inch floppy disk drives. These floppy drives were either single-sided double-density drives (SS/DD, aka SSDD), or double-sided double-density drives (DS/DD, aka DSDD). The IBM PC never used single density floppy drives. The drives and disks were commonly referred to by capacity, e.g. "160KB floppy disk" or "360KB floppy drive", but because this is not entirely unambiguous, they are here referred to using the less commonly used but more accurate SSDD and DSDD terminology. DSDD drives were backwards compatible; they could read and write SSDD floppies. The same type of physical diskette could be used for both drives, however to convert a 5¼" SSDD disk to a DSDD disk, it needed to be reformatted, at which point SSDD drives could no longer read it.

The disks were Modified Frequency Modulation (MFM) coded in 512-byte sectors, and were soft-sectored. They contained 40 tracks per side at the 48 track per inch (TPI) density, and initially were formatted to contain 8 sectors per track. This meant that SSDD disks initially had a formatted capacity of 160 KB, while DSDD disks had a capacity of 320 KB. However, the DOS operating system was later updated to allow formatting the disks with 9 sectors per track. This yielded a formatted capacity of 180 KB with SSDD disks/drives, and 360 KB with DSDD disks/drives. The unformatted capacity of the floppy disks was advertised as 250KB (SSDD) and 500KB (DSDD), however these "raw" 250/500KB were not the same thing as the usable formatted capacity; under DOS, the maximum capacity for SSDD and DSDD disks was 180KB and 360KB, respectively. Regardless of type, the file system of all floppy disks was FAT12.

While the SSDD drives initially were the only floppy drives available for the model 5150 PC, IBM later switched to DSDD drives, and the majority of 5150 PCs sold eventually shipped with one or two DSDD drives. The 5150's successor, the model 5160 IBM XT, never shipped with SSDD drives; it generally had one double-sided 360 kB drive (next to its internal hard disk). While it was technically possible to retrofit more advanced floppy drives such as the high-density drive (released in 1984) into the original IBM PC, this was not an option offered by IBM for the 5150 model, and the move to high-density 5.25" floppies in particular was notoriously fraught with compatibility problems.

IBM's original floppy disk controller card also included an external 37-pin D-shell connector. This allowed users to connect additional external floppy drives by third party vendors. IBM themselves did not offer external floppy drives.

The 5150 could not itself power hard drives without retrofitting a stronger power supply, but IBM later offered the 5161 Expansion Unit, which not only provided more expansion slots, but also included a 10MB (later 20MB) hard drive powered by the 5161's own separate 130-watt power supply.

The first IBM PC that shipped with an internal, fixed, non-removable hard disk was IBM's model 5160, the XT. However, as other IBM-compatible PCs started to appear, hard disks with larger storage capacities than the 5160's and 5161's initial 10MB (later 20MB) also became available, and could — space permitting — be installed into either the IBM PC's Expansion Unit or into PSU-upgraded model 5150 IBM PCs (or into XTs). Adding a third-party hard disk sometimes required plugging in a new controller board, because some of these hard drives were not compatible with the existing disk controller. Some third party hard disks for IBM PCs even sold as kits including a controller card and replacement power supply. Finally, some hard disks were integrated with their controller in a single expansion board, commonly called a "Hard Card".

The IBM PC's ROM BASIC supported cassette tape storage. DOS itself did not support cassette tape storage. PC-DOS version 1.00 supported only 160KB SSDD floppies, but version 1.1, which was released 9 months after the PC's introduction, supported 160KB SSDD and 320KB DSDD floppies. Support for the slightly larger 9 sector per track 180KB and 360KB formats arrived 10 further months later in March 1983.

All IBM PCs include a relatively small piece of software stored in ROM. The original IBM PC 40 KB ROM included 8 KB for power-on self-test (POST) and basic input/output system (BIOS) functions plus 32 KB BASIC in ROM (Cassette BASIC). The ROM BASIC interpreter was the default user interface if no DOS boot disk was present. BASICA was distributed on floppy disk and provided a way to run the ROM BASIC under PC-DOS control.

In addition to PC-DOS, buyers could choose either CP/M-86 or UCSD p-System as operating systems. Due to their higher prices, they never became very popular and PC-DOS or MS-DOS came to be the dominant operating system.

While the IBM PC technology is largely obsolete by today's standards, many are still in service. As of June 2006, IBM PC and XT models are still in use at the majority of U.S. National Weather Service upper-air observing sites. The computers are used to process data as it is returned from the ascending radiosonde, attached to a weather balloon. They are being phased out over a several year period, to be replaced by the Radiosonde Replacement System.

To the top



Personal computer

An exploded view of a modern personal computer and peripherals:  Scanner CPU (Microprocessor) Primary storage (RAM) Expansion cards (graphics cards, etc) Power supply Optical disc drive Secondary storage (Hard disk) Motherboard Speakers Monitor System software Application software Keyboard Mouse External hard disk Printer

A personal computer (PC) is any general-purpose computer whose original sales price, size, and capabilities make it useful for individuals, and which is intended to be operated directly by an end user, with no intervening computer operator.

Today a PC may be a desktop computer, a laptop computer or a tablet computer. The most common operating systems are Microsoft Windows, Mac OS X and Linux, while the most common microprocessors are x86-compatible CPUs, ARM architecture CPUs and PowerPC CPUs. Software applications for personal computers include word processing, spreadsheets, databases, games, and myriad of personal productivity and special-purpose software. Modern personal computers often have high-speed or dial-up connections to the Internet, allowing access to the World Wide Web and a wide range of other resources.

A PC may be a home computer, or may be found in an office, often connected to a local area network. The distinguishing characteristics are that the computer is primarily used, interactively, by one person at a time. This is in contrast to the batch processing or time-sharing models which allowed large expensive systems to be used by many people, usually at the same time, or large data processing systems which required a full-time staff to operate efficiently.

While early PC owners usually had to write their own programs to do anything useful with the machines, today's users have access to a wide range of commercial and non-commercial software which is easily installed.

The capabilities of the PC have changed greatly since the introduction of electronic computers. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person. The introduction of the microprocessor, a single chip with all the circuitry that formerly occupied large cabinets, led to the proliferation of personal computers after about 1975. Early personal computers - generally called microcomputers - were sold often in Electronic kit form and in limited volumes, and were of interest mostly to hobbyists and technicians. Minimal programming was done by toggle switches, and output was provided by front panel indicators. Practical use required peripherals such as keyboards, computer terminals, disk drives, and printers. By 1977, mass-market pre-assembled computers allowed a wider range of people to use computers, focusing more on software applications and less on development of the processor hardware.

Throughout the late 1970s and into the 1980s, computers were developed for household use, offering personal productivity, programming and games. Somewhat larger and more expensive systems (although still low-cost compared with minicomputers and mainframes) were aimed for office and small business use. Workstations are characterized by high-performance processors and graphics displays, with large local disk storage, networking capability, and running under a multitasking operating system. Workstations are still used for tasks such as computer-aided design, drafting and modelling, computation-intensive scientific and engineering calculations, image processing, architectural modelling, and computer graphics for animation and motion picture visual effects.

Eventually the market segments lost any technical distinction; business computers acquired color graphics capability and sound, and home computers and game systems users used the same processors and operating systems as office workers. Mass-market computers had graphics capabilities and memory comparable to dedicated workstations of a few years before. Even local area networking, originally a way to allow business computers to share expensive mass storage and peripherals, became a standard feature of the personal computers used at home.

In 2001 125 million personal computers were shipped in comparison to 48 thousand in 1977. More than 500 million PCs were in use in 2002 and one billion personal computers had been sold worldwide since mid-1970s until this time. Of the latter figure, 75 percent were professional or work related, while the rest sold for personal or home use. About 81.5 percent of PCs shipped had been desktop computers, 16.4 percent laptops and 2.1 percent servers. United States had received 38.8 percent (394 million) of the computers shipped, Europe 25 percent and 11.7 percent had gone to Asia-Pacific region, the fastest-growing market as of 2002. The second billion was expected to be sold by 2008. Almost half of all the households in Western Europe had a personal computer and a computer could be found in 40 percent of homes in United Kingdom, compared with only 13 percent in 1985.

The global PC shipments was 264 million units in year 2007, according to iSuppli, up 11.2 per cent from 239 million in 2006.. In year 2004, the global shipments was 183 million units, 11.6 percent increase over 2003. In 2003, 152.6 million PCs were shipped, at an estimated value of $175 billion. In 2002, 136.7 million PCs were shipped, at an estimated value of $175 billion. In 2000, 140.2 million PCs were shipped, at an estimated value of $226 billion. Worldwide shipments of PCs surpassed the 100-million mark in 1999, growing to 113.5 million units from 93.3 million units in 1998.. In 1999, Asia had 14,1 million units shipped.

As of June 2008, the number of personal computers in use worldwide hit one billion, while another billion is expected to be reached by 2014. Mature markets like the United States, Western Europe and Japan accounted for 58 percent of the worldwide installed PCs. The emerging markets were expected to double their installed PCs by 2013 and to take 70 percent of the second billion PCs. About 180 million PCs (16 percent of the existing installed base) were expected to be replaced and 35 million to be dumped into landfill in 2008. The whole installed base grew 12 percent annually.

In the developed world, there has been a vendor tradition to keep adding functions to maintain high prices of personal computers. However, since the introduction of One Laptop per Child foudation and its low-cost XO-1 laptop, the computing industry started to pursue the price too. Although introduced only one year earlier, there were 14 million netbooks sold in 2008. Besides the regular computer manufacturers, companies making especially rugged versions of computers have sprung up, offering alternatives for people operating their machines in extreme weather or environments.

The emergence of new market segment of small, energy-efficient and low-cost devices (netbooks and nettops) could threaten established companies like Microsoft, Intel, HP or Dell, analysts said in July 2008. A market research firm International Data Corporation predicted that the category could grow from fewer than 500,000 in 2007 to 9 million in 2012 as the market for low cost and secondhand computers expands in developed economies. Also, after Microsoft ceased selling of Windows XP for ordinary machines, it made an exception and continued to offer the operating system for netbook and nettop makers.

Prior to the wide spread of PCs a computer that could fit on a desk was considered remarkably small. Today the phrase usually indicates a particular style of computer case. Desktop computers come in a variety of styles ranging from large vertical tower cases to small form factor models that can be tucked behind an LCD monitor. In this sense, the term 'desktop' refers specifically to a horizontally-oriented case, usually intended to have the display screen placed on top to save space on the desk top. Most modern desktop computers have separate screens and keyboards.

A subtype of desktops, called nettops, was introduced by Intel in February 2008 to describe low-cost, lean-function, desktop computers. A similar subtype of laptops (or notebooks) are the netbooks (see below).

A laptop computer or simply laptop, also called a notebook computer or sometimes a notebook, is a small personal computer designed for mobility. Usually all of the interface hardware needed to operate the laptop, such as parallel and serial ports, graphics card, sound channel, etc., are built in to a single unit. Most laptops contain batteries to facilitate operation without a readily available electrical outlet. In the interest of saving power, weight and space, they usually share RAM with the video channel, slowing their performance compared to an equivalent desktop machine.

One main drawback of the laptop is that, due to the size and configuration of components, relatively little can be done to upgrade the overall computer from its original design. Some devices can be attached externally through ports (including via USB), however internal upgrades are not recommended or in some cases impossible, making the desktop PC more modular.

A subtype of notebooks, called subnotebooks, are computers with most of the features of a standard laptop computer but smaller. They are larger than hand-held computers, and usually run full versions of desktop/laptop operating systems. Ultra-Mobile PCs (UMPC) are usually considered subnotebooks, or more specifically, subnotebook Tablet PCs (see below). Netbooks are sometimes considered in this category, though they are sometimes separated in a category of their own (see below).

Desktop replacements, meanwhile, are large laptops meant to replace a desktop computer while keeping the mobility of a laptop.

Netbook PCs are small portable computers in a "clamshell" design, that are designed specifically for wireless communication and access to the Internet. They are generally much lighter and cheaper than subnotebooks, and have a smaller display, between 7" and 9", with a screen resolution between 800x600 and 1024x768. The operating systems and applications on them are usually specially modified so they can be comfortably used with a smaller sized screen, and the OS is often based on Linux, although some netbooks run on Windows XP. Some netbooks make use of their built in high speed Wireless connectivity to offload some of their applications software to Internet servers, through the principle of Cloud computing, as most have small solid state storage systems instead of hard-disks. Storage capacities are usually in the 4 to 16 GB range. One of the first examples of such a system was the original Eee PC.

A tablet PC is a notebook or slate-shaped mobile computer, first introduced by Pen computing in the early 90s with their PenGo Tablet Computer and popularized by Microsoft. Its touchscreen or graphics tablet/screen hybrid technology allows the user to operate the computer with a stylus or digital pen, or a fingertip, instead of a keyboard or mouse. The form factor offers a more mobile way to interact with a computer. Tablet PCs are often used where normal notebooks are impractical or unwieldy, or do not provide the needed functionality.

The ultra-mobile PC (UMPC) is a specification for a small form factor tablet PC. It was developed as a joint development exercise by Microsoft, Intel, and Samsung, among others. Current UMPCs typically feature the Windows XP Tablet PC Edition 2005, Windows Vista Home Premium Edition, or Linux operating system and low-voltage Intel Pentium or VIA C7-M processors in the 1 GHz range.

A home theater PC (HTPC) is a convergence device that combines the functions of a personal computer and a digital video recorder. It is connected to a television or a television-sized computer display and is often used as a digital photo, music, video player, TV receiver and digital video recorder. Home theater PCs are also referred to as media center systems or media servers. The general goal in a HTPC is usually to combine many or all components of a home theater setup into one box. They can be purchased pre-configured with the required hardware and software needed to add television programming to the PC, or can be cobbled together out of discrete components as is commonly done with Windows Media Center, GB-PVR, SageTV, Famulent or LinuxMCE.

A pocket PC is a hardware specification for a handheld-sized computer (personal digital assistant) that runs the Microsoft Windows Mobile operating system. It may have the capability to run an alternative operating system like NetBSD or Linux. It has many of the capabilities of modern desktop PCs.

Currently there are tens of thousands of applications for handhelds adhering to the Microsoft Pocket PC specification, many of which are freeware. Some of these devices also include mobile phone features. Microsoft compliant Pocket PCs can also be used with many other add-ons like GPS receivers, barcode readers, RFID readers, and cameras. In 2007, with the release of Windows Mobile 6, Microsoft dropped the name Pocket PC in favor of a new naming scheme. Devices without an integrated phone are called Windows Mobile Classic instead of Pocket PC. Devices with an integrated phone and a touch screen are called Windows Mobile Professional.

These components can usually be put together with little knowledge to build a computer. The motherboard is a main part of a computer that connects all devices together. The memory card(s), graphics card and processor are mounted directly onto the motherboard (the processor in a socket and the memory and graphics cards in expansion slots). The mass storage is connected to it with cables and can be installed in the computer case or in a separate case. This is the same for the keyboard and mouse, except that they are external and connect to the I/O panel on the back of the computer. The monitor is also connected to the I/O panel, either through an onboard port on the motherboard, or a port on the graphics card.

Several functions (implemented by chipsets) can be integrated into the motherboard, typically USB and network, but also graphics and sound. Even if these are present, a separate card can be added if what is available isn't sufficient. The graphics and sound card can have a break out box to keep the analog parts away from the electromagnetic radiation inside the computer case. For really large amounts of data, a tape drive can be used or (extra) hard disks can be put together in an external case.

The hardware capabilities of personal computers can sometimes be extended by the addition of expansion cards connected via an expansion bus. Some standard peripheral buses often used for adding expansion cards in personal computers as of 2005 are PCI, AGP (a high-speed PCI bus dedicated to graphics adapters), and PCI Express. Most personal computers as of 2005 have multiple physical PCI expansion slots. Many also include an AGP bus and expansion slot or a PCI Express bus and one or more expansion slots, but few PCs contain both buses.

A computer case is the enclosure that contains the main components of a computer. Cases are usually constructed from steel, aluminium, or plastic, although other materials such as wood, plexiglas or fans have also been used in case designs. Cases can come in many different sizes, or form factors. The size and shape of a computer case is usually determined by the form factor of the motherboard that it is designed to accommodate, since this is the largest and most central component of most computers. Consequently, personal computer form factors typically specify only the internal dimensions and layout of the case. Form factors for rack-mounted and blade servers may include precise external dimensions as well, since these cases must themselves fit in specific enclosures.

Currently, the most popular form factor for desktop computers is ATX, although microATX and small form factors have become very popular for a variety of uses. Companies like Shuttle Inc. and AOpen have popularized small cases, for which FlexATX is the most common motherboard size.

The central processing unit, or CPU, is that part of a computer which executes software program instructions. In older computers this circuitry was formerly on several printed circuit boards, but in PCs is a single integrated circuit. Nearly all PCs contain a type of CPU known as a microprocessor. The microprocessor often plugs into the motherboard using one of many different types of sockets. IBM PC compatible computers use an x86-compatible processor, usually made by Intel, AMD, VIA Technologies or Transmeta. Apple Macintosh computers were initially built with the Motorola 680x0 family of processors, then switched to the PowerPC series (a RISC architecture jointly developed by Apple Computer, IBM and Motorola), but as of 2006, Apple switched again, this time to x86-compatible processors by Intel. Modern CPUs are equipped with a fan attached via heat sink.

The motherboard, also referred to as systemboard or mainboard, is the primary circuit board within a personal computer. Many other components connect directly or indirectly to the motherboard. Motherboards usually contain one or more CPUs, supporting circuitry - usually integrated circuits (ICs) - providing the interface between the CPU memory and input/output peripheral circuits, main memory, and facilities for initial setup of the computer immediately after power-on (often called boot firmware or, in IBM PC compatible computers, a BIOS). In many portable and embedded personal computers, the motherboard houses nearly all of the PC's core components. Often a motherboard will also contain one or more peripheral buses and physical connectors for expansion purposes. Sometimes a secondary daughter board is connected to the motherboard to provide further expandability or to satisfy space constraints.

A PC's main memory is fast storage that is directly accessible by the CPU, and is used to store the currently executing program and immediately needed data. PCs use semiconductor random access memory (RAM) of various kinds such as DRAM or SRAM as their primary storage. Which exact kind depends on cost/performance issues at any particular time. Main memory is much faster than mass storage devices like hard disks or optical discs, but is usually volatile, meaning it does not retain its contents (instructions or data) in the absence of power, and is much more expensive for a given capacity than is most mass storage. Main memory is generally not suitable for long-term or archival data storage.

Mass storage devices store programs and data even when the power is off; they do require power to perform read and write functions during usage. Although semiconductor flash memory has dropped in cost, the prevailing form of mass storage in personal computers is still the electromechanical hard disk.

The disk drives use a sealed head/disk assembly (HDA) which was first introduced by IBM's "Winchester" disk system. The use of a sealed assembly allowed the use of positive air pressure to drive out particles from the surface of the disk, which improves reliability.

If the mass storage controller provides for expandability, a PC may also be upgraded by the addition of extra hard disk or optical disc drives. For example, DVD-ROMs, CD-ROMs, and various optical disc recorders may all be added by the user to certain PCs. Standard internal storage device interfaces are ATA, Serial ATA, SCSI, and CF+ type II in 2005.

The video card - otherwise called a graphics card, graphics adapter or video adapter - processes and renders the graphics output from the computer to the computer display, also called the visual display unit (VDU), and is an essential part of the modern computer. On older models, and today on budget models, graphics circuitry tended to be integrated with the motherboard but, for modern flexible machines, they are supplied in PCI, AGP, or PCI Express format.

When the IBM PC was introduced, most existing business-oriented personal computers used text-only display adapters and had no graphics capability. Home computers at that time had graphics compatible with television signals, but with low resolution by modern standards owing to the limited memory available to the eight-bit processors available at the time.

A visual display unit (also called monitor) is a piece of electrical equipment, usually separate from the computer case, which displays viewable images generated by a computer without producing a permanent record. The word "monitor" is used in other contexts; in particular in television broadcasting, where a television picture is displayed to a high standard. A computer display device is usually either a cathode ray tube or some form of flat panel such as a TFT LCD. The monitor comprises the display device, circuitry to generate a picture from electronic signals sent by the computer, and an enclosure or case. Within the computer, either as an integral part or a plugged-in interface, there is circuitry to convert internal data to a format compatible with a monitor. The images from monitors originally contained only text, but as Graphical user interfaces emerged and became common, they began to display more images and multimedia content.

In computing, a keyboard is an arrangement of buttons that each correspond to a function, letter, or number. They are the primary devices of inputing text. In most cases, they contain an aray of keys specifically organized with the corresponding letters, numbers, and functions printed or engraved on the button. They are generally designed around an operators language, and many different versions for different languages exist. In English, the most common layout is the QWERTY layout, which was originally used in typewriters. They have evolved over time, and have been modified for use in computers with the addition of function keys, number keys, arrow keys, and OS specific keys. Often, specific functions can be achieved by pressing multiple keys at once or in succession, such as inputing characters with accents or opening a task manager. Programs use keyboard shotcuts very differently and all use different keyboard shortcuts for different program specific operations, such as refreshing a web page in a web browser or selecting all text in a word processor.

A Mouse on a computer is a small, slidable device that users hold and slide around to point at, click, and sometimes drag objects on screen in a graphical user interface using a pointer on screen. Almost all Personal Computers have mice. It may be plugged into a computer's rear mouse socket, or as a USB device, or, more recently, may be connected wirelessly via a USB antenna or Bluetooth antenna. In the past, they had a single button that users could press down on the device to "click" on whatever the pointer on the screen was hovering over. Now, however, many Mice have two or three buttons; a "right click" function button on the mouse, which performs a secondary action on a selected object, and a scroll wheel, which users can rotate the wheel using their fingers to "scroll" up or down. The scroll wheel can also be pressed down, and therefore be used as a third button. Different programs make use of these functions differently, and may scroll horizontally by default with the scroll wheel, open different menus with different buttons, among others.

Mice traditionally detected movement and communicated with the computer with an internal "mouse ball"; and use optical encoders to detect rotation of the ball and tell the computer where the mouse has moved. However, these systems were subject to low durability and accuracy. Modern mice use optical technology to directly trace movement of the surface under the mouse and are much more accurate and durable. They work on a wider variety of surfaces and can even operate on walls, ceilings or other non-horizontal surfaces.

All computers require either fixed or removable storage for their operating system, programs and user generated material. Formerly the 5 1/4 inch and 3 1/2 inch floppy drive were the principal forms of removable storage for backup of user files and distribution of software.

As memory sizes increased, the capacity of the floppy did not keep pace; the Zip drive and other higher-capacity removable media were introduced but never became as prevalent as the floppy drive.

By the late 1990s the optical drive, in CD and later DVD and Blu-ray Disc, became the main method for software distribution, and writeable media provided backup and file interchange. Floppy drives have become uncommon in desktop personal computers since about 2000, and were dropped from many laptop systems even earlier.

Early home computers used compact audio cassettes for file storage; these were at the time a very low cost storage solution, but were displaced by floppy disk drives when manfacturing costs dropped, by the mid 1980s.

A second generation of tape recorders was provided when Videocassette recorders were pressed into service as backup media for larger disk drives. All these systems were less reliable and slower than purpose-built magnetic tape drives. Such tape drives were uncommon in consumer-type personal computers but were a necessity in business or industrial use.

Interchange of data such as photographs from digital cameras is greatly expedited by installation of a card reader, which often is compatible with several forms of flash memory. It is usually faster and more convenient to move large amounts of data by removing the card from the mobile device, instead of communicating with the mobile device through a USB interface.

A USB flash drive today performs much of the data transfer and backup functions formerly done with floppy drives, Zip disks and other devices. Main-stream current operating systems for personal computers provide standard support for flash drives, allowing interchange even between computers using different processors and operating systems. The compact size and lack of moving parts or dirt-sensitive media, combined with low cost for high capacity, have made flash drives a popular and useful accessory for any personal computer user.

The operating system (e.g.: Microsoft Windows, Mac OS, Linux or many others) can be located on any removable storage, but typically it is on one of the hard disks. A Live CD is also possible, but it is very slow and is usually used for installation of the OS, demonstrations, or problem solving. Flash-based memory is currently expensive (as of mid-2008) but is starting to appear in laptop computers because of its low weight and low energy consumption, compared to hard disk storage.

Computer software is a general term used to describe a collection of computer programs, procedures and documentation that perform some tasks on a computer system. The term includes application software such as word processors which perform productive tasks for users, system software such as operating systems, which interface with hardware to provide the necessary services for application software, and middleware which controls and co-ordinates distributed systems.

Software applications for word processing, Internet browsing, Internet faxing, e-mail and other digital messaging, multimedia playback, computer game play and computer programming are common. The user of a modern personal computer may have significant knowledge of the operating environment and application programs, but is not necessarily interested in programming nor even able to write programs for the computer. Therefore, most software written primarily for personal computers tends to be designed with simplicity of use, or "user-friendliness" in mind. However, the software industry continuously provide a wide range of new products for use in personal computers, targeted at both the expert and the non-expert user.

An operating system (OS) manages computer resources and provides programmers with an interface used to access those resources. An operating system processes system data and user input, and responds by allocating and managing tasks and internal system resources as a service to users and programs of the system. An operating system performs basic tasks such as controlling and allocating memory, prioritizing system requests, controlling input and output devices, facilitating computer networking and managing files.

Common contemporary desktop OSes are Microsoft Windows (~91% market share), Mac OS X (~8%), Linux (0.7%), Solaris and PC-BSD. Windows, Mac, and Linux all have server and personal variants. With the exception of Microsoft Windows, the designs of each of the aforementioned OSs were inspired by, or directly inherited from, the Unix operating system. Unix was developed at Bell Labs beginning in the late 1960s and spawned the development of numerous free and proprietary operating systems.

Microsoft Windows is the name of several families of software operating systems by Microsoft. Microsoft first introduced an operating environment named Windows in November 1985 as an add-on to MS-DOS in response to the growing interest in graphical user interfaces (GUIs). The most recent client version of Windows is Vista SP1. The current server version of Windows is Windows Server 2008.

Linux is a family of Unix-like computer operating systems. Linux is one of the most prominent examples of free software and open source development: typically all underlying source code can be freely modified, used, and redistributed by anyone. The name "Linux" comes from the Linux kernel, started in 1991 by Linus Torvalds. The system's utilities and libraries usually come from the GNU operating system, announced in 1983 by Richard Stallman. The GNU contribution is the basis for the alternative name GNU/Linux.

Predominantly known for its use in servers, Linux is supported by corporations such as Dell, Hewlett-Packard, IBM, Novell, Oracle Corporation, Red Hat, Canonical Ltd. and Sun Microsystems. It is used as an operating system for a wide variety of computer hardware, including desktop computers, supercomputers, video game systems, such as the PlayStation 3, several arcade games, and embedded devices such as mobile phones, routers, and stage lighting systems.

Mac OS X is a line of graphical operating systems developed, marketed, and sold by Apple Inc.. Mac OS X is the successor to the original Mac OS, which had been Apple's primary operating system since 1984. Unlike its predecessors, Mac OS X is a Unix-based operating system.

Application software employs the capabilities of a computer directly and thoroughly to a task that the user wishes to perform. This should be contrasted with system software which is involved in integrating a computer's various capabilities, but typically does not directly apply them in the performance of tasks that benefit the user. In this context the term application refers to both the application software and its implementation. A simple, if imperfect analogy in the world of hardware would be the relationship of an electric light bulb (an application) to an electric power generation plant (a system). The power plant merely generates electricity, not itself of any real use until harnessed to an application like the electric light that performs a service that benefits the user.

Typical examples of software applications are word processors, spreadsheets, and media players. Multiple applications bundled together as a package are sometimes referred to as an application suite. Microsoft Office and OpenOffice.org, which bundle together a word processor, a spreadsheet, and several other discrete applications, are typical examples. The separate applications in a suite usually have a user interface that has some commonality making it easier for the user to learn and use each application. And often they may have some capability to interact with each other in ways beneficial to the user. For example, a spreadsheet might be able to be embedded in a word processor document even though it had been created in the separate spreadsheet application.

End-user development tailors systems to meet the user's specific needs. User-written software include spreadsheet templates, word processor macros, scientific simulations, graphics and animation scripts. Even email filters are a kind of user software. Users create this software themselves and often overlook how important it is.

Most personal computers are standardized to the point that purchased software is expected to run with little or no customization for the particular computer. Many PCs are also user-upgradeable, especially desktop and workstation class computers. Devices such as main memory, mass storage, even the motherboard and central processing unit may be easily replaced by an end user. This upgradeability is, however, not indefinite due to rapid changes in the personal computer industry. A PC that was considered top-of-the-line five or six years prior may be impractical to upgrade due to changes in industry standards. Such a computer usually must be totally replaced once it is no longer suitable for its purpose. This upgrade and replacement cycle is partially related to new releases of the primary mass-market operating system, which tends to drive the acquisition of new hardware and render obsolete previously serviceable hardware (planned obsolescence).

The processing environment may also render an older computer obsolete even if it is still in good working order. As the memory (RAM) and processing speed of the average computer increases, websites are built or rebuilt based on the expectation of this increased computing power. This spurs the development of still faster processors and higher RAM capacities, as the cycle continues.

Due to the extremely short life-span of the average PC in the U.S. alone approximately 130,000 personal computers are thrown out a day. This statistic supports the growing importance of electronic recycling.

To the top



Source : Wikipedia