Selasa, 17 Juni 2008

All about computer

The word computer once meant a person who did computations, but now it almost always refers to automated electronic devices. Computers can do much more than calculate, however. They are now used in all sorts of ways to better control or automate products and processes. For example, computers are used in airplanes and automobiles to control the way that fuel is injected into the engine, and they are used to monitor every part of the production process in most modern factories. Computers help people write reports, draw pictures, and keep track of information. Since the invention of the Internet, computers are also used to gather information from digital libraries located all over the world, to send and receive electronic messages (e-mail), and to work, shop, and bank from home.

Computers come in many sizes and shapes. They range from small devices that perform one specific function, such as those in cameras that control the shutter speed, to supercomputers. Supercomputers are specially engineered to be able to perform trillions of operations per second. Because they are so powerful and therefore so expensive they are generally used only by government agencies and large research centers.

The computers that most individuals and businesses use to perform such tasks as word processing, managing databases, generating mailing lists, tracking inventory, and calculating accounting information are common desktop personal computers (PCs). A smaller version of the PC is the laptop or notebook computer. These smaller computers have all the same components as a desktop version and can be just as powerful, but they can fit into a briefcase and are easily portable. Today's PCs are powerful machines that can perform millions or even billions of operations per second.

Often all of the PCs at a business or university are connected to a central computer called a server. Servers are fast computers that have greater data processing capabilities than most PCs and can be used by many people at once. These connections are known as a local area network (LAN). They allow users to share information.

Parts of a computer system

A computer system requires both hardware and software. Hardware includes all of the mechanical parts of a computer. Software consists of the instructions and data that the hardware uses to perform its tasks.

Hardware

All computers, no matter how large or small, have basically the same types of hardware. These include a central processing unit (CPU), memory, storage (secondary memory), input/output (I/O) devices, and some type of telecommunication device.

The CPU is the computer's “brain,” where all computations are performed. The computer carries out its computations one step at a time, with each step occurring on each “beat” of its built-in clock. The fastest computer clocks now beat more than 3 GHz (gigahertz), or billions of times per second.

Memory is where instructions and data are held while being worked on. Read-only memory (ROM) is built into the computer and cannot be changed. ROM contains instructions that the computer needs to start up. Random-access memory (RAM), or one of its variants, is typically used for the main computer memory because of its speed. Information is stored temporarily in RAM as a computer processes data and instructions.

Secondary memory is where instructions and data are saved for long-term storage. Most computers use a magnetic device called a hard drive for storage. A hard drive accesses data very quickly. Slower devices are often used to store files on magnetic tape or optical discs such as compact discs (CDs) and digital video discs (DVDs).

I/O devices enable communication between a computer and the person using it. Input devices allow the user to enter data or commands for processing by the CPU. They include the keyboard, mouse, joystick, scanner, and digital tablet. Output devices let the user see or hear the results of the computer's data processing. They include the monitor, printer, and speakers.

Telecommunication devices enable computers to send data through telephone lines or other channels. In this way computer users can exchange information with one another. These devices include regular telephone modems, digital subscriber line (DSL) telephone modems, cable modems, and various wireless modems.

Software

Computer software includes all of the instructions that tell a computer what to do. These instructions are called programs. Software is typically stored on memory devices such as a hard drive or CD-ROM.

Software falls into two basic categories: system software and application software. System software, or the operating system, is what controls the computer's use of its resources (CPU, memory, storage, I/O devices, modem). The most popular operating systems for PCs are Microsoft Corporation's Windows OS, Apple Computer's Mac OS, and Linux. These operating systems use a graphical user interface (GUI). GUIs use icons (symbols) that can be clicked with a mouse to start programs and open files. Application software is used to perform specific tasks, such as designing a building, writing a paper, or playing a video game. The most common applications are discussed below.

How computers work

Computers represent information using the binary (or base-two) number system. The binary number system uses only two digits, 0 and 1, instead of the ten digits, 0–9, that are generally used in everyday life. In computer terms, binary digits are called bits. The binary system is used because the digits 0 and 1 can easily represent the only two states of a computer's electric circuits—“on” or “off.” In this form the numeral 5, for example, which in binary is 101, means “on-off-on.” The codes computers actually use are more complicated than this, but this is the general idea. (See also numbers and counting systems.)

To handle more than just numbers, a special code is used in which bits are grouped into units called bytes. A byte consists of eight bits. Each byte stands for a different letter, punctuation mark, or other character. The most common code is known as ASCII (American Standard Code for Information Interchange) and is understood by all PCs.

Although computers have no trouble working with binary numbers, humans find it much harder. For this reason people have created special computer programming languages, such as BASIC, HTML, and Java, for writing computer instructions. Before these instructions can be understood by the computer, however, they must be converted into binary numbers. Similarly, before a person looks at what a computer has done, its binary numbers are changed into an image, words, or ordinary numbers.

Common computer applications

From the start, computers have been used to solve scientific and engineering problems, such as designing weapons and modeling the first hydrogen nuclear reaction. Scientific applications typically require the most processing power. Efforts to create models of weather conditions and design nuclear weapons have led to the most advanced supercomputers.

Another important computer application is databases. A database is an organized collection of data together with a special index. The index allows a user to pull out different information from the database based on different questions. Common database examples include government social security records, insurance company health records, airline reservations, and school grades. For example, if a person calls an airline to reserve a flight to Orlando, Florida, an airline employee checks a reservation database to see which airplanes have seats available.

Spreadsheets are used for bookkeeping and to create financial reports. A spreadsheet is a particularly useful tool for testing what effect some change will have on financial results. For example, a business manager might use a spreadsheet to see how using a new ingredient would affect the cost of manufacturing a product.

Computer-aided design (CAD) software is used to produce two and three-dimensional blueprints for buildings and machinery. Such software has evolved to include programs for creating illustrations, animations, and special effects used in motion pictures.

Probably the most common application for computers is word processing. Basic word processing involves combining text and images to produce a document that can be printed. The invention of PCs and laser printers in the 1970s led to “desktop publishing.” Instead of needing expensive professional printers, small businesses could produce their own flyers and pamphlets. Over the years word processors have developed to include database, spreadsheet, and graphical capabilities. These features enable users to produce more sophisticated documents.

The Internet

The Internet was created in 1969 so that scientists at U.S. government offices and universities could share information. Just as many users within an office or a university are connected via a local area network, the Internet connects thousands of computers all over the world through a wide area network, or a network of networks. Almost from its birth, the Internet has been used most commonly for the exchange of electronic mail, or e-mail. In fact, it was the desire to extend e-mail services beyond the original government and university research centers that first began to open the Internet to commercial providers. By the beginning of the 21st century, the volume of e-mail easily exceeded that of ordinary mail.

The Internet application called the World Wide Web—or the Web for short—was invented by the British computer programmer Tim Berners-Lee in the early 1990s. Berners-Lee designed the Web as a way of sharing text documents by displaying them in a program called a browser. In 1993 the creation of a graphical browser, known as Mosaic, by U.S. programmer Marc Andreessen made the Web easier to use. Suddenly everyone seemed to be creating Web documents, called pages, to display pictures of their pets, share their favorite recipes, or even to start a business. Regular businesses took notice and soon all kinds of services and products, from banking and books to social clubs (“chat rooms”) and virtual game worlds, were available over the Web.

Each Web page has a different Internet address, or uniform resource locator (URL). An example of a URL is http://www.britannica.com. When this address is entered into a browser, the URL is sent to the computer located at the domain name “britannica.” This computer, called a server, then sends to the user's browser the various components (text, graphics, and sounds) located on the Web page. The “com” in the server's address indicates that it belongs to a commercial enterprise. Other designations include “gov” for government, “edu” for education, and “org” for nonprofit organizations.

The Web is only one part of the Internet. Before the Web was invented there was Usenet, or newsgroups. Newsgroups enable people with common interests to share files through special Internet servers. For example, alt.rec.chess enables players to share their chess games and comments with other computer users. Another older Internet standard is ftp (file transfer protocol), used mainly to distribute larger files and programs.

The early days of the Internet have been compared to the American Wild West of frontier times. The Internet itself is a frontier of sorts, a vast new space with few rules to control people's actions. Some people have taken advantage of the relative “lawlessness” of the Internet to perform criminal acts. Computer users known as hackers have illegally broken into government and corporate computers to steal credit card numbers and other private information. Others have spread harmful computer programs called viruses through e-mail. A virus can change or destroy another computer's programs or data. The Internet has also enabled computer users to share computer software, music, and motion pictures without permission.

Awareness of such abuses has led to improved computer security and greater attention from law enforcement. The largest remaining annoyance for most people is the skyrocketing growth in unwanted e-mail, or “spam.” Some proposals have been made to reduce spam. Like “junk mail,” however, it may never entirely go away.

History

The dream of computing is almost as old as the idea of numbers. The first device for doing arithmetic, the abacus, was invented thousands of years ago. The Italian genius Leonardo da Vinci designed a mechanical calculator in about 1500 but, like so many of his inventions, the tools did not yet exist to build it. The credit for actually building the first calculator goes to the German astronomer Wilhelm Schickard, for his Calculating Clock in 1623. The first calculator to be produced in quantity was the Pascaline, designed and built by the French mathematician Blaise Pascal in 1642.

First computer

The invention that turned these calculators into computers came from an idea borrowed from the Jacquard loom. This device used special “cards” that enabled the weaving to be made automatic. In the 1830s the British inventor Charles Babbage adapted this idea to design an automatic computation machine he called the Analytical Engine. The machine was designed to have input devices, a store (memory), a mill (computing unit), a control unit, and output devices—the essential components of every computer today. Babbage ran into trouble completing his Analytical Engine because tools were still not good enough to build all of the small mechanical gears and switches that he needed. Nevertheless, his device is considered to be the first true computer.

Among those who were fascinated with Babbage's invention was a young woman named Augusta Ada King, the countess of Lovelace. She was the daughter of the famous English poet Lord Byron. Lady Lovelace learned enough about the Analytical Engine's workings and how to control it to write about it so that others could also understand what it could do. She is often credited as the first computer programmer.

Early general-purpose computers

For many years people thought that a different computer had to be built to solve each new problem. It was not until the 1930s that the British mathematician Alan Turing first realized that a “universal computer” was possible. Turing figured out what it would take to make a general-purpose computer, thus eliminating the need to build machines for each new task. Turing also believed that a computer could be built that could “think.” Because of this idea, Turing is known as the father of artificial intelligence.

Various technological needs connected to World War II encouraged computer research in the 1940s. Researchers in the United States, Great Britain, and Germany began to investigate how to replace the mechanical switches of the first computers with electronic ones. The U.S. engineers John Mauchly and J. Presper Eckert, Jr., built the first general-purpose, all-electronic computer, the Electronic Numerical Integrator and Computer (ENIAC). Completed in 1946, ENIAC's first task was computations that led to the construction of a hydrogen bomb.

ENIAC, along with other electronic computers built in the late 1930s and 1940s, marks the beginning of what are known as first-generation computers. These computers cost millions of dollars and filled entire rooms. The vacuum tubes and air conditioning they required to operate used enough electricity to power a small town.

Transistors and integrated circuits

The invention of the transistor in 1947 by the U.S. physicists William B. Shockley, Walter H. Brattain, and John Bardeen revolutionized electronics, including computers. Transistors were smaller and more reliable and efficient than the vacuum tubes that had been used in electronics up to that time. The devices generate and control the electrical signals that operate the computer. By the late 1950s and early 1960s vacuum tubes were no longer used in computers. Computers with transistors are known as second-generation computers. Although these computers were still very expensive, transistors soon led to the creation of smaller (refrigerator-sized), more affordable computers known as minicomputers. With prices starting at about 100,000 dollars, minicomputers spread computing beyond the government and the handful of large universities and corporations that could afford the bigger machines.

The last great breakthrough was the independent invention of the integrated circuit (IC) by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor within weeks of one another in 1958–59. Whereas before transistors and other electronic components had to be wired together one by one, Kilby and Noyce realized that all of these components, along with the wiring, could be built together out of one piece of semiconductor material. (Semiconductors, such as germanium or silicon, are materials with special electrical properties.) This opened the way to building components and wires far too small to be connected by hand. The invention of the IC marks the beginning of the third generation of computers. What once required a room full of tubes and circuits soon fit on a device—called a microprocessor—that was not much bigger than a postage stamp.

By the early 1970s microcomputers based on microprocessors had appeared. Computing then became affordable for small businesses and private individuals. As microcomputer prices fell from their initial level of 10,000 dollars, more and more people could afford them. They began to be called personal computers (PCs).

Computers today

The third generation continues today, with computers finding their way into everything from automobiles to toasters. The distinction between computers and other devices with embedded microprocessors becomes increasingly blurry as one looks from portable “notebook” computers to personal digital assistants (PDAs), portable electronic games, cellular telephones, and global positioning system (GPS) devices.

Meanwhile, supercomputers have come under increasing competition from ever more powerful microcomputers. Rather than spending tens of millions of dollars to develop special processors, many supercomputer makers now just combine thousands of the processors found in ordinary microcomputers.

By Britannica encylopedia

Tidak ada komentar: