kids encyclopedia robot

Computer facts for kids

Kids Encyclopedia Facts
Desktop computer clipart - Yellow theme
A drawing of a modern desktop computer.
Woman-typing-on-laptop2
Woman typing on a laptop
Tabletunterricht 01
Kids using a tablet computer

A computer is a machine that accepts data as input, processes that data using programs, and outputs the processed data as information. Many computers can store and retrieve information using hard drives. Computers can be connected together to form networks, allowing connected computers to communicate with each other.

Characteristics

Traffic lights near Larne - geograph.org.uk - 407144
Traffic lights can be controlled by computers

Сomputers can respond to a specific instruction set in a well-defined manner and they can execute a prerecorded list of instructions called a program.

There are four main processing steps in a computer: inputting, storage, outputting and processing.

Modern computers can do billions of calculations in a second. Being able to calculate many times per second allows modern computers to multi-task, which means they can do many different tasks at the same time. Computers do many different jobs where automation is useful. Some examples are controlling traffic lights, vehicle , security systems, washing machines and digital televisions.

Computers can be designed to do almost anything with information.

Computers are used to control large and small machines which in the past were controlled by humans. They are used for things such as calculation, listening to music, reading an article, writing etc.

Hardware

Modern computers are electronic computer hardware. They do mathematical arithmetic very quickly but computers do not really "think". They only follow the instructions in their software programs. The software uses the hardware when the user gives it instructions, and gives useful output.

Controls

Backlit keyboard
A backlit keyboard

Humans control computers with user interfaces. Input devices include keyboards, computer mice, buttons, and touch screens. Some computers can also be controlled with voice commands, hand gestures or even brain signals through electrodes implanted in the brain or along nerves.

Programs

Computer programs are designed or written by computer programmers. A few programmers write programs in the computer's own language called machine code. Most programs are written using a programming language like C, C++, Java. These programming languages are more like the language with which one talks and writes every day. The compiler translates the user's instructions into binary code (machine code) that the computer will understand and do what is needed.

Bugs

First Computer Bug, 1945
The actual first computer bug, a moth found trapped on a relay of the Harvard Mark II computer

Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design. Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947.

History of computers

Jacquard.loom.full.view
The Jacquard loom was one of the first programmable devices.

Automation

Most humans have a problem with math. Try doing this 584 × 3,220 in your head. It is hard to remember all the steps! People made tools to help them remember where they were in a math problem.

The other problem people have is that they have to do the same problem over and over and over again. A cashier had to work out what the change was every day in her head or with a piece of paper. That took a lot of time and made mistakes. So, people made calculators that did those same things over and over. This part of computer history is called the "history of automated calculation," which is a fancy phrase for "the history of machines that make it easy for us to do this same math problem over and over without making mistakes."

The abacus, the slide rule, the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC) are examples of automated calculation machines.

Programming

People do not want a machine that would do the same thing over and over again. For example, a music box is a machine that plays the same music over and over again. Some people wanted to be able to tell their machine to do different things. For example, they wanted to tell the music box to play different music every time. They wanted to be able to program the music box- to order the music box to play different music. This part of computer history is called the "history of programmable machines" which is a fancy phrase for "The history of machines that I can order to do different things if I know how to speak their language."

One of the first examples of this was built by Hero of Alexandria (c. 10–70 AD). He built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums. These ropes and drums were the language of the machine- they told what the machine did and when. Some people argue that this is the first programmable machine.

Historians disagree on which early machines are "computers". Many say the "castle clock", an astronomical clock invented by Al-Jazari in 1206, is the first known programmable analog computer. The length of day and night could be adjusted every day in order to account for the changing lengths of day and night throughout the year. Some count this daily adjustment as computer programming.

Others say the first computer was made by Charles Babbage. Ada Lovelace is considered to be the first programmer.

The computing era

At the end of the Middle Ages, people started thinking math and engineering were more important. In 1623, Wilhelm Schickard made a mechanical calculator. Other Europeans made more calculators after him. They were not modern computers because they could only add, subtract, and multiply- you could not change what they did to make them do something like play Tetris. Because of this, we say they were not programmable. Now engineers use computers to design and plan.

IBM Port-A-Punch
IBM Port-A-Punch

In 1801, Joseph Marie Jacquard used punched paper cards to tell his textile loom what kind of pattern to weave. He could use punch cards to tell the loom what to do, and he could change the punch cards, which means he could program the loom to weave the pattern he wanted. This means the loom was programmable.

Charles Babbage wanted to make a similar machine that could calculate. He called it "The Analytical Engine". Because Babbage did not have enough money and always changed his design when he had a better idea, he never built his Analytical Engine.

As time went on, computers were used more. People get bored easily doing the same thing over and over. Imagine spending your life writing things down on index cards, storing them, and then having to go find them again. The U.S. Census Bureau in 1890 had hundreds of people doing just that. It was expensive, and reports took a long time. Then an engineer worked out how to make machines do a lot of the work. Herman Hollerith invented a tabulating machine that would automatically add up information that the Census bureau collected. The Computing Tabulating Recording Corporation (which later became IBM) made his machines. They leased the machines instead of selling them. Makers of machines had long helped their users understand and repair them, and CTR's tech support was especially good.

Because of machines like this, new ways of talking to these machines were invented, and new types of machines were invented, and eventually the computer as we know it was born.

Analog and digital computers

In the first half of the 20th century, scientists started using computers, mostly because scientists had a lot of math to figure out and wanted to spend more of their time thinking about science questions instead of spending hours adding numbers together. For example, if they had to launch a rocket ship, they needed to do a lot of math to make sure the rocket worked right. So they put together computers.

These analog computers used analog circuits, which made them very hard to program. In the 1930s, they invented digital computers, and soon made them easier to program.

High-scale computers

Scientists figured out how to make and use digital computers in the 1930s to 1940s. Scientists made a lot of digital computers, and as they did, they figured out how to ask them the right sorts of questions to get the most out of them.

EDSAC (10)
EDSAC was one of the first computers that remembered what you told it even after you turned the power off. This is called (von Neumann) architecture.
  • Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine that used binary arithmetic. Binary arithmetic means using "Yes" and "No." to add numbers together. You could also program it. In 1998 the Z3 was proved to be Turing complete. Turing complete means that it is possible to tell this particular computer anything that it is mathematically possible to tell a computer. It is the world's first modern computer.
  • The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tubes to store "yes" and "no" answers, and regenerative capacitor memory.
  • The Harvard Mark I (1944), A big computer that you could kind of program.
  • The U.S. Army's Ballistics Research Laboratory ENIAC (1946), which could add numbers the way people do (using the numbers 0 through 9) and is sometimes called the first general purpose electronic computer.

Nearly all modern computers use the stored-program architecture. It has become the main concept which defines a modern computer. The technologies used to build computers have changed since the 1940s, but many current computers still use the von-Neumann architecture.

80486dx2-large
Microprocessors are miniaturized devices that often implement stored program CPUs.

In the 1950s computers were built out of mostly vacuum tubes.

Transistors replaced vacuum tubes in the 1960s because they were smaller and cheaper. They also need less power and do not break down as much as vacuum tubes.

In the 1970s, technologies were based on integrated circuits. Microprocessors, such as the Intel 4004 made computers smaller, cheaper, faster and more reliable.

By the 1980s, microcontrollers became small and cheap enough to replace mechanical controls in things like washing machines. The 1980s also saw home computers and personal computers. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.

In 2005 Nokia started to call some of its mobile phones (the N-series) "multimedia computers" and after the launch of the Apple iPhone in 2007, many started adding the smartphone to the category of "real" computers.

Kinds of computers

Kids on old computers (17161290240)
Kids on old computers

There are many types of computers. Some include:

  1. personal computer
  2. workstation
  3. mainframe
  4. server
  5. minicomputer
  6. supercomputer
  7. embedded system
  8. tablet computer

A "desktop computer" is a small machine that has a screen (which is not part of the computer). Most people keep them on top of a desk, which is why they are called "desktop computers." "Laptop computers" are computers small enough to fit on your lap. This makes them easy to carry around. Both laptops and desktops are called personal computers, because one person at a time uses them for things like playing music, surfing the web, or playing video games.

There are bigger computers that many people at a time can use. These are called "Mainframes," and these computers do all the things that make things like the internet work. You can think of a personal computer like this: the personal computer is like your skin: you can see it, other people can see it, and through your skin you feel wind, water, air, and the rest of the world. A mainframe is more like your internal organs: you never see them, and you barely even think about them, but if they suddenly went missing, you would have some very big problems.

An embedded computer, also called embedded system is a computer that does one thing and one thing only, and usually does it very well. For example, an alarm clock is an embedded computer: it tells the time. Unlike your personal computer, you cannot use your clock to play Tetris. Because of this, we say that embedded computers cannot be programmed, because you cannot install more programs on your clock. Some mobile phones, automatic teller machines, microwave ovens, CD players and cars are operated by embedded computers.

All-in-one PC

All-in-one computers are desktop computers that have all of the computer's inner mechanisms in the same case as the monitor. Apple has made several popular examples of all-in-one computers, such as the original Macintosh of the mid-1980s and the iMac of the late 1990s and 2000s.

Common uses of home computers

Kid playing on iMac (16728580353)
Kid playing on an iMac

Common uses of computers at work

Working methods

Binary counter
This counter shows how to count in binary from numbers zero through thirty-one.

Computers store data and the instructions as numbers, because computers can do things with numbers very quickly. These data are stored as binary symbols (1s and 0s). A 1 or a 0 symbol stored by a computer is called a bit, which comes from the words binary digit. Computers can use many bits together to represent instructions and the data that these instructions use. A list of instructions is called a program and is stored on the computer's hard disk. Computers work through the program by using a central processing unit, and they use fast memory called RAM also known as (Random Access Memory) as a space to store the instructions and data while they are doing this. When the computer wants to store the results of the program for later, it uses the hard disk because things stored on a hard disk can still be remembered after the computer is turned off.

Ubuntu 10.10
Ubuntu GNU+Linux, a free operating system

An operating system tells the computer how to understand what jobs it has to do, how to do these jobs, and how to tell people the results. Millions of computers may be using the same operating system, while each computer can have its own application programs to do what its user needs. Using the same operating systems makes it easy to learn how to use computers for new things. A user who needs to use a computer for something different, can learn how to use a new application program. Some operating systems can have simple command lines or a fully user-friendly GUI.

Networking and the Internet

Internet map 1024
Visualization of a portion of the routes on the Internet

Computers have been used to coordinate information between multiple locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. The technologies that made the Arpanet possible spread and evolved.

In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. Computer operating systems and applications were modified to include the ability to define and access the resources of other computers on the network, such as peripheral devices, stored information, and the like, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies like Ethernet and ADSL saw computer networking become almost ubiquitous. In fact, the number of computers that are networked is growing phenomenally. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. "Wireless" networking, often utilizing mobile phone networks, has meant networking is becoming increasingly ubiquitous even in mobile computing environments.

Computers and waste

Old computers
Old computers

A computer is now almost always an electronic device. It usually contains materials that will become electronic waste when discarded. When a new computer is bought in some places, laws require that the cost of its waste management must also be paid for. This is called product stewardship.

Computers can become obsolete quickly, depending on what programs the user runs. Very often, they are thrown away within two or three years, because some newer programs require a more powerful computer. This makes the problem worse, so computer recycling happens a lot. Many projects try to send working computers to developing nations so they can be re-used and will not become waste as quickly, as most people do not need to run new programs. Some computer parts, such as hard drives, can break easily. When these parts end up in the landfill, they can put poisonous chemicals like lead into the ground-water. Hard drives can also contain secret information like credit card numbers. If the hard drive is not erased before being thrown away, an identity thief can get the information from the hard drive, even if the drive doesn't work, and use it to steal money from the previous owner's bank account.

Main hardware

Computers come in different forms, but most of them have a common design.

  • All computers have a CPU.
  • All computers have some kind of data bus which lets them get inputs or output things to the environment.
  • All computers have some form of memory. These are usually chips (integrated circuits) which can hold information.
  • Many computers have some kind of sensors, which lets them get input from their environment.
  • Many computers have some kind of display device, which lets them show output. They may also have other peripheral devices connected.

A computer has several main parts. When comparing a computer to a human body, the CPU is like a brain. It does most of the thinking and tells the rest of the computer how to work. The CPU is on the Motherboard, which is like the skeleton. It provides the basis for where the other parts go, and carries the nerves that connect them to each other and the CPU. The motherboard is connected to a power supply, which provides electricity to the entire computer. The various drives (CD drive, floppy drive, and on many newer computers, USB flash drive) act like eyes, ears, and fingers, and allow the computer to read different types of storage, in the same way that a human can read different types of books. The hard drive is like a human's memory, and keeps track of all the data stored on the computer. Most computers have a sound card or another method of making sound, which is like vocal cords, or a voice box. Connected to the sound card are speakers, which are like a mouth, and are where the sound comes out. Computers might also have a graphics card, which helps the computer to create visual effects, such as 3D environments, or more realistic colors, and more powerful graphics cards can make more realistic or more advanced images, in the same way a well trained artist can.

Largest computer companies

Company name Sales
(US$ billion)
United States Apple 220,000
South Korea Samsung 212,680
Taiwan Foxconn 132,070
United States HP (Hewlett-Packard) 112,300
United States IBM 99,750
Japan Hitachi 87,510
United States Microsoft 86,830
United States Amazon 74,450
Japan Sony 72,340
Japan Panasonic 70,830
United States Google 59,820
United States Dell 56,940
Japan Toshiba 56,200
South Korea LG 54,750
United States Intel 52,700

Future

There is active research to make non-classical computers out of many promising new types of technology, such as optical computers, DNA computers, neural computers, and quantum computers. Most computers are universal, and are able to calculate any computable function, and are limited only by their memory capacity and operating speed. However different designs of computers can give very different performance for particular problems; for example quantum computers can potentially break some modern encryption algorithms (by quantum factoring) very quickly.

Artificial intelligence

Computer programs that learn and adapt are part of the emerging field of artificial intelligence and machine learning.

Joseph Ayerle portrait of Ornella Muti (detail), calculated by Artificial Intelligence (AI) technology
For this 2018 project of the artist Joseph Ayerle the AI had to learn the typical patterns in the colors and brushstrokes of Renaissance painter Raphael. The portrait shows the face of the actress Ornella Muti, "painted" by AI in the style of Raphael.

AI and machine learning technology is used in most of the essential applications of the 2020s, including: search engines (such as Google Search), targeting online advertisements, recommendation systems (offered by Netflix, YouTube or Amazon), driving internet traffic, targeted advertising (AdSense, Facebook), virtual assistants (such as Siri or Alexa), autonomous vehicles (including drones, ADAS and self-driving cars), automatic language translation (Microsoft Translator, Google Translate), facial recognition (Apple's Face ID or Microsoft's DeepFace and Google's FaceNet) and image labeling (used by Facebook, Apple's iPhoto and TikTok).

There are also thousands of successful AI applications used to solve specific problems for specific industries or institutions. In a 2017 survey, one in five companies reported they had incorporated "AI" in some offerings or processes. A few examples are energy storage, medical diagnosis, military logistics, applications that predict the result of judicial decisions, foreign policy, or supply chain management.

Game playing programs have been used since the 1950s to demonstrate and test AI's most advanced techniques. Deep Blue became the first computer chess-playing system to beat a reigning world chess champion, Garry Kasparov, on 11 May 1997. In 2011, in a Jeopardy! quiz show exhibition match, IBM's question answering system, Watson, defeated the two greatest Jeopardy! champions, Brad Rutter and Ken Jennings, by a significant margin. In March 2016, AlphaGo won 4 out of 5 games of Go in a match with Go champion Lee Sedol, becoming the first computer Go-playing system to beat a professional Go player without handicaps. Then it defeated Ke Jie in 2017, who at the time continuously held the world No. 1 ranking for two years. Other programs handle imperfect-information games; such as for poker at a superhuman level, Pluribus and Cepheus. DeepMind in the 2010s developed a "generalized artificial intelligence" that could learn many diverse Atari games on its own.

In the early 2020s, generative AI gained widespread prominence. ChatGPT, based on GPT-3, and other large language models, were tried by 14% of Americans adults. The increasing realism and ease-of-use of AI-based text-to-image generators such as Midjourney, DALL-E, and Stable Diffusion sparked a trend of viral AI-generated photos.

AlphaFold 2 (2020) demonstrated the ability to approximate, in hours rather than months, the 3D structure of a protein.

Images for kids

See also

Kids robot.svg In Spanish: Computadora para niños

kids search engine
Computer Facts for Kids. Kiddle Encyclopedia.