Table of Contents
What is Computer – History of Computer
What is a Computer –The modern era is the computer itself. The word computer comes from the word compute, which means to calculate or count. For this reason, computers are commonly known as computing devices. A computer is an automated, endless purpose electronic device that, once powered on, automatically activates and can perform many tasks at once.
A computer is an electronic device that instantly and literally follows instructions and messages given by a user. Computers have their own logical power and memory. Human memory fades over time, but computer memory remains the same. The biggest difference between computers and humans is that computers do not have their own intelligence or discretion. Therefore, program control allows the computer to understand and follow the user’s instructions so that the user can achieve the desired result.
For this, your computer needs a program, which is stored in your computer’s memory. Apart from your computer program, you can also save the results you get to this memory on your computer. These computer programs are written in a computer-understandable language.
Computer and Human
History of Computer – It was a human dream from the beginning to make such machines and devices to work in exactly the same way as humans. To make this dream come true, humans invented the computer. Just as humans see, hear, and touch to convey information to the brain, so do input devices such as computers, keyboards, mice, and microphones.
Information is sent to your computer via, etc. Just as the human brain analyzes the information it receives, so does his CPU in a computer. It also analyzes the information received.
Computers use monitors, printers, and speakers to present the results obtained from analytical information, just as humans present the results obtained from analytical information in front of their mouths, limbs, and so on. One of the differences between humans and computers is that the human brain can think beyond facts and beyond the information transmitted, and can produce any result under the control of emotions, whereas the computer. Means that you can only convey the facts.
By the way, in the new world of ubiquitous computers, machines will be intelligent, able to think beyond facts, and be able to do their own work without human intervention.
What is Computer | History of Computer

History of Computer Development
What is Computer | History of Computers – Humans have been curious for a long time, and because of their curiosity, humanity is superior to other animals. Humans began to feel the need for calculations from the Stone Age. He needed to record the exchange of goods, but due to his lack of knowledge he was unable to do this task. At that time, they had the only way to do this job-by a small piece of stone. Calculation. They kept small pieces of stone for each item. This can be called the beginning of the calculation.
After this introduction, humans counted with his fingers. Finger calculations can be done in very small quantities. During this time, hunters wanted to know how many animals and birds they hunted. For this reason, humans have begun to make pictures of animals and birds on the clay walls. After this, in the Christian era of 650 BC, Egyptian inhabitants created the appearance of animals and birds in the cave with a special kind of symbol. These people were counting these signs in a special way.
After this, humans searched for many ways to count, but other methods did not prove to be effective. Around 600 BC The first actual calculation attempt was made in China. The counting machine “Abacus” was invented here. The abacus is a mechanical counter that has begun to be used all over the world after China was used in Japan, India, Russia, etc. The abacus is considered to be the first model of the computer. This device is still used to teach children how to count. The abacus is made by screwing beads into a string of rectangular wooden frames.
This rectangular wooden frame is divided into two parts, one small and the other large. The small part is called heaven and the large part is called the earth. The frame consisted of several parallel wires that passed through the central rod from the left edge to the right edge. Five or more beads were threaded through these strings. The small part of the Chinese abacus has two pearls in each row, each with a value of 5. Most of them had five pearls, each of which had one value. The desired number was performed by sliding these beads near the central rod.
Each string from top to bottom represents a unit, tens, hundreds, thousands, 10,000, and Rakı, respectively. Addition, subtraction, multiplication and division operations were performed by sliding these beads. Abacus is still used in China today. Many students believe that abacus calculations are easier than computers, and that high-speed abacus calculations at competitions can provide quick results.
Japanese abacus was different from Chinese abacus. There was one pearl on the left side of the middle rod and the remaining five beads on the right side of the rod. After a while, the Russian abacus was created. It was different from both Chinese and Japanese abacus. The rectangular frame had no central rod and each row contained 10 beads. In 1617, Scott’s mathematician Sir John Napier invented a counting device known as Napier’s bones.
This device consists of two rods made of bone. These rods were attached to each other. Addition on this device makes subtraction, multiplication, or division very easy. In 1620, Willian Hautolde of Germany invented a slide rule called a slide rule that operates on the principle of logarithm.
What is Computer | History of Computers – The first mechanical counter was invented in 1642 by 19-year-old Blaise Pascal, who lives in France. This counter was named Pascalin. This mechanical counter had gears, wheels and discs. Each wheel has a number from 0 to 9. When one wheel makes one revolution, the other wheel moves one place. In other words, if a unit wheel rotates 10 times, 10 wheels will rotate once. I could only connect to this counter.
In 1671, Baron Gottfried Wilhelm von Linjes of Germany used a toothed girdle instead of a Pascalin disc. Along with this, some changes have also been made, which makes it easier to perform multiplication and division operations. After this, the German mathematician Kummer developed a machine that could add or subtract any number in any order. This machine was manufactured and sold on a very large scale and became popular.
Charles Babbage, a British mathematician and inventor who was a professor at the University of Cambridge, first devised a complete computer and spent his life and money realizing this vision. In 1822, he succeeded in developing a working model of a machine that could find pure values of algebraic equations and tables up to three decimal places. This machine was named the difference engine.
After this, he began to work on making it bigger and more powerful. Now they wanted to develop it in such a way that the calculation could give the final result up to the 20th decimal place. In 1842 he introduced the analysis engine to the world.
He introduced a new concept of (analysis engine). It had all the qualities of a modern computer. There were input, process unit, control unit, memory, and output placement. Charles Babbage was unable to confirm the behavioral design of the analytics engine. He died before it was completed. Due to this unique contribution in the field of computers, Charles Babbage is called the father of modern digital computers.
Charles Babbage’s unfinished work was taken over by his colleague Ada Augusta. Ada Augusta is known as the world’s first programmer because he first stored instructions in the analysis engine and operated the analysis engine according to these instructions. Ada developed a binary system with the help of Charles Babbage. In honor of Ada’s work, the US Department of Defense has named Ada a special language for the computers it uses.
What is Computer | History of Computers
Approximately 50 years after Charles Babbage’s analysis engine was invented, American scientist Herman Hollerith, who worked for the American Population Bureau, developed an electric tabulating machine.
The data was stored on this machine with the help of punch cards. These punched cards were placed one by one on a tabulating machine, and the needles of this machine were used to read data from these cards. As these needles passed through the holes in the card, they touched the mercury on the bottom of the card, completing the electrical circuit. With the help of this machine by Herman Hollerith, the census work, which was supposed to last for about five years, was completed in just two years.
In 1886, Hollerith founded a company called the Tabulating Machine Company to trade these machines. In 1911, many other companies also joined the company. Hollerith has given a new name to the combined group of all Computer Tabulating Recording Companies.
Russia In 1924, the company was given the new name International Business Machine Corporation (IBM) [International Business Machine Corporation (IBM)]. By the end of the 1930s, IBM owned 80% of the global punched card equipment market. Thanks to IBM, the previously popular machinery and equipment has been transformed into electromechanical equipment.
During the 30 and 40 years of the 20th century, different countries, such as Germany, Great Britain, and the United States, competed to make different types of computers. Computer technology for decades Abnormal progress has been made.
Howard A, a scientist at Harvard University, in 1948. Aiken (Howard A. Aiken) worked with IBM to develop the first electronic computer called an automatic acceptance control computer. It was given a prestigious name like Mark-I. This huge computer, 18 meters long and 3 meters high, contained 800 kilometers of wires, thousands of electromagnetic relays, hundreds of electron tubes, and many other components. Ah, the computer was able to multiply two 23-digit numbers and make a product in just four and a half seconds.
This computer was making a loud noise and getting very hot while working. In 1946, the Electronic Numerical Integrator and Calculator (ENIAC) was developed at Moore Institute of Technology at the University of Pensivaria in the United States. It was a faster machine than Mark I. This machine can perform 5000 additions and 350 multiplications per second. Since there is no memory storage system in it, some computer scientists consider it to be fewer computers and computers.
This was followed by the production of many large computers running on binary systems and also equipped with memory storage systems. Examples of these include EDVAC, EDSAC, and UNIVAC-I.