WHAT IS A COMPUTER
A computer is a general purpose device that can be programmed to carry out a set of
arithmetic or logical operations. Since a sequence of operations can be readily changed, the
computer can solve more than one kind of problem.
Conventionally, a computer consists of at least one processing element, typically a central
processing unit (CPU) and some form of memory. The processing element carries out
arithmetic and logic operations, and a sequencing and control unit that can change the
order of operations based on stored information. Peripheral devices allow information to be
retrieved from an external source, and the result of operations saved and retrieved.
The first electronic digital computers were developed between 1940 and 1945. Originally
they were the size of a large room, consuming as much power as several hundred modern
personal computers (PCs). In this era mechanical analog computers were used for military
applications.
Modern computers based on integrated circuits are millions to billions of times more
capable than the early machines, and occupy a fraction of the space.[2] Simple computers
are small enough to fit into mobile devices, and mobile computers can be powered by
small batteries. Personal computers in their various forms are icons of the Information Age
and are what most people think of as “computers.” However, the embedded computers
found in many devices from MP3 players to fighter aircraft and from toys to industrial
robots are the most numerous.
History of computing
http://ecampus.maseno.ac.ke
,The Jacquard loom, on display at the Museum of Science and Industry in Manchester,
England, was one of the first programmable devices.
Main article: History of computing hardware
The first use of the word “computer” was recorded in 1613 in a book called “The yong
mans gleanings” by English writer Richard Braithwait I haue read the truest computer of
Times, and the best Arithmetician that euer breathed, and he reduceth thy dayes into a
short number. It referred to a person who carried out calculations, or computations, and the
word continued with the same meaning until the middle of the 20th century. From the end
of the 19th century the word began to take on its more familiar meaning, a machine that
carries out computations.
Limited-function early computers
The history of the modern computer begins with two separate technologies, automated
calculation and programmability. However no single device can be identified as the earliest
computer, partly because of the inconsistent application of that term. A few devices are
worth mentioning though, like some mechanical aids to computing, which were very
successful and survived for centuries until the advent of the electronic calculator, like the
Sumerian abacus, designed around 2500 BC[4] of which a descendant won a speed
competition against a contemporary desk calculating machine in Japan in 1946, the slide
rules, invented in the 1620s, which were carried on five Apollo space missions, including
to the moon[6] and arguably the astrolabe and the Antikythera mechanism, an ancient
astronomical analog computer built by the Greeks around 80 BC. The Greek
mathematician Hero of Alexandria (c. 10–70 AD) built a mechanical theater which
performed a play lasting 10 minutes and was operated by a complex system of ropes and
drums that might be considered to be a means of deciding which parts of the mechanism
performed which actions and when. This is the essence of programmability.
Blaise Pascal invented the mechanical calculator in 1642, known as Pascal's calculator, it
was the first machine to better human performance of arithmetical computations[10] and
would turn out to be the only functional mechanical calculator in the 17th century.[11] Two
hundred years later, in 1851, Thomas de Colmar released, after thirty years of
development, his simplified arithmometer; it became the first machine to be
commercialized because it was strong enough and reliable enough to be used daily in an
office environment. The mechanical calculator was at the root of the development of
computers in two separate ways. Initially, it was in trying to develop more powerful and
more flexible calculators[12] that the computer was first theorized by Charles Babbage and
then developed. Secondly, development of a low-cost electronic calculator, successor to
the mechanical calculator, resulted in the development by Intel[16] of the first commercially
available microprocessor integrated circuit.
First general-purpose computers
In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a
series of punched paper cards as a template which allowed his loom to weave intricate
http://ecampus.maseno.ac.ke
, patterns automatically. The resulting Jacquard loom was an important step in the
development of computers because the use of punched cards to define woven patterns can
be viewed as an early, albeit limited, form of programmability.
The Most Famous Image in the Early History of Computing
This portrait of Jacquard was woven in silk on a Jacquard loom and required 24,000
punched cards to create (1839). It was only produced to order. Charles Babbage owned one
of these portraits; it inspired him in using perforated cards in his analytical engine.[18]
The Zuse Z3, 1941, considered the world's first working programmable, fully automatic
computing machine.
It was the fusion of automatic calculation with programmability that produced the first
recognizable computers. In 1837, Charles Babbage was the first to conceptualize and
design a fully programmable mechanical computer, his analytical engine. Limited finances
and Babbage's inability to resist tinkering with the design meant that the device was never
completed—nevertheless his son, Henry Babbage, completed a simplified version of the
http://ecampus.maseno.ac.ke
A computer is a general purpose device that can be programmed to carry out a set of
arithmetic or logical operations. Since a sequence of operations can be readily changed, the
computer can solve more than one kind of problem.
Conventionally, a computer consists of at least one processing element, typically a central
processing unit (CPU) and some form of memory. The processing element carries out
arithmetic and logic operations, and a sequencing and control unit that can change the
order of operations based on stored information. Peripheral devices allow information to be
retrieved from an external source, and the result of operations saved and retrieved.
The first electronic digital computers were developed between 1940 and 1945. Originally
they were the size of a large room, consuming as much power as several hundred modern
personal computers (PCs). In this era mechanical analog computers were used for military
applications.
Modern computers based on integrated circuits are millions to billions of times more
capable than the early machines, and occupy a fraction of the space.[2] Simple computers
are small enough to fit into mobile devices, and mobile computers can be powered by
small batteries. Personal computers in their various forms are icons of the Information Age
and are what most people think of as “computers.” However, the embedded computers
found in many devices from MP3 players to fighter aircraft and from toys to industrial
robots are the most numerous.
History of computing
http://ecampus.maseno.ac.ke
,The Jacquard loom, on display at the Museum of Science and Industry in Manchester,
England, was one of the first programmable devices.
Main article: History of computing hardware
The first use of the word “computer” was recorded in 1613 in a book called “The yong
mans gleanings” by English writer Richard Braithwait I haue read the truest computer of
Times, and the best Arithmetician that euer breathed, and he reduceth thy dayes into a
short number. It referred to a person who carried out calculations, or computations, and the
word continued with the same meaning until the middle of the 20th century. From the end
of the 19th century the word began to take on its more familiar meaning, a machine that
carries out computations.
Limited-function early computers
The history of the modern computer begins with two separate technologies, automated
calculation and programmability. However no single device can be identified as the earliest
computer, partly because of the inconsistent application of that term. A few devices are
worth mentioning though, like some mechanical aids to computing, which were very
successful and survived for centuries until the advent of the electronic calculator, like the
Sumerian abacus, designed around 2500 BC[4] of which a descendant won a speed
competition against a contemporary desk calculating machine in Japan in 1946, the slide
rules, invented in the 1620s, which were carried on five Apollo space missions, including
to the moon[6] and arguably the astrolabe and the Antikythera mechanism, an ancient
astronomical analog computer built by the Greeks around 80 BC. The Greek
mathematician Hero of Alexandria (c. 10–70 AD) built a mechanical theater which
performed a play lasting 10 minutes and was operated by a complex system of ropes and
drums that might be considered to be a means of deciding which parts of the mechanism
performed which actions and when. This is the essence of programmability.
Blaise Pascal invented the mechanical calculator in 1642, known as Pascal's calculator, it
was the first machine to better human performance of arithmetical computations[10] and
would turn out to be the only functional mechanical calculator in the 17th century.[11] Two
hundred years later, in 1851, Thomas de Colmar released, after thirty years of
development, his simplified arithmometer; it became the first machine to be
commercialized because it was strong enough and reliable enough to be used daily in an
office environment. The mechanical calculator was at the root of the development of
computers in two separate ways. Initially, it was in trying to develop more powerful and
more flexible calculators[12] that the computer was first theorized by Charles Babbage and
then developed. Secondly, development of a low-cost electronic calculator, successor to
the mechanical calculator, resulted in the development by Intel[16] of the first commercially
available microprocessor integrated circuit.
First general-purpose computers
In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a
series of punched paper cards as a template which allowed his loom to weave intricate
http://ecampus.maseno.ac.ke
, patterns automatically. The resulting Jacquard loom was an important step in the
development of computers because the use of punched cards to define woven patterns can
be viewed as an early, albeit limited, form of programmability.
The Most Famous Image in the Early History of Computing
This portrait of Jacquard was woven in silk on a Jacquard loom and required 24,000
punched cards to create (1839). It was only produced to order. Charles Babbage owned one
of these portraits; it inspired him in using perforated cards in his analytical engine.[18]
The Zuse Z3, 1941, considered the world's first working programmable, fully automatic
computing machine.
It was the fusion of automatic calculation with programmability that produced the first
recognizable computers. In 1837, Charles Babbage was the first to conceptualize and
design a fully programmable mechanical computer, his analytical engine. Limited finances
and Babbage's inability to resist tinkering with the design meant that the device was never
completed—nevertheless his son, Henry Babbage, completed a simplified version of the
http://ecampus.maseno.ac.ke