A
RESEARCH PAPER
ON
NEW
MODELS OF COMPUTATION
WRITTEN BY
NAME: EGBO, EGBO THANKGOD
DEPARTMENT
OF COMPUTER SCIENCE
FACULTY
OF SCIENCE.
SUBMITTED TO
DR. OYO. E. OYO
CENTER OF GENERAL STUDIES
OCTOBER, 2016.
CONTENT
1.0 INTRODUCTION
1.1 DEFINIION OF TERMS
1.1.1MODELS
1.1.2 COMPUTATION
1.1.3 EVOLUTION OF
COMPUTATION
2.0 BRIEF HISTORY OF COMPUTERS
3.0 NEW MODELS OF COMUTATION
3.1
CLOUD COMPUTING
3.2
QUANTUM COMPUTING
4.0 WHAT DO WE SEE COMPUTERS DOING
IN YEARS TO COME?
5.0
SUMMARY, CONCLUSION AND RECOMENDATIOS.
5.1 SUMMARY
5.2 CONCLUSION
5.3 RECOMMENDATIONS
REFFERENCES
THESIS STATEMENT
COMPUTERS
HAVE A LOT EFFECT IN THE WORLD TODAY BUT CLOSER EXAMINTION SHOWS THAT THERE ARE
NEW MODELS OF COMPUTATION THAT ARE ABOUT TO CHANGE THE WORLD AS WE KNOW IT.
1.0
INTRODUCTION
Computers have been identified in
today’s world as any electronic device which makes work easier and faster, (lay
man’s definition). Computers can be defined as any electronic device that
receives data (raw information), processes the data, and gives out information
(processed data).
The history of computers dates back to
the ancient Roman Empire, where the abacus was used to do simple calculations.
Over the years, computers have evolved from one machine to another. It is
undeniable that we are in a computer age moreover; about 75% of things today
can’t be done without the aid of computers. Over the years, computers have
evolved from one type to another also with great changes in its abilities.
Today we have computers doing most of humans work which range from computers in
industries, computers in education, computers in business sectors etc. The
evolution of computers is my major interest and also the purpose of this study.
The aim of this study is to bring
readers up to date about the new models of computing and also to enlighten
people about the usefulness of computers in today’s world. Some of the problems
facing the world today include the lack of accepting new discoveries reasons
being that religion does not agree with them or otherwise.
1.1
DEFINITION
OF TERMS
This is where some
certain technical terms used in the research study, are defined. These include;
COMPUTATION:
This
means the act or process of computing Samuel (2003).
MODELS:
A simplified representation used to explain the workings of real world system
or event, John (2006).
EVOLUTION:
Gradual direction change especially one leading to a more advanced or complex
form, growth development, Marcel (2002).
QUANTUM:
A
very small quantity of ELECTROMAGNETIC energy, oxford (2016).
CLOUD
COMPUTING: Is a computing paradigm, where a large pool of
computer systems are connected in private or public networks, to provide
dynamically scalable infrastructure for application, data and file storage,
Wikipedia(2016).
EVOLUTION
OF COMPUTATION: Gradual directional change in the
advancement of computers.
2.0
BRIEF
HISTORY OF COMPUTERS
Computers, the wonders of science, have an integral
role in our present day lives. The majority of us use these machines daily.
Although computers have become common for modern world, the origins of
computing are relatively unknown to the average user. This chapter presents a
brief discussion of the evolutional developments in computing.
ABACUS
A primary example of computing is counting. Humans
learn to do this at an early age, using our fingers. The automation of counting
occurred with the invention of the abacus. The abacus was designed in the 13th
century China, and assisted in the mathematical manipulation of numbers. The
Japanese form of the abacus was the soroban, which works relatively in the same
manner. The abacus was a handy little invention but there had to be better way
of doing calculations.
Traditionally the Chinese abacus has 2 beads in the
top section over the horizontal bar and 5 beads in the lower section, for each
“column”. The upper row beads could each represent one hand. The lower columns
could represent the 10 fingers.
CALCULATING
MACHINES
Wilhelm Schickard (1592 - 1635) had this very
thought, so he composed drawings of a calculator. His drawings may have been
revolutionary, but the calculator he designed was not built. Rather, it was Blasé
Pascal (1623 - 1662) who built the first “calculator”. Pascal’s calculating
machine was completed in 1642, and was called pascaline. Over fifty years
later, in 1694, Gottfried Leibniz (1646 - 1716) had designed the Leibniz wheel.
This wheel was important for mechanical calculators, for machines containing it
could add, subtract, multiply, and divide. Pascal’s calculator was no longer
efficient.
Around 1820, Charles Xavier Thomas created the
first successful, mass-produced mechanical calculator, the Thomas Arithmometer,
which could add, subtract, multiply, and divide. It was mainly based on Leibniz
work.
THE FATHER OF THE COMPUTER
The title, “father of the computer”, has been given
to Charles Babbage (1792 - 1871) for the role he played in the early
conceptualizations of computers. Babbage was a British mathematician working in
London when he came up with the idea of his “Difference Engine” in 1821. He
desired to calculate table of mathematical functions with greater accuracy. In
1822, he wrote a paper titled “Observations of mathematical tables”, which
detailed his plans for a calculation machine. Ada Lovelace, Lord Byron’s
daughter, translated and added notes to the “Sketch of the Analytical Engine”
by Federico Luigi, Conte Mona Brea.
COLOSSUS
Over seventy years later, a British man named Alan
Turing (1912 - 1954) was working on a top secret project. The end of 1943,
Turing completed “Colossus”, a fully operational electromagnetic computer. The
Colossus was used to crack German codes during World War II, and by doing so,
helped defeat the Nazis. Turing is credited with laying down the fundamental
principles of computing his colossus machine is considered one of the first
real computers, along with the ENIAC.
3.0
NEW
MODELS OF COMPUTATION
Following the evolution of computers over the
years, there are several thousands of new models of computation, but for the
sake of this study we focus on just two which include Cloud computing and
Quantum computing.
3.1
CLOUD
COMPUTING
As defined earlier in the course of this study,
Cloud computing is a computing paradigm, where a large pool of computer systems
are connected in private or public networks to provide dynamically scalable
infrastructure for application, data and file storage. With the advent of this
technology, the cost of computation, application hosting (a type of hosting
that uses an application server provider (ASP) mode, which is sometimes
referred to as “on-demand software”) content storage and delivery is
significantly reduced.
Cloud computing is a practical approach to
experience direct cost benefits and it has the potential to transform data
centre form a capital-intensive setup to a variable priced environment.
The idea of cloud computing is based on a very
fundamental principal reusability of IT capabilities. The difference that cloud
computing brings compared to traditional excepts of “grid computing”,
“distributed computing”, “utility computing”, or “automatic computing” is to
broaden horizons across organization boundaries.
According to an American independent technology and
market research company – Forester defines cloud computing as “A pool of
abstraction, highly scalable, and managing infrastructure capable of hosting
end customer applications and billed by consumption”.
CLOUD
COMPUTING MODELS
Cloud computing providers offer services that can
be grouped into three categories, which includes;
1.
SOFTWARE
as a SERVICE (SaaS): In this model, a complete application
is referred to the customer as a service on demand. A single instance of the
service runs on the cloud and multiple end users are serviced. On the customers
side there is no need for upfront investments in servers or software licences
while for the provider, the costs are lowered, since only a simple application
needs to be hosted and maintained. Today SaaS is offered by companies such as
Google, Sales force, Microsoft, Zoho, etc.
2.
PLATFORM
as a SERVICE (PaaS): Here, a layer of software, or
development environment in encapsulated and offered as a service, upon which
other higher lends of service can be built. The customer has the freedom to
build his own applications, which run on the provider’s infrastructure. To meet
manageability and scalability requirements of the applications, PaaS providers
offer a predefined combination of OS and application servers such as LAMP
platform (Linux, Apache, MySQL and PHP), restricted J2EE, Ruby, etc. Google’s
App Engine, Force com etc. are some of the popular PaaS examples.
3.
INFRASTRUCTURE
as a SERVICE (IaaS): IaaS provides basic storage and
computing capabilities as standardized services over the network servers,
storage systems, networking equipment, data centre space etc. are poles and
made available to handle workloads. The customers would typically deploy his
software on the infrastructure. Some common examples are Amazon, GoGrid, 3Tera
etc.
BENEFITS OF CLOUD COMPUTING
Enterprises would need to align their
application so as to exploit the architecture models that cloud computing
offers. Some of the typical benefits are listed as follows;
1.
REDUCED
COSTS: there are a number of reasons to attribute cloud
computing technology with lower costs. The billing model is pay per usage; the
infrastructure is not purchased thus lowering maintenance. Initial expense and
recurring expense are much lower than traditional computing.
2.
INCREASED
STORAGE: with the massive infrastructure that is offered by
cloud providers today, storage and maintenance of large volumes of data is a
reality. Sudden workload spikes are also managed effectively & efficiently,
since cloud can scale dynamically.
3.
FLEXIBILITY:
this is an extremely important characteristic. With enterprises having to adapt
even more rapidly to changing business conditions speed to deliver is critical.
Cloud computing stresses on getting applications to market very quickly by
using the most appropriate building blocks necessary for deployment.
CLOUD
COMPUTING CHALLENGES
Despite its growing
influence, concerns regarding cloud computing still remain. In our opinion, the
benefits outweighing the drawbacks and the model are worth exploring some
common challenges are;
1.
DATA
PROTECTION: data security is a crucial element
that warrants security. Enterprises are reluctant to buy an assurance of
business data security from vendors. They fear losing data to competition and
the data confidentiality of customers. In many instances, the actual storage
location is not disclosed, adding onto the security concerns of enterprises
data centres (owned by enterprises) protect this sensitive information.
Whereas, in the cloud model, service providers are responsible for maintaining
data security and enterprises would have to rely on them.
2.
DATA
RECOVERY AND AVAILABILTY: all business applications have
service level agreements that are stringently followed. Operational terms play
a key role in management of service level agreements and runtime governance of
application. In production environments, operational terms supports:
- Appropriate clustering and fall over.
- Data replication.
- System monitoring (transactions monitoring, logs monitoring and others).
- Maintenance (runtime governance).
- Disaster recovery.
Ø Capacity
and performance management
If any of the above mentioned services is
under-served by a cloud provider, the damage and impact could be severe.
3.
MANAGEMENT
CAPABILITIES: despite there being multiple cloud
providers, the management of platform and infrastructure is still in its
infancy. Features like ‘Auto-Scaling’ for example, area a crucial requirement
for many enterprises. There is huge potential to improve on the scalability and
load balancing features provided today.
4.
REGULATORY
AND COMPLIANCE RESTRICTONS: in some of the European
countries, government do not allow customer’s personal info and other sensitive
information to be physically located outside the state or country. In order to
meet such requirements, cloud providers need to setup a data centre or a
storage site exclusively within the country to comply with regulations. Having
such an infrastructure may not always be feasible and is a big challenge for
cloud providers.
Generally with cloud computing, the action moves to
the interface that is to the interface between service supplies and multiple
groups of service consumers. Cloud service will demand expertise in distributed
services procurement, risk assessment and service negotiation – areas that many
enterprises are only modestly equipped to handle.
3.2
QUANTUM
COMPUTING
Quantum computing studies theoretical computations
systems make use of quantum-mechanical phenomena, such as superposition and
enlargement, to perform operations on data. Quantum computers are different
from digital electronic computer based on transistors. Whereas common digital
computing requires that the data are encoded in binary digits (bits), each of
which is always one or two definite states (0 or 1). A quantum Turing machine
is a machine theoretical model of such a computer. Hey also share similarities
with non-deterministic and probabilistic computers. The field of quantum
computing was initiated by the work of Paul Benioff and Yuri Marun in 1980.
A COMPARISON OF CLASSICAL AND
QUANTUM COMPUTING
Classical computing relies, at its
ultimate level, on principles expressed by Boolean algebra, operating with a
(usually) 7-mode logic gate principle, though it is possible to exist with only
three modes (which are NOT and COPY). Data must be expressed in an exclusive
binary state at any point in time what is either 0 (off/false) or 1 (on/true)
these values are binary digits, or bits. The millions of transistors and
capacitors at the heart of computers can only be in one state at a point. While
the time that each transistor or capacitor need be either in 0 or 1 before
switching states is now measurable in billionths of a second, there is still a
limit as to how quickly these devices can be made to switch states.
The Quantum computer, by contrast can
work with a two mode logic gate XOR and a mode we all call QOI (the ability to change
0 into a superposition of 0 and 1, a logic gate which cannot exist in classical
computing. In quantum computers, a number of elemental particles such as
electrons or photons can be used (in practice, success has also been achieved
with ions)) with either their charge or polarization acting as a representation
0 and 1 or 1.
Each of these particles is known as
quantum bit, or qubit, the nature and behaviour of these particles from the
basics of quantum computing.
QUANTUM PROGRAMMING
Perhaps even more intriguing than the sheer power
of quantum computing is the ability that it offers to write programs in a
completely new way. For example, a quantum computer could incorporate a
programming sequence that would be along the lines of “take all the
superposition of the prior computations” something which is meaningless with a
classical computer which would permit extremely fast ways of solving certain
mathematical problem, such as factorization of large numbers example of which
we discuss below. There have been two notable successes thus far with quantum
programming. The first occurred in 1994 by Peter Shor; he developed a quantum
algorithm that could efficiently factorize large numbers. It centres on a
system that uses numbers theory to estimate the periodicity of a large number
sequence. The major breakthrough happened with IOU Grover of Bell labs in 1996,
with a very fast algorithm that is proven to be the fastest possible for
searching through unstructured databases. The algorithm is so efficient that it
requires only, on average roughly N square root (where N is the total of
elements), as opposed to a search in classical computing which on average needs
N/2 searches.
PREHISTORY OF QUANTUM COMPUTING
Since 1945
we have been witnessing a rapid growth of the raw performance of computers with
respect to their speed and memory size. An important step in this development
was the invention of transistors, which already use some quantum effects in
their operation. However, it is clear that if such an increase in performance
of computers continues then after 50 years, our chips will have to contain 1016 gates and operate at a 1014 Hz clock rate (thus delivering 1030 logic operations per second). It seems that the only
way to achieve that is to learn to build computers directly out of the laws of
quantum physics.
In order to come up seriously with the idea of quantum information
processing, and to develop it so far and so fast, it has been necessary to
overcome several intellectual barriers.
The most basic one concerned an important feature of quantum physics reversibility.
None of the known models of universal computers was reversible. This barrier
was overcome first by Bennett (1973), who showed the existence of universal
reversible Turing machines, and then by Toffoli (1980, 1981) and Fredkin and Toffoli
(1982), who showed the existence of universal classical reversible gates.
The second intellectual barrier was overcome by Benioff (1980, 1982, 1982a)
who showed that quantum mechanical computational processes can be at least as
powerful as classical computational processes. He did that by showing how a
quantum system can simulate actions of the classical reversible Turing machines.
However, his “quantum computer” was not fully quantum yet and could not
outperform classical ones.
4.0 WHAT DO WE SEE COMPUTERS DOING IN YEARS TO
COME
It has been predicted that if the aviation industry
had experienced the same rate of growth as the computer industry, we would be
able to cross the Atlantic Ocean supersonically for a small fare. The industry
shows no evidence of slowing. We have been bitten by the “technology bug” and
will not stop making computers smaller and faster. The development of
technology is really due to the human desire to control nature and to overcome
the constraints nature places on us. Through the use of computers, we can
attempt to control the world around us. By the means of input and output, we
can attempt to computerize our lives. There is still a lot more to learn
though. The computer industry has proven this repeatedly. As Isaac Newton once
noted, “what we know is a drop of what we don’t know in an ocean”.
Although
computers are now able to “intellectual” things due to advancement of
artificial Intelligence (AI), they are still not able to reason like human
beings. Some believe that they never will. Christopher Evans, in his 1979 book,
the mighty micro, calculated that the mental power of the world’s most advanced
computer was equal to that of an earwig, which has a lower intelligence than a
fish. There have been considerable advances in AI, as demonstrated by IBM’s
“Big Blue”. This machine was extremely advanced in the game of chess, and as
such, was able to beat the best human chess player in the world.
The computer was becoming increasingly important for
businesses, which could “hire” robots to do work previously done by expensive
humans. This trend is still alive today automotive plants utilize the
efficiency and accuracy that a computer robot provides. “Smart vehicles” are
being produced that can detect certain conditions. For example, there are many
cars that can detect if a door is ajar and notifies the driver of this problem
by “speaking” to him/her. These robots can do physical functions that a human
can do, but they are in no way even remotely humanoid. This notion is brought
into society by science fiction books, comics and films. George Lucas is
renowned for his film “Star wars”, where humanistic robots help Luke Skywalker
save thousands of lives. Although the movies are exciting, they are unrealistic
according to modern times. Movie makers, like Lukas, do use computer technology
to make these robots realistic enough for moviegoers to believe them.
5.0 SUMMARY, CONCLUSION AND RECOMMENDATIONS
5.1 SUMMARY
This study was carried out to inform the general
public on how computers have evolved over the years. To give the work good
footing, theoretical and empirical literatures relevant to this study were
reviewed. Of course there have been significant evolution of computers over the
years and the world should still expect more and more technologies. Finally it
brings readers up to date with cloud and quantum computing.
5.2 CONCLUSION
The
study examined the new models of computation. The study reveals the new models
of computation such as cloud and quantum computing. It also reveals how the
evolution of computers has affected humans and the world at large.
New models of computation have
affected the world both positively and negatively. Not so far off in the near
future we may see computers depicting most of our human activities ranging from
education to industries to entertainment and many more.
5.3 RECOMMENDATIONS
This
study is recommended to the general public to bring people up to speed with the
fast growing rate of computers. Students also preparing for tertiary
Institutions
to study any discipline under computers are recommended to read this research.
REFERENCES
Gruska,
J. (2015). Quantum computing
Cambridge: University press.
Julius N.
(2009). Computer science beginner’s
handbook for undergraduate Cross-
River: Uncial printing press.
Wenger
P. and Eugene E (2003). New models of computation. The computer journal, 47(1)
pp 1-9. Toronto: first printers press.
Wichard
M.L and Stephen G.S (2011). Models for
computer based testing. New York: New day press.
Yatchi
E. (2014). Application of modern
technology in the world today. Nigerian library an information science review 2
(1 &2)3. Ibadan: kelvin Frank.
Zara P.H
(2004). The science of computing: shaping
a discipline Lagos: Taylor and Francis group CRS press
THIS WORK WAS DONE BY EGBO, EGBO THANKGOD
EmoticonEmoticon