|
Seeing the advantage of such a
process, the Census Bureau put Hollerith�s machine to good use in the
1890s census. This new phenomena improved the tallying procedure, making
it much more quicker for the Bureau to tabulate the census taken than it
had in the 1880 round, even though the country�s population had grown.
Having proved its value in speeding up calculations and reduction of cost
of the census (savings almost ten times greater than the Bureau had
expected), the punch-card tabulator attracted the attention of the owners
of big businesses such as railroads, insurance agencies, banks, and
mass-market manufacturers and retailers.
Seeing the commercial potential of his invention, Hollerith, established
the Tabulating Machine Company. The main aim was to sell tabulators to
businesses. Seeing the growth of the need for such services, Hollerith�s
firm merged with the Computer�Tabulating�Recording Company, to supplier
even larger business machines. After some time, Thirteen years later, a
talented young manager, named Thomas J. Watson, was brought into run the
business. Once he had taken over the reins of the company, he changed the
company�s name to the more impressive sounding International Business
Machines Corporation (IBM). This saw the dawn of the information
technology industry.

Retrospectively it would seem inevitability that people, at
that time, would have thought that computers would be the backbone of
modern business. However, strangely, the reverse was true. People at that
time had much scepticism about the machine�s usefulness. So much so that
Howard Aiken, a distinguished Harvard mathematician and a member of the US
government�s National Research Council and the creator of the Havard Mark
I Computer, had commented to Edward Cannon, of the U.S. National Bureau of
standards, in 1948 that the idea that there would be a big market for
computers is �foolishness�1. Furthermore he had been documented as stating
that �there never would be enough work for more than two of these
computers�2 .
However as technology evolved with the advent of the tiny transistors, the
big bulky vacuum tubes were replaced. This saw the birth of what we now
know as the desktop computers. It is ironic to note that, as with the
former views, the dominant computer companies of the day from IBM to
Digital, paid little attention to these quirky new machines. For the PC
were seen too week for any use. It took the brilliance of a college
dropout named Bill Gates�to see potential in the use of these personal
computers in business. In 1975, Gates together with his high-school friend
Paul Allen founded a little company named Micro-Soft, to write software
for the newly invented PC. Gates envisaged that these machines would not
only find a place inside business but that, because of its versatility and
low cost, it would supplant the bulky mainframe as the centre of corporate
computing.3
Nevertheless it can be seen that such advances still did not allow the
computer to meet its full potential. The issue being that workstation
could not compete with mini and mainframes on the basis of the power of a
single machine. This was solved by in the advent of networks of machines.
In which it was noted that the collective power of the PC was greater than
the sum of the parts. In 1990 a slew of ―application service providers‖
emerged, with considerable venture-capital backing, in hopes of providing
businesses with software programs over the Internet3.
However this good intention was met with, at that time, a significant
�barrier to entry�, where a significant chasm existed between
communication speeds and computer processing speeds. To explain this two
laws were coined. One is Moore�s law. Gordon E Moore stated in 1965 that
over the history of computing hardware, the number of transistors on
integrated circuits doubles approximately every two years4 . The other is
Grove�s Law. Andrew Grove stated, while chip density doubles every
eighteen months (Moore�s Law), telecommunications bandwidth doubles every
100 years1 .
However in the recent past we can see that the Grove�s law has been
progressively being negated. With the genesis and subsequent improvement
of communication services, it can be noted that the next stage in the
evolution of computer services has come, that is providing computer as a
utility. As data now can be transferred quickly at a cheaper rate �the
full power of computers can finally be delivered to users from afar. It
doesn't matter much whether the server computer running your program is in
the data centre down the hall or in somebody else's data center on the
other side of the country. All the machines are now connected and shared
-- they're one machine.3"
What is utility computing? As stated in Wikipidia utility computing is the
packaging of computing resources, such as computation, storage and
services, as a metered service5.
Imagine the day in which a person would just plug their laptop, palm top
or tablet PC�s; to a wall socket and obtain services pre-negotiated
between the service provider and his/her respective business, so that to
carry out his/her daily work.
The service provider may provide the company with comprehensive package.
The package may come pre-bundled with:
� Computer hardware , included standard servers, CPUs, monitors, input
devices and network cables.6
� Internet access , including Web servers and browsing software. 6
� Software applications that run the entire gamut of computer programs.
They could include word processing programs, e-mail clients,
project-specific applications and everything in between. Industry experts
call this particular kind of business "Software as a Service " (SaaS). 6
� Access to the processing power of a supercomputer . Some corporations
have hefty computational requirements. For example, a financial company
might need to process rapidly-changing data gathered from the stock
market. While a normal computer might take hours to process complicated
data, a supercomputer could complete the same task much more quickly. 6
� The use of a grid computing system . A grid computing system is a
network of computers running special software called middleware. The
middleware detects idle CPU processing power and allows an application
running on another computer to take advantage of it. It's useful for large
computational problems that can be divided into smaller chunks. 6
� Off-site data storage , which is also called cloud storage . There are
many reasons a company might want to store data off-site. If the company
processes a lot of data, it might not have the physical space to hold the
data servers it needs. An off-site backup is also a good way to protect
information in case of a catastrophe. For example, if the company's
building were demolished in a fire, its data would still exist in another
location. 6
At the end of the month, this business in question would then receive an
invoice. This invoice would depict charges based on usage and not on a
fixed flat fee. This could be compared to the modern electricity tariff.
Where usage for a certain amount of units would be billed at a predefine
amount.
Second part of this article
will be continued in the next month issue
Author:
Kumar is the Secretary to ISACA Sri Lanka Chapter and as an
information system audit and assurance professional, is currently working
as an Information Systems Auditor at SJMS Associates, an esteemed firm of
Chartered Accountants independent correspondent firm to Deloitte Touche
Tohmatsu
| |