Education

How computer science and technology go hand in hand

Although the term computer science was first coined in 1961, the origins of the discipline can be traced back thousands of years. What came to be known as computer science has always gone hand in hand with the development of technology. In some cases, theories of computing have had to wait for technology to catch up before they can be put into practice. In many other examples, computer science has birthed new technology, providing the inspiration and the means for paradigm-changing devices to be created.

Changing the world

In the last hundred years alone, computer science has helped to win a world war, put a man on the moon and revolutionize society thanks to the internet, personal computers, and smartphones. Computer science now underpins the fabric of our technologically driven, digitally mediated lives. Without it, so much of what we take for granted in modern civilization would not be possible.

Purdue University opened the first computer science department in 1962, a year after George Forsythe first named the subject. Forsythe would go on to create the computer science department at Stanford University, one of the world’s leading think tanks. Today you can take an online master’s of computer science at your own pace, utilizing the power of the world wide web to study the foundational principles that make it possible. Computer science is so far-reaching that a postgraduate qualification in the subject can be used useful in almost any sector. 

Mathematical calculation

From the beginning, computer science has been bound up not only with the history of technology, but with the development of mathematical theory and new discoveries in physics and philosophy. The earliest ancestor of the modern computer is the humble abacus, and the principle of using a representational analog device to assist in mathematical calculation goes back to ancient Sumer. Lines of pebbles in the sand were used as long ago as 2700BC, before the abacus evolved into the framework of moveable beads on separate lines that we are familiar with today.

Medieval computer technology

A device used to calculate the positions of the stars was discovered off the Greek Island of Antikythera in 1901. Believed to date back to 100BC, the Antikythera Mechanism is sometimes considered to be one of the earliest mechanical analog computers. 

Mechanical, self-playing musical instruments, including a hydro-powered organ and a programmable automatic flute player, were invented by the Banu Musa brothers in Baghdad and described in their ‘Book of Ingenious Devices’. In 1206, Ismail al-Jazari described his invention of the first programmable drum machine, played by automated ‘robot’ musicians.

Babbage and Lovelace

The story of modern computing really begins with the pioneering work of Charles Babbage and Ada Lovelace in the mid-nineteenth century. Babbage was a professor of mathematics at Cambridge University — Lovelace was the daughter of the romantic poet, Lord Byron, and a brilliant scholar in her own right, who worked as Babbage’s assistant. 

In his lifetime, Babbage proposed two hypothetical digital computing machines. The first of these was the Difference Engine, intended to calculate mathematical tables using mechanical components such as ten-toothed metal wheels mounted in columns. In 1822 he built a small working model, which represented a fraction of the proposed whole, and he used it to conduct relatively complex computing work.

A modified version of Babbage’s Difference Engine was built by Swedish father-and-son team Georg and Edvard Scheutz. Babbage’s original vision remained theoretical.

The Analytical Engine

Babbage’s second proposal, the Analytical Engine, was more sophisticated and was intended for wider computational use. It included an expandable memory store, which could have held 675 bytes, the equivalent of keeping important data on RAM. There was also a central processing unit (CPU) which Babbage referred to as ‘the mill’ and the Analytical Engine was capable of conditional branching — selecting from alternative actions based on previous outcomes. It used punch cards held together by ribbons, based on the Jacquard weaving loom, an example of the mechanized technology that constituted Britain’s Industrial Revolution.

Babbage died in 1871, part way through the construction of a large model of the Analytical Engine, but as with the Difference Engine, a full working example was never constructed. However, the work of Babbage and Lovelace would inspire and set the groundwork for future generations of computer scientists. Ada Lovelace is credited with creating the first algorithm, and the programming language ADA is named in her honor.

Computing machines

We can see that theories of computing developed alongside technological advancements such as the Industrial Revolution, and that increased mechanization began to make the construction of actual computers a possibility. Nevertheless, in the early 20th century, ‘computers’ still referred to human clerks, usually female, who worked under the direction of a physicist, performing computations and calculations for commercial, government and research purposes.

The first widely available computing machines began to be used in the 1920s, but these were not commonly referred to as computers until the post-war period. The development of computing during this period is owing largely to the work of Alan Turing, who did much to define modern notions of computers, artificial intelligence and more.

The modern age

The Church-Turing thesis stated that a mathematical method is effective if it can be set out as a list of instructions that a human clerk can follow for as long as necessary, without the requirement of additional ingenuity or insight. The first electronic digital computer was built on Iowa State campus between 1939 and 1942 by physics and mathematics professor John V Atanasoff, and engineering graduate student Clifford Berry.

Personal computers began to emerge in the 1970s. Bill Gates founded Microsoft in 1975 and Steve Jobs set up Apple in 1976. The modern age was nearly upon us, and by the 1990s, the world wide web began to be widely accessible. 

Technology and computer science have progressed a long way in the last few decades, but the principles on which our contemporary digital existence is founded were laid down over several centuries. Technology has enabled these theories to be put into practice, and they have, in turn, made new forms of technology possible. 

The future will doubtless see more innovations that will take us further than we can currently imagine.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button