What does the future of computers technology hold in store?

by Dr. Ashutosh Kumar Singh and Akilan Thangrajah
                                                                                      

The history of computers can be traced to the invention of the abacus by the Babylonians back in 2400 BC. The abacus was an arithmetic calculator or computing device designed as a composition of mechanical components.

Since then, mechanical computers have gone through many evolutions as scientists, mathematicians and researchers worked towards perfecting the machine as an electronic device.

Mechanical computers were widely used in research laboratories until the 19th Century. With the two world wars, computer technology was dramatically transformed to meet military strategic needs. During the Second World War and the subsequent Cold War, computers were developed for military applications such as controlling weapons, intelligence gathering and launching missiles.

Between 1943 and 1946, the United States Government granted the University of Pennsylvania a research grant to develop a computer system to launch nuclear weapons. However, the US Army did not get to use the system during the war because it was not developed in time.

At this point, the size, processing time, and power consumption of computers became major concerns as mobility of the devices became paramount in military operations. This led to a new paradigm of computer technology that continues to this day.

Today’s computer architecture is based on the concepts introduced by the prominent mathematicians and computer scientists John Von Neumann and Alan Mathison Turing. Von Neumann pioneered the fundamentalcomponents of modern computers – input/output devices, memory, arithmetic/logic units (ALU) and control units.

Computer manufacturers led by IBM and Apple and later Intel and others, built on their concepts and gradually improved on the technology. Thus, we have gone through two generations of computers in the past. The computers we use today are considered the third generation, and there will inevitably be a fourth generation in the future. The generations are differentiated based on the introduction of transistor technology, silicon chips and their followed-through improvements.

Third generation computers contribute tremendously to all spheres of life, including astronomy and aerospace, the military, politics, education, medicine and healthcare, manufacturing, business, communications, transportation, farming and plantations, entertainment, cooking, cleaning, and security and surveillance.

Meanwhile, the size of the computer has been dramatically reduced to the size of a person’s palm or even smaller compared to the early days when mainframe computers the size of a room were commonplace.

Although, technically, today’s computers comprise all the same basic components as Von Neumann’s computers, they are no longer distinguishable from his original design. This is due to the tremendous advances in technology since his day. The latest technology has merged or embedded them.

For example, touch screen technology has removed the necessity for an external input device like a keyboard or mouse by virtualising the functionality of such devices on screen. Similarly, storage devices are being replaced by a new technology called ‘clouding’.

The advancements in hardware and software technology are quite incredible and have brought immense benefits to mankind. Just pay a visit to the nearby mall or surf the Internet in your leisure time and you will be simply amazed with the myriad of gadgets that have not only become much smaller and more handy, but offer more and more capabilities and functions.

The future of computers will always be related to Intel co-founder Gordon E. Moore’s law which states that improvement in the speed of processors increases exponentially roughly every ten years. However, a question begs to be asked: “Is it necessary to keep increasing the processing speed exponentially to keep with Moore’s law?”

Indeed, there is bottleneck in the access speeds of storage and memory devices. No matter how quick processors work, the computer’s memory or Random Access Memories (RAM) and storage devices such as hard disks are not quick enough to cope with processor speeds regardless of the bridging caches (pronounced cash, cache is a special temporary memory component which is relatively quicker than RAM but limited in size and is moreover expensive).

This bottleneck will be solved in the near future as new memory technology is being developed. Recently, IBM revealed a new technology called Phase Change Memory (PCM) that will result in memory devices 100 times faster than current flash memories.

On the other hand, there are concerns about the power dissipation of the modern technology. To solve this, researchers have proposed the use of ‘Reversible Logics’ which will theoretically result in zero power dissipation. At the same time, ‘quantum’ computers which promise to transform the computing experience of the future are likely to be introduced.

Overall, we have seen that computer technology has been drastically improved in the 21st Century such that the benefits and usefulness of computers are now enjoyed by people in all the corners of the world. The future of computer technology promises more tremendous breakthroughs that will inevitably benefit mankind, just as the transformation of mainframe computers to personal computers when transistors replaced vacuum valves did.

However, another question begs to be asked: “Will indeed everyone in the world get to benefit from these new advancements? Will the current eco-social imbalance in the world persist, denying millions in Africa, Asia and Latin America from enjoying the new technology?”

Dr. Ashutosh Kumar Singh is an Associate Professor and Head of the Department of Electrical and Computer Engineering (ECE) at Curtin Sarawak. His research interests include verification, synthesis, design and testing of digital circuits and he has published around 60 research papers on the subjects in various conference and research journals. He co-authored two books, ‘Digital Systems Fundamentals’ and ‘Computer System Organisation & Architecture’, and has delivered talks on computer engineering in several countries including Australia, the United Kingdom and the United States. Dr. Ashutosh can be contacted by e-mail to ashutosh.s@curtin.edu.my.

Akilan Thangarajah is a fourth-year Computer System Engineering student of Curtin Sarawak’s School of Engineering and Science.