top of page
Vlad Cristian Marin

From Bits to QUbits: A New Era of Computing (Part 1)

2500 BC, Babylonia was using the abacus for simple arythmetic tasks. Ca. 150-100 BC, the Antikythera mechanism was used for astrological and calendrical purposes. In the 11th century, during the Song dynasty, Su Song invented the astronomical clock tower. In 1642, Blaise Pascal (aged 19) was pioneering the field of computing, creating one of the first mechanical computers. Two centuries later, Charles Babbage invented the first programmable analogue computer, which in today’s terms could be labeled the premier “Turing-complete” machine. February 1946 saw the birth of ENIAC (Electronic Numberical Integrator and Computer) and marked a unique breakthrough in history – the “renaissance” of computing. The following 75 years have probably seen the fastest developments of all times, and from all points of view. They are, in absolute terms, a small fraction in the world’s history pages, but in relative terms, the reinvention of the wheel – technologically speaking. From month to month, from year to year, the power of computational machines kept growing exponentially – until today. Let’s take a look at how computers have changed, as well as how they’ve changed our lives – forever.


(1940s): “Computers in the future might weight no more than 1.5 tons.” – Popular Mechanics

Thousands of years of trials, of new discoveries, of an unimaginably sheer number of contributors from all over the world – Babylonians, Egyptians, Romans, Byzantines, Arabs, Persians – all sorts of nations, cultures and civilizations, have culminated in the 20th century. Its early years saw a tremendous amount of innovation in computing, all facilitated by the recent adjacent technological advancements. Communication has quickly become a complex system of integrated parts, and computers have had their fair contribution.

What is a computer?

A computer is an electronic device that manipulates information or data. It has the ability to store, retrieve, and process those. Modern times have allowed us to become users of many more applications of computers than initially predicted (if predicted at all), and did not limit so far the domains where they can be of use. For the regular person, today’s computers (and, thank God, the creation of the internet) make it possible to enjoy news, to keep in contact with loved ones, to play video games, to read and write, to compile handy information… The list of possibilities goes on.

 One of the most feared expressions in modern times is ‘The computer is down.’ – Norman R.A.  I personally get very excited whenever I think about all of them. Computers have quickly become a defining part of our lives. As time passed, the definition of a “computer” has broadened extensively, while they spread everywhere. As you can imagine, it’s hard to keep track of their number, but some estimate that personal computers alone account for over 2.5 billion units in use at the moment. Even with hundreds of thousands of discarded electronics filling the junkyards every week, numbers grow exponentially all the time. In the same light, the world’s internet penetration (or the number of internet users) is growing at a fast pace every second. Compared to 1995, when less than 1% of the world’s population was connected, nowadays over 40% of all Earth’s individuals have access to the internet. According to a recent study by the McKinsey & Co. Global Institute, over 46% of World’s businesses are automated. On a macro scale, all these figures may indicate a sad reality, given that we have just entered the year 2016, but actually, humanity has already passed all possible milestones.

Cognitive Computing

Remember all the possibilities I was telling you about just now? Some people did not stop at fantasizing about revolutionizing the world through computing; these people made it reality. For practical reasons, many (of the already so many) small contributions and intermediary steps in the computer’s evolution will be ignored, and we will make jump to the next stop: cognitive computing.

Cognitive computing is a form of machine learning, whose decisions are made on past experiences, cases, and models. Whilst constantly learning, a cognitive computer takes into account all known past occurences of a certain event and perfects its rationale with every new data set. Dozens of servers act as a neural network (millions of little nuclea, just like neurons), the best possible outcome being computed for any given task. This is, in fact, a very important aspect of cognitive computational machines. They do not offer the “right” alternative, but the best one. A CC synthesizes all available information about a concept X, and on top of that, it provides the user with insights and it makes context computable. Pretty sweet, right? What makes cognitive computing beautiful is the dynamic nature of information. The status quo changes constantly, and so does data. The world is a complex, information-rich mechanism and CC has opened up new ways of solving both highly sensitive problems and as-old-as-time dilemmas. Its inputs comprise complex, ambiguous, and uncertain entries, while it always deals with conflicting scenarios. Just like a human brain, cognitive computers learn by experience or instruction, adapting to any situation and even influencing behavior.

2015 has seen some major breakthroughs in terms of artificial inteligence. I was recently writing on the same topic that, last year, more and more machines almost passed the Turing test, in a period I would characterise as ‘revolutionary’. A Turing test, first developed in 1950 by the British scientist Alan Turing, is a test of a machine’s ability to exhibit intelligent behaviour equivalent to – or indistinguishable from – that of a human. Just think about it! There are hundreds (if not thousands) of applications that even I, an outsider from the technological world, can think of. From medicine to defence, and from transportation to building, cognitive computing introduces… the future.

Completely autonomous artificial intelligence is still far away from the present, but recent years have seen incredible developments. And the future is made of sweet promises – Apple’s Siri, Microsoft’s Cortana, IPSoft’s Amelia (which has recently upgraded to its 2.0 version), Google’s deep learning program, innovations in quantum and cognitive computing, all have been and will be greatly influencing our day-to-day lives. “Give fish to a man, and feed him for a day. Teach him how to fish, and feed him for life.” This old adage was never more actual, because in fact, that’s how artificial intelligence is growing. In that sense, we need to teach them – the AI machines – the best and in the best fashion: to act for the greater good.



IBM Revolutionizes Cognitive Computing

IBM, the IT giant, has pioneered the CC industry. Its WATSON and CELIA supercomputers have passed more milestones, technologically speaking, in the past decades than technology had evolved in a couple of thousands of years. The systems answer to fairly complex questions asked in natural language, all powered by a crazy architecture: 90 IBM Power 750 servers, with a 16 Terabytes of Random-Access Memory (RAM). WATSON can process 500 GB of information per second, or the equivalent of a million books. There are dozens of applications already in use, but I will get back to them in the upcoming article. However, I would love leaving you with one example that touched me. Noteworthy are all of them, but one has a certain charge on me: Medicine. IBM Watson Healthcare is the program with probably the biggest potential, which has just announced the supercomputer’s newest breakthrough: its ability to see. After partnering with top tier companies in the med-care sector, IBM’s Watson has achieved the ability to offer superior medical advice based on records of thousands of different occurrences for a particular case. Think of cancer and how much easier would be, maybe not to cure it, but to slow it down and to prevent its advancement. Based on context, insights and a lot of unstructured other data, WATSON is able to tailor medical advice for basically any type of patient, coming as the handiest tool a doctor could have. All different individual characteristics and personal medical records will be blended into the best possible outcome for the patient. It is simply amazing!

Hopefully I have offered you some food for thought. Hopefully more and more people will get engaged in the field and will contribute with as little as they can towards bettering the world. Hopefully you will help solve some of the most fundamental problems humanity has! The future is here.

Until next time, do not stop dreaming and always seek for innovations!

Part 2 of the article will follow on the 31st of January and will deal with the other superconcept of today’s computing: QUANTUM COMPUTING. Stay tuned!

Please refer below for a list of additional literature on Cognitive Computing:

Comments


bottom of page