In the early 1990s, the late Mark Weiser of Xerox PARC presented the bold vision of “Ubiquitous Computing” in an article titled “The Computer for the 21st Century”. He wrote: “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” At the time, personal computers have just started to get popular. The Internet has also begun to become accessible but was still really slow and troublesome. Back then, you and I might have thought that the future has just gotten slightly more interesting with technology…
Fast forward to today, and computing technology has undoubtedly become ubiquitous. Many of us own more than one computing device – a notebook, a smartphone, perhaps also a tablet plus a smartwatch. Moreover, we also become increasingly dependent on our mobile computing devices whenever and wherever we are – work, home, holidays, commute, etc.
The above is just a glimpse of what Mark has envisioned in his article. He anticipated an environment where users are not required to consciously and explicitly interact with a computer. The interaction between human and computers will be intuitive and eventually natural. Ultimately, computers will become invisible to their users.
Such scenarios may still sound like sci-fi, but we should be familiar with them. In the past, many sci-fi TV series and movies depicted how future computers may look like. We have seen talking computers such as K.I.T.T. in Knight Rider, HAL 9000 in 2001: A Space Odyssey and Jarvis in Marvel’s movies.
My personal favourite is still the world Stephen Spielberg presented in the movie Minority Report – As John Anderton (played by Tom Cruise) walk through the mall, the surrounding displays start talking to him, recognising him and making personalised offers. Back in early 2000, we might not have known how to realise and develop such technology. However, within a short span of a decade, they are no longer just scenes found in a movie.
Today, personalised advertisements can be found everywhere online. Big tech companies may be seen to be offering search, social media or videos for your daily consumption and usage. At the same time, they are collecting a massive amount of data from their users for different purposes. Indeed, without all the data from the users, search results may not be as precise. Advertisements will drive you away from your online strolls during lunch breaks.
Today, we notice that search results seem to know us pretty well. Even after a search for a hotel at a particular location, you see advertisements of similar and nearby hotels appearing on your browser for a while. After booking your hotel, Google Maps shows you your exact travel date at this hotel. Though these functions may not be like the ones shown in the movies, they are real examples that work today.
The enabling technologies for ubiquitous computing include the recent development of cloud computing, artificial intelligence (AI), machine learning and sensor technologies. Another factor that has accelerated this rapid development is the reducing cost of computation. What used to be expensive and resource demanding in the last decade is just a fraction of its costs today. Our smartphones today are more powerful than a personal computer in the 90s. Hence, there is no longer a hindrance to process a massive amount of data to produce useful information within a short period. As long as one can gather a large amount of relevant data, and make good sense out of them, one may produce value from this data.
In recent years, industries and businesses began paying attention to latest development such as big data analytics, Internet of Things (IoT), autonomous vehicle and Industry 4.0. Beyond being buzzwords and hype, these are timely technologies that will revolutionise how we work and live.
Often people asked – will AI and robots take over the world? Without a crystal ball, it will be difficult to predict the future. Nevertheless, new and useful technology will continue to grow and stay. Only through technology and innovation, we can have higher productivity, less human error in the workplace, reduced mundane and repetitive tasks as well as, hopefully, better quality of life for everyone. I genuinely believe this – we create technology not to replace human but to make people work more efficiently. In return, we can have more time to spend with things that matter more in life – our loved ones, health, and well-being. Businesses in Malaysia and this part of the world, especially SMEs, need to begin to embrace these current developments. Particularly technology related to AI, IoT and big data analytics can bring in automation, real-time updates, and insights.
More importantly, business owners will be able to make better decisions when more relevant data can be captured, processed and interpreted. They will need computer scientists to develop AI algorithms that suit their business needs, software engineers to design optimised software that improves processes and productivity, data and business analyst to understand trends and insights for timely but crucial business decisions as well as IT engineers to ensure all computing infrastructures run smoothly 24/7.
This is also the reason why we want our computing students at Sunway University – no matter in the areas of Computer Science, Information Systems, Networking and Security, Information Technology, Mobile Computing, Business Analytics or Software Engineering – to be trained as dedicated professionals with good ethics and responsibility. They have three years to complete an internationally recognized degree and to develop solid skills that empower them to contribute to the society. Only then, we can have full confidence and faith in them to fulfil the needs of businesses in Malaysia and beyond.
The technology we have witnessed and experienced today is just the beginning – what lies ahead will only be more exciting and promising. The future will be one that is worth looking forward to. Let’s embrace it with open arms and enjoy the benefit it may bring us.