Guess most of us are getting used to the new normal — Artificial Intelligence(AI) amongst us. A few decades ago, it was the dot-com bubble followed by social media frenzies on various platforms; now, we have AI systems that would be a part of all we do in the so-called “Smart Cities”.
AI, as opposed to the natural intelligence exhibited by animals, including humans, is intelligence demonstrated by machines. It entails teaching machines to learn using big data with the goal of doing what humans can do. However, whether or not AI can completely replace our jobs remains an open question. Some AI experts believe humans would still need to provide some level of control and guidance to AI systems.
It is noteworthy that AI research has been around as far back as the 1950s. However, it nearly came to a halt in the late 1980s, partly because of the limitations of high computing power needed for the fast processing of big data. This period was widely regarded as the AI winter, due to the reduced funding and interest in AI research by governments and corporations. And then came cloud computing in early 2000, with the promise of better computing power for training machines. This helped reinvigorate the interest in AI development, especially its subfield known as machine learning. Meanwhile, improving the accuracy of AI systems to deliver more efficiency than humans continues to generate interest in various research communities.
The future for AI is promising regardless of the drawbacks encountered by its enthusiasts. We have seen self-driving cars, the world’s first humanoid robot, Sophia, that can converse with humans, the humanoid waiters that go around tables serving meals in restaurants, and those that assist with goods handling, among many other great examples. However, we should be worried that AI would lead to job losses in some sectors that require active human labour while creating more IT job opportunities. Well, as the proverbial saying goes, “to be forewarned is to be forearmed”; hence, acquiring IT skills and experience sufficient to survive the next wave of AI evolution might be worthwhile for all and sundry, including but not limited to the nextgen.