words Al Woods
Image credit: Pixabay
We’re increasingly told that technology is taking over our lives: we’re becoming over-reliant on our devices, they say, allowing our apps and our smart assistants to dictate and shape our existences. That’s typically seen as a bad thing (our social interactions and our attention spans are said to be suffering, for example) but there’s no denying that technological innovations can have a profound and positive impact on our lives.
But the everyday technologies we now take for granted are just the tip of an ever-expanding iceberg: they’re continually evolving their capabilities and reshaping possibilities, with newer, more advanced innovations always on the horizon. That said, not all new technologies will become the next life-altering must-haves; some (remember Google Glass?) will be consigned to the technological graveyard, while others will define the decades to come.
So, which emergent technology trends will really shape our futures?
AI and Machine Learning
The concept of artificial intelligence has existed for decades, but its influence is increasingly being felt. Our smartphones now come with AI-powered smart assistants (such as Apple’s Siri) as standard, while streaming platforms like Netflix use AI algorithms to predict what we want to watch and when. Machine learning (a subset of AI) takes this one step further, enabling a machine to develop its intelligence without explicit programming from a human.
Microsoft (through its cloud computing platform Azure) is already making use of machine learning to build, train, and deploy AI models and streamline its processes, but in the future machine learning will increasingly be leveraged by medical professionals (to predict, detect and prevent disease quicker than ever before) and environmental scientists (to help uncover tangible solutions to the ongoing climate crisis).
Data security is an increasing concern, particularly in an age of interconnected networks and devices, and biometrics is one way of addressing this. Biometrics is what allows a machine to recognize a user through biological “markers” such as their facial features, their voice or their fingerprint. Since these features are biologically unique to each of us, it’s much more difficult for an unauthorized user to gain access to our devices or applications.
In the near future, biometrics might even lead to the extinction of passwords (a cue for celebration, surely) since it theoretically provides swifter (and markedly more secure) access to devices and systems; this will be particularly welcome news to those of us (at least two thirds of us, in fact) who use the same password across multiple accounts, or the 20 million+ who reportedly use the password 123456.
While (for now) no computing model can truly be called serverless, “cloud” computing allows resources to be accessed via a network of interconnected, geographically-diverse servers, enabling more effective resource management, enhancing security and ultimately reducing costs. More than 9 in 10 businesses already run at least some of their resources through the cloud, but in the coming decade that number will creep closer to 100%.
Internet giants like Amazon (AWS) and Google (GCP) offer their own cloud-based infrastructures (AWS currently holds a 33% share of the cloud infrastructure market), while websites and applications are typically hosted through a managed service provider like Cloudways, allowing businesses to scale their resources quickly and flexibly. By 2025, the cloud will hold almost half of the world’s data (that’s 100 billion terabytes).
The Internet of Things (IoT)
Nowadays, it’s not just computers (but an increasing number of “things”) that can connect to the internet — and to each other. The “things” we’re referring to hereare the physical objects (such as home appliances, in this case of the “smart” variety) equipped with sensors, software and processing abilities that enable them to connect and share data with other devices across the internet in real-time.
Often overlapping with AI, cloud computing and natural language processing, the IoT is what enables you to control your heating from your smartphone or tell Alexa to switch on the lights. In the not-so-distant future, driverless cars (which could be seen on our roads by the mid-2020s) will rely on the IoT to gather and share information about user behavior, road conditions and traffic, allowing the vehicle to safely navigate the roads all by itself.
Virtual and Augmented Reality
You’ve likely heard of the “metaverse”, but you may have dismissed it as simply a Gen Z-era buzzword or the reserve of science fiction screenwriters. And while drawing either of those conclusions would be understandable, the metaverse is increasingly becoming a (version of) reality, with VR and AR technologies enabling us to access a network of 3D worlds (often seen as a “physical” manifestation of the internet) and interact with others therein.
Early examples of the metaverse included wildly popular games like Fortnite and Second Life, where players connect with others in virtual worlds using avatars, but the next step comes in the form of platforms like Microsoft Mesh, which will enable businesses to collaborate using “mixed reality applications” that render as 3D “worlds”, using personalized avatars and immersive, interactive spaces to replicate the feeling of being in the same room.
Natural Language Processing (NLP)
While we’ve already covered artificial intelligence, natural language processing (which is a subfield of AI) is worthy of its own mention. In simple terms, NLP is what enables a machine to understand human language (in both spoken and written form). Its most basic application is seen in spell-checkers and autocomplete functions, but NLP has the power to facilitate seamless interactions between humans and computers.
AI-powered assistants like ChatBot use NLP to interpret, understand and respond to customer queries on many mainstream websites, mimicking a human-to-human interaction and enabling 24/7 support without the need for a human customer service agent. As we continue to enhance the language processing power of computers, it will become increasingly difficult to tell when you’re interacting with a human or an AI-powered bot.
In business, there’s a continual drive to streamline processes, make more efficient use of resources and do away with time- and effort-intensive tasks; and that’s where hyperautomation comes in. Hyperautomation (utilizing aspects of AI and machine learning) enables businesses to identify, vet and automate otherwise time-consuming workflows, allowing them to steer their focus toward the tasks which really need their attention.
In a recent example of hyperautomation in practice, the UK’s Heathrow Airport introduced a “low-code/no-code” policy in light of downsizing its IT department as a result of the pandemic, allowing their non-technical staff to develop applications with little to no coding expertise. The research firm Gartner predicts that by 2025, up to 70% of the applications developed by companies will be done using a low-code/no-code approach.