Nvdia Thumbnail

Artificial intelligence or “AI” can be a chilling concept for some people. Visions of fiery, red-eyed cyborgs marching desolate wastelands that for some reason speak English with an Austrian accent offer a frightening possibility of the future. There are also those that are skeptical as to the possibilities of AI, arguing that a giant computer able to defeat the world’s best chess player has little application in the real world. Unfortunately for the fearful and the skeptics, AI is here to stay and is increasingly being adopted by organisations around the world to improve on their products and services.

One of the companies on the forefront of the AI evolution is Nvidia. Some may be familiar with the company and their state-of-the-art graphics cards which are highly sought after by PC-gaming enthusiasts, costing as much as fifty thousand rand. Graphics cards or graphics processing units (“GPUs”) are incredibly sophisticated circuit boards with microchips built into them. They were initially created to assist with the heavy workload resulting from the increasingly complex calculations required to render images on a computer screen. By now you may be wondering what GPUs have to do with AI. It just so happens that apart from rendering life-like images on a computer screen, GPUs are also incredibly adept at performing the calculations associated with AI, specifically a subset of AI, machine learning.

Gaming still makes up the biggest use-case for Nvidia’s GPUs but it won’t be long before it is surpassed by those that are using GPUs to train their “machines”. The world’s biggest technology companies all make-use of Nvidia’s GPUs. Whether it is Netflix training its system to constantly improve on recommendations for viewers, Amazon training Alexa to better understand our voices or Google improving on its image recognition frameworks to better identify images of cats.

Big Tech companies are not the only ones making use of Nvidia’s GPUs. As we move closer into a world of 5G and high-speed data transfer, computing will become decentralised and move towards the “edge”. Another way of explaining this concept is that computing will start to take place closer to the source rather than being centralised in a cloud data center. Imagine a factory floor operating thousands of autonomous machines, be it robot arms, trollies, forklifts or even cameras. Sensors embedded into these devices would have previously sent data back to the cloud server for processing. Now with 5G, latency will be minimised and data can be processed in real-time on the factory floor using Nvidia’s “edge-servers”, instantly alerting workers to anomalies or malfunctions.  Speaking of factories, imagine you needed to deploy a machine but it needed to be trained on the layout of the factory floor, what to do, where to move and importantly, how to adjust for obstacles. Ordinarily the machine would need to perform this training once deployed on the actual floor but not anymore. Nvidia have created what they call an “omniverse” which is a simulated world that obeys the laws of physics, be it liquids, waves, fluids, cables or springs. A machine can be trained in this metaverse before it is even deployed in the real factory so that when it is dropped on the factory floor it has already learned the skills required – a world where robots can learn to be robots. Am I the only one who is reminded of the dojo scene in the Matrix when Neo became a kung-fu expert in one download before facing off against Morpheus?

Sticking with our factory, imagine you wanted to design a factory but were uncertain how efficient it would be once operational. BMW faced a similar challenge but with the help of Nvidia and its AI-enabled GPUs, it was able to create a digital twin and simulate the factory operating in real time. Designers could “worm-hole” into the digital factory and get a “real” feel for how things moved and operated. Perhaps a shelf was too high, a bench not wide enough, or a pathway was obstructed. All these issues could be identified in real-time before even building the real thing. Think of the cost savings and efficiency gains to be had.

These are but a few of the applications that make use of Nvidia’s GPUs. In the next article we can explore what has been accomplished in the medical field through the use of Nvidia’s technologies as well as how Nvidia is bringing us closer to a world of driverless vehicles.

About the Author

Jonathan Wernick
Equity Analyst, Sasfin Wealth

Offcanvas Title

Default content goes here.
Intro