LONDON: Chip giant Intel was rocked last week, when the discovery of two CPU security bugs, one of which predominantly affects Intel chips and could slow performance, became public. During his keynote speech at the Consumer Electronics Show (CES), Intel CEO Brain Krzanich promised quick updates to fix the problem and reiterated that the average computer user would not be drastically affected.
Krzanich also looked to the future in his keynote address. Two chips, a neuromorphic processor that mimics the human brain and a 49-qubit processor for quantum computing, represent Intel’s effort to get out ahead of the next computing revolution. Intel has dominated the era of personal computers and conventional processors. It’s aiming to do the same with whatever comes next.
Taking AI to the next level
Intel missed the boat when it comes to accelerating artificial-intelligence workloads. Standard CPUs, even ultra-fast many-core Xeon server chips, are no match for architectures better suited for the task. Graphics processors, like those from market leader NVIDIA, provide a massive performance boost, both for training AI systems and using those systems in real-world applications.
Intel has tried to play catch-up, launching various iterations of its Xeon Phi chips, which feature a large number of standard x86 cores. The company is also building its own discrete graphics business, no doubt aiming to get its eventual GPUs into data centers to run AI workloads. These efforts haven’t yet made a dent in NVIDIA’s dominance.
Loihi, Intel’s neuromorphic chip unveiled at CES, could be a game-changer if the company can successfully commercialize the technology. While AI accelerators like NVIDIA’s GPUs run neural network software that roughly simulates how a human brain works, neuromorphic chips like Loihi are designed at the hardware level to mimic the human brain. Intel claims that Loihi uses about 0.1% as much power as a conventional processor.
Intel isn’t alone in exploring neuromorphic chips. International Business Machines unveiled its own brain-like chip way back in 2014. The current iteration, TrueNorth, uses just 70 milliwatts of power to simulate 1 million neurons.
Meaningful commercialization of this technology may still be years away. But the future of AI processing seems likely to be dominated by specialized chips designed specifically for the task. Neuromorphic chips, or another type of chip designed specifically for AI, could potentially displace GPUs down the road.
Looking even further into the future, Intel unveiled a quantum computing chip with 49 qubits, the quantum equivalent of a bit in traditional computing. Quantum computing takes advantage of the bizarre and unintuitive properties of quantum mechanical systems. While a bit is always in one of two states at any given time, a qubit is in some combination of two states. This fuzziness creates the potential for certain workloads to be dramatically sped up, but it also introduces errors that make current quantum computers mostly useless for real-world applications.
Intel’s quantum chip, code-named Tangle Lake, puts the company in the race to build a commercial quantum computer. IBM revealed its own prototype 50-qubit chip late last year, and Alphabet’s Google is working on one as well. IBM has taken the first step toward commercialization, partnering with a variety of companies to explore real-world applications for its quantum systems. But it may still be many years before quantum computers are successfully commercialized.
The quantum computing revolution is coming, and Intel’s Tangle Lake puts the company in position to be a major player in this new market.