IBM Is Using Light, Instead Of Electricity, To Create Ultra-fast Computing
To quench algorithms' seemingly limitless thirst for processing power, IBM researchers have unveiled a new approach that could mean big changes for deep-learning applications: processors that perform computations entirely with light, rather than electricity.
The researchers have created a photonic tensor core that, based on the properties of light particles, is capable of processing data at unprecedented speeds, to deliver AI applications with ultra-low latency.
Although the device has only been tested at a small scale, the report suggests that as the processor develops, it could achieve one thousand trillion multiply-accumulate (MAC) operations per second and per square-millimeter – more than twice as many, according to the scientists, as "state-of-the-art AI processors" that rely on electrical signals.
IBM has been working on novel approaches to processing units for a number of years now. Part of the company's research has focused on developing in-memory computing technologies, in which memory and processing co-exist in some form. This avoids transferring data between the processor and a separate RAM unit, saving energy and reducing latency.
Last year, the company's researchers unveiled that they had successfully developed an all-optical approach to in-memory processing: they integrated in-memory computing on a photonic chip that used light to carry out computational tasks. As part of the experiment, the team demonstrated that a basic scalar multiplication could effectively be carried out using the technology.
In a new blog post, IBM Research staff member Abu Sebastian shared a new milestone that has now been achieved using light-based in-memory processors. Taking the technology to the next stage, the team has built a photonic tensor core, which is a type of processing core that performs sophisticated matrix math, and is particularly suitable for deep learning applications. The light-based tensor core was used to carry out an operation called convolution, that is useful to process visual data like images.
"Our experiments in 2019 were mostly about showing the potential of the technology. A scalar multiplication is very far from any real-life application," Abu Sebastian, research staff member at IBM Research, tells ZDNet. "But now, we have an entire convolution processor, which you could maybe use as part of a deep neural network. That convolution is a killer application for optical processing. In that sense, it's quite a big step."
The most significant advantage that light-based circuits have over their electronic counterparts is never-before-seen speed. Leveraging optical physics, the technology developed by IBM can run complex operations in parallel in a single core, using different optical wavelengths for each calculation. Combined with in-memory computing, IBM's scientists achieved ultra-low latency that is yet to be matched by electrical circuits. For applications that require very low latency, therefore, the speed of photonic processing could make a big difference.
Sebastian puts forward the example of self-driving cars, where speed of detection could have life-saving implications. "If you're driving on the highway at 100 miles-per-hour, and you need to detect something within a certain distance – there are some cases where the existing technology doesn't allow you to do that. But the kind of speed that you get with photonic-based systems is several orders of magnitude better than electrical approaches."
With its ability to perform several operations simultaneously, the light-based processor developed by IBM also requires much less compute density. According to Sebastian, this could be another key differentiator: there will be a point, says the scientist, where loading car trunks with rows of conventional GPUs to support ever-more sophisticated AI systems won't cut it anymore.
With most large car companies now opening their own AI research centers, Sebastian sees autonomous vehicles as a key application for light-based processors. "There is a real need for low latency inference in the domain of autonomous driving, and no technology that can meet it as of now. That is a unique opportunity."
IBM's team, although it has successfully designed and tested a powerful core, still needs to extend trials to make sure that the technology can be integrated at a system level to ensure end-to-end performance. "We need to do much more there," says Sebastian; but according to the scientist, work is already underway, and as research continues, more applications are only likely to arise. Trading electricity for light, in the field of computing, certainly makes for a spot to watch.
Reassessing AI Investments: What The Correction In US Megacap Tech Stocks Signals
The recent correction in US megacap tech stocks, including giants like Nvidia, Tesla, Meta, and Alphabet, has sent rippl... Read more
AI Hype Meets Reality: Assessing The Impact Of Stock Declines On Future Tech Investments
Recent declines in the stock prices of major tech companies such as Nvidia, Tesla, Meta, and Alphabet have highlighted a... Read more
Technology Sector Fuels U.S. Economic Growth In Q2
The technology sector played a pivotal role in accelerating America's economic growth in the second quarter of 2024.The ... Read more
Tech Start-Ups Advised To Guard Against Foreign Investment Risks
The US National Counterintelligence and Security Center (NCSC) has advised American tech start-ups to be wary of foreign... Read more
Global IT Outage Threatens To Cost Insurers Billions
Largest disruption since 2017’s NotPetya malware attack highlights vulnerabilities.A recent global IT outage has cause... Read more
Global IT Outage Disrupts Airlines, Financial Services, And Media Groups
On Friday morning, a major IT outage caused widespread disruption across various sectors, including airlines, financial ... Read more