IBM’s complicated chip story: It’s designing brainiac chips and trying to shed its chip making biz

You could be forgiven for thinking that IBM is of two minds when it comes to chips.

On the one hand, it’s obvious that Big Blue is trying to offload its chip-making business. It may even have offered to pay GlobalFoundries $1 billion to take that operation off its hands, according to Bloomberg, which cited an unidentified source. That seems crazy until you realize that it’s a money-loser — by some estimates, the chip business costs IBM $1.5 billion a year, and CEO Ginni Rometty is trying to shed unprofitable businesses. The company’s sale of its X386 server business to Lenovo is still pending. (I’ve reached out to IBM and GlobalFoundries for comment, but expect there will be none.)

On the other hand, check out all the headlines today about the brainiac chip IBM Researchers have designed. As summarized in the journal Science, which published a paper on the breakthrough, the researchers “applied our present knowledge of the structure and function of the brain to design a new computer chip that uses the same wiring rules and architecture. The flexible, scalable chip operated efficiently in real time, while using very little power.”

IBM promises a chip that thinks like a brain

By mimicking the way that neurons, synapses and other parts of the brain work to solve problems, IBM said this “SyNAPSE” silicon can recognize patterns and classify objects in a very power-efficient way. This silicon, which IBM claims is the world’s first “neurosynaptic” computer chip, can combine a million programmable neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt, according to a press release that added that a postage-stamp-sized computer running this chip would run on the equivalent of a hearing aid battery.

Putting an array of these chips together in a neural network could solve all sorts of complex problems.

As the New York Times‘ John Markoff wrote:

The chip’s electronic “neurons” are able to signal others when a type of data — light, for example — passes a certain threshold. Working in parallel, the neurons begin to organize the data into patterns suggesting the light is growing brighter, or changing color or shape.

The processor may thus be able to recognize that a woman in a video is picking up a purse, or control a robot that is reaching into a pocket and pulling out a quarter. Humans are able to recognize these acts without conscious thought, yet today’s computers and robots struggle to interpret them.

IBM chip plan: Stick with R&D, dump manufacturing?

So that’s some smart silicon, but how does it square with talk about IBM getting out of the chip-making business? It may not be that much of a contradiction. IBM has said it will continue to invest in chip R&D — last month it announced a $3 billion investment to create chip technologies for next-gen computing, big data and cognitive systems.

There’s a distinction between designing chips for the future and manufacturing those chips once that’s done for broader use. Why not offload that capital-intensive fab work to companies specializing in it?

The bigger worry about IBM is that the R&D that the company is so famous for has been deemphasized and cut back. I suspect that this is the case, although breakthroughs like this one allay some of those concerns.

Image copyright Shutterstock / agsandrew.

Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.



More From paidContent.org