Long ago, the Perceptron showed great promise as a way to have machines learn. There were limitations, but unfortunately these were extrapolated prematurely to conclude it was a dead end, and the technology languished for years before being reborn as neural networks, and eventually deep neural networks, or "DeepNet".
The cool thing about perceptrons was they were analog, and each unit cell could adjust its own parameters in a simple, analog, and provably effective manner.
IBM has just figured out how to mix Non Volatile Memory (NVM) and CMOS technology to build analog perceptrons on a chip, thus making it possible to have analog deep nets, which saves incredible amounts of compute time by doing everything in the analog domain.
I expect this to make a big difference in the long run.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment