Cognitive computer

A cognitive computer combines artificial intelligence and machine-learning algorithms, in an approach which attempts to reproduce the behaviour of the human brain.[1]

An example of neural network implementations of cognitive convolution and deep learning is provided by the IBM company's Watson machine. A subsequent development by IBM is the TrueNorth microchip architecture, which is designed to be closer in structure to the human brain than the von Neumann architecture used in conventional computers.[1] In 2017 Intel announced its own version of a cognitive chip in "Loihi", which will be available to university and research labs in 2018.

Intel Loihi chip

Intel's self-learning neuromorphic chip, named Loihi, perhaps named after the Hawaiian seamount Loihi, offers substantial power efficiency designed after the human brain. Intel claims Loihi is about 1000 times more energy efficient than the general-purpose computing power needed to train the neural networks that rival Loihi’s performance. In theory, this would support both machine learning training and inference on the same silicon independently of a cloud connection, and more efficient than using convolutional neural networks (CNNs) or deep learning neural networks. Intel points to a system for monitoring a person’s heartbeat, taking readings after events such as exercise or eating, and uses the cognitive computing chip to normalize the data and work out the ‘normal’ heartbeat. It can then spot abnormalities, but also deal with any new events or conditions.

The first iteration of the Loihi chip was made using Intel’s 14 nm fabrication process, and houses 128 clusters of 1,024 artificial neurons each for a total of 131,072 simulated neurons.[2] This offers around 130 million synapses, which is still a rather long way from the human brain's 800 trillion synapses, and behind IBM’s TrueNorth, which has around 16 billion by using 64 by 4,096 cores.[3]

IBM TrueNorth Chip

The IBM cognitive computers implement learning using Hebbian theory. Instead of being programmable in a traditional sense within machine language or a higher level programming language such a device learns by inputting instances through an input device that are aggregated within a computational convolution or neural network architecture consisting of weights within a parallel memory system. An early instantiation of such a device has been developed in 2012 under the Darpa SyNAPSE program at IBM directed by Dharmendra Modha.

In 2017 this IBM 64-chip array will contain the processing equivalent of 64 million neurons and 16 billion synapses, yet each processor consumes just 10 watts of electricity. Like other neural networks, this system will be put to use in pattern recognition and sensory processing roles. The Air Force wants to combine the TrueNorth ability to convert multiple data feeds  whether it's audio, video or text  into machine readable symbols with a conventional supercomputer's ability to crunch data. This isn't the first time that IBM's neural chip system has been integrated into cutting-edge technology. August, 2017 Samsung installed the chips in its Dynamic Vision Sensors enabling cameras to capture images at up to 2,000 fps while using just 300 milliwatts of power.

Criticism

There are many approaches and definitions for a cognitive computer,[4] and other approaches may be more fruitful than the others.[5]

Specifically, there are critics who argue that a room-sized computer - like the case of Watson - is not a viable alternative to a three-pound human brain.[6] Some also cite the difficulty for a single system to bring so many elements together such as the disparate sources of information as well as computing resources.[7] During the 2018 World Economic Forum, there are experts who claim that cognitive systems could adopt the biases of their developers and this was demonstrated in the case of the Google image-recognition or computer vision algorithm, which identified African Americans unfavorably.[8]

See also

References

  1. 1 2 Dharmendra Modha (interview), "A computer that thinks", New Scientist 8 November 2014, Pages 28-29
  2. "Why Intel built a neuromorphic chip". September 29, 2017. www.ZDNet.com
  3. "Intel unveils Loihi neuromorphic chip, chases IBM in artificial brains". October 17, 2017. AITrends.com
  4. Schank, Roger C.; Childers, Peter G. (1984). The cognitive computer: on language, learning, and artificial intelligence. Addison-Wesley Pub. Co. ISBN 9780201064438.
  5. Wilson, Stephen (1988). "The Cognitive Computer: On Language, Learning, and Artificial Intelligence by Roger C. Schank, Peter Childers (review)". Leonardo. 21 (2): 210. ISSN 1530-9282. Retrieved 13 January 2017.
  6. Neumeier, Marty (2012). Metaskills: Five Talents for the Robotic Age. Indianapolis, IN: New Riders. ISBN 9780133359329.
  7. Hurwitz, Judith; Kaufman, Marcia; Bowles, Adrian (2015). Cognitive Computing and Big Data Analytics. Indianapolis, IN: John Wiley & Sons. p. 110. ISBN 9781118896624.
  8. Choudhury, Saheli Roy (2018-09-18). "A.I. has a bias problem that needs to be fixed: World Economic Forum". CNBC. Retrieved 2018-10-12.

http://www.foxnews.com/tech/2018/01/09/ces-2018-intel-gives-glimpse-into-mind-blowing-future-computing.html

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.