Skip to:Content
|
Bottom
Cover image for Circuit complexity and neural networks
Title:
Circuit complexity and neural networks
Personal Author:
Publication Information:
Cambridge, Mass. : The MIT, 1994
ISBN:
9780262161480

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000003919192 QA76.87 P36 1994 Open Access Book Book
Searching...

On Order

Summary

Summary

Neural networks usually work adequately on small problems but can run into trouble when they are scaled up to problems involving large amounts of input data. Circuit Complexity and Neural Networks addresses the important question of how well neural networks scale - that is, how fast the computation time and number of neurons grow as the problem size increases. It surveys recent research in circuit complexity (a robust branch of theoretical computer science) and applies this work to a theoretical understanding of the problem of scalability. Most research in neural networks focuses on learning, yet it is important to understand the physical limitations of the network before the resources needed to solve a certain problem can be calculated. One of the aims of this book is to compare the complexity of neural networks and the complexity of conventional computers, looking at the computational ability and resources (neurons and time) that are a necessary part of the foundations of neural network learning. Circuit Complexity and Neural Networks contains a significant amount of background material on conventional complexity theory that will enable neural network scientists to learn about how complexity theory applies to their discipline, and allow complexity theorists to see how their discipline applies to neural networks.


Reviews 1

Choice Review

Parberry's artificial neural network book is sure to be a seminal work for years to come. It is a timely and well-prepared addition to the "Foundation of Computing" series from MIT Press. Theoretical computer science is an extremely important and growing area of research, and Parberry provides a solid foundation for analyzing issues of artificial neural networks from the theoretical computer science perspective. Special considerations are given to the network as a computational model and, in particular, the size, complexity, and scale-up problem of the network. As a processing element, a neuron presents many interesting properties. Parberry highlights many theorems and lemmas and their proofs, related to the network as an algorithmic model. The coverage of the material emphasizes the theoretical nature of the problem and presents concepts in clear and concise terminology, in contrast to other works that emphasize the engineering aspects of neural networks. Upper-division undergraduate; graduate; professional. J. Y. Cheung; University of Oklahoma


Go to:Top of Page