Neural Networks And Pattern Recognition

Hardcover | October 20, 1997

byOmid Omidvar, Judith DayhoffEditorOmid Omidvar

not yet rated|write a review
This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and highly respected researchers and practitioners in the field. Key Features* Features neural network architectures on the cutting edge of neural network research* Brings together highly innovative ideas on dynamical neural networks* Includes articles written by authors prominent in the neural networks research community* Provides an authoritative, technically correct presentation of each specific technical area

Pricing and Purchase Info

$140.85 online
$151.95 list price (save 7%)
In stock online
Ships free on orders over $25

From the Publisher

This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and...

Omid Omidvar is a professor of Computer Science at the University of Computer Science at the University of the District of Columbia, Washington, D.C. He is also a technical director of SPPARC center; a supercomputing facility funded by NSF. He received his Ph.D. from the University of Oklahoma in 1967 and has done extensive work in app...

other books by Omid Omidvar

Neural Systems for Control
Neural Systems for Control

Kobo ebook|Feb 24 1997

$105.49 online$137.00list price(save 23%)
Neural Systems For Robotics
Neural Systems For Robotics

Hardcover|Apr 10 1997

$148.84 online$175.95list price(save 15%)
Format:HardcoverDimensions:351 pages, 9 × 6 × 0.98 inPublished:October 20, 1997Language:English

The following ISBNs are associated with this title:

ISBN - 10:0125264208

ISBN - 13:9780125264204

Look for similar items by category:

Customer Reviews of Neural Networks And Pattern Recognition

Reviews

Extra Content

Table of Contents

(Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield,Pulse-Coupled Neural Networks.H. Li and J. Wang,A Neural Network Model for Optical Flow Computation.F. Unal and N. Tepedelenlioglu,Temporal Pattern Matching Using an Artificial Neural Network.J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu,On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions.Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References.

Editorial Reviews

"Contributors incorporate landmark results on how neural network models have evolved from simple feedforward systems into advanced neural architectures with self-sustained activity patters, simple and complicated oscillations, specialized time elements, and new capabilities for analysis and processing of time-varying signals. Coverage includes the architecture and capabilities of pulse-coupled networks; the relationship between automata and recurrent neural networks; and a putative neurobiological model that correlates with trial-and-error learning."--REFERENCE & RESEARCH BOOK NEWS