The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar

Author:   Partha Niyogi
Publisher:   Springer-Verlag New York Inc.
Edition:   Softcover reprint of the original 1st ed. 1998
ISBN:  

9781461374930


Pages:   224
Publication Date:   16 October 2012
Format:   Paperback
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Our Price $290.37 Quantity:  
Add to Cart

Share |

The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar


Add your own review!

Overview

Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn? - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science, and statistics will find it particularly relevant.

Full Product Details

Author:   Partha Niyogi
Publisher:   Springer-Verlag New York Inc.
Imprint:   Springer-Verlag New York Inc.
Edition:   Softcover reprint of the original 1st ed. 1998
Dimensions:   Width: 15.50cm , Height: 1.30cm , Length: 23.50cm
Weight:   0.391kg
ISBN:  

9781461374930


ISBN 10:   1461374936
Pages:   224
Publication Date:   16 October 2012
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Paperback
Publisher's Status:   Active
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Table of Contents

1. Introduction.- 1.1 The Components of a Learning Paradigm.- 1.2 Parametric Hypothesis Spaces.- 1.3 Technical Contents and Major Contributions.- 2. Generalization Error For Neural Nets.- 2.1 Introduction.- 2.2 Definitions and Statement of the Problem.- 2.3 Stating the Problem for Radial Basis Functions.- 2.4 Main Result.- 2.5 Remarks.- 2.6 Implications of the Theorem in Practice: Putting In the Numbers.- 2.7 Conclusion.- 2-A Notations.- 2-B A Useful Decomposition of the Expected Risk.- 2-C A Useful Inequality.- 2-D Proof of the Main Theorem.- 3. Active Learning.- 3.1 A General Framework For Active Approximation.- 3.2 Example 1: A Class of Monotonically Increasing Bounded Functions.- 3.3 Example 2: A Class of Functions with Bounded First Derivative.- 3.4 Conclusions, Extensions, and Open Problems.- 3.5 A Simple Example.- 3.6 Generalizations.- 4. Language Learning.- 4.1 Language Learning and The Poverty of Stimulus.- 4.2 Constrained Grammars-Principles and Parameters.- 4.3 Learning in the Principles and Parameters Framework.- 4.4 Formal Analysis of the Triggering Learning Algorithm.- 4.5 Characterizing Convergence Times for the Markov Chain Model.- 4.6 Exploring Other Points.- 4.7 Batch Learning Upper and Lower Bounds: An Aside.- 4.8 Conclusions, Open Questions, and Future Directions.- 4-A Unembedded Sentences For Parametric Grammars.- 4-B Memoryless Algorithms and Markov Chains.- 4-C Proof of Learnability Theorem.- 4-D Formal Proof.- 5. Language Change.- 5.1 Introduction.- 5.2 Language Change in Parametric Systems.- 5.3 Example 1: A Three Parameter System.- 5.4 Example 2: The Case of Modern French:.- 5.5 Conclusions.- 6. Conclusions.- 6.1 Emergent Themes.- 6.2 Extensions.- 6.3 A Concluding Note.- References.

Reviews

Author Information

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

Aorrng

Shopping Cart
Your cart is empty
Shopping cart
Mailing List