This is Penn State

Biologically Inspired Algorithms for Knowledge Representation, Memory, Language Processing and Learning

(Funded in part by a grant from the National Science Foundation)
 

Artificial neural networks, because of their potential for massive parallelism and fault and noise tolerance, offer an attractive approach to the design of associative memories, language processors, and trainable pattern classifiers. Constructive learning algorithms, which build arbitrarily complex decision boundaries needed for pattern classification (and in some ways, foreshadowed the recent development of support vector machines) were motivated by: the need to overcome the limitations of learning through parameter modification within an a priori fixed network topology; and to avoid the guesswork involved in deciding suitable network architectures for different pattern classification problems by dynamically growing the network to match the complexity of the underlying pattern classification task. Evolutionary algorithms offer a powerful means of exploring large search spaces for solutions that optimize multiple objectives e.g., feature subsets that maximize the predictive performance and minimize the complexity of the classifiers that use them. Against this background, we explored several closely related topics in biologically inspired (neural, evolutionary) algorithms and architectures for knowledge representation, language processing e.g., parsing, and learning. This work has led to:

  • Generalization (with convergence guarantees) of a large family of constructive neural network learning algorithms designed for 2-class binary pattern classification problems to handle classification problems involving real-valued patterns and an arbitrary number of classes (Parekh et al., 2000).
  • Development of a simple, inter-pattern distance based provably convergent, polynomial time constructive neural network algorithm which compares very favorably with computationally far more expensive algorithms in terms of generalization accuracy (Yang et al., 1999).
  • Development of algorithms for construction of robust, noise-tolerant neural memories for pattern storage and associative, content-based retrieval (Chen et al., 1995) and query processing (1996).
  • Development of algorithms for construction of highly parallel neural architectures for syntax analysis (parsing of regular, context-free, and context-sensitive languages) (1999).
  • Development of a biologically inspired neural architecture and an extended Kalman filter algorithm for place learning and localization in a-priori unknown environments which successfully accounts for a large body of behavioral and neurobiological data from animal experiments and offers several testable predictions (Balakrishnan et al., 1998, 2000).
  • Development of evolutionary algorithms for feature subset selection for classification problems (Yang et al., 1998) and sensors and controllers for adaptive robots (Balakrishnan and Honavar, 1996; 1998; 2001).
  • Development of incremental neural network learning algorithms with applications in sensing and nondestructive evaluation (Polikar et al., 2001a, 2001b, 2004).
  • Development of constructive neural network algorithms that take advantage of prior knowledge in the form of classification rules (Parekh et al., 1999).
  • Hybrid neural-symbolic architectures for information processing (Honavar and Uhr, 1994; 1995).