A neural architecture for a class of abduction problems
Document Type
Article
Publication Date
12-1-1996
Abstract
The general task of abduction is to infer a hypothesis that best explains a set of data. A typical subtask of this is to synthesize a composite hypothesis that best explains the entire data from elementary hypotheses which can explain portions of it. The synthesis subtask of abduction is computationally expensive, more so in the presence of certain types of interactions between the elementary hypotheses. In this paper, we first formulate the abduction task as a nonmonotonic constrained-optimization problem. We then consider a special version of the general abduction task that is linear and monotonie. Next, we describe a neural network based on the Hopfield model of computation for the special version of the abduction task. The connections in this network are symmetric, the energy function contains product forms, and the minimization of this function requires a network of order greater than two. We then discuss another neural architecture which is composed of functional modules that reflect the structure of the abduction task. The connections in this second-order network are asymmetric. We conclude with a discussion of how the second architecture may be extended to address the general abduction task. © 1996 IEEE.
Publication Source (Journal or Book title)
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
First Page
854
Last Page
860
Recommended Citation
Goel, A., & Ramanujam, J. (1996). A neural architecture for a class of abduction problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 26 (6), 854-860. https://doi.org/10.1109/3477.544299