Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Singapore Management University

OS and Networks

Supervised learning

Publication Year

Articles 1 - 4 of 4

Full-Text Articles in Physical Sciences and Mathematics

Breaking Neural Reasoning Architectures With Metamorphic Relation-Based Adversarial Examples, Alvin Chan, Lei Ma, Felix Juefei-Xu, Yew-Soon Ong, Xiaofei Xie, Minhui Xue, Yang Liu Apr 2021

Breaking Neural Reasoning Architectures With Metamorphic Relation-Based Adversarial Examples, Alvin Chan, Lei Ma, Felix Juefei-Xu, Yew-Soon Ong, Xiaofei Xie, Minhui Xue, Yang Liu

Research Collection School Of Computing and Information Systems

The ability to read, reason, and infer lies at the heart of neural reasoning architectures. After all, the ability to perform logical reasoning over language remains a coveted goal of Artificial Intelligence. To this end, models such as the Turing-complete differentiable neural computer (DNC) boast of real logical reasoning capabilities, along with the ability to reason beyond simple surface-level matching. In this brief, we propose the first probe into DNC's logical reasoning capabilities with a focus on text-based question answering (QA). More concretely, we propose a conceptually simple but effective adversarial attack based on metamorphic relations. Our proposed adversarial attack …


Self-Organizing Neural Models Integrating Rules And Reinforcement Learning, Teck-Hou Teng, Zhong-Ming Tan, Ah-Hwee Tan Jun 2008

Self-Organizing Neural Models Integrating Rules And Reinforcement Learning, Teck-Hou Teng, Zhong-Ming Tan, Ah-Hwee Tan

Research Collection School Of Computing and Information Systems

Traditional approaches to integrating knowledge into neural network are concerned mainly about supervised learning. This paper presents how a family of self-organizing neural models known as fusion architecture for learning, cognition and navigation (FALCON) can incorporate a priori knowledge and perform knowledge refinement and expansion through reinforcement learning. Symbolic rules are formulated based on pre-existing know-how and inserted into FALCON as a priori knowledge. The availability of knowledge enables FALCON to start performing earlier in the initial learning trials. Through a temporal-difference (TD) learning method, the inserted rules can be refined and expanded according to the evaluative feedback signals received …


Inductive Neural Logic Network And The Scm Algorithm, Ah-Hwee Tan, Loo-Nin Teow Feb 1997

Inductive Neural Logic Network And The Scm Algorithm, Ah-Hwee Tan, Loo-Nin Teow

Research Collection School Of Computing and Information Systems

Neural Logic Network (NLN) is a class of neural network models that performs both pattern processing and logical inferencing. This article presents a procedure for NLN to learn multi-dimensional mapping of both binary and analog data. The procedure, known as the Supervised Clustering and Matching (SCM) algorithm, provides a means of inferring inductive knowledge from databases. In contrast to gradient descent error correction methods, pattern mapping is learned by an inductive NLN using fast and incremental clustering of input and output patterns. In addition, learning/encoding only takes place when both the input and output match criteria are satisfied in a …


Adaptive Resonance Associative Map, Ah-Hwee Tan Jan 1995

Adaptive Resonance Associative Map, Ah-Hwee Tan

Research Collection School Of Computing and Information Systems

This article introduces a neural architecture termed Adaptive Resonance Associative Map (ARAM) that extends unsupervised Adaptive Resonance Theory (ART) systems for rapid, yet stable, heteroassociative learning. ARAM can be visualized as two overlapping ART networks sharing a single category field. Although ARAM is simpler in architecture than another class of supervised ART models known as ARTMAP, it produces classification performance equivalent to that of ARTMAP. As ARAM network structure and operations are symmetrical, associative recall can be performed in both directions. With maximal vigilance settings, ARAM encodes pattern pairs explicitly as cognitive chunks and thus guarantees perfect storage and recall …