Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics

Central Washington University

2020

Deep learning

Articles 1 - 3 of 3

Full-Text Articles in Entire DC Network

Semiotic Aggregation In Deep Learning, Bogdan Muşat, Răzvan Andonie Dec 2020

Semiotic Aggregation In Deep Learning, Bogdan Muşat, Răzvan Andonie

All Faculty Scholarship for the College of the Sciences

Convolutional neural networks utilize a hierarchy of neural network layers. The statistical aspects of information concentration in successive layers can bring an insight into the feature abstraction process. We analyze the saliency maps of these layers from the perspective of semiotics, also known as the study of signs and sign-using behavior. In computational semiotics, this aggregation operation (known as superization) is accompanied by a decrease of spatial entropy: signs are aggregated into supersign. Using spatial entropy, we compute the information content of the saliency maps and study the superization processes which take place between successive layers of the network. In …


Modeling Multi-Targets Sentiment Classification Via Graph Convolutional Networks And Auxiliary Relation, Ao Feng, Zhengjie Gao, Xinyu Song, Ke Ke, Tianhao Xu, Xuelei Zhang Jun 2020

Modeling Multi-Targets Sentiment Classification Via Graph Convolutional Networks And Auxiliary Relation, Ao Feng, Zhengjie Gao, Xinyu Song, Ke Ke, Tianhao Xu, Xuelei Zhang

All Faculty Scholarship for the College of the Sciences

Existing solutions do not work well when multi-targets coexist in a sentence. The reason is that the existing solution is usually to separate multiple targets and process them separately. If the original sentence has N target, the original sentence will be repeated for N times, and only one target will be processed each time. To some extent, this approach degenerates the fine-grained sentiment classification task into the sentencelevel sentiment classification task, and the research method of processing the target separately ignores the internal relation and interaction between the targets. Based on the above considerations, we proposes to use Graph Convolutional …


Learning In Feedforward Neural Networks Accelerated By Transfer Entropy, Adrian Moldovan, Angel Caţaron, Rǎzvan Andonie Jan 2020

Learning In Feedforward Neural Networks Accelerated By Transfer Entropy, Adrian Moldovan, Angel Caţaron, Rǎzvan Andonie

All Faculty Scholarship for the College of the Sciences

Current neural networks architectures are many times harder to train because of the increasing size and complexity of the used datasets. Our objective is to design more efficient training algorithms utilizing causal relationships inferred from neural networks. The transfer entropy (TE) was initially introduced as an information transfer measure used to quantify the statistical coherence between events (time series). Later, it was related to causality, even if they are not the same. There are only few papers reporting applications of causality or TE in neural networks. Our contribution is an information-theoretical method for analyzing information transfer between the nodes of …