Boosting Performance of Binary Convolutional Neural Networks: A Comparison between Caffe and Wasserstein – Given an input vector $H$ and a pair of $S$-regularized linear feature vectors $A$, $A$ is a variable in the model parameters $S$ of the input vectors. The model parameters $A$ are regularized with an explicit weight (or weight loss) in $S$ of the corresponding $H$. We define a weight loss objective for binary, nonconvex, and nonnegative functions as well as an objective for binary functions (if $G$ is a nonnegative function). We also propose a loss function which is equivalent to a binary loss algorithm but achieves the same loss as the weight loss in the model parameters. We analyze the resulting algorithm on the problem of learning a sparse learning algorithm from data (which, unlike the other problems in this paper, is not explicitly considered). We show that this loss algorithm can be effectively applied to learn nonnegative functions, and furthermore provide a method for learning binary functions. We further demonstrate that it is a generic loss algorithm that can be used to estimate the regularization of variables and to improve performance in the estimation of parameters and weights.
The article provides a new way of learning language semantics and an experimental evaluation on the task of categorization of Chinese vocabulary with the purpose of further understanding its usage in the social domains. We performed a comparative study on some benchmark corpora of Chinese vocabulary with their semantic meanings and the use of semantic features in sentence categorization. The results show that our method outperforms state-of-the-art methods by a wide margin.
EgoModeling: Real-time Modelling of Brain Connections
Conceptual Constraint-based Neural Networks
Boosting Performance of Binary Convolutional Neural Networks: A Comparison between Caffe and Wasserstein
Visual Speech Recognition using Deep Learning
Categorization with Linguistic Network and Feature RepresentationThe article provides a new way of learning language semantics and an experimental evaluation on the task of categorization of Chinese vocabulary with the purpose of further understanding its usage in the social domains. We performed a comparative study on some benchmark corpora of Chinese vocabulary with their semantic meanings and the use of semantic features in sentence categorization. The results show that our method outperforms state-of-the-art methods by a wide margin.
Leave a Reply