Feature Selection in Learning Common Sense Associations Using Matrix Factorization

Feature Selection in Learning Common Sense Associations Using Matrix Factorization We propose a computational model to learn the common sense association between a pair of concept classes based on a bipartite network and matrix factorization methods. We view the concept-pair association as a bipartite network so that the autoassociation mappings can become similarity constraints. We impose the additional similarity and regularity constraints on the optimization objectives so that a mapping matrix can be found in the matrix factorization to best fit the observation data. We extract 139 locations and 436 activities and 667 location–activity pairs from ConceptNet ( http://conceptnet5.media.mit.edu/ ). We evaluate the performance in terms of F-factor, precision, and recall using a common sense association problem between locations and activities against four feature selection strategies in the matrix factorization optimization. The comparison between the performances with and without human judgment reveals that matrix factorization method tends to show good generalization even under very little observation evidence. Among the four feature selection methods, the maximal entropy method performs better in terms of F-score and recall when the feature number is more than 30 % while SVD method performs better in terms of F-score and recall when the feature number is less than 30 %. Random selection can have a higher precision given “enough” features, but it tends to be the worst performer in the recall and F-score. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Fuzzy Systems Springer Journals

Feature Selection in Learning Common Sense Associations Using Matrix Factorization

Loading next page...
 
/lp/springer_journal/feature-selection-in-learning-common-sense-associations-using-matrix-t0SALrgRK5
Publisher
Springer Berlin Heidelberg
Copyright
Copyright © 2016 by Taiwan Fuzzy Systems Association and Springer-Verlag Berlin Heidelberg
Subject
Engineering; Computational Intelligence; Artificial Intelligence (incl. Robotics); Operations Research, Management Science
ISSN
1562-2479
eISSN
2199-3211
D.O.I.
10.1007/s40815-016-0235-4
Publisher site
See Article on Publisher Site

Abstract

We propose a computational model to learn the common sense association between a pair of concept classes based on a bipartite network and matrix factorization methods. We view the concept-pair association as a bipartite network so that the autoassociation mappings can become similarity constraints. We impose the additional similarity and regularity constraints on the optimization objectives so that a mapping matrix can be found in the matrix factorization to best fit the observation data. We extract 139 locations and 436 activities and 667 location–activity pairs from ConceptNet ( http://conceptnet5.media.mit.edu/ ). We evaluate the performance in terms of F-factor, precision, and recall using a common sense association problem between locations and activities against four feature selection strategies in the matrix factorization optimization. The comparison between the performances with and without human judgment reveals that matrix factorization method tends to show good generalization even under very little observation evidence. Among the four feature selection methods, the maximal entropy method performs better in terms of F-score and recall when the feature number is more than 30 % while SVD method performs better in terms of F-score and recall when the feature number is less than 30 %. Random selection can have a higher precision given “enough” features, but it tends to be the worst performer in the recall and F-score.

Journal

International Journal of Fuzzy SystemsSpringer Journals

Published: Sep 2, 2016

References

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create lists to
organize your research

Export lists, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off