Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Neural and MTS Algorithms for Feature Selection

Neural and MTS Algorithms for Feature Selection The relationships among multi‐dimensional data (such as medical examination data) with ambibuity and variation are difficult to explore. The traditional approach to building a data classification system requires the formulation of rules by which the input data can be analyzed. The formulation of such rules is very difficult with large sets of input data. This paper first describes two classification approaches using back‐propagation (BP) neural network and Mahalanobis distance (MD) classifier, and then proposes two classification approaches for multi‐dimensional feature selection. The first one proposed is a feature selection procedure from the trained back‐propagation (BP) neural network. The basic idea of this procedure is to compare the multiplication weights between input and hidden layer and hidden and output layer. In order to simplify the structure, only the multiplication weights of large absolute values are used. The second approach is Mahalanobis‐Taguchi system (MTS) originally suggested by Dr. Taguchi. The MTS performs Taguchi’s fractional factorial design based on the Mahalanobis distance as a performance metric. We combine the automatic thresholding with MD; it can deal with a reduced model, which is the focus of this paper. In this work, two case studies will be used as examples to compare and discuss the complete and reduced models employing BP neural network and MD classifier. The implementation results show that proposed approaches are effective and powerful for the classification. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Asian Journal on Quality Emerald Publishing

Neural and MTS Algorithms for Feature Selection

Asian Journal on Quality , Volume 3 (2): 19 – Aug 21, 2002

Loading next page...
 
/lp/emerald-publishing/neural-and-mts-algorithms-for-feature-selection-8t4Ty8acM9
Publisher
Emerald Publishing
Copyright
Copyright © 2002 MCB UP Ltd. All rights reserved.
ISSN
1598-2688
DOI
10.1108/15982688200200023
Publisher site
See Article on Publisher Site

Abstract

The relationships among multi‐dimensional data (such as medical examination data) with ambibuity and variation are difficult to explore. The traditional approach to building a data classification system requires the formulation of rules by which the input data can be analyzed. The formulation of such rules is very difficult with large sets of input data. This paper first describes two classification approaches using back‐propagation (BP) neural network and Mahalanobis distance (MD) classifier, and then proposes two classification approaches for multi‐dimensional feature selection. The first one proposed is a feature selection procedure from the trained back‐propagation (BP) neural network. The basic idea of this procedure is to compare the multiplication weights between input and hidden layer and hidden and output layer. In order to simplify the structure, only the multiplication weights of large absolute values are used. The second approach is Mahalanobis‐Taguchi system (MTS) originally suggested by Dr. Taguchi. The MTS performs Taguchi’s fractional factorial design based on the Mahalanobis distance as a performance metric. We combine the automatic thresholding with MD; it can deal with a reduced model, which is the focus of this paper. In this work, two case studies will be used as examples to compare and discuss the complete and reduced models employing BP neural network and MD classifier. The implementation results show that proposed approaches are effective and powerful for the classification.

Journal

Asian Journal on QualityEmerald Publishing

Published: Aug 21, 2002

Keywords: Feature selection; Artificial neural networks; Backpropagation; Mahalanobis distance; Automatic thresholding; Mahalanobis‐Taguchi system

There are no references for this article.