Abstract
The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifiers improves generalization. One of the most effective combining methods is bagging. In bagging, training sets are selected by resampling from the original training set and classifiers trained with these sets are combined by voting. We implement the bagging technique into the training of evolving neural network classifiers to improve generalization.
Original language | English (US) |
---|---|
Pages | 3218-3222 |
Number of pages | 5 |
State | Published - Sep 25 2003 |
Event | International Joint Conference on Neural Networks 2003 - Portland, OR, United States Duration: Jul 20 2003 → Jul 24 2003 |
Other
Other | International Joint Conference on Neural Networks 2003 |
---|---|
Country/Territory | United States |
City | Portland, OR |
Period | 7/20/03 → 7/24/03 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence