data mining: opportunities and challenges
Chapter IV - Feature Selection in Data Mining
Data Mining: Opportunities and Challenges
by John Wang (ed) 
Idea Group Publishing 2003
Brought to you by Team-Fly

Agrawal, R., Gehrke, J., Gunopulos, D., & Raghavan, P. (1998). Automatic subspace clustering of high dimensional data for data mining applications. In Proceedings of the ACM SIGMOD Int'l Conference on Management of Data, pp. 94 105, Seattle, WA.

Bauer, E. & Kohavi, R. (1999). An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36: 105 139.

Bradley, P. S., Fayyad, U. M., & Reina, C. (1998). Scaling EM (expectation-maximization) clustering to large databases. Technical Report MSR-TR-98 35, Microsoft, Redmond, WA.

Breiman, L. (1996a). Bagging predictors. Machine Learning, 24 (2): 123 140.

Breiman, L. (1996b). Bias, variance, and arching classifiers. Technical Report 460, University of California, Department of Statistics, Berkeley, CA.

Breiman, L. (1999). Random forests-Random features. Technical Report 567, University of California, Department of Statistics, Berkeley, CA.

Buhmann, J. (1995). Data clustering and learning. In Arbib, M. (ed.), Handbook of Brain Theory and Neural Networks. Cambridge, MA: Bradfort Books/MIT Press.

Cheeseman, P. & Stutz, J. (1996). Bayesian classification system (AutoClass): Theory and results. In U. Fayyad, G. Piatetsky-Shapiro, P. Smyth, & R. Uthurusamy(eds.), Advances in Knowledge Discovery and Data Mining, pp. 153 180, San Francisco: AAAI/MIT Press.

Chen, S., Guerra-Salcedo, C., & Smith, S. (1999). Non-standard crossover for a standard representation - Commonality-based feature subset selection. In Proceedings of the Genetic and Evolutionary Computation Conference, pp. 129 134. San Francisco: Morgan Kaufmann.

Cunningham, P. & Carney, J. (2000). Diversity versus quality in classification ensembles based on feature selection. Technical Report TCD-CS-2000-02, Trinity College, Department of Computer Science, Dublin, Ireland.

Dempster, A. P., Laird, N. M., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39(1): 1 38.

Devaney, M. & Ram, A. (1997). Efficient feature selection in conceptual clustering. In Proceedings of the 14th Int'l Conference on Machine Learning, pp. 92 97. San Francisco: Morgan Kaufmann.

Dy, J. G. & Brodley, C. E. (2000a). Feature subset selection and order identification for unsupervised learning. In Proceedings of the 17th Int'l Conference on Machine Learning, pp. 247 254. San Francisco: Morgan Kaufmann.

Dy, J. G. & Brodley, C. E. (2000b). Visualization and interactive feature selection for unsupervised data. In Proceedings of the 6th ACM SIGKDD Int'l Conference on Knowledge Discovery & Data Mining (KDD-00), pp. 360 364, ACM Press.

Freund, Y. & Schapire, R. (1996). Experiments with a new boosting algorithm. In Proceedings of the 13th Int'l Conference on Machine Learning, pp. 148 156, Bari, Italy, Morgan Kaufmann.

Goldberg, D. E. & Richardson, J. (1987). Genetic algorithms with sharing for multimodal function optimization. In Proceedings of the 2nd International Conference on Genetic Algorithms, pp. 41 49. Hillsdale, NJ: Lawrence Erlbaum.

Guerra-Salcedo, C. & Whitley, D. (1999). Genetic approach to feature selection for ensemble creation. In GECCO-99: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 236 243. San Francisco: Morgan Kaufmann.

Ho, T. K. (1998a). C4.5 decision forests. In Proceedings of the 14th International Conference on Pattern Recognition, IEEE Computer Society, pp. 545 549.

Ho, T. K. (1998b). The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8): 832 844.

Horn, J. (1997). Multi-criteria decision making and evolutionary computation. In T. Back, D. B.Fogel & Z. Michaelevicz (Eds.), Handbook of Evolutionary Computation. London: Institute of Physics Publishing.

Kim, Y. & Street, W. N. (2000). CoIL challenge 2000: Choosing and explaining likely caravan insurance customers. Technical Report 2000-09, Sentient Machine Research and Leiden Institute of Advanced Computer Science.

Mangasarian, O. L., Street, W. N., & Wolberg, W. H. (1995). Breast cancer diagnosis and prognosis via linear programming. Operations Research, 43 (4): 570 577.

Meila, M. & Heckerman, D. (1998). An experimental comparison of several clustering methods. Technical Report MSR-TR-98-06, Microsoft, Redmond, WA.

Menczer, F. & Belew, R. K. (1996). From complex environments to complex behaviors. Adaptive Behavior, 4: 317 363.

Menczer, F., Degeratu, M., & Street, W. N. (2000). Efficient and scalable Pareto optimization by evolutionary local selection algorithms. Evolutionary Computation, 8(2): 223 247.

Menczer, F., Street, W. N., & Degeratu, M. (2000). Evolving heterogeneous neural agents by local selection. In V. Honavar, M. Patel & K. Balakrishnan(eds.), Advances in the Evolutionary Synthesis of Intelligent Agents. Cambridge, MA: MIT Press.

Opitz, D. (1999). Feature selection for ensembles. In Proceedings of the 16th National Conference on Artificial Intelligence (AAAI), pp. 379 384, Orlando, FL, AAAI.

Street, W. N. & Kim, Y. (2001). A streaming ensemble algorithm (SEA) for large-scale classification. In Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD-01), pp.377 382, ACM Press.

Street, W. N., Mangasarian, O. L., & Wolberg, W. H. (1995). An inductive learning approach to prognostic prediction. In A. Prieditis & S. Russell (eds.), Proceedings of the 12th International Conference on Machine Learning, pp. 522 530, San Francisco: Morgan Kaufmann.

Van Veldhuizen, D. A. (1999). Multiobjective evolutionary algorithms: Classifications, analyses, and new innovations. PhD thesis, Air Force Institute of Technology.

Brought to you by Team-Fly

Data Mining(c) Opportunities and Challenges
Data Mining: Opportunities and Challenges
ISBN: 1591400511
EAN: 2147483647
Year: 2003
Pages: 194
Authors: John Wang © 2008-2017.
If you may any questions please contact us: