1998). Automatic subspace clustering of high dimensional data for data mining applications. In Proceedings of the ACM SIGMOD Int'l Conference on Management of Data, pp. 94 105, Seattle, WA. (
1999). An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 105 139. (
1998). Scaling EM (expectation-maximization) clustering to large databases. Technical Report MSR-TR-98 35, Microsoft, Redmond, WA. (
1996a). Bagging predictors. Machine Learning, (2): 123 140. (
1996b). Bias, variance, and arching classifiers. Technical Report 460, University of California, Department of Statistics, Berkeley, CA. (
1999). Random forests-Random features. Technical Report 567, University of California, Department of Statistics, Berkeley, CA. (
1995). Data clustering and learning. In Arbib, M. (ed.), Handbook of Brain Theory and Neural Networks. Cambridge, MA: Bradfort Books/MIT Press. (
1996). Bayesian classification system (AutoClass): Theory and results. In U. Fayyad, G. Piatetsky-Shapiro, P. Smyth, & R. Uthurusamy(eds.), Advances in Knowledge Discovery and Data Mining, pp. 153 180, San Francisco: AAAI/MIT Press. (
1999). Non-standard crossover for a standard representation - Commonality-based feature subset selection. In Proceedings of the Genetic and Evolutionary Computation Conference, pp. 129 134. San Francisco: Morgan Kaufmann. (
2000). Diversity versus quality in classification ensembles based on feature selection. Technical Report TCD-CS-2000-02, Trinity College, Department of Computer Science, Dublin, Ireland. (
1977). Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, (1): 1 38. (
1997). Efficient feature selection in conceptual clustering. In Proceedings of the 14th Int'l Conference on Machine Learning, pp. 92 97. San Francisco: Morgan Kaufmann. (
2000a). Feature subset selection and order identification for unsupervised learning. In Proceedings of the 17th Int'l Conference on Machine Learning, pp. 247 254. San Francisco: Morgan Kaufmann. (
2000b). Visualization and interactive feature selection for unsupervised data. In Proceedings of the 6th ACM SIGKDD Int'l Conference on Knowledge Discovery & Data Mining (KDD-00), pp. 360 364, ACM Press. (
1996). Experiments with a new boosting algorithm. In Proceedings of the 13th Int'l Conference on Machine Learning, pp. 148 156, Bari, Italy, Morgan Kaufmann. (
1987). Genetic algorithms with sharing for multimodal function optimization. In Proceedings of the 2nd International Conference on Genetic Algorithms, pp. 41 49. Hillsdale, NJ: Lawrence Erlbaum. (
1999). Genetic approach to feature selection for ensemble creation. In GECCO-99: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 236 243. San Francisco: Morgan Kaufmann. (
1998a). C4.5 decision forests. In Proceedings of the 14th International Conference on Pattern Recognition, IEEE Computer Society, pp. 545 549. (
1998b). The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, (8): 832 844. (
1997). Multi-criteria decision making and evolutionary computation. In T. Back, D. B.Fogel & Z. Michaelevicz (Eds.), Handbook of Evolutionary Computation. London: Institute of Physics Publishing. (
2000). CoIL challenge 2000: Choosing and explaining likely caravan insurance customers. Technical Report 2000-09, Sentient Machine Research and Leiden Institute of Advanced Computer Science. http://www.wi.leidenuniv.nl/~putten/library/cc2000/. (
1995). Breast cancer diagnosis and prognosis via linear programming. Operations Research, (4): 570 577. (
1998). An experimental comparison of several clustering methods. Technical Report MSR-TR-98-06, Microsoft, Redmond, WA. (
1996). From complex environments to complex behaviors. Adaptive Behavior, 317 363. (
2000). Efficient and scalable Pareto optimization by evolutionary local selection algorithms. Evolutionary Computation, (2): 223 247. (
2000). Evolving heterogeneous neural agents by local selection. In V. Honavar, M. Patel & K. Balakrishnan(eds.), Advances in the Evolutionary Synthesis of Intelligent Agents. Cambridge, MA: MIT Press. (
1999). Feature selection for ensembles. In Proceedings of the 16th National Conference on Artificial Intelligence (AAAI), pp. 379 384, Orlando, FL, AAAI. (
2001). A streaming ensemble algorithm (SEA) for large-scale classification. In Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD-01), pp.377 382, ACM Press. (
1995). An inductive learning approach to prognostic prediction. In A. Prieditis & S. Russell (eds.), Proceedings of the 12th International Conference on Machine Learning, pp. 522 530, San Francisco: Morgan Kaufmann. (
1999). Multiobjective evolutionary algorithms: Classifications, analyses, and new innovations. PhD thesis, Air Force Institute of Technology. (
| |||||||||||||||||||||||||||||||||
|