1990). Change of representation and inductive bias. Norwell, MA: Kluwer Academic Publishers. (ed.) (
1990). Introductory combinatorics, 2nd Ed. Orlando, FL: Harcourt. (
1989). Classifier systems and genetic algorithms. Artificial Intelligence, , 235 282. (
1994). Time series analysis, forecasting, and control (3rd ed.). San Francisco, CA: Holden-Day. (
1996) Bagging predictors. Machine Learning, , 123 140. (
1995). The mythical-man month, Anniversary edition: Essays on software engineering. Reading, MA: AddisonWesley. (
1999). Designing efficient and accurate parallel genetic algorithms. Ph.D. thesis, University of Illinois at Urbana-Champaign. Technical report, Illinois Genetic Algorithms Laboratory (IlliGAL). (
2001). Introduction to algorithms, 2nd edition. Cambridge, MA: MIT Press. (
1991). Elements of iInformation theory. New York: John Wiley & Sons. (
1993). Using genetic algorithms for concept learning. Machine Learning, , 161 188. (
1996). Knowledge-guided constructive induction. Ph.D. thesis, Department of Computer Science, University of Illinois at Urbana-Champaign. (
2000). Pattern classification, 2nd ed. New York: John Wiley & Sons. (
1998). Proceedings of the 1998 Joint AAAI-ICML Workshop on the Methodology of Applying Machine Learning (Technical Report WS-98-16). Menlo Park, CA: AAAI Press. (
1996). Experiments with a new boosting Algorithm. In Proceedings of the 13th International Conference on Machine Learning, pp. 148 156. San Mateo, CA: Morgan Kaufmann. (
1985). Learning intermediate concepts in constructing a hierarchical knowledge base. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI-85), pp. 659 666, Los Angeles, CA. (
1994). Crisis management in anesthesiology. New York: Churchill Livingstone. (
1992). Neural networks and the bias/variance dilemna. Neural Computation, , 1 58. (
1994). The future of time series: Learning and understanding. In Time Series Prediction: Forecasting the Future and Understanding the Past (Santa Fe Institute Studies in the Sciences of Complexity XV). Reading, MA: AddisonWesley. (eds). (
1989). Genetic algorithms in search, optimization, and machine learning. Reading, MA: AddisonWesley. (
1998). The race, The hurdle, and The sweet spot: Lessons from genetic algorithms for the automation of design innovation andcreativity. Technical report, Illinois Genetic Algorithms Laboratory (IlliGAL). (
1998). Bayesian network models for generation of crisis management training scenarios. In Proceedings of IAAI-98. Menlo Park, CA: AAAI Press, pp. 1113 1120. (
1997). A parameter-less genetic algorithm. Technical report, Illinois Genetic Algorithms Laboratory (IlliGAL). (
1995). Fundamentals of artificial neural networks. Cambridge, MA: MIT Press. (
1996). Guardian Project home page, URL: http://www-ksl.stanford.edu/projects/guardian/. (
1999). Neural networks: A comprehensive foundation, 2nd ed. Englewood Cliffs, NJ: Prentice Hall. (
1991). Probabilistic similarity networks. Cambridge, MA: MIT Press. (
1996). A tutorial on learning with Bayesian networks. Microsoft Research Technical Report 95 06, revised June 1996. (
1994). Computer intensive statistical methods: Validation, model selection and nootstrap. London: Chapman and Hall. (
1997). The nature of niching: Genetic algorithms and the evolution of optimal, cooperative populations. Ph.D. thesis, University of Illinois at Urbana-Champaign. Technical report, Illinois Genetic Algorithms Laboratory (IlliGAL). (
1995). Display of information for time-critical decision making. In Proceedings of the 11th International Conference on Uncertainty in Artificial Intelligence (UAI-95). San Mateo, CA: Morgan-Kaufmann, pp. 296 305. (
1998). Time series learning with probabilistic network composites. Ph.D. thesis, University of Illinois at Urbana-Champaign. Technical Report UIUC-DCS-R2063. URL: http://www.kddresearch.org/Publications/Theses/PhD/Hsu. (
1998). A new approach to multi-strategy learning from heterogeneous time series. In Proceedings of the International Workshop on Multi-strategy Learning, Milan, Italy, June. (
2000). A multi-strategy approach to classifier learning from time series. Machine Learning, , 213 236. (
2002). Constructive induction wrappers in high-performance commercial data mining and decision support systems. Data Mining and Knowledge Discovery, (4): 361 391, October. (
1994). Hierarchical mixtures of wxperts and the EM algorithm. Neural Computation, , 181 214. (
1997). Wrappers for feature subset selection. Artificial Intelligence, Special Issue on Relevance, (1-2), 273 324. (
1996). Data mining using MLC++: A machine learning library in C++. In Tools with Artificial Intelligence, p. 234 245. Rockville, MD: IEEE Computer Society Press URL: http://www.sgi.com/Technology/mlc. (
1990). The self-organizing map. In Proceedings of the IEEE, 1464 1480. (
1992). Genetic programming. Cambridge, MA: MIT Press. (
1990). A time-delay neural network architecture for isolated word recognition. Neural Networks, , 23 43. (
1989). Automatic speech recognition: The development of the SPHINX system. Norwell, MA: Kluwer Academic Publishers. (
1993). Hierarchical classification and vector quantization with neural trees. Neurocomputing, , 119 139. (
1995). Radial basis function networks. In M. A. Arbib (Ed.), The handbook of brain theory and neural networks, 779 782. Cambridge, MA: MIT Press. (
1997). Elements of artificial neural networks. Cambridge, MA: MIT Press. (
1993). A theory and methodology of inductive learning. Artificial Intelligence, (2), 111 161. Reprinted in B. G. Buchanan, & D. C. Wilkins (Eds.), Readings in knowledge acquisition and learning,. San Mateo, CA: MorganKaufmann. (
1983). Learning from observation: Conceptual clustering. In R.S. Michalski, J.G. Carbonell, T. M. J& Mitchell (Eds.), Machine learning: An artificial intelligence approach. San Mateo, CA: Morgan Kaufmann. (
1997). Machine learning. New York: McGraw-Hill. (
1994). Neural net architectures for temporal sequence processing. In A.S. Weigend. & N.A. Gershenfeld. (eds.), Time series prediction: Forecasting the future and understanding the past (Santa Fe Institute Studies in the Sciences of Complexity XV). Reading, MA: AddisonWesley. (
1996). Bayesian learning for neural networks. New York: Springer-Verlag. (
1965). Meteorological drought. Research Paper Number 45, Office of Climatology, United States Weather Bureau. (
1992). The Gamma model A new neural net model for temporal processing. Neural Networks, , 565 576. (
2001). NeuroSolutions v4.0, Gainesville, FL: NeuroDimension. URL: http://www.nd.com/. (
1998). Self-organized-expert modular network for classification of spatiotemporal sequences. Journal of Intelligent Data Analysis, (4). (
1997). Recommender systems. Communications of the ACM, (3): 56 58. (
1995). Artificial intelligence: A modern approach. Englewood Cliffs, NJ: Prentice Hall. (
2002). Neural network FAQ. Periodic posting to the Usenet newsgroup comp.ai.neural-nets, URL: ftp://ftp.sas.com/pub/neural/FAQ.html. (ed.) (
1997). A new metric-based approach to model selection. In Proceedings of the Fourteenth National Conference on Artificial Intelligence (AAAI-97), Providence, RI, 552 558. Menlo Park, CA: AAAI Press. (
1993). The merging of the senses. Cambridge, MA: MIT Press. (
1997). An asymptotic equivalence of choice of models by cross-validation and Akaike's criterion. Journal of the Royal Statistical Society, Series B, , 44 47. (
1985). Pattern recognition: Human and mechanical. New York: John Wiley and Sons. (
2000). Data mining: Practical machine learning tools and techniques with Java implementations. San Mateo, CA: MorganKaufmann. (
1992). Stacked generalization. Neural Networks, , 241 259. (
| |||||||||||||||||||||||||||||||||
|