|
[1] A. Agapitos, M. O’Neill, and A. Brabazon. Adaptive distance metrics for nearest neighbour classification based on genetic programming. In Proceedings of European Conference on Genetic Programming, pages 1–12, 2013. [2] T. Back. Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms. Oxford university press, 1996. [3] T. Cover and P. Hart. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1):21–27, 1967. [4] C. Domeniconi, J. Peng, and D. Gunopulos. Locally adaptive metric nearestneighbor classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(9):1281–1285, 2002. [5] C. Domeniconi and B. Yan. Nearest neighbor ensemble. In Proceedings of International Conference on Pattern Recognition, pages 228–231, 2004. [6] S. Dudani. The distance-weighted k-nearest-neighbor rule. IEEE Transactions on Systems, Man, and Cybernetics, 6(4):325–327, 1976. [7] J. Friedman, J. Bentley, and R. Finkel. An algorithm for finding best matches in logarithmic expected time. Association for Computing Machinery Transactions on Mathematical Software, 3(3):209–226, 1977. [8] A. Ghosh. On optimum choice of k in nearest neighbor classification. Computational Statistics and Data Analysis, 50(11):3113–3123, 2006. [9] D. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley Longman, 1989. [10] N. Hansen and A. Ostermeier. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation, pages 312–317, 1996. [11] J. Holland. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. University of Michigan Press, 1975. [12] J. Holland. Genetic algorithms. Scientific American, 67(1):66–73, 1992. [13] C. Holmes and N. Adams. A probabilistic nearest neighbour method for statistical pattern recognition. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 64(2):295–306, 2002. [14] F. Hussein, N. Kharma, and R.Ward. Genetic algorithms for feature selection and weighting, a review and study. In Proceedings of Sixth International Conference on Document Analysis and Recognition, pages 1240–1244, 2001. [15] M. jabbar, B. Deekshatulu, and P. Chandra. Classification of heart disease using knearest neighbor and genetic algorithm. Procedia Technology, 10:85 – 94, 2013. [16] J. Kelly and L. Davis. A hybrid genetic algorithm for classification. In Proceedings of International Joint Conference on Artificial Intelligence, pages 645–650, 1991. [17] J. Kennedy. Swarm Intelligence. Springer, 2006. [18] L. Wang, X. Wang, and Q. Chen. Ga-based feature subset clustering for combination of multiple nearest neighbors classifiers. In Proceedings of International Conference on Machine Learning and Cybernetics, pages 2982–2987, 2005. [19] D. Loftsgaarden and C. Quesenberry. A non-parametric estimate of a multivariate density function. Annals of Mathematical Statistics, 36:13–22, 1965. [20] R. Mcroberts, E. NÊsset, and T. Gobakken. Optimizing the k-nearest neighbors technique for estimating forest aboveground biomass using airborne laser scanning data. Remote Sensing of Environment, 163:13–22, 2015. [21] R. Paredes and E. Vidal. Learning weighted metrics to minimize nearest-neighbor classification error. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(7):1100–1110, 2006. [22] W. PunchIII, E. Goodman, Min P., Lai C., P. D. Hovland, and R. Enbody. Further research on feature selection and classification using genetic algorithms. In Proceedings of International Computer Game Association, pages 557–564, 1993. [23] I. Rechenberg. Evolutionsstrategien. Springer, 1978. [24] H. Schwefel. Numerical Optimization of Computer Models. John Wiley & Sons, 1981. [25] W. Siedlecki and J. Sklansky. A note on genetic algorithms for large-scale feature selection. Pattern Recognition Letters, 10(5):335 – 347, 1989. [26] M. Tahir, A. Bouridane, and F. Kurugollu. Simultaneous feature selection and feature weighting using hybrid tabu search/k-nearest neighbor classifier. Pattern Recognition Letters, 28(4):438 – 446, 2007. [27] S. Tan. Neighbor-weighted k-nearest neighbor for unbalanced text corpus. Expert Systems with Applications, 28(4):667 – 671, 2005. [28] K. Weinberger and L. Saul. Distance metric learning for large margin nearest neighbor classification. Journal of Machine Learning Research, 10:207–244, 2009. [29] D. Wettschereck, D. Aha, and T. Mohri. A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artificial Intelligence Review, 11(1):273–314, 1997. [30] X. Wu, V. Kumar, J. Quinlan, J. Ghosh, Q. Yang, H. Motoda, G. McLachlan, A. Ng, B. Liu, P. Yu, Z. Zhou, M. Steinbach, D. Hand, and D. Steinberg. Top 10 algorithms in data mining. Knowledge and Information Systems, 14(1):1–37, 2008. [31] Z. Yu, H. Chen, J. Liu, J. You, H. Leung, and G. Han. Hybrid k-nearest neighbor classifier. IEEE Transactions on Cybernetics, 46(6):1263–1275, 2016. |