The Fundamental Learning Problem that Genetic Algorithms with Uniform Crossover Solve Efficiently and Repeatedly As Evolution Proceeds
Abstract
This paper establishes theoretical bonafides for implicit concurrent multivariate effect evaluationimplicit concurrency for shorta broad and versatile computational learning efficiency thought to underlie generalpurpose, nonlocal, noisetolerant optimization in genetic algorithms with uniform crossover (UGAs). We demonstrate that implicit concurrency is indeed a form of efficient learning by showing that it can be used to obtain closetooptimal bounds on the time and queries required to approximately correctly solve a constrained version (k=7, \eta=1/5) of a recognizable computational learning problem: learning parities with noisy membership queries. We argue that a UGA that treats the noisy membership query oracle as a fitness function can be straightforwardly used to approximately correctly learn the essential attributes in O(log^1.585 n) queries and O(n log^1.585 n) time, where n is the total number of attributes. Our proof relies on an accessible symmetry argument and the use of statistical hypothesis testing to reject a global null hypothesis at the 10^100 level of significance. It is, to the best of our knowledge, the first relatively rigorous identification of efficient computational learning in an evolutionary algorithm on a nontrivial learning problem.
 Publication:

arXiv eprints
 Pub Date:
 July 2013
 arXiv:
 arXiv:1307.3824
 Bibcode:
 2013arXiv1307.3824B
 Keywords:

 Computer Science  Neural and Evolutionary Computing;
 Computer Science  Artificial Intelligence;
 Computer Science  Computational Complexity;
 Computer Science  Discrete Mathematics;
 Computer Science  Machine Learning;
 I.2.8;
 I.2.6;
 F.2
 EPrint:
 For an easy introduction to implicit concurrency (with animations), visit http://blog.hackingevolution.net/2013/03/24/implicitconcurrencyingeneticalgorithms/