12 Search Results for "Witt, Carsten"


Document
Estimation-of-Distribution Algorithms: Theory and Applications (Dagstuhl Seminar 22182)

Authors: Josu Ceberio Uribe, Benjamin Doerr, Carsten Witt, and Vicente P. Soloviev

Published in: Dagstuhl Reports, Volume 12, Issue 5 (2022)


Abstract
The Dagstuhl seminar 22182 Estimation-of-Distribution Algorithms: Theory and Practice on May 2-6, 2022 brought together 19 international experts in estimation-of-distribution algorithms (EDAs). Their research ranged from a theoretical perspective, e.g., runtime analysis on synthetic problems, to an applied perspective, e.g., solutions of industrial optimization problems with EDAs. This report documents the program and the outcomes of the seminar.

Cite as

Josu Ceberio Uribe, Benjamin Doerr, Carsten Witt, and Vicente P. Soloviev. Estimation-of-Distribution Algorithms: Theory and Applications (Dagstuhl Seminar 22182). In Dagstuhl Reports, Volume 12, Issue 5, pp. 17-36, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Article{uribe_et_al:DagRep.12.5.17,
  author =	{Uribe, Josu Ceberio and Doerr, Benjamin and Witt, Carsten and Soloviev, Vicente P.},
  title =	{{Estimation-of-Distribution Algorithms: Theory and Applications (Dagstuhl Seminar 22182)}},
  pages =	{17--36},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2022},
  volume =	{12},
  number =	{5},
  editor =	{Uribe, Josu Ceberio and Doerr, Benjamin and Witt, Carsten and Soloviev, Vicente P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.12.5.17},
  URN =		{urn:nbn:de:0030-drops-174421},
  doi =		{10.4230/DagRep.12.5.17},
  annote =	{Keywords: estimation-of-distribution algorithms, heuristic search and optimization, machine learning, probabilistic model building}
}
Document
Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation

Authors: Carsten Witt

Published in: LIPIcs, Volume 14, 29th International Symposium on Theoretical Aspects of Computer Science (STACS 2012)


Abstract
The analysis of randomized search heuristics on classes of functions is fundamental for the understanding of the underlying stochastic process and the development of suitable proof techniques. Recently, remarkable progress has been made in bounding the expected optimization time of the simple (1+1) EA on the class of linear functions. We improve the best known bound in this setting from (1.39+o(1))(en ln n) to (en ln n)+O(n) in expectation and with high probability, which is tight up to lower-order terms. Moreover, upper and lower bounds for arbitrary mutations probabilities p are derived, which imply expected polynomial optimization time as long as p=O((ln n)/n) and which are tight if p=c/n for a constant c. As a consequence, the standard mutation probability p=1/n is optimal for all linear functions, and the (1+1) EA is found to be an optimal mutation-based algorithm. Furthermore, the algorithm turns out to be surprisingly robust since large neighborhood explored by the mutation operator does not disrupt the search.

Cite as

Carsten Witt. Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation. In 29th International Symposium on Theoretical Aspects of Computer Science (STACS 2012). Leibniz International Proceedings in Informatics (LIPIcs), Volume 14, pp. 420-431, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{witt:LIPIcs.STACS.2012.420,
  author =	{Witt, Carsten},
  title =	{{Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation}},
  booktitle =	{29th International Symposium on Theoretical Aspects of Computer Science (STACS 2012)},
  pages =	{420--431},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-35-4},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{14},
  editor =	{D\"{u}rr, Christoph and Wilke, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2012.420},
  URN =		{urn:nbn:de:0030-drops-33920},
  doi =		{10.4230/LIPIcs.STACS.2012.420},
  annote =	{Keywords: Randomized Search Heuristics, Evolutionary Algorithms, Linear Functions, Running Time Analysis}
}
Document
10361 Abstracts Collection and Executive Summary – Theory of Evolutionary Algorithms

Authors: Anne Auger, Jonathan L. Shapiro, L. Darrell Whitley, and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 10361, Theory of Evolutionary Algorithms (2010)


Abstract
From September 5 to 10, the Dagstuhl Seminar 10361 ``Theory of Evolutionary Algorithms '' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general.

Cite as

Anne Auger, Jonathan L. Shapiro, L. Darrell Whitley, and Carsten Witt. 10361 Abstracts Collection and Executive Summary – Theory of Evolutionary Algorithms. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 10361, pp. 1-19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{auger_et_al:DagSemProc.10361.1,
  author =	{Auger, Anne and Shapiro, Jonathan L. and Whitley, L. Darrell and Witt, Carsten},
  title =	{{10361 Abstracts Collection and Executive Summary – Theory of Evolutionary Algorithms}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--19},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10361},
  editor =	{Anne Auger and Jonathan L. Shapiro and L. Darrell Whitley and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10361.1},
  URN =		{urn:nbn:de:0030-drops-28180},
  doi =		{10.4230/DagSemProc.10361.1},
  annote =	{Keywords: Evolutionary algorithms, bio-inspired search heuristics, theoretical analysis, optimization time}
}
Document
2-bit Flip Mutation Elementary Fitness Landscapes

Authors: William Langdon

Published in: Dagstuhl Seminar Proceedings, Volume 10361, Theory of Evolutionary Algorithms (2010)


Abstract
Genetic Programming parity is not elementary. GP parity cannot be represented as the sum of a small number of elementary landscapes. Statistics, including fitness distance correlation, of Parity's fitness landscape are calculated. Using Walsh analysis the eigen values and eigenvectors of the Laplacian of the two bit flip fitness landscape are given and a ruggedness measure for elementary landscapes is proposed. An elementary needle in a haystack (NIH) landscape is given.

Cite as

William Langdon. 2-bit Flip Mutation Elementary Fitness Landscapes. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 10361, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{langdon:DagSemProc.10361.2,
  author =	{Langdon, William},
  title =	{{2-bit Flip Mutation Elementary Fitness Landscapes}},
  booktitle =	{Theory of Evolutionary Algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10361},
  editor =	{Anne Auger and Jonathan L. Shapiro and L. Darrell Whitley and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10361.2},
  URN =		{urn:nbn:de:0030-drops-28146},
  doi =		{10.4230/DagSemProc.10361.2},
  annote =	{Keywords: Genetic Algorithms, Genetic Programming, search, optimisation, graph theory, Laplacian, Hamming cube}
}
Document
Exploring the common concepts of adaptive MCMC and Covariance Matrix Adaptation schemes

Authors: Christian Lorenz Mueller

Published in: Dagstuhl Seminar Proceedings, Volume 10361, Theory of Evolutionary Algorithms (2010)


Abstract
In the field of scientific modeling, one is often confronted with the task of drawing samples from a probability distribution that is only known up to a normalizing constant and for which no direct analytical method for sample generation is available. Since the past decade, adaptive Markov Chain Monte Carlo (MCMC) methods gained considerable attention in the statistics community in order to tackle this black-box (or indirect) sampling scenario. Common application domains are Bayesian statistics and statistical physics. Adaptive MCMC methods try to learn an optimal proposal distribution from previously accepted samples in order to efficiently explore the target distribution. Variable metric ap- proaches in black-box optimization, such as the Evolution Strategy with covariance matrix adaptation (CMA-ES) and Gaussian Adaption (GaA), use almost identical ideas to locate putative global optima. This extended abstract summarizes the common concepts in adaptive MCMC and co- variance matrix adaptation schemes. We also present how both types of methods can be unified within the Gaussian Adaptation framework and propose a unification of both fields as “grand challenge” for future research.

Cite as

Christian Lorenz Mueller. Exploring the common concepts of adaptive MCMC and Covariance Matrix Adaptation schemes. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 10361, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{mueller:DagSemProc.10361.3,
  author =	{Mueller, Christian Lorenz},
  title =	{{Exploring the common concepts of adaptive MCMC and Covariance Matrix Adaptation schemes}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--10},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10361},
  editor =	{Anne Auger and Jonathan L. Shapiro and L. Darrell Whitley and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10361.3},
  URN =		{urn:nbn:de:0030-drops-28135},
  doi =		{10.4230/DagSemProc.10361.3},
  annote =	{Keywords: Adaptive MCMC, Gaussian Adaptation, CMA-ES, covari- ance matrix adaptation}
}
Document
08051 Abstracts Collection – Theory of Evolutionary Algorithms

Authors: Dirk V. Arnold, Anne Auger, Carsten Witt, and Jonathan E. Rowe

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
From Jan. 27, 2008 to Feb. 1, 2008, the Dagstuhl Seminar 08051 ``Theory of Evolutionary Algorithms'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Dirk V. Arnold, Anne Auger, Carsten Witt, and Jonathan E. Rowe. 08051 Abstracts Collection – Theory of Evolutionary Algorithms. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{arnold_et_al:DagSemProc.08051.1,
  author =	{Arnold, Dirk V. and Auger, Anne and Witt, Carsten and Rowe, Jonathan E.},
  title =	{{08051 Abstracts Collection – Theory of Evolutionary Algorithms}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--15},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.1},
  URN =		{urn:nbn:de:0030-drops-15242},
  doi =		{10.4230/DagSemProc.08051.1},
  annote =	{Keywords: Evolutionary Computation, Theory of Evolutionary Algorithms}
}
Document
08051 Executive Summary – Theory of Evolutionary Algorithms

Authors: Dirk V. Arnold, Anne Auger, Jonathan E. Rowe, and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
The 2008 Dagstuhl Seminar "Theory of Evolutionary Algorithms" was the fifth in a firmly established series of biannual events. In the week from Jan. 27, 2008 to Feb. 1, 2008, 47 researchers from nine countries discussed their recent work and trends in evolutionary computation.

Cite as

Dirk V. Arnold, Anne Auger, Jonathan E. Rowe, and Carsten Witt. 08051 Executive Summary – Theory of Evolutionary Algorithms. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{arnold_et_al:DagSemProc.08051.2,
  author =	{Arnold, Dirk V. and Auger, Anne and Rowe, Jonathan E. and Witt, Carsten},
  title =	{{08051 Executive Summary – Theory of Evolutionary Algorithms}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--5},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.2},
  URN =		{urn:nbn:de:0030-drops-14812},
  doi =		{10.4230/DagSemProc.08051.2},
  annote =	{Keywords: Evolutionary Algorithms, Theory of Evolutionary Algorithms}
}
Document
A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

Authors: Jun He, Yuren Zhou, and Xin Yao

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
Constraints exist in almost every optimization problem. Different constraint handling techniques have been incorporated with genetic algorithms (GAs), however most of current studies are based on computer experiments. An example is Michalewicz's comparison among GAs using different constraint handling techniques on the 0-1 knapsack problem. The following phenomena are observed in experiments: 1) the penalty method needs more generations to find a feasible solution to the restrictive capacity knapsack than the repair method; 2) the penalty method can find better solutions to the average capacity knapsack. Such observations need a theoretical explanation. This paper aims at providing a theoretical analysis of Michalewicz's experiments. The main result of the paper is that GAs using the repair method are more efficient than GAs using the penalty method on both restrictive capacity and average capacity knapsack problems. This result of the average capacity is a little different from Michalewicz's experimental results. So a supplemental experiment is implemented to support the theoretical claim. The results confirm the general principle pointed out by Coello: a better constraint-handling approach should tend to exploit specific domain knowledge.

Cite as

Jun He, Yuren Zhou, and Xin Yao. A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-39, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{he_et_al:DagSemProc.08051.3,
  author =	{He, Jun and Zhou, Yuren and Yao, Xin},
  title =	{{A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--39},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.3},
  URN =		{urn:nbn:de:0030-drops-14822},
  doi =		{10.4230/DagSemProc.08051.3},
  annote =	{Keywords: Genetic Algorithms, Constrained Optimization, Knapsack Problem, Computation Time, Performance Analysis}
}
Document
Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases

Authors: Boris S. Mitavskiy and Chris Cannings

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
The evolutionary algorithm stochastic process is well-known to be Markovian. These have been under investigation in much of the theoretical evolutionary computing research. When mutation rate is positive, the Markov chain modeling an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution, yet, rather little is known about the stationary distribution. On the other hand, knowing the stationary distribution may provide some information about the expected times to hit optimum, assessment of the biases due to recombination and is of importance in population genetics to assess what's called a ``genetic load" (see the introduction for more details). In this talk I will show how the quotient construction method can be exploited to derive rather explicit bounds on the ratios of the stationary distribution values of various subsets of the state space. In fact, some of the bounds obtained in the current work are expressed in terms of the parameters involved in all the three main stages of an evolutionary algorithm: namely selection, recombination and mutation. I will also discuss the newest developments which may allow for further improvements of the bounds

Cite as

Boris S. Mitavskiy and Chris Cannings. Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{mitavskiy_et_al:DagSemProc.08051.4,
  author =	{Mitavskiy, Boris S. and Cannings, Chris},
  title =	{{Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases}},
  booktitle =	{Theory of Evolutionary Algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.4},
  URN =		{urn:nbn:de:0030-drops-14845},
  doi =		{10.4230/DagSemProc.08051.4},
  annote =	{Keywords: Genetic algorithms, Markov chains, stationary distribution, lumping quotient}
}
Document
N-gram GP: Early results and half-baked ideas

Authors: Nicholas Freitag McPhee and Riccardo Poli

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
In this talk I present N-gram GP, a system for evolving linear GP programs using an EDA style system to update the probabilities of different 3-grams (triplets) of instructions. I then pick apart some of the evolved programs in an effort to better understand the properties of this approach and identify ways that it might be extended. Doing so reveals that there are frequently cases where the system needs two triples of the form ABC and ABD to solve the problem, but can only choose between them probabilistically in the EDA phase. I present the entirely untested idea of creating a new pseudo-instruction that is a duplicate of a key instruction. This could potentially allow the system to learn, for example, that AB is always followed by C, while AB' is always followed by D.

Cite as

Nicholas Freitag McPhee and Riccardo Poli. N-gram GP: Early results and half-baked ideas. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{mcphee_et_al:DagSemProc.08051.5,
  author =	{McPhee, Nicholas Freitag and Poli, Riccardo},
  title =	{{N-gram GP: Early results and half-baked ideas}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--3},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.5},
  URN =		{urn:nbn:de:0030-drops-14838},
  doi =		{10.4230/DagSemProc.08051.5},
  annote =	{Keywords: Genetic programming, estimation of distribution algorithms, linear GP, machine learning}
}
Document
Runtime Analysis of Binary PSO

Authors: Dirk Sudholt and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
We investigate the runtime of the Binary Particle Swarm Optimization (PSO) algorithm introduced by Kennedy and Eberhart (1997). The Binary PSO maintains a global best solution and a swarm of particles. Each particle consists of a current position, an own best position and a velocity vector used in a probabilistic process to update the particle's position. We present lower bounds for a broad class of implementations with swarms of polynomial size. To prove upper bounds, we transfer a fitness-level argument well-established for evolutionary algorithms (EAs) to PSO. This method is then applied to estimate the expected runtime on the class of unimodal functions. A simple variant of the Binary PSO is considered in more detail. The1-PSO only maintains one particle, hence own best and global best solutions coincide. Despite its simplicity, the 1-PSO is surprisingly efficient. A detailed analysis for the function Onemax shows that the 1-PSO is competitive to EAs.

Cite as

Dirk Sudholt and Carsten Witt. Runtime Analysis of Binary PSO. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{sudholt_et_al:DagSemProc.08051.6,
  author =	{Sudholt, Dirk and Witt, Carsten},
  title =	{{Runtime Analysis of Binary PSO}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--22},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.6},
  URN =		{urn:nbn:de:0030-drops-14800},
  doi =		{10.4230/DagSemProc.08051.6},
  annote =	{Keywords: Particle swarm optimization, runtime analysis}
}
Document
Runtime Analysis of a Simple Ant Colony Optimization Algorithm

Authors: Frank Neumann and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 6061, Theory of Evolutionary Algorithms (2006)


Abstract
Ant Colony Optimization (ACO) has become quite popular in recent years. In contrast to many successful applications, the theoretical foundation of this randomized search heuristic is rather weak. Building up such a theory is demanded to understand how these heuristics work as well as to come up with better algorithms for certain problems. Up to now, only convergence results have been achieved showing that optimal solutions can be obtained in a finite amount of time. We present the first runtime analysis of a simple ACO algorithm that transfers many rigorous results with respect to the expected runtime of a simple evolutionary algorithm to our algorithm. In addition, we examine the choice of the evaporation factor, which is a crucial parameter in such an algorithm, in greater detail and analyze its effect with respect to the runtime.

Cite as

Frank Neumann and Carsten Witt. Runtime Analysis of a Simple Ant Colony Optimization Algorithm. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 6061, pp. 1-17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{neumann_et_al:DagSemProc.06061.8,
  author =	{Neumann, Frank and Witt, Carsten},
  title =	{{Runtime Analysis of  a Simple Ant Colony Optimization Algorithm}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--17},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{6061},
  editor =	{Dirk V. Arnold and Thomas Jansen and Michael D. Vose and Jonathan E. Rowe},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.06061.8},
  URN =		{urn:nbn:de:0030-drops-5928},
  doi =		{10.4230/DagSemProc.06061.8},
  annote =	{Keywords: Randomized Search Heuristics, Ant Colony Optimization, Runtime Analysis}
}
  • Refine by Author
  • 7 Witt, Carsten
  • 3 Auger, Anne
  • 2 Arnold, Dirk V.
  • 2 Rowe, Jonathan E.
  • 1 Cannings, Chris
  • Show More...

  • Refine by Classification
  • 1 Computing methodologies → Search methodologies
  • 1 Theory of computation → Design and analysis of algorithms

  • Refine by Keyword
  • 2 Evolutionary Algorithms
  • 2 Genetic Algorithms
  • 2 Randomized Search Heuristics
  • 2 Theory of Evolutionary Algorithms
  • 2 machine learning
  • Show More...

  • Refine by Type
  • 12 document

  • Refine by Publication Year
  • 6 2008
  • 3 2010
  • 1 2006
  • 1 2012
  • 1 2022

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail