11 Search Results for "Auger, Anne"


Document
Challenges in Benchmarking Optimization Heuristics (Dagstuhl Seminar 23251)

Authors: Anne Auger, Peter A. N. Bosman, Pascal Kerschke, Darrell Whitley, and Lennart Schäpermeier

Published in: Dagstuhl Reports, Volume 13, Issue 6 (2024)


Abstract
This report documents the program and outcomes of the Dagstuhl Seminar 23251 "Challenges in Benchmarking Optimization Heuristics". In the domain of optimization heuristics, a stable basis for fairly evaluating the performance of optimization algorithms and other solution approaches - commonly referred to as "benchmarking" - is fundamental to ensuring steady scientific progress. Although many pitfalls are well known in the community, the development of sound benchmarking protocols is slow, and the adoption of community-wide recognized and implementable standards requires lasting and joint efforts among research groups. This seminar brought together people from diverse backgrounds and fostered discussions among different optimization communities, focusing on how to cope with "horse racing papers", landscape analysis techniques for understanding problem instances, and discussions about the overarching goals of benchmarking.

Cite as

Anne Auger, Peter A. N. Bosman, Pascal Kerschke, Darrell Whitley, and Lennart Schäpermeier. Challenges in Benchmarking Optimization Heuristics (Dagstuhl Seminar 23251). In Dagstuhl Reports, Volume 13, Issue 6, pp. 55-80, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{auger_et_al:DagRep.13.6.55,
  author =	{Auger, Anne and Bosman, Peter A. N. and Kerschke, Pascal and Whitley, Darrell and Sch\"{a}permeier, Lennart},
  title =	{{Challenges in Benchmarking Optimization Heuristics (Dagstuhl Seminar 23251)}},
  pages =	{55--80},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2024},
  volume =	{13},
  number =	{6},
  editor =	{Auger, Anne and Bosman, Peter A. N. and Kerschke, Pascal and Whitley, Darrell and Sch\"{a}permeier, Lennart},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.13.6.55},
  URN =		{urn:nbn:de:0030-drops-196383},
  doi =		{10.4230/DagRep.13.6.55},
  annote =	{Keywords: benchmarking, design of search heuristics, optimization, real-world applications, understanding problem complexity}
}
Document
Theory of Randomized Optimization Heuristics (Dagstuhl Seminar 22081)

Authors: Anne Auger, Carlos M. Fonseca, Tobias Friedrich, Johannes Lengler, and Armand Gissler

Published in: Dagstuhl Reports, Volume 12, Issue 2 (2022)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 22081 "Theory of Randomized Optimization Heuristics". This seminar is part of a biennial seminar series. This year, we focused on connections between classical topics of the community, such as Evolutionary Algorithms and Strategies (EA, ES), Estimation-of-Distribution Algorithms (EDA) and Evolutionary Multi-Objective Optimization (EMO), and related fields like Stochastic Gradient Descent (SGD) and Bayesian Optimization (BO). The mixture proved to be extremely successful. Already the first talk turned into a two hour long, vivid and productive plenary discussion. The seminar was smaller than previous versions (due to corona regulations), but its intensity more than made up for the smaller size.

Cite as

Anne Auger, Carlos M. Fonseca, Tobias Friedrich, Johannes Lengler, and Armand Gissler. Theory of Randomized Optimization Heuristics (Dagstuhl Seminar 22081). In Dagstuhl Reports, Volume 12, Issue 2, pp. 87-102, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Article{auger_et_al:DagRep.12.2.87,
  author =	{Auger, Anne and Fonseca, Carlos M. and Friedrich, Tobias and Lengler, Johannes and Gissler, Armand},
  title =	{{Theory of Randomized Optimization Heuristics (Dagstuhl Seminar 22081)}},
  pages =	{87--102},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2022},
  volume =	{12},
  number =	{2},
  editor =	{Auger, Anne and Fonseca, Carlos M. and Friedrich, Tobias and Lengler, Johannes and Gissler, Armand},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.12.2.87},
  URN =		{urn:nbn:de:0030-drops-169325},
  doi =		{10.4230/DagRep.12.2.87},
  annote =	{Keywords: black-box optimization, derivative-free optimization, evolutionary and genetic algorithms, randomized search algorithms, stochastic gradient descent, theoretical computer science}
}
Document
10361 Abstracts Collection and Executive Summary – Theory of Evolutionary Algorithms

Authors: Anne Auger, Jonathan L. Shapiro, L. Darrell Whitley, and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 10361, Theory of Evolutionary Algorithms (2010)


Abstract
From September 5 to 10, the Dagstuhl Seminar 10361 ``Theory of Evolutionary Algorithms '' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general.

Cite as

Anne Auger, Jonathan L. Shapiro, L. Darrell Whitley, and Carsten Witt. 10361 Abstracts Collection and Executive Summary – Theory of Evolutionary Algorithms. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 10361, pp. 1-19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{auger_et_al:DagSemProc.10361.1,
  author =	{Auger, Anne and Shapiro, Jonathan L. and Whitley, L. Darrell and Witt, Carsten},
  title =	{{10361 Abstracts Collection and Executive Summary – Theory of Evolutionary Algorithms}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--19},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10361},
  editor =	{Anne Auger and Jonathan L. Shapiro and L. Darrell Whitley and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10361.1},
  URN =		{urn:nbn:de:0030-drops-28180},
  doi =		{10.4230/DagSemProc.10361.1},
  annote =	{Keywords: Evolutionary algorithms, bio-inspired search heuristics, theoretical analysis, optimization time}
}
Document
2-bit Flip Mutation Elementary Fitness Landscapes

Authors: William Langdon

Published in: Dagstuhl Seminar Proceedings, Volume 10361, Theory of Evolutionary Algorithms (2010)


Abstract
Genetic Programming parity is not elementary. GP parity cannot be represented as the sum of a small number of elementary landscapes. Statistics, including fitness distance correlation, of Parity's fitness landscape are calculated. Using Walsh analysis the eigen values and eigenvectors of the Laplacian of the two bit flip fitness landscape are given and a ruggedness measure for elementary landscapes is proposed. An elementary needle in a haystack (NIH) landscape is given.

Cite as

William Langdon. 2-bit Flip Mutation Elementary Fitness Landscapes. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 10361, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{langdon:DagSemProc.10361.2,
  author =	{Langdon, William},
  title =	{{2-bit Flip Mutation Elementary Fitness Landscapes}},
  booktitle =	{Theory of Evolutionary Algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10361},
  editor =	{Anne Auger and Jonathan L. Shapiro and L. Darrell Whitley and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10361.2},
  URN =		{urn:nbn:de:0030-drops-28146},
  doi =		{10.4230/DagSemProc.10361.2},
  annote =	{Keywords: Genetic Algorithms, Genetic Programming, search, optimisation, graph theory, Laplacian, Hamming cube}
}
Document
Exploring the common concepts of adaptive MCMC and Covariance Matrix Adaptation schemes

Authors: Christian Lorenz Mueller

Published in: Dagstuhl Seminar Proceedings, Volume 10361, Theory of Evolutionary Algorithms (2010)


Abstract
In the field of scientific modeling, one is often confronted with the task of drawing samples from a probability distribution that is only known up to a normalizing constant and for which no direct analytical method for sample generation is available. Since the past decade, adaptive Markov Chain Monte Carlo (MCMC) methods gained considerable attention in the statistics community in order to tackle this black-box (or indirect) sampling scenario. Common application domains are Bayesian statistics and statistical physics. Adaptive MCMC methods try to learn an optimal proposal distribution from previously accepted samples in order to efficiently explore the target distribution. Variable metric ap- proaches in black-box optimization, such as the Evolution Strategy with covariance matrix adaptation (CMA-ES) and Gaussian Adaption (GaA), use almost identical ideas to locate putative global optima. This extended abstract summarizes the common concepts in adaptive MCMC and co- variance matrix adaptation schemes. We also present how both types of methods can be unified within the Gaussian Adaptation framework and propose a unification of both fields as “grand challenge” for future research.

Cite as

Christian Lorenz Mueller. Exploring the common concepts of adaptive MCMC and Covariance Matrix Adaptation schemes. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 10361, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{mueller:DagSemProc.10361.3,
  author =	{Mueller, Christian Lorenz},
  title =	{{Exploring the common concepts of adaptive MCMC and Covariance Matrix Adaptation schemes}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--10},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{10361},
  editor =	{Anne Auger and Jonathan L. Shapiro and L. Darrell Whitley and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.10361.3},
  URN =		{urn:nbn:de:0030-drops-28135},
  doi =		{10.4230/DagSemProc.10361.3},
  annote =	{Keywords: Adaptive MCMC, Gaussian Adaptation, CMA-ES, covari- ance matrix adaptation}
}
Document
08051 Abstracts Collection – Theory of Evolutionary Algorithms

Authors: Dirk V. Arnold, Anne Auger, Carsten Witt, and Jonathan E. Rowe

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
From Jan. 27, 2008 to Feb. 1, 2008, the Dagstuhl Seminar 08051 ``Theory of Evolutionary Algorithms'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Dirk V. Arnold, Anne Auger, Carsten Witt, and Jonathan E. Rowe. 08051 Abstracts Collection – Theory of Evolutionary Algorithms. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{arnold_et_al:DagSemProc.08051.1,
  author =	{Arnold, Dirk V. and Auger, Anne and Witt, Carsten and Rowe, Jonathan E.},
  title =	{{08051 Abstracts Collection – Theory of Evolutionary Algorithms}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--15},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.1},
  URN =		{urn:nbn:de:0030-drops-15242},
  doi =		{10.4230/DagSemProc.08051.1},
  annote =	{Keywords: Evolutionary Computation, Theory of Evolutionary Algorithms}
}
Document
08051 Executive Summary – Theory of Evolutionary Algorithms

Authors: Dirk V. Arnold, Anne Auger, Jonathan E. Rowe, and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
The 2008 Dagstuhl Seminar "Theory of Evolutionary Algorithms" was the fifth in a firmly established series of biannual events. In the week from Jan. 27, 2008 to Feb. 1, 2008, 47 researchers from nine countries discussed their recent work and trends in evolutionary computation.

Cite as

Dirk V. Arnold, Anne Auger, Jonathan E. Rowe, and Carsten Witt. 08051 Executive Summary – Theory of Evolutionary Algorithms. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{arnold_et_al:DagSemProc.08051.2,
  author =	{Arnold, Dirk V. and Auger, Anne and Rowe, Jonathan E. and Witt, Carsten},
  title =	{{08051 Executive Summary – Theory of Evolutionary Algorithms}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--5},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.2},
  URN =		{urn:nbn:de:0030-drops-14812},
  doi =		{10.4230/DagSemProc.08051.2},
  annote =	{Keywords: Evolutionary Algorithms, Theory of Evolutionary Algorithms}
}
Document
A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

Authors: Jun He, Yuren Zhou, and Xin Yao

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
Constraints exist in almost every optimization problem. Different constraint handling techniques have been incorporated with genetic algorithms (GAs), however most of current studies are based on computer experiments. An example is Michalewicz's comparison among GAs using different constraint handling techniques on the 0-1 knapsack problem. The following phenomena are observed in experiments: 1) the penalty method needs more generations to find a feasible solution to the restrictive capacity knapsack than the repair method; 2) the penalty method can find better solutions to the average capacity knapsack. Such observations need a theoretical explanation. This paper aims at providing a theoretical analysis of Michalewicz's experiments. The main result of the paper is that GAs using the repair method are more efficient than GAs using the penalty method on both restrictive capacity and average capacity knapsack problems. This result of the average capacity is a little different from Michalewicz's experimental results. So a supplemental experiment is implemented to support the theoretical claim. The results confirm the general principle pointed out by Coello: a better constraint-handling approach should tend to exploit specific domain knowledge.

Cite as

Jun He, Yuren Zhou, and Xin Yao. A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-39, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{he_et_al:DagSemProc.08051.3,
  author =	{He, Jun and Zhou, Yuren and Yao, Xin},
  title =	{{A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--39},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.3},
  URN =		{urn:nbn:de:0030-drops-14822},
  doi =		{10.4230/DagSemProc.08051.3},
  annote =	{Keywords: Genetic Algorithms, Constrained Optimization, Knapsack Problem, Computation Time, Performance Analysis}
}
Document
Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases

Authors: Boris S. Mitavskiy and Chris Cannings

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
The evolutionary algorithm stochastic process is well-known to be Markovian. These have been under investigation in much of the theoretical evolutionary computing research. When mutation rate is positive, the Markov chain modeling an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution, yet, rather little is known about the stationary distribution. On the other hand, knowing the stationary distribution may provide some information about the expected times to hit optimum, assessment of the biases due to recombination and is of importance in population genetics to assess what's called a ``genetic load" (see the introduction for more details). In this talk I will show how the quotient construction method can be exploited to derive rather explicit bounds on the ratios of the stationary distribution values of various subsets of the state space. In fact, some of the bounds obtained in the current work are expressed in terms of the parameters involved in all the three main stages of an evolutionary algorithm: namely selection, recombination and mutation. I will also discuss the newest developments which may allow for further improvements of the bounds

Cite as

Boris S. Mitavskiy and Chris Cannings. Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{mitavskiy_et_al:DagSemProc.08051.4,
  author =	{Mitavskiy, Boris S. and Cannings, Chris},
  title =	{{Evaluating Stationary Distribution of the Binary GA Markov Chain in Special Cases}},
  booktitle =	{Theory of Evolutionary Algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.4},
  URN =		{urn:nbn:de:0030-drops-14845},
  doi =		{10.4230/DagSemProc.08051.4},
  annote =	{Keywords: Genetic algorithms, Markov chains, stationary distribution, lumping quotient}
}
Document
N-gram GP: Early results and half-baked ideas

Authors: Nicholas Freitag McPhee and Riccardo Poli

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
In this talk I present N-gram GP, a system for evolving linear GP programs using an EDA style system to update the probabilities of different 3-grams (triplets) of instructions. I then pick apart some of the evolved programs in an effort to better understand the properties of this approach and identify ways that it might be extended. Doing so reveals that there are frequently cases where the system needs two triples of the form ABC and ABD to solve the problem, but can only choose between them probabilistically in the EDA phase. I present the entirely untested idea of creating a new pseudo-instruction that is a duplicate of a key instruction. This could potentially allow the system to learn, for example, that AB is always followed by C, while AB' is always followed by D.

Cite as

Nicholas Freitag McPhee and Riccardo Poli. N-gram GP: Early results and half-baked ideas. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{mcphee_et_al:DagSemProc.08051.5,
  author =	{McPhee, Nicholas Freitag and Poli, Riccardo},
  title =	{{N-gram GP: Early results and half-baked ideas}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--3},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.5},
  URN =		{urn:nbn:de:0030-drops-14838},
  doi =		{10.4230/DagSemProc.08051.5},
  annote =	{Keywords: Genetic programming, estimation of distribution algorithms, linear GP, machine learning}
}
Document
Runtime Analysis of Binary PSO

Authors: Dirk Sudholt and Carsten Witt

Published in: Dagstuhl Seminar Proceedings, Volume 8051, Theory of Evolutionary Algorithms (2008)


Abstract
We investigate the runtime of the Binary Particle Swarm Optimization (PSO) algorithm introduced by Kennedy and Eberhart (1997). The Binary PSO maintains a global best solution and a swarm of particles. Each particle consists of a current position, an own best position and a velocity vector used in a probabilistic process to update the particle's position. We present lower bounds for a broad class of implementations with swarms of polynomial size. To prove upper bounds, we transfer a fitness-level argument well-established for evolutionary algorithms (EAs) to PSO. This method is then applied to estimate the expected runtime on the class of unimodal functions. A simple variant of the Binary PSO is considered in more detail. The1-PSO only maintains one particle, hence own best and global best solutions coincide. Despite its simplicity, the 1-PSO is surprisingly efficient. A detailed analysis for the function Onemax shows that the 1-PSO is competitive to EAs.

Cite as

Dirk Sudholt and Carsten Witt. Runtime Analysis of Binary PSO. In Theory of Evolutionary Algorithms. Dagstuhl Seminar Proceedings, Volume 8051, pp. 1-22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{sudholt_et_al:DagSemProc.08051.6,
  author =	{Sudholt, Dirk and Witt, Carsten},
  title =	{{Runtime Analysis of Binary PSO}},
  booktitle =	{Theory of Evolutionary Algorithms},
  pages =	{1--22},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8051},
  editor =	{Dirk V. Arnold and Anne Auger and Jonathan E. Rowe and Carsten Witt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08051.6},
  URN =		{urn:nbn:de:0030-drops-14800},
  doi =		{10.4230/DagSemProc.08051.6},
  annote =	{Keywords: Particle swarm optimization, runtime analysis}
}
  • Refine by Author
  • 5 Auger, Anne
  • 4 Witt, Carsten
  • 2 Arnold, Dirk V.
  • 2 Rowe, Jonathan E.
  • 1 Bosman, Peter A. N.
  • Show More...

  • Refine by Classification
  • 1 Computing methodologies → Search methodologies
  • 1 General and reference → Empirical studies
  • 1 Theory of computation → Bio-inspired optimization
  • 1 Theory of computation → Evolutionary algorithms
  • 1 Theory of computation → Theory of randomized search heuristics

  • Refine by Keyword
  • 2 Genetic Algorithms
  • 2 Theory of Evolutionary Algorithms
  • 1 Adaptive MCMC
  • 1 CMA-ES
  • 1 Computation Time
  • Show More...

  • Refine by Type
  • 11 document

  • Refine by Publication Year
  • 6 2008
  • 3 2010
  • 1 2022
  • 1 2024

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail