Refine
Document Type
- Article (6)
Language
- English (6)
Has Fulltext
- yes (6)
Is part of the Bibliography
- no (6)
Keywords
Institute
- Mathematik (6)
We determine that the continuous-state branching processes for which the genealogy, suitably time-changed, can be described by an autonomous Markov process are precisely those arising from $\alpha$-stable branching mechanisms. The random ancestral partition is then a time-changed $\Lambda$-coalescent, where $\Lambda$ is the Beta-distribution with parameters $2-\alpha$ and $\alpha$, and the time change is given by $Z^{1-\alpha}$, where $Z$ is the total population size. For $\alpha = 2$ (Feller's branching diffusion) and $\Lambda = \delta_0$ (Kingman's coalescent), this is in the spirit of (a non-spatial version of) Perkins' Disintegration Theorem. For $\alpha =1$ and $\Lambda$ the uniform distribution on $[0,1]$, this is the duality discovered by Bertoin & Le Gall (2000) between the norming of Neveu's continuous state branching process and the Bolthausen-Sznitman coalescent.
We present two approaches: one, exploiting the `modified lookdown construction', draws heavily on Donnelly & Kurtz (1999); the other is based on direct calculations with generators.
From Brownian motion with a local time drift to Feller's branching diffusion with logistic growth
(2011)
We give a new proof for a Ray-Knight representation of Feller's branching diffusion with logistic growth in terms of the local times of a reflected Brownian motion H with a drift that is affine linear in the local time accumulated by H
at its current level. In Le et al. (2011) such a representation was obtained by an approximation through Harris paths that code the genealogies of particle systems. The present proof is purely in terms of stochastic analysis, and is inspired by previous work of Norris, Rogers and Williams (1988).
The objective of this paper is the study of the equilibrium behavior of a population on the hierarchical group ΩN consisting of families of individuals undergoing critical branching random walk and in addition these families also develop according to a critical branching process. Strong transience of the random walk guarantees existence of an equilibrium for this two-level branching system. In the limit N→∞ (called the hierarchical mean field limit), the equilibrium aggregated populations in a nested sequence of balls B(N)ℓ of hierarchical radius ℓ converge to a backward Markov chain on R+. This limiting Markov chain can be explicitly represented in terms of a cascade of subordinators which in turn makes possible a description of the genealogy of the population.
For a class of Cannings models we prove Haldane’s formula, π(sN)∼2sNρ2, for the fixation probability of a single beneficial mutant in the limit of large population size N and in the regime of moderately strong selection, i.e. for sN∼N−b and 0<b<1/2. Here, sN is the selective advantage of an individual carrying the beneficial type, and ρ2 is the (asymptotic) offspring variance. Our assumptions on the reproduction mechanism allow for a coupling of the beneficial allele’s frequency process with slightly supercritical Galton–Watson processes in the early phase of fixation.
We consider catalytic branching random walk (the reactant) where the state space is a countable Abelean group. The branching is critical binary and the local branching rate is given by a catalytic medium. Here the medium is itself an autonomous (ordinary) branching random walk (the catalyst) - maybe with a different motion law. For persistent catalyst (transient motion) the reactant shows the usual dichotomy of persistence versus extinction depending on transience or recurrence of its motion. If the catalyst goes to local extinction it turns out that the longtime behaviour of the reactant ranges (depending on its motion) from local extinction to free random walk with either deterministic or random global intensity of particles.
It is possible to represent each of a number of Markov chains as an evolving sequence of connected subsets of a directed acyclic graph that grow in the following way: initially, all vertices of the graph are unoccupied, particles are fed in one-by-one at a distinguished source vertex, successive particles proceed along directed edges according to an appropriate stochastic mechanism, and each particle comes to rest once it encounters an unoccupied vertex. Examples include the binary and digital search tree processes, the random recursive tree process and generalizations of it arising from nested instances of Pitman's two-parameter Chinese restaurant process, tree-growth models associated with Mallows' ϕ model of random permutations and with Schützenberger's non-commutative q-binomial theorem, and a construction due to Luczak and Winkler that grows uniform random binary trees in a Markovian manner. We introduce a framework that encompasses such Markov chains, and we characterize their asymptotic behavior by analyzing in detail their Doob-Martin compactifications, Poisson boundaries and tail σ-fields.