Universitätspublikationen
Refine
Year of publication
Document Type
- Article (58) (remove)
Has Fulltext
- yes (58)
Is part of the Bibliography
- no (58)
Keywords
- Brownian motion (2)
- Perception (2)
- Tropical geometry (2)
- Vision (2)
- coalescent (2)
- complexity (2)
- genealogy (2)
- Action potential (1)
- Adaptive dynamics (1)
- Ancestral selection graph (1)
Institute
- Mathematik (58) (remove)
In this article we use techniques from tropical and logarithmic geometry to construct a non-Archimedean analogue of Teichmüller space T¯g whose points are pairs consisting of a stable projective curve over a non-Archimedean field and a Teichmüller marking of the topological fundamental group of its Berkovich analytification. This construction is closely related to and inspired by the classical construction of a non-Archimedean Schottky space for Mumford curves by Gerritzen and Herrlich. We argue that the skeleton of non-Archimedean Teichmüller space is precisely the tropical Teichmüller space introduced by Chan–Melo–Viviani as a simplicial completion of Culler–Vogtmann Outer space. As a consequence, Outer space turns out to be a strong deformation retract of the locus of smooth Mumford curves in T¯g.
Poster presentation from Twentieth Annual Computational Neuroscience Meeting: CNS*2011 Stockholm, Sweden. 23-28 July 2011. In statistical spike train analysis, stochastic point process models usually assume stationarity, in particular that the underlying spike train shows a constant firing rate (e.g. [1]). However, such models can lead to misinterpretation of the associated tests if the assumption of rate stationarity is not met (e.g. [2]). Therefore, the analysis of nonstationary data requires that rate changes can be located as precisely as possible. However, present statistical methods focus on rejecting the null hypothesis of stationarity without explicitly locating the change point(s) (e.g. [3]). We propose a test for stationarity of a given spike train that can also be used to estimate the change points in the firing rate. Assuming a Poisson process with piecewise constant firing rate, we propose a Step-Filter-Test (SFT) which can work simultaneously in different time scales, accounting for the high variety of firing patterns in experimental spike trains. Formally, we compare the numbers N1=N1(t,h) and N2=N2(t,h) of spikes in the time intervals (t-h,t] and (h,t+h]. By varying t within a fine time lattice and simultaneously varying the interval length h, we obtain a multivariate statistic D(h,t):=(N1-N2)/V(N1+N2), for which we prove asymptotic multivariate normality under homogeneity. From this a practical, graphical device to spot changes of the firing rate is constructed. Our graphical representation of D(h,t) (Figure 1A) visualizes the changes in the firing rate. For the statistical test, a threshold K is chosen such that under homogeneity, |D(h,t)|<K holds for all investigated h and t with probability 0.95. This threshold can indicate potential change points in order to estimate the inhomogeneous rate profile (Figure 1B). The SFT is applied to a sample data set of spontaneous single unit activity recorded from the substantia nigra of anesthetized mice. In this data set, multiple rate changes are identified which agree closely with visual inspection. In contrast to approaches choosing one fixed kernel width [4], our method has advantages in the flexibility of h.
We consider versions of the FIND algorithm where the pivot element used is the median of a subset chosen uniformly at random from the data. For the median selection we assume that subsamples of size asymptotic to c⋅nα are chosen, where 0<α≤12, c>0 and n is the size of the data set to be split. We consider the complexity of FIND as a process in the rank to be selected and measured by the number of key comparisons required. After normalization we show weak convergence of the complexity to a centered Gaussian process as n→∞, which depends on α. The proof relies on a contraction argument for probability distributions on càdlàg functions. We also identify the covariance function of the Gaussian limit process and discuss path and tail properties.
Bipartite graphs occur in many parts of mathematics, and their embeddings into orientable compact surfaces are an old subject. A new interest comes from the fact that these embeddings give dessins d’enfants providing the surface with a unique structure as a Riemann surface and algebraic curve. In this paper, we study the (surprisingly many different) dessins coming from the graphs of finite cyclic projective planes. It turns out that all reasonable questions about these dessins — uniformity, regularity, automorphism groups, cartographic groups, defining equations of the algebraic curves, their fields of definition, Galois actions — depend on cyclic orderings of difference sets for the projective planes. We explain the interplay between number theoretic problems concerning these cyclic ordered difference sets and topological properties of the dessin like e.g. the Wada property that every vertex lies on the border of every cell.
In this paper, a translation of the visual description technique HyCharts to Hybrid Data-Flow Graphs (HDFG) is given. While HyCharts combine a data-flow and a control-flow oriented formalism for the specification of the architecture and the behavior of hybrid systems, HDFG allow the efficient and homogeneous internal representation of hybrid systems in computers and their automatic manipulation. HDFG represent a system as a data-flow network built from a set of fundamental functions.
The translation permits to combine the advantages of the different description techniques: The use of HyCharts for specification supports the abstract and formal interactive specification of hybrid systems, while HDFG permit the tool based optimization of hybrid systems and the synthesis of mixed-signal prototypes.
In 1957, Craig Mooney published a set of human face stimuli to study perceptual closure: the formation of a coherent percept on the basis of minimal visual information. Images of this type, now known as “Mooney faces”, are widely used in cognitive psychology and neuroscience because they offer a means of inducing variable perception with constant visuo-spatial characteristics (they are often not perceived as faces if viewed upside down). Mooney’s original set of 40 stimuli has been employed in several studies. However, it is often necessary to use a much larger stimulus set. We created a new set of over 500 Mooney faces and tested them on a cohort of human observers. We present the results of our tests here, and make the stimuli freely available via the internet. Our test results can be used to select subsets of the stimuli that are most suited for a given experimental purpose.
Can variances of latent variables be scaled in such a way that they correspond to eigenvalues?
(2017)
The paper reports an investigation of whether sums of squared factor loadings obtained in confirmatory factor analysis correspond to eigenvalues of exploratory factor analysis. The sum of squared factor loadings reflects the variance of the corresponding latent variable if the variance parameter of the confirmatory factor model is set equal to one. Hence, the computation of the sum implies a specific type of scaling of the variance. While the investigation of the theoretical foundations suggested the expected correspondence between sums of squared factor loadings and eigenvalues, the necessity of procedural specifications in the application, as for example the estimation method, revealed external influences on the outcome. A simulation study was conducted that demonstrated the possibility of exact correspondence if the same estimation method was applied. However, in the majority of realized specifications the estimates showed similar sizes but no correspondence.
Statistical analysis on various stocks reveals long range dependence behavior of the stock prices that is not consistent with the classical Black and Scholes model. This memory or nondeterministic trend behavior is often seen as a reflection of market sentiments and causes that the historical volatility estimator becomes unreliable in practice. We propose an extension of the Black and Scholes model by adding a term to the original Wiener term involving a smoother process which accounts for these effects. The problem of arbitrage will be discussed. Using a generalized stochastic integration theory [8], we show that it is possible to construct a self financing replicating portfolio for a European option without any further knowledge of the extension and that, as a consequence, the classical concept of volatility needs to be re-interpreted.
AMS subject classifications: 60H05, 60H10, 90A09.
Thought structures of modelling task solutions and their connection to the level of difficulty
(2015)
Although efforts have been made to integrate the concept of mathematical modelling in school, among others PISA and TIMSS revealed weaknesses of not only German students in the field of mathematical modelling. There may be various reasons starting from educational policy via curricular issues to practical instructional concerns. Studies show that mathematical modelling has not been arrived yet in everyday school class (Blum &BorromeoFerri, 2009, p. 47). Thus, the proportion of mathematical modelling in everyday school classes is low (Jordan et al., 2006). When focusing on the teachers’ point of view there are difficulties which may contribute to avoid modelling tasks in class. The development of reasonable modelling tasks, estimating the task space, valuating the task difficulty and assessing the student solutions are difficulties which occur to an increasing degree compared to ordinary mathematics tasks.The project MokiMaS (transl.: modeling competency in math classes of secondary education) aims at providing inter-year modelling tasks, whose task space and level of difficulty is known, together with an evaluation scheme. In particular a theory based method has been developed to determine the level of difficulty of modelling tasks on the basis of thought structures, representing the cognitive load of solution approaches. The current question is whether this method leads to a realistic rating. To go further into that question an evaluation scheme has been developed which is guided by the daily assessment work of teachers, to investigate the relation of task difficulty and student performance.