Refine
Document Type
- diplomthesis (13) (remove)
Language
- English (13) (remove)
Has Fulltext
- yes (13)
Is part of the Bibliography
- no (13)
Keywords
- Approximationsalgorithmus (1)
- CJT formalism (1)
- CJT-Formalismus (1)
- Caché (1)
- Chirale Symmetrie (1)
- Dichte <Stochastik> (1)
- Dosis-Wirkungs-Modellierung (1)
- Einbettung <Mathematik> (1)
- Fixpunkt (1)
- InterSystems (1)
Institute
- Mathematik (6)
- Physik (4)
- Informatik (3)
We presented a proof for the classical stable limit laws under use of contraction method in combination with the Zolotarev metric. Furthermore, a stable limit law was proved for scaled sums of growing into sequences. This limit law was alternatively formulated for sequences of random variables defined by a simple degenerate recursion.
The Benchmark Dose (BMD) approach, which was suggested firstly in 1984 by K. Crump [CRUMP (1984)], is a widely used instrument in risk assessment of substances in the environment and in food. In this context, the BMD approach determines a reference point (RfP) on the statistically estimated dose-response curve, for which the risk can be determined with adequate certainty and confidence. In the next step of risk characterization a threshold is calculated, based on this RfP and toxicological considerations. The BMD approach bases upon the fit of a dose-response model on the data. For this fit a stochastic distribution of the response endpoint is taken as a basis. Ultimately, the BMD reflects the dose for which a pre-specified increase in an adverse health effect (the benchmark response) can be expected. Until now, the BMD approach has been specified only for quantal and continuous endpoints. But in risk assessment of carcinogens especially so called time-to-event data are of high interest since they contain more information on the tumor development than quantal incidence data. The goal of this diploma thesis was to extend the BMD approach to such time-to-event data.
A new experimental system has been set up with the ability to investigate catalytic processes and charge transfer of acrylonitrile on copper. For this purpose a new Time of Flight Mass Spectrometer to measure both the reaction outcome and electron energy distributions has been designed and tested. First experiments have been carried out, in which the width of the two-photon photoelectron energy distribution can be varied by changing the wavelength of the incident laser beam. This method allows high precision measurements of the work function and will be useful in the study with adsorbates, physi- or chemisorbed. In first adsorption measurements the excitation of vibrational modes of acrylonitrile has been seen to be consistent with earlier gas-phase experiments. Electron energy spectra taken with the electron analyzer with high resolution showed a clear defect in the electron yield at energies around the energy of one vibrational mode, indicating the possibility of resonant vibrational excitation by electron impact. More indications to that process were found i first electron spectra from the new TOF-MS, since a threshold for the capture probability is found at energies close to vibrational excitation. The threshold vanishes when the exposure is amplified significantly, indicating that electrons are scattered multiple and no resonance are be observed anymore. The experiments carried out were just the starting point in understanding the mechanism of the reaction. A new femtosecond laser system which is currently set up will give not only a time-resolved information on the reaction pathways but also give the possibility to create non-thermal electrons and to study intermediate states of the photoemission and the influence of the adsorbate on them. In addition the rotation of the electron analyzer will permit angle-resolved measurements of the scattering process of the electrons and the vibrational excitation via this pathway. With the new cooling system applied it will also be interesting to study the excitation process at lower temperatures. Below -160° C there are different geometries of the molecule predicted to be present at the surface. At these temperatures the thermal effects should play a major role, so that a thermal decoupling of the electrons is very desirable.
This work connects Markov chain imbedding technique (MCIT) introduced by M.V. Koutras and J.C. Fu with distributions concerning the cycle structure of permutations. As a final result program code is given that uses MCIT to deliver proper numerical values for these. The discrete distributions of interest are the one of the cycle structure, the one of the number of cycles, the one of the rth longest and shortest cycle and finally the length of a random chosen cycle. These are analyzed for equiprobable permutations as well as for biased ones. Analytical solutions and limit distributions are also considered to put the results on a safe, theoretical base.
The synchronization of neuronal firing activity is considered an important mechanism in cortical information processing. The tendency of multiple neurons to synchronize their joint firing activity can be investigated with the 'unitary event' analysis (Grün, 1996). This method is based on the nullhypothesis of independent Bernoulli processes and can therefore not tell whether coincidences observed between more than two processes can be considered "genuine" higher- order coincidences or whether they might be caused by coincidences of lower order that coincide by chance ("chance coincidences"). In order to distinguish between genuine and chance coincidences, a parametric model of independent interaction processes (MIIP) is presented. In the framework of this model, Maximum-Likelihood estimates are derived for the firing rates of n single processes and for the rates with which genuine higher order correlations occur. The asymptotic normality of these estimates is used to derive their asymptotic variance and in order to investigate whether higher order coincidences can be considered genuine or whether they can be explained by chance coincidences. The empirical test power of this procedure for n=2 and n=3 processes and for finite analysis windows is derived with simulations and compared to the asymptotic values. Finally, the model is extended in order to allow for the analysis of correlations that are caused by jittered coincidences.
This thesis presents the implementation of the online reconstruction, calibration and monitoring of the data of the Transition Radiation Detector of ALICE. This reconstruction is performed on the High Level Trigger, the third level of the ALICE trigger system, and enables online calibration and monitoring of the incoming data. Additionally, the HLT can steer the data storage, such that only physical interesting events are saved. The online reconstruction, as well as the calibration, makes use of the existing offline algorithms. Therefore, interfaces between the HLT and these offline algorithms were implemented. For being able to reach the speed of 2000 Hz in proton-proton collisions, and 200 Hz in leadlead collisions, the algorithms had to be accelerated. Bottlenecks were tracked down using dedicated tools, and respective code was either reimplemented or it is being skipped during the online reconstruction. The quality of the output data was monitored throughout the implementation, to assure that it is not being cut too much.
The thesis in general deals with CORBA, the Common Object Request Broker Architecture. More specifically, it takes a look at the server-side, where object adapters exist to aid the developer in implementing objects and in dealing with request processing. The new Portable Object Adapter was recently added to the CORBA 2.2 standard. My task was the implementation of the POA in MICO and the examination if (a) the POA specification is sensible and (b) in which areas it improves over the old Basic Object Adapter. After introducing distributed platforms in general and CORBA in particular, the thesis' main two chapters are a detailed abstract examination ("Design") of the POA design and their relization ("Implementation"), highlighting the potential trouble spots, persistence and collocation.
Approximating Perpetuities
(2006)
A perpetuity is a real valued random variable which is characterised by a distributional fixed-point equation of the form X=AX+b, where (A,b) is a vector of random variables independent of X, whereas dependencies between A and b are allowed. Conditions for existence and uniqueness of solutions of such fixed-point equations are known, as is the tail behaviour for most cases. In this work, we look at the central area and develop an algorithm to approximate the distribution function and possibly density of a large class of such perpetuities. For one specific example from the probabilistic analysis of algorithms, the algorithm is implemented and explicit error bounds for this approximation are given. At last, we look at some examples, where the densities or at least some properties are known to compare the theoretical error bounds to the actual error of the approximation. The algorithm used here is based on a method which was developed for another class of fixed-point equations. While adapting to this case, a considerable improvement was found, which can be translated to the original method.
RDF is widely used in order to catalogue the chaos of data across the internet. But these descriptions must be stored, evaluated, analyzed and verified. This creates the need to search for an environment to realize these aspects and strengthen RDFs influence. InterSystems postrelational database Caché exposes many features that are similar to RDF and provide persistence with semantic part. Some models for relational databases exist but these lack features like object-oriented data-structures and multidimensional variables. The aim of this thesis is to develop an RDF model for Caché that saves RDF data in an object-oriented form. Furthermore an interface for importing RDF data will be presented and implemented.