Refine
Year of publication
Document Type
- Article (6)
- Doctoral Thesis (6)
- Bachelor Thesis (1)
- Report (1)
- Working Paper (1)
Has Fulltext
- yes (15)
Is part of the Bibliography
- no (15)
Keywords
- Simulation (15) (remove)
Institute
- Physik (3)
- Biochemie und Chemie (2)
- Informatik und Mathematik (2)
- Center for Financial Studies (CFS) (1)
- Frankfurt Institute for Advanced Studies (FIAS) (1)
- Informatik (1)
- Institut für Ökologie, Evolution und Diversität (1)
- Psychologie (1)
- Senckenbergische Naturforschende Gesellschaft (1)
- Wirtschaftswissenschaften (1)
Background: We evaluated the sensitivity of the D-statistic, a parsimony-like method widely used to detect gene flow between closely related species. This method has been applied to a variety of taxa with a wide range of divergence times. However, its parameter space and thus its applicability to a wide taxonomic range has not been systematically studied. Divergence time, population size, time of gene flow, distance of outgroup and number of loci were examined in a sensitivity analysis.
Result: The sensitivity study shows that the primary determinant of the D-statistic is the relative population size, i.e. the population size scaled by the number of generations since divergence. This is consistent with the fact that the main confounding factor in gene flow detection is incomplete lineage sorting by diluting the signal. The sensitivity of the D-statistic is also affected by the direction of gene flow, size and number of loci. In addition, we examined the ability of the f-statistics, fˆGf^G and fˆhomf^hom, to estimate the fraction of a genome affected by gene flow; while these statistics are difficult to implement to practical questions in biology due to lack of knowledge of when the gene flow happened, they can be used to compare datasets with identical or similar demographic background.
Conclusions: The D-statistic, as a method to detect gene flow, is robust against a wide range of genetic distances (divergence times) but it is sensitive to population size. The D-statistic should only be applied with critical reservation to taxa where population sizes are large relative to branch lengths in generations.
Lattice simulation of a center symmetric three dimensional effective theory for SU(2) Yang-Mills
(2010)
We present lattice simulations of a center symmetric dimensionally reduced effective field theory for SU(2) Yang Mills which employ thermal Wilson lines and three-dimensional magnetic fields as fundamental degrees of freedom. The action is composed of a gauge invariant kinetic term, spatial gauge fields and a potential for the Wilson line which includes a "fuzzy" bag term to generate non-perturbative fluctuations between Z(2) degenerate ground states. The model is studied in the limit where the gauge fields are set to zero as well as the full model with gauge fields. We confirm that, at moderately weak coupling, the "fuzzy" bag term leads to eigenvalue repulsion in a finite region above the deconfining phase transition which shrinks in the extreme weak-coupling limit. A non-trivial Z(N) symmetric vacuum arises in the confined phase. The effective potential for the Polyakov loop in the theory with gauge fields is extracted from the simulations including all modes of the loop as well as for cooled configurations where the hard modes have been averaged out. The former is found to exhibit a non-analytic contribution while the latter can be described by a mean-field like ansatz with quadratic and quartic terms, plus a Vandermonde potential which depends upon the location within the phase diagram. Other results include the exact location of the phase boundary in the plane spanned by the coupling parameters, correlation lengths of several operators in the magnetic and electric sectors and the spatial string tension. We also present results from simulations of the full 4D Yang-Mills theory and attempt to make a qualitative comparison to the 3D effective theory.
Das Ziel dieser Arbeit ist die realitätsgetreue Entwicklung eines interaktiven 3D-Stadtmodells, welches auf den ÖPNV zugeschnitten ist. Dabei soll das Programm anhand von Benutzereingaben und mit Hilfe einer Datenquelle, automatisch eine dreidimensionale Visualisierung der Gebäude erzeugen und den lokalen ÖPNV mitintegrieren. Als Beispiel der Ausarbeitung diente das ÖPNV-Netz der Stadt Frankfurt. Hierbei wurde auf die Problematik der Erhebung von Geoinformationen und der Verarbeitung von solchen komplexen Daten eingegangen. Es wurde ermittelt, welche Nutzergruppen einen Mehrwert durch eine derartige 3D Visualisierung haben und welche neuen Erweiterungs- und Nutzungspotenziale das Modell bietet.
Dem Leser soll insbesondere ein Einblick in die Generierung von interaktiven 3D-Modellen aus reinen Rohdaten verschafft werden. Dazu wurde als Entwicklungsumgebung die Spiele-Engine Unity eingesetzt, welche sich als sehr fähiges und modernes Entwicklungswerkzeug bei der Erstellung von funktionalen 3D-Visualisierungen herausgestellt hat. Als Datenquelle wurde das OpenStreetMap Projekt benutzt und im Rahmen dieser Arbeit behandelt. Anschließend wurde zur Evaluation, das Modell verschiedenen Nutzern bereitgestellt und anhand eines Fragebogens evaluiert.
Das Ziel dieser Arbeit war es, RNA-Strukturen als potentielle Zielstrukturen für die Medikamentenentwicklung zu untersuchen. Hierbei ging es im Speziellen um die Anwendung Virtueller Screening Verfahren für die RNA-Liganden-Vorhersage. Hierzu wurde die als TAR-Motiv (transactivating response element) bekannte RNA-Struktur der mRNAs des HI-Virus ausgewählt. Diese Struktur wurde gewählt, da mit den vier PDB-Einträgen 1ANR, 1ARJ, 1LVJ und 1QD3 bereits experimentell motivierte Strukturmodelle zum Beginn der Untersuchung vorlagen. Ausschlaggebend war hierbei auch das Vorhandensein eines Tat-TAR-FRET-Assays im Rahmen des SFB 579, in welchem diese Arbeit angefertigt wurde. Die Aufmerksamkeit, welche dem HI-Virus im Rahmen der Bekämpfung der Immunschwächekrankheit bereits zukam, führte bei dem gewählten Testmodell ebenfalls zu einem, wenn auch immer noch überschaubaren Datensatz bereits getesteter Substanzen, der als Grundlage für einen Liganden-basierten Ansatz als erste Basis dienen konnte. Basierend auf diesen Voruntersuchungen ergaben sich die weiteren Schritte dieser Arbeit. Die Arbeit lässt sich zusammenfassend in vier zum Teil parallel verlaufende Phasen einteilen: Phase 1:Bestandsaufnahme bekannter Informationen über die Zielstruktur · experimentell bestimmte Zielstrukturen · experimentell bestimmte Liganden/Nichtliganden der Zielstruktur Phase 2: Ableiten eines ligandenbasierten Ansatzes zur Vorhersage von potentiellen Bindern der Zielstruktur aus Substanzbibliotheken, der nicht auf Strukturdaten der Zielstruktur beruht. Phase 3: Analyse der bekannten Konformere der Zielstruktur auf konstante Angriffspunkte für ein spezielles Liganden-Design. Phase 4: Einbinden der bekannten Strukturinformationen der Zielstruktur zur weiteren Verfeinerung der Auswahlverfahren neuer Kandidaten für die weitere experimentelle Bestimmung des Bindeverhaltens. Im Rahmen dieser Arbeit konnten mittels der Anwendung von künstlichen neuronalen Netzen in einem ligandenbasierten Ansatz durch virtuelles Screening der Chemikalien-Datenbanken verschiedener Lieferanten fünf neue potentielle TAR-RNA-Liganden identifiziert werden (drei davon mit einem Methylenaminoguanidyl-Substrukturmotiv), sowie als „Spin-Off“ durch die Anwendung der ursprünglich nur für den Tat-TAR-FRET-Assay vorgesehenen Testsubstanzen in einem Kooperationsprojekt (mittels CFivTT-Assay) zwei neue potentiell antibakterielle Verbindungen identifiziert werden. Die Beschäftigung mit der offensichtlichen Flexibilität der TAR-RNA und damit einer nicht eindeutig zu definierenden Referenz-Zielstruktur für das Liganden-Docking führte zur Erstellung eines Software-Pakets, mit dem flexible Zielstrukturen – basierend auf den Konformer-Datensätzen von MD-Simulationen – auf konstante Angriffspunkte untersucht werden können. Hierbei wurde ausgehend von der Integration eines Taschenvorhersage-Programms (PocketPicker) eine Reihe von Filtern implementiert, die auf den hierzu in einer MySQL-Datenbank abgelegten Strukturinformationen eine Einschränkung des möglichen Taschenraums für das zukünftige Liganden-Design automatisiert vornehmen können. Des Weiteren ermöglicht dieser Ansatz einen einfachen Zugriff auf die einzelnen Konformere und die Möglichkeit Annotationen zu den Konformeren und den daraus abgeleiteten Tascheninformationen hinzuzufügen, so dass diese Informationen für die Erstellung von Liganden-Docking-Versuchen verwendet werden können. Ferner wurden im Rahmen dieser Arbeit ein neuer Deskriptor für die Beschreibung von Taschenoberflächen eingeführt: der auf der „Skalierungs-Index-Methode“ basierende molekulare SIMPrint. Die Beschäftigung mit der Verteilung der potentiellen Bindetaschen auf der Oberfläche der Konformerensemble führte ferner zur Definition der Taschenoberflächenbildungswahrscheinlichkeit (Pocket Surface Generation Probability – PSGP) für einzelne Atome einer Zielstruktur, die tendenziell für die Einschätzung der Ausbildung einer potentiell langlebigen Interaktion eines Liganden mit der Zielstruktur herangezogen werden kann, um beispielsweise Docking-Posen zu bewerten.
Background: Recent epidemics have entailed global discussions on revamping epidemic control and prevention approaches. A general consensus is that all sources of data should be embraced to improve epidemic preparedness. As a disease transmission is inherently governed by individual-level responses, pathogen dynamics within infected hosts posit high potentials to inform population-level phenomena. We propose a multiscale approach showing that individual dynamics were able to reproduce population-level observations.
Methods: Using experimental data, we formulated mathematical models of pathogen infection dynamics from which we simulated mechanistically its transmission parameters. The models were then embedded in our implementation of an age-specific contact network that allows to express individual differences relevant to the transmission processes. This approach is illustrated with an example of Ebola virus (EBOV).
Results: The results showed that a within-host infection model can reproduce EBOV’s transmission parameters obtained from population data. At the same time, population age-structure, contact distribution and patterns can be expressed using network generating algorithm. This framework opens a vast opportunity to investigate individual roles of factors involved in the epidemic processes. Estimating EBOV’s reproduction number revealed a heterogeneous pattern among age-groups, prompting cautions on estimates unadjusted for contact pattern. Assessments of mass vaccination strategies showed that vaccination conducted in a time window from five months before to one week after the start of an epidemic appeared to strongly reduce epidemic size. Noticeably, compared to a non-intervention scenario, a low critical vaccination coverage of 33% cannot ensure epidemic extinction but could reduce the number of cases by ten to hundred times as well as lessen the case-fatality rate.
Conclusions: Experimental data on the within-host infection have been able to capture upfront key transmission parameters of a pathogen; the applications of this approach will give us more time to prepare for potential epidemics. The population of interest in epidemic assessments could be modelled with an age-specific contact network without exhaustive amount of data. Further assessments and adaptations for different pathogens and scenarios to explore multilevel aspects in infectious diseases epidemics are underway.
Waldwachstumsmodelle sind ein ideales Werkzeug, um Auswirkungen veränderter Umweltbedingungen auf das Wachstum der Bäume aufzuzeigen. Ziel des Teilprojektes „Waldwachstumsreaktionen und Systemprozesse“ im Rahmen von ENFORCHANGE war, durch die Kombination von Wachstumsmodellen mit unterschiedlichen methodischen Ansätzen regionale Auswirkungen standörtlicher und klimatischer Veränderungen auf die Waldentwicklung zu analysieren und somit bessere Grundlagen für eine angepasste Forstbetriebsplanung zu schaffen. Anhand des physiologischen Wachstumsmodells BALANCE wurde der Einfluss der prognostizierten Klimaänderungen auf das Wachstum der Bäume abgeschätzt. Die für verschiedene Baumarten und regionaltypische Bestände gewonnenen Reaktionsmuster konnten anschließend in das managementorientierte Wachstumsmodell SILVA übertragen werden. Die Entwicklung repräsentativer Waldbestände wurde in SILVA für einen Zeitraum von 30 Jahren simuliert, wobei verschiedene Nutzungsszenarien untersucht wurden, um Handlungsspielräume und mögliche strategische Planungen für Forstbetriebe aufzuzeigen. Die gewonnenen Erkenntnisse für die praktische Betriebsplanung wurden am Beispiel des kommunalen Forstbetriebes Zittau dargestellt. Es wird deutlich, wie die Forstplanung von derartigen Szenarioanalysen profitieren kann. Die Simulationsrechnungen unter Annahme geänderter Klimaverhältnisse zeigen, dass die Bestände unter diesen Bedingungen ein verringertes Reaktionsvermögen auf waldbauliche Maßnahmen aufweisen, was insbesondere bei den Zuwächsen bemerkbar ist. Dabei haben Laubholzbestände, die bereits jetzt auf 27% der Betriebsfläche stocken, vermutlich eine Pufferwirkung und mildern die Auswirkungen der Klimaänderungen auf die Produktivität des Gesamtbetriebes ab.
A basic introduction to RFQs has been given in the first part of this thesis. The principle and the main ideas of the RFQ have been described and a small summary of different resonator concepts has been given. Two different strategies of designing RFQs have been introduced. The analytic description of the electric fields inside the quadrupole channel has been derived and the limitation of these approaches were shown. The main work of this thesis was the implementation and analysis of a Multigrid Poisson solver to describe the potential and electric field of RFQs which are needed to simulate the particle dynamics accurately. The main two ingredients of a Multigrid Poisson solver are the ability of a Gauß-Seidel iteration method to smooth the error of an approximation within a few iteration steps and the coarse grid principle. The smoothing corresponds to a damping of the high frequency components of the error. After the smoothing, the error term can well be approximated on a coarser grid in which the low frequency components of the error on the fine grid are converted to high frequency errors on the coarse grid which can be damped further with the same Gauß-Seidel method. After implementation, the multigrid Poisson solver was analyzed using two different type of test problems: with and without a charge density. After illustrating the results of the multigrid Poisson solver, a comparison to the field of the old multipole expansion method was made. The multipole expansion method is an accurate representation of the field within the minimum aperture, as limited by cylindrical symmetry. Within these limitations the multigrid Poisson solver and the multipole expansion method agree well. Beyond the limitation the two method give different fields. It was shown that particles leave the region in which the multipole expansion method gives correct fields and that the transmission is affected therefrom as well as the single particle dynamic. The multigridPoisson solver also gives a more realistic description of the field in the beginning of the RFQ, because it takes the tank wall into account, and this effect is shown as well. Closing the analysis of the external field, the transmission and fraction of accelerated particles of the set of 12 RFQs for the two different methods were shown. For RFQs with small apertures and big modulations the two different method give different values for the transmission due to the limitation of the multipole expansion method. The internal space charge fields without images was analyzed at the level of single particle dynamic and compared to the well known SCHEFF routine from LANL, showing major differences for the analyzed particle. For comparing influences on the transmissions of the set of 12 RFQs a third space charge routine (PICNIC) was considered as well. The basic shape of the transmission curve was the same independent of space charge routines, but the absolute values differ a little from routine to routine, with SCHEFF about 2% lower than the other routines. The multigrid Poisson solver and PICNIC agree quite well (less than 1%), but PICNIC has an extremely long running time. The major advantage of the multigrid Poisson solver in calculating space charge effects compared to the other two routines used here is that the Poisson solver can take the effect of image charges on the electrodes into account by just changing the boundaries to have the shape of the vanes whereas all other settings remain unchanged. It was demonstrated that the effect of image charges on the vanes on the space charge field is very big in the region close to the electrodes. Particles in that region will see a stronger transversely defocusing force than without images. The result is that the transmission decreases by as much as 10% which is considerably more than determined by other (inexact) routines before. This is an important result, because knowing about the big effect of image charges on the electrodes it allows it to taken into account while designing the RFQ to increase the performance of the machine. It is also an important factor in resolving the traditional difference observed between the transmission of actual RFQs and the transmission predicted by earlier simulations. In the last chapter of this thesis some experimental work on the MAFF (Munich Accelerator for Fission Fragments) IH-RFQ is described. The machine was assembled in Frankfurt and a beam test stand was built. The shunt impedance of the structure was measured using different techniques, the output energy of the structure were measured and finally its transmission was determined and compared to the beam dynamics simulations of the RFQ. Unfortunately, the transmission measurements were done without exact knowledge of the beam’s emittance. So the comparison to the simulation is somewhat rough, but with a reasonable guess of the emittance a good comparison between the measurement and simulation was obtained.
Estimating power in (generalized) linear mixed models: An open introduction and tutorial in R
(2021)
Mixed-effects models are a powerful tool for modeling fixed and random effects simultaneously, but do not offer a feasible analytic solution for estimating the probability that a test correctly rejects the null hypothesis. Being able to estimate this probability, however, is critical for sample size planning, as power is closely linked to the reliability and replicability of empirical findings. A flexible and very intuitive alternative to analytic power solutions are simulation-based power analyses. Although various tools for conducting simulation-based power analyses for mixed-effects models are available, there is lack of guidance on how to appropriately use them. In this tutorial, we discuss how to estimate power for mixed-effects models in different use cases: first, how to use models that were fit on available (e.g. published) data to determine sample size; second, how to determine the number of stimuli required for sufficient power; and finally, how to conduct sample size planning without available data. Our examples cover both linear and generalized linear models and we provide code and resources for performing simulation-based power analyses on openly accessible data sets. The present work therefore helps researchers to navigate sound research design when using mixed-effects models, by summarizing resources, collating available knowledge, providing solutions and tools, and applying them to real-world problems in sample sizing planning when sophisticated analysis procedures like mixed-effects models are outlined as inferential procedures.
Molecular dynamics (MD) simulation serves as an important and widely used computational tool to study molecular systems at an atomic resolution. No experimental technique is capable of generating a complete description of the dynamical structure of the biomolecules in their native solution environment. MD simulations allow us to study the dynamics and structure of the system and, moreover, helps in the interpretation of experimental observations. MD simulation was first introduced and applied by Alder and Wainwright in 1957 \cite{Alder57}. However, the first MD simulation of a macromolecule of biological interest was published 28 years ago \cite{McCammon77}. The simulation was concerned with the bovine pancreatic trypsin inhibitor (BPTI) protein, which has served as the hydrogen molecule'' of protein dynamics because of its small size, high stability, and relatively accurate X-ray structure available in 1977 \cite{Deisenhofer75}. This method is now widely used to tackle larger and more complex biological systems \cite{Groot01,Roux02} and has been facilitated by the development of fast and efficient methods for treating the long-range electrostatic interactions \cite{Essmann95}, the availability of faster parallel computers, and the continuous development of empirical molecular mechanical force fields \cite{Langley98,Cheatham99,Foloppe00}. It took several years until the first MD simulations of nucleic acid systems were performed \cite{Levitt83,Tidor83,Prabhakaran83,Nilsson86}. These investigations, which were also performed in vacuo, clearly demonstrated the importance of proper handling of electrostatics in a highly charged nucleic acid system, and different approaches, such as reduction of the phosphate charges and addition of hydrated counterions, have been applied to remedy this shortcoming and to maintain stable DNA structures. A few years later, the first MD simulation of a DNA molecule, including explicit water molecules and counterions was published \cite{Seibel85}. Various MD simulations on fully solvated RNA molecules with explicit inclusion of mobile ions indicated the importance of proper treatment of the environment of highly charged nucleic acids \cite{Lee95,Zichi95,Auffinger97,Auffinger99}. Given the central roles of RNA in the life of cells, it is important to understand the mechanism by which RNA forms three dimensional structures endowed with properties such as catalysis, ligand binding, and recognition of proteins. Furthermore, the increasing awareness of the essential role of RNA in controlling viral replication and in bacterial protein synthesis emphazises the potential of ribonucleicacids as targets for developing new antibacterial and new antiviral drugs. Driven by fruitful collaborations in the Sonderforschungsbereich RNA-Ligand interactions" the model RNA systems in this study include various RNA tetraloops and HIV-1 TAR RNA. For the latter system, the binding sites of heteroaromatic compounds have been studied employing automated docking calculations \cite{Goodsell90}. The results show that it is possible to use this tool to dock small rigid ligands to an RNA molecule, while large and flexible molecules are clearly problematic. The main part of this work is focused on MD simulations of RNA tetraloops.
Serial correlation in dynamic panel data models with weakly exogenous regressor and fixed effects
(2005)
Our paper wants to present and compare two estimation methodologies for dynamic panel data models in the presence of serially correlated errors and weakly exogenous regressors. The ¯rst is the ¯rst di®erence GMM estimator as proposed by Arellano and Bond (1991) and the second is the transformed Maximum Likelihood Estimator as proposed by Hsiao, Pesaran, and Tahmiscioglu (2002). Thereby, we consider the ¯xed e®ects case and weakly exogenous regressors. The ¯nite sample properties of both estimation methodologies are analysed within a simulation experiment. Furthermore, we will present an empirical example to consider the performance of both estimators with real data. JEL Classification: C23, J64