Refine
Year of publication
Document Type
- Doctoral Thesis (2069) (remove)
Language
- English (2069) (remove)
Has Fulltext
- yes (2069)
Is part of the Bibliography
- no (2069)
Keywords
- ALICE (9)
- Quark-Gluon-Plasma (8)
- Membranproteine (7)
- Geldpolitik (6)
- Proteine (6)
- Apoptosis (5)
- Biochemie (5)
- CERN (5)
- Heavy Ion Collisions (5)
- Immunologie (5)
Institute
- Biowissenschaften (427)
- Physik (381)
- Biochemie und Chemie (282)
- Biochemie, Chemie und Pharmazie (211)
- Medizin (128)
- Pharmazie (92)
- Geowissenschaften (87)
- Informatik und Mathematik (85)
- Informatik (55)
- Mathematik (46)
Quantum chromodynamics (QCD) is the theory of the strong interaction between quarks and gluons. Due to Confinement, at lower energies quarks and gluons are bound into colorless states called hadrons. QCD is also asymptotically free, i.e. at large energies or densities it enters a deconfined state, termed quark-gluon plasma (QGP), where quarks and gluons are quasi-free. This transition occurs at an energy scale around 200 MeV where QCD cannot be treated perturbatively. Instead it can be formulated on a space-time grid. The resulting theory, lattice quantum chromodynamics (LQCD), can be simulated efficiently on high performance parallel-computing clusters. In recent years graphic processing units (GPUs), which outperform CPUs in terms of parallel-computing and memory bandwidth capabilities, became very popular for LQCD computations. In this work the QCD deconfinement transition is studied using CL2QCD, a LQCD application that runs efficiently on GPUs. Furthermore, CL2QCD is extended by a Rational Hybrid Monte Carlo algorithm for Wilson fermions to allow for simulations of an odd number of quark flavors.
Due to the sign-problem LQCD simulations are restricted to zero or very small baryon densities, where, in the limit of infinite quark mass QCD has a first order deconfinement phase transition associated to the breaking of the global centre symmetry. Including dynamical quarks breaks this symmetry explicitly. Lowering their mass weakens the first order transition until it terminates in a second order Z2 point. Beyond this point the transition is merely an analytic crossover. As the lattice spacing is decreased, the reduction of discretization errors causes the region of first order transitions to expand towards lower masses. In this work the deconfinement critical point with 2 and 3 flavors of standard Wilson fermions is studied. To this end several kappa values are simulated on temporal lattice extents 6,8,10 (4) for two flavors (three flavors) and various aspect ratios (spatial lattice extent / temporal lattice extent) so as to extrapolate to the thermodynamic limit, applying finite size scaling. For two flavors an estimate is done if and when a continuum extrapolation is possible.
The chiral and deconfinement phase transitions at zero density for light and heavy quarks, respectively, have analytic continuations to purely imaginary chemical potential, where no sign-problem exists and LQCD simulations can be applied. At some critical value of the imaginary chemical potential, the transitions meet the endpoint of the Roberge-Weiss transition between adjacent Z3 sectors. For light and heavy quarks the transition lines meet in a triple point, while for intermediate masses they meet in a second order point. At the boundary between these regimes the junction is a tricritical point, as shown in studies with two and three flavors of staggered and Wilson quarks on lattices with a temporal lattice extent of 4. Employing finite size scaling the nature of this point as a function of the quark mass is studied in this work for two flavors of Wilson fermions with a temporal lattice extent of 6. Of particular interest is the change of the location of tricritical points compared to an earlier study on lattices with temporal extent of 4.
Based on an original dataset of 100 important pieces of legislation passed during the three presidencies of William J. Clinton, George W. Bush, and Barack H. Obama (1992-2013), this study explores two sets of questions:
(1) How do presidents influence legislators in Congress in the legislative arena, and what factors have an effect on the legislative strategies presidents choose?
(2) How successful are presidents in getting their policy positions enacted into law, and what configurations of institutional and actor-centered conditions determine presidential legislative success?
The analyses show that in an hyper-polarized environment, presidents usually have to fight an uphill-battle in the legislative arena, getting more involved if they face less favorable contexts and the odds are against them.
Moreover, the analyses suggest that there is no silver-bullet approach for presidents' legislative success. Instead, multiple patterns of success exist as presidents - depending on the institutional and public environment - can resort to different combinations of actions in order to see their preferred policy outcomes enacted.
Powerful environment perception systems are a fundamental prerequisite for the successful deployment of intelligent vehicles, from advanced driver assistance systems to self-driving cars. Arguably the most essential task of such systems is the reliable detection and localization of obstacles in order to avoid collisions. Two particularly challenging scenarios in this context are represented by small, unexpected obstacles on the road ahead, and by potentially dynamic objects observed from a large distance. Both scenarios become exceedingly critical when the ego-vehicle is traveling at high speed. As a consequence, two major requirements placed on environment perception systems are the capability of (a) high-sensitivity generic object detection and (b) high-accuracy obstacle distance estimation. The present thesis addresses both requirements by proposing novel approaches based on stereo vision for spatial perception.
First, this work presents a novel method for the detection of small, generic obstacles and objects at long range directly from stereo imagery. The detection is based on sound statistical tests using local geometric criteria which are applicable to both static and moving objects. The approach is not limited to predefined sets of semantic object classes and does not rely on restrictive assumptions on the environment, such as oversimplified global ground surface models. Free-space and obstacle hypotheses are evaluated based on a statistical model of the input image data in order to avoid a loss of sensitivity through intermediate processing steps. In addition to the detection result, the algorithm simultaneously yields refined estimates of object distances, originating from an implicit optimization of the geometric obstacle hypothesis models. The proposed detection system provides multiple flexible output representations, ranging from 3D obstacle point clouds to compact mid-level obstacle segments to bounding box representations of object instances suitable for model-based tracking. The core algorithm concept lends itself to massive parallelization and can be implemented efficiently on dedicated hardware. Real-time execution is demonstrated on a test vehicle in real-world traffic. For a thorough quantitative evaluation of the detection performance, two dedicated datasets are employed, covering small and hard-to-detect obstacles in urban environments as well as distant dynamic objects in highway driving scenarios. The proposed system is shown to significantly outperform current general purpose obstacle detection approaches in both setups, providing a considerable increase in detection range while reducing the false positive rate at the same time.
Second, this work considers the high-accuracy estimation of object distances from stereo vision, particularly at long range. Several new methods for optimizing the stereo-based distance estimates of detected objects are proposed and compared to state-of-the-art concepts. A comprehensive statistical evaluation is performed on an extensive dedicated dataset, establishing reference values for the accuracy limits actually achievable in practice. Notably, the refined distance estimates implicitly provided by the proposed obstacle detection system are shown to yield highly accurate results, on par with the top-performing dedicated stereo matching algorithms considered in the analysis.
The central goal of this investigation is to describe the dynamic reaction of a multicellular tumour spheroid to treatment with radiotherapy. A focus will be on the triggered dynamic cell cycle reaction in the spheroid and how it can be employed within fractionated radiation schedules.
An agent-based model for cancer cells is employed which features inherent cell cycle progression and reactions to environmental conditions. Cells are represented spatially by a weighted, dynamic and kinetic Voronoi/Delaunay model which also provides for the identification of cells in contact within the multicellular aggregate. Force-based interaction between cells will lead to rearrangement in response to proliferation and can induce cell quiescence via a mechanism of pressure-induced contact inhibition. The evolution of glucose and oxygen concentration inside the tumour spheroid is tracked in a diffusion solver in correspondence to in vitro or in vivo boundary conditions and a corresponding local nutrient uptake by single cells.
Radiation effects are implemented based on the measured single cell survival in the linear-quadratic model. The survival probability will be affected by the radiosensitivity of the current cycle phase and the local oxygen concentration. Quiescent cells will reduce the effective dose they receive as a consequence of their increased radioresistance. The radiation model includes a fast response to fatal DNA damage through cell apoptosis and a slow response via cell loss due to misrepair during the radiation-induced G2-block.
A simplified model for drug delivery in chemotherapy is implemented.
The model can describe the growth dynamics of spheroids in accordance to experimental data, including total number of cells, histological structure and cell cycle distribution. Investigations of possible mechanisms for growth saturation reveal a critical dependence of tumour growth on the shedding rate of cells from the surface.
In response to a dose of irradiation, a synchronisation of the cell cycle progression within the tumour is observed. This will lead to cyclic changes in the overall radiation sensitivity of the tumour which are quantified using an enhancement measure in comparison to the expected radiosensitivity of he tumour. A transient strong peak in radiosensitivity enhancement is observed after administration of irradiation. Mechanisms which influence the peak timing and development are systematically investigated, revealing quiescence and reactivation of cells to be a central mechanism for the enhancement.
Direct redistribution of cells due to different survival in cell cycle phases, re-activation of quiescent cells in response to radiation-induced cell death and blocking of DNA damaged cells at the G2/M checkpoint are identified as the main mechanisms which contribute to a synchronisation and determine the radiosensitivity increase. A typical time scale for the development of radiosensitivity and the relaxation of tumours to a steady-state after irradiation is identified, which is related to the typical total cell cycle time.
A range of clinical radiotherapy schedules is tested for their performance within the simulation and a systematic comparison with alternative delivery schedules is performed, in order to identify schedules which can most effectively employ the described transient enhancement effects. In response to high-dose schedules, a dissolution of the tumour spheroid into smaller aggregates can be observed which is a result of the loss of integrity in the spheroid that is associated with high cell death via apoptosis. Fractionated irradiation of spheroids with constant dose per time unit but different inter-fraction times clearly reveals optimal time-intervals for radiation, which are directly related to the enhancement response of the tumour.
In order to test the use of triggered enhancement effects in tumours, combinations of trigger- and effector doses are examined for their performance in specific treatment regimens. Furthermore, the automatic identification and triggering in response to high enhancement periods in the tumour is analysed.
While triggered schedules and automatic schedules both yield a higher treatment efficiency in comparison to conventional schedules, treatment optimisation is a revealed to be a global problem, which cannot be sufficiently solved using local optimisation only.
The spatio-temporal dynamics of hypoxia in the tumour are studied in response to irradiation. Microscopic, diffusion-induced reoxygenation dynamics are demonstrated to be on a typical time-scale which is in the order of fractionation intervals. Neoadjuvant chemotherapy with hydroxyurea can yield a drastic improvement of radiosensitivity via cell cycle synchronisation and specific toxicity against radioresistant S-phase cells.
The model makes clear predictions of radiation schedules which are especially effective as a result of triggered cell cycle-based radiosensitivity enhancement. Division of radiation into trigger and effector doses is highly effective and especially suited to be combined with adjuvant chemotherapy in order to limit regrowth of cells.
The fruit fly Drosophila melanogaster is one of the most important biological model organisms, but only the comparative approach with closely related species provides insights into the evolutionary diversification of insects. Of particular interest is the live imaging of fluorophores in developing embryos. It provides data for the analysis and comparison of the threedimensional morphogenesis as a function of time. However, for all species apart from Drosophila, for example the red flour beetle Tribolium castaneum, essentially no established standard operation procedures are available and the pool of data and resources is sparse. The goal of my PhD project was to address these limitations. I was able to accomplish the following milestones:
- Development of the hemisphere and cobweb mounting methods for the non-invasive imaging of Tribolium embryos in light sheet-based fluorescence microscopes and characterization of most crucial embryogenetic events.
- Comprehensive documentation of methods as protocols that describe (i) beetle rearing in the laboratory, (ii) preparation of embryos, (ii) calibration of light sheet-based fluorescence microscopes, (iv) recording over several days, (v) embryo retrieval as a quality control as well as (vi) data processing.
- Adaption of the methods to record and analyze embryonic morphogenesis of the Mediterranean fruit fly Ceratitis capitata and the two-spotted cricket Gryllus bimaculatus as well as integration of the data into an evolutionary context.
- Further development of the hemisphere method to allow the bead-based / landmark-based registration and fusion of three-dimensional images acquired along multiple directions to compensate the shadowing effect.
- Development of the BugCube, a web-based computer program that allows to share image data, which was recorded by using light sheet-based fluorescence microscopy, with colleagues.
- Invention and experimental proof-of-principle of the (i) AGameOfClones vector concept that creates homozygous transgenic insect lines systematically. Additionally, partial proof-of-principle of the (ii) AClashOfStrings vector concept that creates double homozygous transgenic insect lines systematically, as well as preliminary evaluation of the (iii) AStormOfRecords vector concept that creates triple homozygous transgenic insect lines systematically.
- Creation and performance screening of more than fifty transgenic Tribolium lines for the long-term imaging of embryogenesis in fluorescence microscopes, including the first Lifeact and histone subunit-based lines.
My primary results contribute significantly to the advanced fluorescence imaging approaches of insect species beyond Drosophila. The image data can be used to compare different strategies of embryonic morphogenesis and thus to interpret the respective phylogenetic context. My technological developments extend the methodological arsenal for insect model organisms considerably.
Within my perspective, I emphasize the importance of non-invasive long-term fluorescence live imaging to establish speciesspecific morphogenetic standards, discuss the feasibly of a morphologic ontology on the cellular level, suggest the ‘nested linearly decreasing phylogenetic relationship’ approach for evolutionary developmental biology, propose the live imaging of species hybrids to investigate speciation and finally outline how light sheet-based fluorescence microscopy contributes to the transition from on-demand to systematic data acquisition in developmental biology.
During my PhD project, I wrote a total of ten manuscripts, six of which were already published in peer-reviewed scientific journals. Additionally, I supervised four Master and two Bachelor projects whose scientific questions were inspired by the topic of my PhD work.
Inhibition of midbrain dopamine (DA) neurons codes for negative reward prediction errors, and causally affects conditioning learning. DA neurons located in the ventral tegmental area (VTA) display two-fold longer rebound delays from hyperpolarizing inhibition in comparison to those in the substantia nigra (SN). This difference has been linked to the slow inactivation of Kv4.3-mediated A-type currents (IA). One known suppressor of Kv4.3 inactivation is a splice variant of potassium channel interacting protein 4 (KChIP4), KChIP4a, which has a unique potassium channel inactivation suppressor domain (KISD) that is coded within exon 3 of the KChIP4 gene. Previous ex vivo experiments from our lab showed that the constitutive knockout of KChIP4 (KChIP4 KO) removes the slow inactivation of IA in VTA DA neurons, with marginal effects on SN DA neurons. KChIP4 KO also increased firing pauses in response to phasic hyperpolarization in these neurons. Here I show, using extracellular recordings combined with juxtacellular labeling in anesthetized mice, that KChIP4 KO also selectively changes the number and duration spontaneous firing pauses by VTA DA neurons in vivo. Pauses were quantified with two different statistical methods, including one developed in house. No other firing parameter was affected, including mean frequency and bursting, and the activity of SN DA neurons was untouched, suggesting that KChIP4 gene products have a highly specific effect on VTA DA neuron responses to inhibitory input.
Following up on this result, I developed a new mouse line (KChIP4 Ex3d) where the KISD-coding exon 3 of KChIP4 is selectively excised by cre-recombinase expressed under the dopamine transporter (DAT) promoter, therefore disrupting the expression of KChIP4a only in midbrain DA neurons. I show that these mice have a highly selective behavioral phenotype, displaying a drastic acceleration in extinction learning, but no changes in acquisition learning, in comparison to control littermates. Computational fitting of the behavioral data with a modified Rescorla-Wagner model confirmed that this phenotype is congruent with a selective increase in learning from negative prediction errors. KChIP4 Ex3d also had normal open field exploration, novel object preference, hole board exploration and spontaneous alternation in a plus maze, indicating that exploratory drive, responses to novelty, anxiety, locomotion and working memory were not affected by the genetic manipulation. Furthermore semi-quantitative IHC revealed that KChIP4 Ex3d mice have increased Kv4.3 expression in TH+ neurons, suggesting that the absence of KChIP4a increases the binding of other KChIP variants, which known to increase surface expression of Kv4 channels.
Furthermore, in the course of my experimental study I identified that the most used mouse line where cre-recombinase is expressed under the DAT promoter (DAT-cre KI) has a different behavioral phenotype during conditioning in relation to WT littermate controls. These animals displayed increased responding during the initial trials of acquisition and delayed response latency extinction, consistent with an increase in motivation, which is in line with a decrease in DAT function.
I propose a working model where the disruption of KChIP4a expression in DA neurons leads to an increase in binding of other KChIP variants to Kv4.3 subunits, promoting their increased surface expression and increasing IA current density; this then increases firing pauses in response to synaptic inhibition, which in behaving animals translates to an increase in negative prediction error-based learning.
In this thesis we introduce the imaginary projection of (multivariate) polynomials as the projection of their variety onto its imaginary part, I(f) = { Im(z_1, ... , z_n) : f(z_1, ... , z_n) = 0 }. This induces a geometric viewpoint to stability, since a polynomial f is stable if and only if its imaginary projection does not intersect the positive orthant. Accordingly, the thesis is mainly motivated by the theory of stable polynomials.
Interested in the number and structure of components of the complement of imaginary projections, we show as a key result that there are only finitely many components which are all convex. This offers a connection to the theory of amoebas and coamoebas as well as to the theory of hyperbolic polynomials.
For hyperbolic polynomials, we show that hyperbolicity cones coincide with components of the complement of imaginary projections, which provides a strong structural relationship between these two sets. Based on this, we prove a tight upper bound for the number of hyperbolicity cones and, respectively, for the number of components of the complement in the case of homogeneous polynomials. Beside this, we investigate various aspects of imaginary projections and compute imaginary projections of several classes explicitly.
Finally, we initiate the study of a conic generalization of stability by considering polynomials whose roots have no imaginary part in the interior of a given real, n-dimensional, proper cone K. This appears to be very natural, since many statements known for univariate and multivariate stable polynomials can be transferred to the conic situation, like the Hermite-Biehler Theorem and the Hermite-Kakeya-Obreschkoff Theorem. When considering K to be the cone of positive semidefinite matrices, we prove a criterion for conic stability of determinantal polynomials.
The endoplasmic-reticulum-associated protein degradation pathway ensures quality control of newly synthesized soluble and membrane proteins of the secretory pathway. Proteins failing to fold into their native structure are processed in a multistep process and finally ubiquitinated and degraded by the proteasome in order to protect the cell from proteotoxic stress. My thesis covers structural as well as functional studies of various protein components that constitute the protein complexes that are responsible for this process.
One sub-project addressed the mechanism of glycan recognition by Yos9 as part of the ERAD substrate selection. NMR solution structures of the mannose-6-phosphate homology (MRH) domain of Yos9 both in a free and glycan bound conformation reveal a gripping movement of loop regions upon binding of correctly processed glycan structures.
The main projects focused on revealing the mechanism of efficient ubiquitin chain assembly by the ERAD ubiquitination machinery. This included the investigation of the role of the ERAD components Cue1 and Ubc7 in processive ubiquitin chain formation, how ubiquitin chain conformations change during elongation, how the conformation of a chain is impacted by interacting proteins and finally understand the activity regulation of the ERAD E2 enzyme Ubc7 by its cognate RING E3 ligases. Nuclear magnetic resonance (NMR) analysis and fluorescence-based ubiquitination assays show that the CUE domain of Cue1 contributes with its proximal binding preference as well as with its position dependent accelerating effect to efficient ubiquitin chain formation. This is required to efficiently drive degradation of substrates. Specific ubiquitin binding events dictate and coordinate the spatial arrangement of the E2 enzyme relative to the distal tip of a chain. This process can be further accelerated by RING E3 ligases that promote Ubc7 activity by more than ~20 fold via inducing allosteric changes around the catalytic cysteine. My results additionally suggest a model where Ubc7 dimerization results in proximity induced activation of the E2. This data ensures rapid diubiquitin formation that is followed by a CUE domain assisted chain elongation mechanism where Cue1 acts in an E4 like fashion.
How ubiquitin binding events can modulate the conformations of a ubiquitin chain were investigated by pulsed electron-electron double resonance (PELDOR) spectroscopy combined with molecular modeling. This shows that K48-linked diubiquitin samples a broad conformational space which can be modulated in distinct ways. The CUE domain of Cue1 uses conformational selection of pre-populated open conformations to support ubiquitin chain elongation. In contrast, deubiquitinating enzymes shift the conformational distribution to weakly or even non-populated conformations to allow cleavage of the isopeptide bond that connects adjacent ubiquitins. Ubiquitin chain elongation increases the sampled conformational space and suggests that this high conformational flexibility might contribute to efficient proteasomal recognition.
International society consists of states and the rules and institutions they share. Although international society has become a mundane feature of the world and the principal research focus of International Relations, it has become meaningless. More specifically, the technical rules that determine what states are and how they relate to other features of the world are units of semantic meaning, but their rampant, unprincipled proliferation has corroded their capacity to contain existential meaning. This deterioration is to be deplored because it alienates subjects from each other, it is totalising and excludes alternatives, and it is theoretically irreversible. To connect the two kinds of meaning, the first step is to reconceptualise international society as consisting strictly of constitutive rules whose meaning depends on the context they jointly compose, which implies that these rules can in turn be represented as signs in a semiotic structure. In order to evaluate the capacity of the signs to contain existential meaning, the next step is to adapt Baudrillard’s hierarchical typology of semiotic systems, ranging from the most meaningful systems based on symbolic exchange value to the vapid terminus of hyperreality based on sign value, in which semantic meaning is without value and existential meaning is impossible. The narrative traces the history of the signs of international law from the premodern period, when Christendom was understood as an approximation of the divine kingdom and a vehicle for salvation, to the present postmodern period, in which hundreds of articles of international maritime law make the decision to go to war over isolated rocks intelligible – even rational – and international trade law catalogues potato products to six digits. Three cases in particular exemplify this devolution in international law: the laws determining the territorial sea, the most-favoured national principle of international trade law, and nationality as a normative basis for statehood.