Refine
Year of publication
- 2018 (113) (remove)
Document Type
- Doctoral Thesis (113) (remove)
Language
- English (113) (remove)
Has Fulltext
- yes (113)
Is part of the Bibliography
- no (113)
Keywords
- vascular endothelial cells (2)
- ABCE1 (1)
- Afghanistan (1)
- Allosteric inhibition (1)
- Aurora kinase (1)
- Autism Spectrum Disorder (1)
- Bertolt Brecht (1)
- Biological citizenship (1)
- Bourdieu, Pierre (1)
- Brownian motion (1)
Institute
- Biowissenschaften (24)
- Biochemie, Chemie und Pharmazie (22)
- Informatik und Mathematik (17)
- Physik (17)
- Medizin (7)
- Gesellschaftswissenschaften (4)
- Rechtswissenschaft (3)
- Sprach- und Kulturwissenschaften (3)
- Biochemie und Chemie (2)
- Geowissenschaften (2)
The results of this thesis lie in the area of convex algebraic geometry, which is the intersection of real algebraic geometry, convex geometry, and optimization.
We study sums of nonnegative circuit polynomials (SONC) and their related cone, both geometrically and in application to polynomial optimization. SONC polynomials are certain sparse polynomials having a special structure in terms of their Newton polytopes and supports, and serve as a certificate of nonnegativity for real polynomials, which is independent of sums of squares.
The first part of this thesis is dedicated to the convex geometric study of the SONC cone. As main results we show that the SONC cone is full-dimensional in the cone of nonnegative polynomials, we exactly determine the number of zeros of a nonnegative circuit polynomial, and we give a complete and explicit characterization of the number of zeros of SONC polynomials and forms. Moreover, we provide a first approach to the study of the exposed faces of the SONC cone and their dimensions.
In the second part of the thesis we use SONC polynomials to tackle constrained polynomial optimization problems (CPOPs).
As a first step, we derive a lower bound for the optimal value of CPOP based on SONC polynomials by using a single convex optimization program, which is a geometric program (GP) under certain assumptions. GPs are a special type of convex optimization problems and can be solved in polynomial time. We test the new method experimentally and provide examples comparing our new SONC/GP approach with Lasserre's relaxation, a common approach for tackling CPOPs, which approximates nonnegative polynomials via sums of squares and semidefinite programming (SDP). The new approach comes with the benefit that in practice GPs can be solved significantly faster than SDPs. Furthermore, increasing the degree of a given problem has almost no effect on the runtime of the new program, which is in sharp contrast to SDPs.
As a second step, we establish a hierarchy of efficiently computable lower bounds converging to the optimal value of CPOP based on SONC polynomials. For a given degree each bound is computable by a relative entropy program. This program is also a convex optimization program, which is more general than a geometric program, but still efficiently solvable via interior point methods.
In this thesis we introduce the imaginary projection of (multivariate) polynomials as the projection of their variety onto its imaginary part, I(f) = { Im(z_1, ... , z_n) : f(z_1, ... , z_n) = 0 }. This induces a geometric viewpoint to stability, since a polynomial f is stable if and only if its imaginary projection does not intersect the positive orthant. Accordingly, the thesis is mainly motivated by the theory of stable polynomials.
Interested in the number and structure of components of the complement of imaginary projections, we show as a key result that there are only finitely many components which are all convex. This offers a connection to the theory of amoebas and coamoebas as well as to the theory of hyperbolic polynomials.
For hyperbolic polynomials, we show that hyperbolicity cones coincide with components of the complement of imaginary projections, which provides a strong structural relationship between these two sets. Based on this, we prove a tight upper bound for the number of hyperbolicity cones and, respectively, for the number of components of the complement in the case of homogeneous polynomials. Beside this, we investigate various aspects of imaginary projections and compute imaginary projections of several classes explicitly.
Finally, we initiate the study of a conic generalization of stability by considering polynomials whose roots have no imaginary part in the interior of a given real, n-dimensional, proper cone K. This appears to be very natural, since many statements known for univariate and multivariate stable polynomials can be transferred to the conic situation, like the Hermite-Biehler Theorem and the Hermite-Kakeya-Obreschkoff Theorem. When considering K to be the cone of positive semidefinite matrices, we prove a criterion for conic stability of determinantal polynomials.
Antimicrobial resistance became a serious threat to the worldwide public health in this century. A better understanding of the mechanisms, by which bacteria infect host cells and how the host counteracts against the invading pathogens, is an important subject of current research. Intracellular bacteria of the Salmonella genus have been frequently used as a model system for bacterial infections. Salmonella are ingested by contaminated food or water and cause gastroenteritis and typhoid fever in animals and humans. Once inside the gastrointestinal tract, Salmonella can invade intestinal epithelial cells. The host cell can fight against intracellular pathogens by a process called xenophagy. For complex systems, such as processes involved in the bacterial infection of cells, computational systems biology provides approaches to describe mathematically how these intertwined mechanisms in the cell function. Computational systems biology allows the analysis of biological systems at different levels of abstraction. Functional dependencies as well as dynamic behavior can be studied. In this thesis, we used the Petri net formalism to gain a better insight into bacterial infections and host defense mechanisms and to predict cellular behavior that can be tested experimentally. We also focused on the development of new computational methods.
In this work, the first realization of a mathematical model of the xenophagic capturing of Salmonella enterica serovar Typhimurium in epithelial cells was developed. The mathematical model expressed in the Petri net formalism was constructed in an iterative way of modeling and analyses. For the model verification, we analyzed the Petri net, including a computational performance of knockout experiments named in silico knockouts, which was established in this work. The in silico knockouts of the proposed Petri net are consistent with the published experimental perturbation studies and, thus, ensures the biological credibility of the Petri net. In silico knockouts that have not been experimentally investigated yet provide hypotheses for future investigations of the pathway.
To study the dynamic behavior of an epithelial cell infected with Salmonella enterica serovar Typhimurium, a stochastic Petri net was constructed. In experimental research, a decision like "Which incubation time is needed to infect half of the epithelial cells with Salmonella?" is based on experience or practicability. A mathematical model can help to answer these questions and improve experimental design. The stochastic Petri net models the cell at different stages of the Salmonella infection. We parameterized the model by a set of experimental data derived from different literature sources. The kinetic parameters of the stochastic Petri net determine the time evolution of the bacterial infection of a cell. The model captures the stochastic variation and heterogeneity of the intracellular Salmonella population of a single cell over time. The stochastic Petri net is a valuable tool to examine the dynamics of Salmonella infections in epithelial cells and generate valuable information for experimental design.
In the last part of this thesis, a novel theoretical method was introduced to perform knockout experiments in silico. The new concept of in silico knockouts is based on the computation of signal flows at steady state and allows the determination of knockout behavior that is comparable to experimental perturbation behavior. In this context, we established the concept of Manatee invariants and demonstrated the suitability of their application for in silico knockouts by reflecting biological dependencies from the signal initiation to the response. As a proof of principle, we applied the proposed concept of in silico knockouts to the Petri net of the xenophagic recognition of Salmonella. To enable the application of in silico knockouts for the scientific community, we implemented the novel method in the software isiKnock. isiKnock allows the automatized performance and visualization of in silico knockouts in signaling pathways expressed in the Petri net formalism. In conclusion, the knockout analysis provides a valuable method to verify computational models of signaling pathways, to detect inconsistencies in the current knowledge of a pathway, and to predict unknown pathway behavior.
In summary, the main contributions of this thesis are the Petri net of the xenophagic capturing of Salmonella enterica serovar Typhimurium in epithelial cells to study the knockout behavior and the stochastic Petri net of an epithelial cell infected with Salmonella enterica serovar Typhimurium to analyze the infection dynamics. Moreover, we established a new method for in silico knockouts, including the concept of Manatee invariants and the software isiKnock. The results of these studies are useful to a better understanding of bacterial infections and provide valuable model analysis techniques for the field of computational systems biology.
Cells within a tissue form highly complex, cellular interactions. This architecture is lost in twodimensional cell cultures. To close the gap between two-dimensional cell cultures and in vivo tissues, three-dimensional cell cultures were developed. Three-dimensional cellular aggregates such as spheroids, organoids, or embryoid bodies have been established as an essential tool in many different aspects of life science, including tumour biology, drug screening and embryonic development. To fully take advantage of the third dimension, imaging techniques are essential. The emerging field of “imagebased systems biology” exploits the information in images and builds a connection between experimental and theoretical investigation of biological processes at a spatio-temporal level. Such interdisciplinary approaches strongly depend on the development of protocols to establish threedimensional cell cultures, innovations in sample preparation, well-suited imaging techniques and quantitative segmentation methods.
Although three-dimensional cell cultures and image-based systems biology provide a great potential, two-dimensional methods are still not completely replaced by three-dimensional methods. The knowledge about many biological processes relies on two-dimensional experiments. This is mainly due to methodical and technical hurdles. Therefore, this thesis provides a significant contribution to overcome these hurdles and to further develop three-dimensional cell cultures. I established computational as well as experimental methods related to three-dimensional cellular aggregates and investigated fundamental, cellular processes such as adhesion, growth and differentiation.
My study examined MMA training, and thereby the ‘back region’ of MMA, where the ‘everyday life’ of MMA takes place. I enquired into how MMA training corresponds with MMA’s self-description, namely the somehow self-contradicting notion that MMA fights would be dangerous combative goings-on of approximately real fighting, but that MMA fighters would be able to approach these incalculable and uncontrolla-ble combative dangers as calculable and controllable risks.235 Conducting an ethnog-raphy in which I focused on the combination of participation and observation, I stud-ied how the specific interaction organisations of the three core training practices of MMA training provide the training students with specific combative experiences and how they thereby construct the social reality that is MMA training....
Die vorliegende Arbeit beschäftigt sich mit dem Thema Stemmatologie, d.h. primär der Rekonstruktion der Kopiergeschichte handschriftlich fixierter Dokumente. Zentrales Objekt der Stemmatologie ist das Stemma, eine visuelle Darstellung der Kopiergeschichte, welche i.d.R. graphtheoretisch als Baum bzw. gerichteter azyklischer Graph vorliegt, wobei die Knoten Textzeugen (d.s. die Textvarianten) darstellen während die Kanten für einzelne Kopierprozesse stehen. Im Mittelpunkt des Wissenschaftszweiges steht die Frage des Autorenoriginals (falls ein einziges solches existiert haben sollte) und die Frage der Rekonstruktion seines Textes. Das Stemma selbst ist ein Mittel zu diesem Hauptzweck (Cameron 1987). Der durch für manuelle Kopierprozesse kennzeichnende Abweichungen zunehmend abgewandelte Originaltext ist meist nicht direkt überliefert. Ziel der Arbeit ist es, die semi-automatische Stemmatologie umfassend zu beschreiben und durch Tools und analytische Verfahren weiterzuentwickeln. Der erste Teil der Arbeit beschreibt die Geschichte der computer-assistierten Stemmatologie inkl. ihrer klassischen Vorläufer und mündet in der Vorstellung eines einfachen Tools zur dynamischen graphischen Darstellung von Stemmata. Ein Exkurs zum philologischen Leitphänomen Lectio difficilior erörtert dessen mögliche psycholinguistische Ursachen im schnelleren lexikalischen Zugriff auf hochfrequente Lexeme. Im zweiten Teil wird daraufhin die existenziellste aller stemmatologischen Debatten, initiiert durch Joseph Bédier, mit mathematischen Argumenten auf Basis eines von Paul Maas 1937 vorgeschlagenen stemmatischen Models beleuchtet. Des Weiteren simuliert der Autor in diesem Kapitel Stemmata, um den potenziellen Einfluss der Distribution an Kopierhäufigkeiten pro Manuskript abzuschätzen.
Im nächsten Teil stellt der Autor ein eigens erstelltes Korpus in persischer Sprache vor, welches ebenso wie 3 der bekannten artifiziellen Korpora (Parzival, Notre Besoin, Heinrichi) qualitativ untersucht wird. Schließlich wird mit der Multi Modal Distance eine Methode zur Stemmagenerierung angewandt, welche auf externen Daten psycholinguistisch determinierter Buchstabenverwechslungswahrscheinlichkeiten beruht. Im letzten Teil arbeitet der Autor mit minimalen Spannbäumen zur Stemmaerzeugung, wobei eine vergleichende Studie zu 4 Methoden der Distanzmatrixgenerierung mit 4 Methoden zur Stemmaerzeugung durchgeführt, evaluiert und diskutiert wird.
Heat stress transcription factors (Hsfs) have an essential role in heat stress response (HSR) and thermotolerance by controlling the expression of hundreds of genes including heat shock proteins (Hsps) with molecular chaperone functions. Hsf family in plants shows a striking multiplicity, with more than 20 members in many species. In Solanum lycopersicum HsfA1a was reported to act as the master regulator of the onset of HSR and therefore is essential for basal thermotolerance. Evidence for this was provided by the analysis of HsfA1a co-suppression (A1CS) transgenic plants, which exhibited hypersensitivity upon exposure to heat stress (HS) due to the inability of the plants to induce the expression of many HS-genes including HsfA2, HsfB1 and several Hsps. Completion of tomato genome sequencing allowed the completion of the Hsf inventory, which is consisted of 27 members, including another three HsfA1 genes, namely HsfA1b, HsfA1c and HsfA1e.
Consequently, the suppression effect of the short interference RNA in A1CS lin e was re-evaluated for all HsfA1 genes. We found that expression of all HsfA1 proteins was suppressed in A1CS protoplasts. This result suggested that the model of single master regulator needs to be re-examined.
Expression analysis revealed that HsfA1a is constitutively expressed in different tissues and in response to HS, while HsfA1c and HsfA1e are minimally expressed in general, and show an induction during fruit ripening and a weak upregulation in late HSR. Instead HsfA1b shows preferential expression in specific tissues and is strongly and rapidly induced in response to HS. At the protein level HsfA1b and HsfA1e are rapidly degraded while HsfA1a and HsfA1c show a higher stability. In addition, HsfA1a and HsfA1c show a nucleocytosolic distribution, while HsfA1b and HsfA1e a strong nuclear retention.
A major property of a master regulator in HSR is thought to be its ability to cause a strong transactivation of a wide range of genes required for the initial activation of protective mechanisms. GUS reporter assays as well as analysis of transcript levels of several endogenous transcripts in protoplasts transiently expressing HsfA1 proteins revealed that HsfA1a can stimulate the transcription of many genes, while the other Hsfs have weaker activity and only on limited set of target genes. The low activity of HsfA1c and HsfA1e can be attributed to the lower DNA capacity of the two factors as judged by a GUS reporter repressor assay.
HsfA1a has been shown to have synergistic activity with the stress induced HsfA2 and HsfB1. The formation of such complexes is considered as important for stimulation of transcription and long term stress adaptation. All HsfA1 members show synergistic activity with HsfA2, while only HsfA1a act as co-activator of HsfB1 and HsfA7. Interestingly, HsfA1b shows an exceptional synergistic activity with HsfA3, suggesting that different Hsf complexes might regulate different HS-related gene networks. Altogether these results suggest that HsfA1a has unique characteristics within HsfA1 subfamily. This result is interesting considering the very high sequencing similarity among HsfA1s, and particularly among HsfA1a and HsfA1c.
To understand the molecular basis of this discrepancy, a series of domain swapping mutants between HsfA1a and HsfA1c were generated. Oligomerization domain and C-terminal swaps did not affect the basal activity or co-activity of the proteins. Remarkably, an HsfA1a mutant harbouring the N-terminus of HsfA1c shows reduced activity and co-activity, while the reciprocal HsfA1c with the N-terminus of HsfA1a cause a gain of activity and enhanced DNA binding capacity.
Sequence analysis of the DBD of HsfA1 proteins revealed a divergence in the highly conserved C-terminus of the turn of β3-β4 sheet. As the vast majority of HsfA1 proteins, HsfA1a at this position comprises an Arg residue (R107), while HsfA1c a Leu and HsfA1e a Cys. An HsfA1a-R107L mutant has reduced DNA binding capacity and consequently activity. Therefore, the results presented here point to the essential function of this amino acid residue for DNA binding function. Interestingly, the mutation did not affect the activity of the protein on Hsp70-1, suggesting that the functionality of the DBD and consequently the transcription factor on different promoters with variable heat stress element number and architecture is dependent on structural peculiarities of the DBD.
In conclusion, the unique properties including expression pattern, transcriptional activities, stability, DBD-peculiarities are likely responsible for the dominant function of HsfA1a as a master regulator of HSR in tomato. Instead, other HsfA1-members are only participating in HSR or developmental regulations by regulating a specific set of genes. Furthermore, HsfA1b and HsfA1e are likely function as stress primers in specific tissues while HsfA1c as a co-regulator in mild HSR. Thereby, tomato subclass A1 presents another example of function diversity not only within the Hsf family but also within the Hsf-subfamily of closely related members. The diversification based on DBD peculiarities is likely to occur in potato as well. Therefore this might have eliminated the functional redundancy observed in other species such as Arabidopsis thaliana but has probably allowed the more refined regulation of Hsf networks possibly under different stress regimes, tissues and cell types.
Photolabile protecting groups (PPGs, cages, photocages) are molecules which can block the activity of a functional group and be removed by irradiation of light of an appropriate wavelength. One of the goals of this work was to design new photolabile protecting groups, based on a literature known one. The far-UV absorbing diethylamino benzyl (DEAMb) photocage, developed by Wang et al., was selected as structural basis for this work. In order to trigger the uncaging reaction with longer wavelengths (≥365 nm), thus allowing also biological applications, its structure was optimized. This was done by elongating the π-orbital conjugation using biphenyl derivatives instead of a single aromatic moiety. The photocage was loaded with glutamic acid as the leaving group.
The highest bathochromic shift was shown by compounds, which had the smallest sterical hindrance imposed on the second aromatic ring. The absorption spectrum was more redshifted if the second aromatic ring contained an electron withdrawing group. However, the stronger the substituents electron withdrawing strength was, the lower the uncaging quantum yield was. It was rationalized, that this is due to a decreased excited state electron density at the benzylic carbon of the DEAMb core which is necessary to trigger bond dissociation. This has been confirmed using TDDFT (time-dependent density functional theory) computations done by Jan von Cosel, Konstantin Falahati and Carsten Hamerla (from the group of Irene Burghardt). The best uncaging quantum yield was 42% for m-phenyl substituted DEAMb, while if a strong electron withdrawing group was present (nitro group), there was no photoactivity at all.
In order to achieve a better π-orbital conjugation of the non-coplanar biphenyl derivatives, a C-C bond was introduced between the benzylic carbon and the second aromatic ring. The resulting planar compounds belong to the fluorene class. The computational data predicted the photochemical meta effect to some extent to be preserved in these molecules. A set of fluorene derivatives was synthesized and photochemically characterized. The molar absorption coefficients of all prepared fluorene derivatives were higher than for any of the biphenyl derivatives. Quantum yields of the acetate release ranged between 3-42%, thus being as good as the best glutamic acid releasing biphenyl compounds. The highest uncaging cross section of the acetate release from the prepared fluorene derivatives was above 5000 M^-1 cm^-1. This value proves the high potential of the new fluorene based photocages developed in this work. Furthermore, release of hydroxide ion from fluorenol could be shown along with generation of, presumably, fluorenyl cation. These intriguing results paves a way for further exploration of fluorene based photocages for the release of bad leaving groups.
The second part of this work describes the custom synthesis of 13C labeled compounds for the VIPER (VIbrationally Promoted Electronic Resonance) project. In the VIPER pulse sequence, a molecule is vibrationally excited by a narrow band IR-pump pulse. The following Vis-pump pulse will promote the vibrationally pre-excited molecules to an electronically excited state. This Vis-pump pulse is offresonant for the not vibrationally pre-selected species and only resonant with the molecules, which are already pre-excited by the IR-pump pulse. Since the IR absorption bands usually are well resolved, a selective excitation of one molecule in an ensemble of similar ones is possible in the IR frequency range. Isotopologues and isotopomers are an extreme case of molecules which are near identical and differ only by isotopic composition or position. As a result in solution and at room temperature they have an identical UV-Vis absorption spectrum but different IR spectrum. This allows vibrational excitation of only one isotopologue (or isotopomer).
Isotopic labels were introduced in known photocages: 7-diethylamino coumarin (DEACM) and para-hydroxy phenacyl (pHP). The position for isotopic label incorporation in these molecules was guided by computations done by Jan von Cosel and Carsten Neumann. To allow control of the photoreactions in an ultrafast timescale, an IR active leaving group was used. The uncaging behavior of the prepared molecules in steady state was tested using chromatography (HPLC) and spectroscopy (1H NMR, FTIR and UV-Vis). The VIPER experiments were performed by Daniela Kern-Michler, Carsten Neumann, Nicole Mielke and Luuk van Wilderen (from the group of Jens Bredenbeck). A selective uncaging of only the vibrationally pre-excited molecules could be achieved.
This thesis presents the first measurement of the proton capture reaction on the isotope 124Xe performed in inverse kinematics. The experiment was carried out in June 2016 at the Experimental Storage Ring (ESR) at the GSI Helmholtz Centre for Heavy Ion Research in Darmstadt, Germany.
124Xe is one of about 35 p-nuclei that cannot be produced via neutron-induced nucleo- synthesis as the vast majority of heavy elements. Its production and destruction provide important information about the nucleosynthesis of the p-nuclei. Measuring the 124Xe(p,g)125Cs reaction also gives strong constraints for its reverse 125Cs(g,p)124Xe reaction.
Fully stripped 124Xe ions repeatedly passed a H2 gas jet target at five different energies between 5.5 MeV/u and 8 MeV/u. An electron cooler compensated for the energy loss in the target and reduced the beam momentum spread. The reaction product 125Cs55+ has a smaller magnetic rigidity than 124Xe54+. Therefore 125Cs55+ was deflected towards smaller radii in the first dipole after the target area and thereby separated from 124Xe54+. It was detected with a position-sensitive Double-Sided Silicon Strip Detector (DSSSD). The novelty of this experiment was the installation of the DSSSD inside the ultra-high vacuum of the storage ring using a newly designed manipulator.
Three High-Purity Germanium X-ray detectors were used to measure the X-rays following the Radiative Electron Capture (REC) events into 124Xe53+. The REC cross sections are well-known and were used to determine the luminosity.
The 124Xe(p,g)125Cs cross sections at ion beam energies between 5.5 MeV/u and 8 MeV/u were determined relatively to the K-REC cross sections and finally compared to the theoretically predicted cross sections. While theoretical predictions of the TENDL database are lower than the measured ones by a factor of up to seven, the NON-SMOKER data are higher by a factor of up to two, except of the cross section at 7 MeV/u, where NON-SMOKER data are slightly lower than the experimental value.
For the first time, a proton capture cross section could be measured in inverse kinematics close to the astrophysically relevant Gamow window. This allows the direct determination of the (p,g) cross section of isotopes with half-lives down to several minutes, which is not possible with any other technique.
Quantum chromodynamics (QCD) is the theory of the strong interaction between quarks and gluons. Due to Confinement, at lower energies quarks and gluons are bound into colorless states called hadrons. QCD is also asymptotically free, i.e. at large energies or densities it enters a deconfined state, termed quark-gluon plasma (QGP), where quarks and gluons are quasi-free. This transition occurs at an energy scale around 200 MeV where QCD cannot be treated perturbatively. Instead it can be formulated on a space-time grid. The resulting theory, lattice quantum chromodynamics (LQCD), can be simulated efficiently on high performance parallel-computing clusters. In recent years graphic processing units (GPUs), which outperform CPUs in terms of parallel-computing and memory bandwidth capabilities, became very popular for LQCD computations. In this work the QCD deconfinement transition is studied using CL2QCD, a LQCD application that runs efficiently on GPUs. Furthermore, CL2QCD is extended by a Rational Hybrid Monte Carlo algorithm for Wilson fermions to allow for simulations of an odd number of quark flavors.
Due to the sign-problem LQCD simulations are restricted to zero or very small baryon densities, where, in the limit of infinite quark mass QCD has a first order deconfinement phase transition associated to the breaking of the global centre symmetry. Including dynamical quarks breaks this symmetry explicitly. Lowering their mass weakens the first order transition until it terminates in a second order Z2 point. Beyond this point the transition is merely an analytic crossover. As the lattice spacing is decreased, the reduction of discretization errors causes the region of first order transitions to expand towards lower masses. In this work the deconfinement critical point with 2 and 3 flavors of standard Wilson fermions is studied. To this end several kappa values are simulated on temporal lattice extents 6,8,10 (4) for two flavors (three flavors) and various aspect ratios (spatial lattice extent / temporal lattice extent) so as to extrapolate to the thermodynamic limit, applying finite size scaling. For two flavors an estimate is done if and when a continuum extrapolation is possible.
The chiral and deconfinement phase transitions at zero density for light and heavy quarks, respectively, have analytic continuations to purely imaginary chemical potential, where no sign-problem exists and LQCD simulations can be applied. At some critical value of the imaginary chemical potential, the transitions meet the endpoint of the Roberge-Weiss transition between adjacent Z3 sectors. For light and heavy quarks the transition lines meet in a triple point, while for intermediate masses they meet in a second order point. At the boundary between these regimes the junction is a tricritical point, as shown in studies with two and three flavors of staggered and Wilson quarks on lattices with a temporal lattice extent of 4. Employing finite size scaling the nature of this point as a function of the quark mass is studied in this work for two flavors of Wilson fermions with a temporal lattice extent of 6. Of particular interest is the change of the location of tricritical points compared to an earlier study on lattices with temporal extent of 4.