Refine
Year of publication
- 2014 (1214) (remove)
Document Type
- Article (588)
- Part of Periodical (163)
- Working Paper (149)
- Book (134)
- Doctoral Thesis (86)
- Report (27)
- Part of a Book (23)
- Conference Proceeding (18)
- Review (10)
- Preprint (8)
Language
- English (1214) (remove)
Keywords
- taxonomy (21)
- new species (19)
- Syntax (11)
- Inversionsfigur (10)
- Multistability (10)
- Multistable figures (10)
- Wahrnehmungswechsel (10)
- morphology (8)
- Bantusprachen (7)
- Benjamin, Walter (7)
Institute
- Medizin (231)
- Wirtschaftswissenschaften (149)
- Center for Financial Studies (CFS) (131)
- Physik (101)
- Biowissenschaften (88)
- Sustainable Architecture for Finance in Europe (SAFE) (86)
- House of Finance (HoF) (82)
- Biochemie und Chemie (48)
- Geowissenschaften (41)
- Gesellschaftswissenschaften (33)
The extinction of conditioned fear depends on an efficient interplay between the amygdala and the medial prefrontal cortex (mPFC). In rats, high-frequency electrical mPFC stimulation has been shown to improve extinction by means of a reduction of amygdala activity. However, so far it is unclear whether stimulation of homologues regions in humans might have similar beneficial effects. Healthy volunteers received one session of either active or sham repetitive transcranial magnetic stimulation (rTMS) covering the mPFC while undergoing a 2-day fear conditioning and extinction paradigm. Repetitive TMS was applied offline after fear acquisition in which one of two faces (CS+ but not CS−) was associated with an aversive scream (UCS). Immediate extinction learning (day 1) and extinction recall (day 2) were conducted without UCS delivery. Conditioned responses (CR) were assessed in a multimodal approach using fear-potentiated startle (FPS), skin conductance responses (SCR), functional near-infrared spectroscopy (fNIRS), and self-report scales. Consistent with the hypothesis of a modulated processing of conditioned fear after high-frequency rTMS, the active group showed a reduced CS+/CS− discrimination during extinction learning as evident in FPS as well as in SCR and arousal ratings. FPS responses to CS+ further showed a linear decrement throughout both extinction sessions. This study describes the first experimental approach of influencing conditioned fear by using rTMS and can thus be a basis for future studies investigating a complementation of mPFC stimulation to cognitive behavioral therapy (CBT).
This paper provides a systematic analysis of individual attitudes towards ambiguity, based on laboratory experiments. The design of the analysis allows to capture individual behavior across various levels of ambiguity, ranging from low to high. Attitudes towards risk and attitudes towards ambiguity are disentangled, providing pure measures of ambiguity aversion. Ambiguity aversion is captured in several ways, i.e. as a discount factor net of a risk premium, and as an estimated parameter in a generalized utility function. We find that ambiguity aversion varies across individuals, and with the level of ambiguity, being most prominent for intermediate levels. Around one third of subjects show no aversion, one third show maximum aversion, and one third show intermediate levels of ambiguity aversion, while there is almost no ambiguity seeking. While most theoretical work on ambiguity builds on maxmin expected utility, our results provide evidence that MEU does not adequately capture individual attitudes towards ambiguity for the majority of individuals. Instead, our results support models that allow for intermediate levels of ambiguity aversion. Moreover, we find risk aversion to be statistically unrelated to ambiguity aversion on average. Taken together, the results support the view that ambiguity is an important and distinct argument in decision making under uncertainty.
The n_TOF facility operates at CERN with the aim of addressing the request of high accuracy nuclear data for advanced nuclear energy systems as well as for nuclear astrophysics. Thanks to the features of the neutron beam, important results have been obtained on neutron induced fission and capture cross sections of U, Pu and minor actinides. Recently the construction of another beam line has started; the new line will be complementary to the first one, allowing to further extend the experimental program foreseen for next measurement campaigns.
We present the results of two-pion production in tagged quasi-free np collisions at a deutron incident beam energy of 1.25 GeV/c measured with the High-Acceptance Di-Electron Spectrometer (HADES) installed at GSI. The specific acceptance of HADES allowed for the first time to obtain high-precision data on π+π− and π−π0 production in np collisions in a region corresponding to large transverse momenta of the secondary particles. The obtained differential cross section data provide strong constraints on the production mechanisms and on the various baryon resonance contributions (∆∆, N(1440), N(1520), ∆(1600)). The invariant mass and angular distributions from the np → npπ+π −and np → ppπ−π0 reactions are compared with different theoretical model predictions.
The elements in the universe are mainly produced by charged-particle fusion reactions and neutron-capture reactions. About 35 proton-rich isotopes, the p-nuclei, cannot be produced via neutron-induced reactions. To date, nucleosynthesis simulations of possible production sites fail to reproduce the p-nuclei abundances observed in the solar system. In particular, the origin of the light p-nuclei 92Mo, 94Mo, 96Ru and 98Ru is little understood. The nucleosynthesis simulations rely on assumptions about the seed abundance distributions, the nuclear reaction network and the astrophysical environment. This work addressed the nuclear data input.
The key reaction 94Mo(g,n) for the production ratio of the p-nuclei 92Mo and 94Mo was investigated via Coulomb dissociation at the LAND/R3B setup at GSI Helmholtzzentrum für Schwerionenforschung in Darmstadt, Germany. A beam of 94Mo with an energy of 500 AMeV was directed onto a lead target. The neutron-dissociation reactions following the Coulomb excitation by virtual photons of the electromagnetic field of the target nucleus were investigated. All particles in the incoming and outgoing channels of the reaction were identified and their kinematics were determined in a complex analysis. The systematic uncertainties were analyzed by calculating the cross sections for all possible combinations of the data selection criteria. The integral Coulomb dissociation cross section of the reaction 94Mo(g,n) was determined to be (571 +- 14 (stat) +- 46 (syst) ) mb. The result was compared to the data obtained in a real photon experiment carried out at the Saclay linear accelerator. The ratio of the integral cross sections was found to be 0.63 +- 0.07, which is lower than the expected value of about 0.8.
The nucleosynthesis of the light p-nuclei 92Mo, 94Mo, 96Ru and 98Ru was investigated in post-processing nucleosynthesis simulations within the NuGrid research platform. The impact of rate uncertainties of the most important production and destruction reactions was studied for a Supernova type II model. It could be shown that the light p-nuclei are mainly produced via neutron-dissociation reactions on heavier nuclei in the isotopic chains, and that the final abundances of these p-nuclei are determined by their main destruction reactions. The nucleosynthesis of 92Mo and 94Mo was also studied in different environments of a Supernova type Ia model. It was concluded that the maximum temperature and the duration of the high temperature phase determine the final abundances of 92Mo and 94Mo.
A measurement of the transverse momentum spectra of jets in Pb-Pb collisions at sNN−−−√=2.76 TeV is reported. Jets are reconstructed from charged particles using the anti-kT jet algorithm with jet resolution parameters R of 0.2 and 0.3 in pseudo-rapidity |η|<0.5. The transverse momentum pT of charged particles is measured down to 0.15 GeV/c which gives access to the low pT fragments of the jet. Jets found in heavy-ion collisions are corrected event-by-event for average background density and on an inclusive basis (via unfolding) for residual background fluctuations and detector effects. A strong suppression of jet production in central events with respect to peripheral events is observed. The suppression is found to be similar to the suppression of charged hadrons, which suggests that substantial energy is radiated at angles larger than the jet resolution parameter R=0.3 considered in the analysis. The fragmentation bias introduced by selecting jets with a high pT leading particle, which rejects jets with a soft fragmentation pattern, has a similar effect on the jet yield for central and peripheral events. The ratio of jet spectra with R=0.2 and R=0.3 is found to be similar in Pb-Pb and simulated PYTHIA pp events, indicating no strong broadening of the radial jet structure in the reconstructed jets with R<0.3.
This thesis is structured into 7 chapters:
• Chapter 2 gives an overview of the ultrashort high intensity laser interaction with matter. The laser interaction with an induced plasma is described, starting from the kinematics of single electron motion, followed by collective electron effects and the ponderamotive motion in the laser focus and the plasma transparency for the laser beam. The three different mechanisms prepared to accelerate and propagate electrons through matter are discussed. The following indirect acceleration of protons is explained by the Target Normal Sheath Acceleration (TNSA) mechanism. Finally some possible applications of laser accelerated protons are explained briefly.
• Chapter 3 deals with the modeling of geometry and field mapping of magnetic lens. Initial proton and electron distributions, fitted to PHELIX measured data are generated, a brief description of employed codes and used techniques in simulation is given, and the aberrations at the solenoid focal spot is studied.
• Chapter 4 presents a simulation study for suggested corrections to optimize the proton beam as a later beam source. Two tools have been employed in these suggested corrections, an aperture placed at the solenoid focal spot as energy selection tool, and a scattering foil placed in the proton beam to smooth the radial energy beam profile correlation at the focal spot due to chromatic aberrations. Another suggested correction has been investigated, to optimize the beam radius at the focal spot by lens geometry controlling.
• Chapter 5 presents a simulation study for the de-neutralization problem in TNSA caused by the fringing fields of pulsed magnetic solenoid and quadrupole. In this simulation, we followed an electrostatic model, wherethe evolution of both, self and mutual fields through the pulsed magnetic solenoid could be found, which is not the case in the quadrupole and only the growth of self fields could be found. The field mapping of magnetic elements is generated by the Matlab program, while the TraceWin code is employed to study the tracking through magnetic elements.
• Chapter 6 describes the PHELIX laser parameters at GSI with chirp pulse amplification technique (CPA), and Gafchromic Radiochromic film RCF) as a spatial energy resolver film detector. The results of experiments with laser proton acceleration, which were performed in two experimental areas at GSI (Z6 area and PHELIX Laser Hall (PLH)), are presented in section 6.3.
• Chapter 7 includes the main results of this work, conclusions and gives a perspective for future experimental activities.
Mathematical modeling of Arabidopsis thaliana with focus on network decomposition and reduction
(2014)
Systems biology has become an important research field during the last decade. It focusses on the understanding of the systems which emit the measured data. An important part of this research field is the network analysis, investigating biological networks. An essential point of the inspection of these network models is their validation, i.e., the successful comparison of predicted properties to measured data. Here especially Petri nets have shown their usefulness as modeling technique, coming with sound analysis methods and an intuitive representation of biological network data.
A very important tool for network validation is the analysis of the Transition-invariants (TI), which represent possible steady-state pathways, and the investigation of the liveness property. The computational complexity of the determination of both, TI and liveness property, often hamper their investigation.
To investigate this issue, a metabolic network model is created. It describes the core metabolism of Arabidopsis thaliana, and it is solely based on data from the literature. The model is too complex to determine the TI and the liveness property.
Several strategies are followed to enable an analysis and validation of the network. A network decomposition is utilized in two different ways: manually, motivated by idea to preserve the integrity of biological pathways, and automatically, motivated by the idea to minimize the number of crossing edges. As a decomposition may not be preserving important properties like the coveredness, a network reduction approach is suggested, which is mathematically proven to conserve these important properties. To deal with the large amount of data coming from the TI analysis, new organizational structures are proposed. The liveness property is investigated by reducing the complexity of the calculation method and adapting it to biological networks.
The results obtained by these approaches suggest a valid network model. In conclusion, the proposed approaches and strategies can be used in combination to allow the validation and analysis of highly complex biological networks.
A recent paper on the phylogenetic relationships of species within the cephalopod family Mastigoteuthidae meant great progress in stabilizing the classification of the family. The authors, however, left the generic placement of Mastigoteuthis pyrodes unresolved. This problem is corrected here by placing this species in a new monotypic genus, Mastigotragus, based on unique structures of the photophores and the funnel/mantle locking apparatus.
More than 100 years after Henry James’s death, criticism is still working through unresolved gender issues in his fiction. This study proposes a new interdisciplinary approach to the gendered power relations in James’s novels that fills a crucial vacancy in the literature. Reading James’s intricately woven narrative form through the lens of relational sociology, specifically Pierre Bourdieu’s concept of symbolic domination, reconciles some of the most fiercely disputed positions in James studies of the past decades. With its focus on gender-related symbolic domination, this study demonstrates this approach’s potential to probe the depths of James’s fictional social worlds while developing the narratological tools to do so.
Many critics have paid attention to the relational nature of James’s social fictions as well as his talent for capturing unspoken, invisible, hidden social constraints. Blatantly missing from the literature is a systematic relational analysis into the specifically Jamesian method of narrating the socio-psychological, embodied responses to power and oppression. The present study closes this research gap. It reveals how James persistently narrates his characters as social agents whose perception, affects, and bodily practices are products of the social structures that they in turn continue to shape and reproduce. Moreover, it traces a development throughout James’s career that reflects his growing sensitivity for the stubbornness of some seemingly insurmountable social constraints. James’s fictional social worlds are relational ones through and through. This study is the first sustained effort to investigate the way in which his narratives capture this interrelatedness.
This article explores life insurance consumption in 31 European countries from 2003 to 2012 and aims to investigate the extent to which market transparency can affect life insurance demand. The cross-country evidence for the entire sample period shows that greater market transparency, which resolves asymmetric information, can generate a higher demand for life insurance. However, when considering the financial crisis period (2008-2012) separately, the results suggest a negative impact of enhanced market transparency on life insurance consumption. The mixed findings imply a trade-off between the reduction in adverse selection under greater market transparency and the possible negative effects on life insurance consumption during the crisis period due to more effective market discipline. Furthermore, this article studies the extent to which transparency can influence the reaction of life insurance demand to bad market outcomes: i.e., low solvency ratios or low profitability. The results indicate that the markets with bad outcomes generate higher life insurance demand under greater transparency compared to the markets that also experience bad outcomes but are less transparent.
Many Zanjian settlements (8th to 13th centuries AD) on Tanzania’s coast are considered to have collapsed and not regarded as belonging to the formation of the Swahili culture (13th to 16th centuries AD). With this regard, Swahili traditions found on Tanzania’s coast are seldom linked to local Zanjian precursors but to external influence especially from Lamu archipelago on the Kenya coast. Nevertheless, new archaeological evidences from Pangani Bay on the northern coast of Tanzania suggest that the external influences to cultural continuity and change from Zanjian to Swahili periods are overemphasized. This conclusion is grounded on archaeological field works conducted in the surrounding of Pangani Bay in 2010 and 2012, where major Swahili sites directly overlie Zanjian sites without recognizable changes of the cultural materials. The study compares and contrasts cultural materials (in particular pottery) and remains of economy and trade (fauna and glass beads) traditions from both Zanjian and Swahili phases. The aim of this comparative analysis is to trace change and continuity of archaeological traditions for better understanding the origin of Swahili culture in Pangani Bay.
In this endeavour, the analysis of ceramic, faunal remains and glass beads from Pangani Bay proposes negligible differences of materials and economical traditions from the late 1st to 2nd millennia AD. That is, local ceramic styles by Swahilis show only minor differences to those used by their ancestors, while fauna data suggest a similarity in subsistence economy between Zanjian and Swahili periods. Correspondingly, glass bead data indicate that although maritime trade became highly sophisticated during Swahili time, early involvement into oceanic far distance trade contact began in the Zanjian period. Thus, this thesis conveys all issues together. It presents research objectives, field work methods as well as analysis and interpretation of the results, with a main focus on ceramic, fauna and bead data. With the support of archaeological evidences, the current work concludes that there is more continuity than change in most of the Zanjian traditions that facilitated the origin of Swahili culture in Pangani Bay.
he predictive likelihood is of particular relevance in a Bayesian setting when the purpose is to rank models in a forecast comparison exercise. This paper discusses how the predictive likelihood can be estimated for any subset of the observable variables in linear Gaussian state-space models with Bayesian methods, and proposes to utilize a missing observations consistent Kalman filter in the process of achieving this objective. As an empirical application, we analyze euro area data and compare the density forecast performance of a DSGE model to DSGE-VARs and reduced-form linear Gaussian models.
Mapping is an important tool for the management of plant invasions. If landscapes are mapped in an appropriate way, results can help managers decide when and where to prioritize their efforts. We mapped vegetation with the aim of providing key information for managers on the extent, density and rates of spread of multiple invasive species across the landscape. Our case study focused on an area of Galapagos National Park that is faced with the challenge of managing multiple plant invasions. We used satellite imagery to produce a spatially explicit database of plant species densities in the canopy, finding that 92% of the humid highlands had some degree of invasion and 41% of the canopy was comprised of invasive plants. We also calculated the rate of spread of eight invasive species using known introduction dates, finding that species with the most limited dispersal ability had the slowest spread rates while those able to disperse long distances had a range of spread rates. Our results on spread rate fall at the lower end of the range of published spread rates of invasive plants. This is probably because most studies are based on the entire geographic extent, whereas our estimates took plant density into account. A spatial database of plant species densities, such as the one developed in our case study, can be used by managers to decide where to apply management actions and thereby help curtail the spread of current plant invasions. For example, it can be used to identify sites containing several invasive plant species, to find the density of a particular species across the landscape or to locate where native species make up the majority of the canopy. Similar databases could be developed elsewhere to help inform the management of multiple plant invasions over the landscape.
Telecommunications companies traditionally offer several tariffs from which their customers can choose the tariff that best suits their preferences. Yet, customers sometimes make choices that are not optimal for them because they do not minimize their bill for a certain usage amount. We show in this paper that companies should be very concerned about choices in which customers pick tariffs that are too small for them because they lead to a significant increase in customers churn. In contrast, this is not the case if customers choose tariffs that are too big for them. The reason is that in particular flat-rates provide customers with the additional benefit that they guarantee a constant bill amount that consumption can be enjoyed more freely because all costs are already accounted for.
FINANCIAL SERVICE PROVIDERS FACE SERIOUS PROBLEMS IF MANY OF THEIR CUSTOMERS LEAVE QUICKLY BECAUSE SUCH CUSTOMERS HAVE LITTLE LONG-TERM VALUE. STILL, CURRENT REPORTING PRIMARILY FOCUSES ON CURRENT PROFITABILITY THAT REPRESENTS THE SHORT-TERM VALUE OF THE CUSTOMERS. THE LONG-TERM VALUE TYPICALLY RECEIVES LITTLE ATTENTION. CUSTOMER EQUITY REPORTING PRESENTS A MEANS TO FOCUS ON THE LONG-TERM VALUE OF THE COMPANY'S CUSTOMERS. IT AVOIDS THE RISK THAT SHORT-TERM PROFITS ARE INCREASED AT THE EXPENSE OF LONG-TERM VALUE CREATION AND ITS CENTRAL METRIC, CUSTOMER EQUITY, SERVES AS AN EARLY WARNING INDICATOR FOR RISK MANAGEMENT SYSTEMS THAT FOCUS ON CUSTOMER LOSS.
5-lipoxygenase (5-LO) is an enzyme with a substantial role in inflammatory processes. In vitro kinase assays using [32P]-ATP in combination with mutagenesis have revealed that serine residues 271, 523 and 663 can be phosphorylated by MK2, PKA and ERK2 kinases, respectively. A few available reports regarding 5-LO protein sequence have covered up to 30% of the sequence after amino acid sequencing including Ser663. In LCMS/MS analyses of 5-LO tryptic digests from different cellular sources different peptides have been detected; however, none of the three phosphorylations has been detected and only Ser663 was included in the covered sequence.
As there was no comprehensive mass spectrometric analysis of 5-LO, the purpose of this study was to optimize the experimental conditions under which detection of the aforementioned phosphorylation events, as well as other possible post-translational modifications (PTMs), would be feasible. Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry (MALDI-MS) was used for peptide analysis of 5-LO cleaved either by chemical reagents or by proteases. Sequence coverage of 5-LO could be enhanced to be close to completion by combination of results from digestions by trypsin, AspN and chymotrypsin. In-gel trypsin digestion followed by in-solution AspN digestion proved to be a useful sample treatment for reproducible detection of the Ser271-containing peptide.
Nevertheless, in none of the examined cleavage protocols the sequence around Ser523 was detected reproducibly or with acceptable signal intensity for subsequent peptide fragmentation. Propionic anhydride and sulfo-NHS-SS-biotin cross-linker (EZ-linkTM), were used for derivatization of lysine side chains and hindrance of lysine residue recognition by trypsin. Phosphopeptide enrichment became possible after tryptic digestion of these samples, not only due to formation of an individual Ser523-containing peptide, but also because TiO2-mediated enrichment, which is performed in acidic pH, was not impaired by positively charged free lysine side chains. Additionally, biotinylation of lysine residues was exploited for an intermediate enrichment step of the lysine containing peptides, prior to TiO2 phosphopeptide enrichment.
MALDI-MS analysis after in-vitro phosphorylation of 5-LO by the three kinases showed that Ser271 was phosphorylated in the MK2 and PKA kinase assays, while Ser523 was phosphorylated only in the PKA kinase assay. Surpisingly, no phosphopeptides were detected in the in-vitro kinase assays with ERK2, even though the unmodified counterpart of the Ser663-containing peptide was easily detected. The detection limit for each of the three phosphorylation sites was determined by the use of custom made phosphopeptides and an amount of 0.06 pmol of phosphopeptide in 1 μg 5-LO (representing 0.5% phosphorylation rate) was sufficient in all cases for successful enrichment and detection by MS.
In-vitro kinase assays with [32P]-ATP were performed for some kinases that were expected to phosphorylate 5-LO according to in-silico data. Three members of the Src tyrosine kinase family (Fgr, Hck and Yes) and the Ser/Thr specific kinase DNA-PK used 5-LO as their substrate and mainly residues at the N-terminal part of 5-LO were detected phosphorylated by MS (e.g. Y42, Y53). Additional in-vitro assays for recombinant 5-LO modification included incubation with glutathione or compound U73122, previously described as inhibitor of 5-LO.
Since in-vitro assays might have generated artifacts, a method for 5-LO purification from human cells was sought, in order to examine the modification state of the protein in the cellular context. ATP-agarose affinity purification and anti-5-LO immunoprecipitation proved inappropriate for sample purification for MALDI-MS analysis. Consequently, two human cell lines that are able to express 5-LO (Rec-1 Blymphocytes and MM6 monocytes) were transduced with a DNA cassette that contained recombinant human 5-LO sequence with an attached N-terminal FLAG-tag. Anti-FLAG immunoprecipitation was then performed effectively in cell lysates and the precipitated FLAG-5-LO was separated by SDS-PAGE before MALDI-MS analysis.
The examined cell stimuli were expected to result to phosphorylation of 5-LO at Ser523 by PKA in Rec-1 cells and to phosphorylation of Ser271 and/or Ser663 in MM6 cells by activated MK2 and ERK2, respectively. Additionally, under the conditions of MM6 cell stimulation, Fgr, Hck and Yes kinases, which phosphorylated 5-LO in vitro, were expected to be activated and the possibility of 5-LO phosphorylation on tyrosine was investigated. Although immunoblotting results indicated that all the aforementioned phosphorylation events existed in the examined samples, MALDI-MS analysis verified only phosphorylation on Ser271 in differentiated MM6 cells, interestingly regardless of cell stimulation.
Finally, the primary amine derivatization procedure by EZ-linkTM was utilized for MS analysis of lysine rich proteins. In the past, chemical propionylation of histones had been employed prior to trypsin digestion; however it was easily confused in MS with combinations of other PTMs (e.g. acetylation, methylation). Moreover, propionylation is a PTM for histone H3 and this information was lost. Consequently, the EZ-link reagent was more useful for analysis of histones, as unambiguous assignment of PTMs and detection of native propionylation on bovine H3 became possible.
Background: Malaria is still a priority public health problem of Nepal where about 84% of the population are at risk. The aim of this paper is to highlight the past and present malaria situation in this country and its challenges for long-term malaria elimination strategies.
Methods: Malariometric indicator data of Nepal recorded through routine surveillance of health facilities for the years between 1963 and 2012 were compiled. Trends and differences in malaria indicator data were analysed.
Results: The trend of confirmed malaria cases in Nepal between 1963 and 2012 shows fluctuation, with a peak in 1985 when the number exceeded 42,321, representing the highest malaria case-load ever recorded in Nepal. This was followed by a steep declining trend of malaria with some major outbreaks. Nepal has made significant progress in controlling malaria transmission over the past decade: total confirmed malaria cases declined by 84% (12,750 in 2002 vs 2,092 in 2012), and there was only one reported death in 2012. Based on the evaluation of the National Malaria Control Programme in 2010, Nepal recently adopted a long-term malaria elimination strategy for the years 2011–2026 with the ambitious vision of a malaria-free Nepal by 2026. However, there has been an increasing trend of Plasmodium falciparum and imported malaria proportions in the last decade. Furthermore, the analysis of malariometric indicators of 31 malaria-risk districts between 2004 and 2012 shows a statistically significant reduction in the incidence of confirmed malaria and of Plasmodium vivax, but not in the incidence of P. falciparum and clinically suspected malaria.
Conclusions: Based on the achievements the country has made over the last decade, Nepal is preparing to move towards malaria elimination by 2026. However, considerable challenges lie ahead. These include especially, the need to improve access to diagnostic facilities to confirm clinically suspected cases and their treatment, the development of resistance in parasites and vectors, climate change, and increasing numbers of imported cases from a porous border with India. Therefore, caution is needed before the country embarks towards malaria elimination.
The European Central Bank (ECB) has finalized its comprehensive assessment of the solvency of the largest banks in the euro area and on October 26 disclosed the results of this assessment. In the present paper, Acharya and Steffen compare the outcomes of the ECB's assessment to their own benchmark stress tests conducted for 39 publically listed financial institutions that are also included in the ECB's regulatory review. The authors identify a negative correlation between their benchmark estimates for capital shortfalls and the regulatory capital shortfall, but a positive correlation between their benchmark estimates for losses under stress both in the banking book and in the trading book. They conclude that the regulatory stress test outcomes are potentially heavily affected by discretion of national regulators in measuring what is capital, and especially the use of risk-weighted assets in calculating the prudential capital requirement.
Europeana provides a common access point to digital cultural heritage objects across different cultural domains among which the libraries. The recent development of the Europeana Data Model (EDM) provide new ways for libraries to experiment with Linked Data. Indeed the model is designed as a framework reusing various wellknown standards developed in the Semantic Web Community, such as the Resource Description Framework (RDF), the OAI Object Reuse and Exchange (ORE), and Dublin Core namespaces. It provides new opportunities for libraries to provide rich and interlinked metadata to the Europeana aggregation.
However to be able to provide data to Europeana, libraries need to create mappings from the librarystandard to EDM. This step involves decisions based on domainspecific requirements and on the possibilities offered by EDM. The crossdomain nature of EDM limiting in some cases the completeness of the mappings, extension of the model have been proposed to accommodate the library needs.
The "Digitised Manuscripts to Europeana" project (DM2E) has created an extension of EDM to optimise the mappings of librarydata for manuscripts. This extension is in the form of subclasses and subproperties that further specialise EDM concepts and properties. It includes spatial creation and publishing information, specific contributor and publication type properties and more.
Furthermore the granularity of the mapping has been extended to allow references and annotations on page level as required for scholarly work. As part of this project the metadata of the Hebrew Manuscripts as well as of the Medieval Manuscripts presented in the Digital Collections of the Frankfurt University Library have been mapped to this extension. This includes links to the Integrated Authority File (GND) of the German National Library with further links to the Virtual International Authority File (VIAF).
Based on this development a new comprehensive mapping from the digitalisation metadata format METS/MODS to EDM has been established for all materials of the Frankfurt Judaica in "Judaica Europeana ". It demonstrates today’s capabilities of the creation of linked Data structures in Europeana based on library catalogue data and structural data from the digitalisation process.
Cryptochrome 1a, located in the UV/violet-sensitive cones in the avian retina, is discussed as receptor molecule for the magnetic compass of birds. Our previous immunohistochemical studies of chicken retinae with an antiserum that labelled only activated cryptochrome 1a had shown activation of cryptochrome 1a under 373 nm UV, 424 nm blue, 502 nm turquoise and 565 nm green light. Green light, however, does not allow the first step of photoreduction of oxidized cryptochromes to the semiquinone. As the chickens had been kept under ‘white’ light before, we suggested that there was a supply of the semiquinone present at the beginning of the exposure to green light, which could be further reduced and then re-oxidized. To test this hypothesis, we exposed chickens to various wavelengths (1) for 30 min after being kept in daylight, (2) for 30 min after a 30 min pre-exposure to total darkness, and (3) for 1 h after being kept in daylight. In the first case, we found activated cryptochrome 1a under UV, blue, turquoise and green light; in the second two cases we found activated cryptochrome 1a only under UV to turquoise light, where the complete redox cycle of cryptochrome can run, but not under green light. This observation is in agreement with the hypothesis that activated cryptochrome 1a is found as long as there is some of the semiquinone left, but not when the supply is depleted. It supports the idea that the crucial radical pair for magnetoreception is generated during re-oxidation.
Cryo-electron tomography provides a snapshot of the cellular proteome. With template matching, the spatial positions of various macromolecular complexes within their native cellular context can be detected. However, the growing awareness of the reference bias introduced by the cross-correlation based approaches, and more importantly the lack of a reliable confidence measurement in the selection of these macromolecular complexes, has restricted the use of these applications. Here we propose a heuristic, in which the reference bias is measured in real space in an analogous way to the R-free value in X-ray crystallography. We measure the reference bias within the mask used to outline the area of the template, and do not modify the template itself. The heuristic works by splitting the mask into a working and a testing area in a volume ratio of 9:1. While the working area is used during the calculation of the cross-correlation function, the information from both areas is explored to calculate the M-free score. We show using artificial data, that the M-free score gives a reliable measure for the reference bias. The heuristic can be applied in template matching and in sub-tomogram averaging. We further test the applicability of the heuristic in tomograms of purified macromolecules, and tomograms of whole Mycoplasma cells.
Lysimachia mauritiana Lam. (family Primulaceae), a small short-lived herb native to India, Indian and Pacific Ocean islands, and coastal east Asia, is described as a new naturalised record from the eastern suburbs of Sydney, New South Wales, Australia. It was first recorded in 1981 near Coogee, and grows in exposed rock crevices and seepages on the seacoast, very similar to its natural habitat overseas. Lysimachia mauritiana is known to have been cultivated in the area in 1961 in a home garden, which is the likely source of this introduction; it appears to be spreading locally as a weed.
Bacteria communicate via small diffusible molecules to mediate group-coordinated behavior, a process designated as quorum sensing. The basic molecular quorum sensing system of Gram-negative bacteria consists of a LuxI-type autoinducer synthase producing acyl-homoserine lactones (AHLs) as signaling molecules, and a LuxR-type receptor detecting the AHLs to control expression of specific genes. However, many proteobacteria possess one or more unpaired LuxR-type receptors that lack a cognate LuxI-like synthase, referred to as LuxR solos. The enteric and insect pathogenic bacteria of the genus Photorhabdus harbor an extraordinarily high number of LuxR solos, more than any other known bacteria, and all lack a LuxI-like synthase. Here, we focus on the presence and the different types of LuxR solos in the three known Photorhabdus species using bioinformatics analyses. Generally, the N-terminal signal-binding domain (SBD) of LuxR-type receptors sensing AHLs have a motif of six conserved amino acids that is important for binding and specificity of the signaling molecule. However, this motif is altered in the majority of the Photorhabdus-specific LuxR solos, suggesting the use of other signaling molecules than AHLs. Furthermore, all Photorhabdus species contain at least one LuxR solo with an intact AHL-binding motif, which might allow the ability to sense AHLs of other bacteria. Moreover, all three species have high AHL-degrading activity caused by the presence of different AHL-lactonases and AHL-acylases, revealing a high quorum quenching activity against other bacteria. However, the majority of the other LuxR solos in Photorhabdus have a N-terminal so-called PAS4-domain instead of an AHL-binding domain, containing different amino acid motifs than the AHL-sensors, which potentially allows the recognition of a highly variable range of signaling molecules that can be sensed apart from AHLs. These PAS4-LuxR solos are proposed to be involved in host sensing, and therefore in inter-kingdom signaling. Overall, Photorhabdus species are perfect model organisms to study bacterial communication via LuxR solos and their role for a symbiotic and pathogenic life style.
Noise-induced hearing loss is one of the most common auditory pathologies, resulting from overstimulation of the human cochlea, an exquisitely sensitive micromechanical device. At very low frequencies (less than 250 Hz), however, the sensitivity of human hearing, and therefore the perceived loudness is poor. The perceived loudness is mediated by the inner hair cells of the cochlea which are driven very inadequately at low frequencies. To assess the impact of low-frequency (LF) sound, we exploited a by-product of the active amplification of sound outer hair cells (OHCs) perform, so-called spontaneous otoacoustic emissions. These are faint sounds produced by the inner ear that can be used to detect changes of cochlear physiology. We show that a short exposure to perceptually unobtrusive, LF sounds significantly affects OHCs: a 90 s, 80 dB(A) LF sound induced slow, concordant and positively correlated frequency and level oscillations of spontaneous otoacoustic emissions that lasted for about 2 min after LF sound offset. LF sounds, contrary to their unobtrusive perception, strongly stimulate the human cochlea and affect amplification processes in the most sensitive and important frequency range of human hearing.
Low-energy effective models for two-flavor quantum chromodynamics and the universality hypothesis
(2014)
Die Untersuchung der Natur auf extremen Längenskalen hat seit jeher zu bahnbrechenden Einsichten und Innovationen geführt. Insbesondere zu unserem heutigen Verständnis, dass Nukleonen (Protonen und Neutronen) aus Quarks zusammengesetzt sind, die infolge der starken Wechselwirkung, vermittelt durch Gluonenaustausch, gebunden sind. Mit dem Aufkommen des Quarkmodells wurde bald die Quantenchromodynamik (QCD) erfolgreich in der Beschreibung vieler messbarer Eigenschaften der starken Wechselwirkung. Um es mit Goethe zu sagen: mit den modernen Hochenergie-Beschleuniger-Experimenten wird versucht unser Verständnis davon zu verbessern, was die Welt im Innersten zusammenhält. Am Large Hadron Collider (LHC) werden beispielsweise Protonen derart beschleunigt und miteinander zur Kollision gebracht, dass bislang unerreichte Energiedichten auftreten, infolge derer Temperatur und baryochemisches Potential Werte annehmen, die mit denen des frühen Universums vergleichbar sind. Es gibt sowohl theoretische als auch experimentelle Hinweise darauf, dass hadronische Materie mit zunehmender Temperatur und/oder zunehmendem baryochemischen Potentials einen Phasenübergang durchläuft, hin zu einem exotischen Zustand, der als Quark-Gluon-Plasma bekannt ist. Dieser Übergang wird begleitet von einem sogenannten chiralen Übergang. Es ist eine wichtige Frage, ob es sich bei diesem chiralen Übergang um einen echten Phasenübergang (von erster bzw. zweiter Ordnung) handelt, oder ob ein sogenannter crossover vorliegt. Einige Resultate deuten auf einen crossover für verschwindendes baryochemisches Potential und einen Phasenübergang erster Ordnung für verschwindende Temperatur hin, lassen jedoch noch keinen endgültigen Schluss zu, ob dies tatsächlich der Realität entspricht. Wenn ja, so liegt die Annahme nahe, dass ein kritischer Endpunkt existiert, an dem der chirale Übergang von zweiter Ordnung ist. In der Tat existiert ein kritischer Endpunkt in einigen theoretischen Zugängen zur Beschreibung des chiralen Phasenübergangs, deren Aussagekraft seit jeher lebhaft diskutiert wird. Ein zentrales Ziel des zukünftigen CBM-Experiments an der GSI in Darmstadt ist es, die Existenz im Experiment zu überprüfen.
In der Nähe des QCD-(Phasen)übergangs ist es die Abwesenheit jeglicher perturbativer Entwicklungsparameter, die exakte analytische Berechnungen verbietet. Das gleiche gilt für realistische effektive Modelle für QCD. Nichtperturbative Methoden sind daher unverzichtbar für die Untersuchung des QCD-Phasendiagramms. Zu den populärsten dieser Zugänge gehören Gitter-QCD, Resummierungsverfahren, der Dyson-Schwinger-Formalismus, sowie die Funktionale Renormierungsgruppe (FRG). All diese Methoden ergänzen sich gegenseitig und werden zum Teil auch miteinander kombiniert. Eine der Stärken der FRG-Methode ist, dass sie nicht nur erfolgreich auf effektive Modelle angewendet werden kann, sondern auch auf QCD selbst. Für letztere Ab-Initio-Rechnungen sind die aus effektiven Modellen für QCD gewonnenen Resultate von grossem Wert.
Der Schwerpunkt der vorliegenden Arbeit liegt auf der Fragestellung von welcher Ordnung der chirale Phasenübergang im Fall von genau zwei leichten Quarksorten ist. Problemstellungen wie die Suche nach einer Antwort auf die Frage nach den Bedingungen für die Existenz eines Phasenübergangs zweiter Ordnung, die Bestimmung der Universalitätsklasse in diesem Fall etc. erfordern Wissen aus verschiedenen Gebieten.
Kapitel 1 besteht aus einer allgemeinen Einleitung.
In Kapitel 2 stellen wir zunächst einige allgemeine Aspekte von Phasenübergängen dar, die von besonderer Relevanz für das Verständnis des Renormierungsgruppen-Zugangs zu ebendiesen sind. Unser Fokus liegt hierbei auf einer kritischen Untersuchung der Universalitätshypothese. Insbesondere die Rechtfertigung des linearen Sigma-Modells als effektive Theorie für den chiralen Ordnungsparameter beruht auf der Gültigkeit selbiger.
Kapitel 3 beschäftigt sich mit dem chiralen Phasenübergang von einem allgemeinen Standpunkt aus. Wir ergünzen wohlbekannte Fakten durch eine detaillierte Diskussion der sogenannten O(4)-Hypothese. Die Überprüfung der Gültigkeit selbiger wird schließlich in Kapitel 6 und 7 in Angriff genommen.
In Kapitel 4 stellen wir die von uns benutzte FRG-Methode vor. Außerdem diskutieren wir den Zusammenhang zwischen effektiven Theorien für QCD und der QCD selbst.
Kapitel 5 behandelt ein mathematisches Thema, das für alle unserer Untersuchungen unabdingbar ist, nämlich die systematische Konstruktion polynomialer Invarianten zu einer gegebenen Symmetrie. Wir präsentieren einen einfachen, jedoch neuartigen, Algorithmus für die praktische Konstruktion von Invarianten einer gegebenen polynomialen Ordnung.
Kapitel 6 widmet sich Renormierungsgruppen-Studien einer Reihe dimensional reduzierter Theorien. Von zentralem Interesse ist hierbei das lineare Sigma-Modell, insbesondere in Anwesenheit der axialen Anomalie. Es stellt sich heraus, dass die Fixpunkt-Struktur des letzteren vergleichsweise kompliziert ist und ein tieferes Verständnis der zugrundeliegenden Methode sowie ihrer Annahmen erfordert. Dies führt uns zu einer sorgfältigen Analyse der Fixpunkt-Struktur von Modellen verschiedenster Symmetrien. Im Zusammenhang mit der Untersuchung des Einflusses von Vektor- und Axial-Vektor-Mesonen stoßen wir hierbei auf eine neue Universalitä}tsklasse.
Während wenig Spielraum für die Wahl der Symmetriegruppe der effektiven Theorie für den chiralen Ordnungsparameter besteht, ist die Identifizierung der Ordnungsparameter-Komponenten mit den relevanten mesonischen Freiheitsgraden hochgradig nichttrivial. Diese Wahl entspricht der Wahl einer Darstellung der Gruppe und kann zur Zeit nicht eindeutig aus der QCD hergeleitet werden. Es ist daher unerlässlich, verschiedene Möglichkeiten auszutesten. Eine wohlbekannte Wahl besteht darin, das Pion und seinen chiralen Partner, das Sigma-Meson, der O(4)-Darstellung für SU(2)_A x SU(2)_V zuzuordnen, welche einen Phasenübergang zweiter Ordnung erlaubt. Dieses Szenario ist jedoch nur dann sinnvoll, wenn nahe der kritischen Temperatur alle anderen Mesonen entsprechend schwer sind. Im Fall von genau zwei leichten Quarkmassen erfordert dies eine hinreichend große Anomaliestärke. Berücksichtigt man zusätzlich zum Pion und Sigma-Meson auch das Eta-Meson und das a_0-Meson, liefern unsere derzeitigen expliziten Rechnungen keinen Nachweis für die Existenz eines Phasenübergang zweiter Ordnung. Stattdessen spricht die Abwesenheit eines physikalischen (hinsichtlich der Massen) infrarot-stabilen Fixpunktes für einen fluktuationsinduzierten Phasenübergang erster Ordnung. Dieses Ergebnis ist auch zu erwarten (jedoch nicht impliziert), allein durch die Existenz zweier quadratischer Invarianten. Es besteht jedoch immer noch eine hypothetische Chance auf einen Phasenübergang zweiter Ordnung in der SU(2)_A x U(2)_V -Universalitätsklasse. Dies wäre der Fall, wenn der entsprechende von uns gefundene unphysikalische infrarot-stabile Fixpunkt physikalisch werden sollte in höherer Trunkierungsordnung. Interessanterweise finden wir bei endlicher Temperatur für gewisse Parameter einen Phasenübergang zweiter Ordnung. Es ist unklar, ob diese Wahl der Parameter in den Gültigkeitsbereich der dimensional reduzierten Theorie fällt.
Erst vor kurzem (Ende September 2013) wurde die Existenz eines infrarot-stabilen U(2)_A x U(2)_V-symmetrischen Fixpunkts durch Pelissetto und Vicari verifiziert (die zugehörige anomale Dimension ist mit 0.12 angegeben). Dieses Resultat war sehr
überraschend, da für zwei leichte Quarksorten und abwesende Anomalie ein Phasenübergang erster Ordnung relativ gesichert erschien, insbesondere durch die Epsilon-Entwicklung. Offensichtlich versagt letztere jedoch im Limes D=3, also für drei räumliche Dimensionen, da lediglich Fixpunkte gefunden werden können, die auch nahe D=4 existieren. Inspiriert durch diesen wichtigen Fund führen wir eine FRG-Fixpunktstudie in lokaler Potential-Näherung und hoher Trunkierungsordnung (bis zu zehnter Ordnung in den Feldern) durch. Die Stabilitätsanalyse besitzt jedoch leider keine Aussagekraft, da die Stabilitätsmatrix für den Gaußschen Fixpunkt marginale Eigenwerte besitzt. Wir sind überzeugt davon, dass dies nicht mehr der Fall ist, wenn man über die lokale Potential-Näherung hinausgeht und eine nichtverschwindende anomale Dimension zulässt. Die bisherigen Resultate verdeutlichen die Limitierungen der lokalen Potential-Näherung und der Epsilon-Entwicklung, auf denen unsere Untersuchungen zur Universalitätshypothese in weiten Teilen beruhen. Systematische Untersuchungen der Fixpunktstruktur von Modellen mit acht Ordnungsparameter-Komponenten wurden in der Literatur im Rahmen der Epsilon-Entwicklung durchgeführt und im Rahmen dieser Dissertation innerhalb der lokalen Potential-Näherung. Die meisten der Vorhersagen der Epsilon-Entwicklung konnten bestätigt werden, einige hingegen werden in Frage gestellt durch das Auftauchen marginaler Stabilitätsmatrix-Eigenwerte.
Einige wichtige Fragestellungen können nicht im Rahmen einer dimensional reduzierten Theorie behandelt werden, da die explizite Temperaturabhängigkeit in diesem Fall eliminiert wurde.
Insbesondere ist es in diesem Fall nicht möglich, die Stärke eines Phasenübergangs erster Ordnung vorherzusagen, da diese von Observablen (Meson-Massen und die Pion-Zerfallskonstante im Vakuum) abhängen, an die man bei verschwindender Temperatur fitten muss. Dieser Umstand führt uns zu solchen FRG-Studien, in denen die Temperatur als expliziter Parameter verbleibt.
Ein beträchtlicher Teil der für die vorliegende Dissertation zur Verfügung stehenden Arbeitszeit wurde darauf verwendet, eigene Implementierungen geeigneter Algorithmen zur numerischen Lösung der auftretenden partiellen Differentialgleichungen zu finden. Exemplarische Routinen (welche ausschließlich wohlbekannte Methoden nutzen) sind in einem Anhang zur Verfügung gestellt. Das Hauptziel der vorliegenden Arbeit, die Anwendung auf effektive Modelle für QCD, wird in Kapitel 7 präsentiert. Unsere (vorläufigen) FRG-Studien des linearen Sigma-Modells mit axialer Anomalie bei nichtverschwindender Temperatur erlauben verschiedene Szenarien. Sowohl einen extrem schwach ausgeprägten, als auch einen sehr deutlichen Phasenübergang erster Ordnung, ganz abhängig von der Wahl der Ultraviolett-Abschneideskala und oben genannter Parameter. Sogar ein Phasenübergang zweiter Ordnung scheint möglich für gewisse Parameterwerte. Um verlässliche Schlussfolgerungen zu ziehen, sind weitere Untersuchungen nötig und bereits im Gange. In Kapitel 7 verifizieren wir außerdem bereits bekannte numerische Resultate für das Quark-Meson-Modell.
The High Acceptance DiElectron Spectrometer HADES [1] is installed at the Helmholtzzentrum für Schwerionenforschung (GSI) accelerator facility in Darmstadt. It investigates dielectron emission and strangeness production in the 1-3 AGeV regime. A recent experiment series focusses on medium-modifications of light vector mesons in cold nuclear matter. In two runs, p+p and p+Nb reactions were investigated at 3.5 GeV beam energy; about 9·109 events have been registered. In contrast to other experiments the high acceptance of the HADES allows for a detailed analysis of electron pairs with low momenta relative to nuclear matter, where modifications of the spectral functions of vector mesons are predicted to be most prominent. Comparing these low momentum electron pairs to the reference measurement in the elementary p+p reaction, we find in fact a strong modification of the spectral distribution in the whole vector meson region.
Background: Risk stratification, detection of minimal residual disease (MRD), and implementation of novel therapeutic agents have improved outcome in acute lymphoblastic leukemia (ALL), but survival of adult patients with T-cell acute lymphoblastic leukemia (T-ALL) remains unsatisfactory. Thus, novel molecular insights and therapeutic approaches are urgently needed.
Methods: We studied the impact of B-cell CLL/lymphoma 11b (BCL11b), a key regulator in normal T-cell development, in T-ALL patients enrolled into the German Multicenter Acute Lymphoblastic Leukemia Study Group trials (GMALL; n = 169). The mutational status (exon 4) of BCL11b was analyzed by Sanger sequencing and mRNA expression levels were determined by quantitative real-time PCR. In addition gene expression profiles generated on the Human Genome U133 Plus 2.0 Array (affymetrix) were used to investigate BCL11b low and high expressing T-ALL patients.
Results: We demonstrate that BCL11b is aberrantly expressed in T-ALL and gene expression profiles reveal an association of low BCL11b expression with up-regulation of immature markers. T-ALL patients characterized by low BCL11b expression exhibit an adverse prognosis [5-year overall survival (OS): low 35% (n = 40) vs. high 53% (n = 129), P = 0.02]. Within the standard risk group of thymic T-ALL (n = 102), low BCL11b expression identified patients with an unexpected poor outcome compared to those with high expression (5-year OS: 20%, n = 18 versus 62%, n = 84, P < 0.01). In addition, sequencing of exon 4 revealed a high mutation rate (14%) of BCL11b.
Conclusions: In summary, our data of a large adult T-ALL patient cohort show that low BCL11b expression was associated with poor prognosis; particularly in the standard risk group of thymic T-ALL. These findings can be utilized for improved risk prediction in a significant proportion of adult T-ALL patients, which carry a high risk of standard therapy failure despite a favorable immunophenotype.
Objectives: Low energy shock waves have been shown to induce angiogenesis, improve left ventricular ejection fraction and decrease angina symptoms in patients suffering from chronic ischemic heart disease. Whether there is as well an effect in acute ischemia was not yet investigated.
Methods: Hind-limb ischemia was induced in 10–12 weeks old male C57/Bl6 wild-type mice by excision of the left femoral artery. Animals were randomly divided in a treatment group (SWT, 300 shock waves at 0.1 mJ/mm2, 5 Hz) and untreated controls (CTR), n = 10 per group. The treatment group received shock wave therapy immediately after surgery.
Results: Higher gene expression and protein levels of angiogenic factors VEGF-A and PlGF, as well as their receptors Flt-1 and KDR have been found. This resulted in significantly more vessels per high-power field in SWT compared to controls. Improvement of blood perfusion in treatment animals was confirmed by laser Doppler perfusion imaging. Receptor tyrosine kinase profiler revealed significant phosphorylation of VEGF receptor 2 as an underlying mechanism of action. The effect of VEGF signaling was abolished upon incubation with a VEGFR2 inhibitor indicating that the effect is indeed VEGFR 2 dependent.
Conclusions: Low energy shock wave treatment induces angiogenesis in acute ischemia via VEGF receptor 2 stimulation and shows the same promising effects as known from chronic myocardial ischemia. It may therefore develop as an adjunct to the treatment armentarium of acute muscle ischemia in limbs and myocardium.
Loudness in the novel
(2014)
The novel is composed entirely of voices: the most prominent among them is typically that of the narrator, which is regularly intermixed with those of the various characters. In reading through a novel, the reader "hears" these heterogeneous voices as they occur in the text. When the novel is read out loud, the voices are audibly heard. They are also heard, however, when the novel is read silently: in this la!er case, the voices are not verbalized for others to hear, but acoustically created and perceived in the mind of the reader. Simply put: sound, in the context of the novel, is fundamentally a product of the novel’s voices. This conception of sound mechanics may at first seem unintuitive—sound seems to be the product of oral reading—but it is only by starting with the voice that one can fully appreciate sound’s function in the novel. Moreover, such a conception of sound mechanics finds affirmation in the works of both Mikhail Bakhtin and Elaine Scarry: "In the novel," writes Bakhtin, "we can always hear voices (even while reading silently to ourselves)."
The mitochondrial kinase PINK1 and the ubiquitin ligase Parkin are participating in quality control after CCCP- or ROSinduced mitochondrial damage, and their dysfunction is associated with the development and progression of Parkinson’s disease. Furthermore, PINK1 expression is also induced by starvation indicating an additional role for PINK1 in stress response. Therefore, the effects of PINK1 deficiency on the autophago-lysosomal pathway during stress were investigated. Under trophic deprivation SH-SY5Y cells with stable PINK1 knockdown showed downregulation of key autophagic genes, including Beclin, LC3 and LAMP-2. In good agreement, protein levels of LC3-II and LAMP-2 but not of LAMP-1 were reduced in different cell model systems with PINK1 knockdown or knockout after addition of different stressors. This downregulation of autophagic factors caused increased apoptosis, which could be rescued by overexpression of LC3 or PINK1. Taken together, the PINK1-mediated reduction of autophagic key factors during stress resulted in increased cell death, thus defining an additional pathway that could contribute to the progression of Parkinson’s disease in patients with PINK1 mutations.
The deregulation of Polo-like kinase 1 is inversely linked to the prognosis of patients with diverse human tumors. Targeting Polo-like kinase 1 has been widely considered as one of the most promising strategies for molecular anticancer therapy. While the preclinical results are encouraging, the clinical outcomes are rather less inspiring by showing limited anticancer activity. It is thus of importance to identify molecules and mechanisms responsible for the sensitivity of Polo-like kinase 1 inhibition. We have recently shown that p21Cip1/CDKN1A is involved in the regulation of mitosis and its loss prolongs the mitotic duration accompanied by defects in chromosome segregation and cytokinesis in various tumor cells. In the present study, we demonstrate that p21 affects the efficacy of Polo-like kinase 1 inhibitors, especially Poloxin, a specific inhibitor of the unique Polo-box domain. Intriguingly, upon treatment with Polo-like kinase 1 inhibitors, p21 is increased in the cytoplasm, associated with anti-apoptosis, DNA repair and cell survival. By contrast, deficiency of p21 renders tumor cells more susceptible to Polo-like kinase 1 inhibition by showing a pronounced mitotic arrest, DNA damage and apoptosis. Furthermore, long-term treatment with Plk1 inhibitors induced fiercely the senescent state of tumor cells with functional p21. We suggest that the p21 status may be a useful biomarker for predicting the efficacy of Plk1 inhibition.
Background: In this study, we examined patients who had non-progressive disease for at least 2 years after diagnosis of inoperable locoregional recurrent or metastatic breast cancer under continuous trastuzumab treatment. Our primary goal was to assess the long-term outcome of patients with durable response to trastuzumab.
Methods: 268 patients with HER2-positive inoperable locally recurrent or metastatic breast cancer and non-progressive disease for at least 2 years under trastuzumab treatment were documented retrospectively or prospectively in the HER-OS registry, an online documentation tool, between December 2006 and September 2010 by 71 German oncology centers. The study end point was time to tumor progression.
Results: Overall, 47.1% of patients (95% confidence interval (CI): 39.9–54.1%) remained in remission for more than 5 years, while the median time to progression was 4.5 years (95% CI: 4.0–6.6 years). Lower age (<50 years) and good performance status (ECOG 0) at time of trastuzumab treatment initiation as well as complete remission after initial trastuzumab treatment were associated with longer time to progression. Interruption of trastuzumab therapy correlated with shorter time to progression.
Conclusions: HER2-positive patients, who initially respond to palliative treatment with trastuzumab, can achieve a long-term tumor remission of several years.
In the interest of understanding the development of a multicellular organism, subcellular events must be seen in the context of the entire three-dimensional tissue. In addition, events that occur within a short period of time can be of great importance for the relatively long developmental process of the organ. Thus, it is required to capture subcellular events in a larger spatio-temporal scale context, which has been up to now a technical challenge. In developmental biology, light microscopy has always been an important tool. The dilemma of light microscopy, in particular fluorescence microscopy, is that molecules receive high light intensities that might change the conformation of molecules, which can have signaling or toxic effects. In Light Sheet-based Fluorescence Microscopy (LSFM), the energy required for a single recording is reduced by several orders of magnitude compared to other fluorescence microscopy techniques. During the last ten years, LSFM has emerged as a preferred tool to capture all cells during embryogenesis of the zebrafish Danio rerio, the fruit fly Drosophila melanogaster or recently the red flour beetle Tribolium castaneum for a period of several days. The motivation of this work was to gain new insights in developmental related processes of plant organs. The aim of this work was to establish a protocol for imaging plant growth over a long period of time using LSFM and perform comprehensive analyses at the cellular level. Plants have to cope with a variety of environmental conditions, therefore the conditions inside the microscope chamber had to be brought under control. The sample preparation methods and the standardized conditions at a physiological level allowed the study of gravity response, day-night rhythms, organ shape development as well as the intracellular dynamic events of the cytoskeleton and endosomal compartments in an unprecedented manner. Several of these projects were successfully published in collaborations with Prof. Jozef Šamaj (Palacký University Olomouc, Czech Republic), Prof. Niko Geldner (University of Lausanne, Switzerland), Prof. Malcom Bennett (University of Nottingham, UK) and Dr. Jürgen Kleine-Vehn (University of Natural Resources and Life Sciences, Austria). The main part of my work focused on the formation of lateral roots in Arabidopsis thaliana and was conducted in close collaboration with Dr. Alexis Maizel (University of Heidelberg, Germany). Previously, most experiments that describe lateral root formation have been performed on a small number of cells and for short periods of time. Capturing the complete process of lateral roots is an ambitious goal, because first, the primordium of a lateral root is located deep inside the primary root and imaging quality is impaired due to scattering of the overlaying tissue. Second, the process takes about 48 h, i.e. the plant has to be kept healthy for the whole period. Third, the amount of excitation light required for the spatio-temporal might have phototoxic effects that lead to a stop of growth at least in conventional microscopic techniques. In Arabidopsis embryogenesis, the sequence of cell divisions is relatively invariant. However, whether lateral root organogenesis follows particular cell division patterns has been unknown. The complete process of lateral root formation was captured from the first cell division until after the emergence from the main root. Images of a nuclei marker and a plasmamembrane marker were recorded every 5 min for a time period of up to 64 h. The positions and cell divisions of all cells were tracked manually. In collaboration with Alexander Schmitz (Goethe University Frankfurt am Main, Germany) and Dr. Jens Fangerau (University of Heidelberg, Germany), comprehensive analyses of the data were performed. A lateral root forms from initially 8-15 founder cells, arranged in a patch of 5-8 parallel files. The occurrence of new cell layers by periclinal divisions, as well as the sequence of layer generation was conserved and resembles the sequence suggested by Malamy and Benfey in 1997. Besides this stereotyped occurrence of periclinal divisions, radial divisions were found to appear stochastically, following no particular pattern. A large variability was also found in the contribution of founder cells and cell files to the final lateral root. In summary, the results suggest that a stereotyped pattern of cell divisions at particular developmental stages and a dynamically adapted control of cell divisions exist in parallel. Both properties allow a controlled but flexible development of the organ according to variations in cell topology and mechanical properties of the surrounding tissue. This work shows that LSFM, the sample preparation methods and controlled environmental conditions allow to capture and analyse the development of plants over several days at high resolution in an unprecedented manner.
Locative inversion in Cuwabo
(2014)
This paper proposes a detailed description of locative inversion (LI) constructions in Cuwabo, in terms of morphosyntactic properties and thematic restrictions. Of particular interest are the use of disjoint verb forms in LI, and the co-existence of formal and semantic LI, which challenges the widespread belief that the two constructions cannot be found in the same language.
Late stage cancer is often associated with reduced immune recognition and a highly immunosuppressive tumor microenvironment. The presence of tumor infiltrating lymphocytes (TILs) and specific gene-signatures prior to treatment are linked to good prognosis, while the opposite is true for extensive immunosuppression. The use of adenoviruses as cancer vaccines is a form of active immunotherapy to initialise a tumor-specific immune response that targets the patient’s unique tumor antigen repertoire. We report a case of a 68-year-old male with asbestos-related malignant pleural mesothelioma who was treated in a Phase I study with a granulocyte-macrophage colony‑stimulating factor (GM-CSF)-expressing oncolytic adenovirus, Ad5/3-D24-GMCSF (ONCOS-102). The treatment resulted in prominent infiltration of CD8C lymphocytes to tumor, marked induction of systemic antitumor CD8C T-cells and induction of Th1- type polarization in the tumor. These results indicate that ONCOS-102 treatment sensitizes tumors to other immunotherapies by inducing a T-cell positive phenotype to an initially T-cell negative tumor.
Local active information storage as a tool to understand distributed neural information processing
(2014)
Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.
Banks can deal with their liquidity risk by holding liquid assets (self-insurance), by participating in interbank markets (coinsurance), or by using flexible financing instruments, such as bank capital (risk-sharing). We use a simple model to show that undiversifiable liquidity risk, i.e. the liquidity risk that banks are unable to coinsure on interbank markets, represents an important risk factor affecting their capital structures. Banks facing higher undiversifiable liquidity risk hold more capital. We posit that empirically banks that are more exposed to undiversifiable liquidity risk are less active on interbank markets. Therefore, we test for the existence of a negative relationship between bank capital and interbank market activity and find support in a large sample of U.S. commercial banks.
Dendritic morphology has been shown to have a dramatic impact on neuronal function. However, population features such as the inherent variability in dendritic morphology between cells belonging to the same neuronal type are often overlooked when studying computation in neural networks. While detailed models for morphology and electrophysiology exist for many types of single neurons, the role of detailed single cell morphology in the population has not been studied quantitatively or computationally. Here we use the structural context of the neural tissue in which dendritic trees exist to drive their generation in silico. We synthesize the entire population of dentate gyrus granule cells, the most numerous cell type in the hippocampus, by growing their dendritic trees within their characteristic dendritic fields bounded by the realistic structural context of (1) the granule cell layer that contains all somata and (2) the molecular layer that contains the dendritic forest. This process enables branching statistics to be linked to larger scale neuroanatomical features. We find large differences in dendritic total length and individual path length measures as a function of location in the dentate gyrus and of somatic depth in the granule cell layer. We also predict the number of unique granule cell dendrites invading a given volume in the molecular layer. This work enables the complete population-level study of morphological properties and provides a framework to develop complex and realistic neural network models.
Channelrhodopsin-2 (ChR2) is a cation-selective light-gated channel from Chlamydomonas reinhardtii (Nagel G, Szellas T, Huhn W, Kateriya S, Adeishvili N, Berthold P, et al. Channelrhodopsin-2, a directly light-gated cation-selective membrane channel. Proc Natl Acad Sci USA 2003;100:13940-5), which has become a powerful tool in optogenetics. Two-dimensional crystals of the slow photocycling C128T ChR2 mutant were exposed to 473 nm light and rapidly frozen to trap the open state. Projection difference maps at 6Å resolution show the location, extent and direction of light-induced conformational changes in ChR2 during the transition from the closed state to the ion-conducting open state. Difference peaks indicate that transmembrane helices (TMHs) TMH2, TMH6 and TMH7 reorient or rearrange during the photocycle. No major differences were found near TMH3 and TMH4 at the dimer interface. While conformational changes in TMH6 and TMH7 are known from other microbial-type rhodopsins, our results indicate that TMH2 has a key role in light-induced channel opening and closing in ChR2.
This paper studies the life cycle consumption-investment-insurance problem of a family. The wage earner faces the risk of a health shock that significantly increases his probability of dying. The family can buy term life insurance with realistic features. In particular, the available contracts are long term so that decisions are sticky and can only be revised at significant costs. Furthermore, a revision is only possible as long as the insured person is healthy. A second important and realistic feature of our model is that the labor income of
the wage earner is unspanned. We document that the combination of unspanned labor income and the stickiness of insurance decisions reduces the insurance demand significantly. This is because an income shock induces the need to reduce the insurance coverage, since premia become less affordable. Since such a reduction is costly and families anticipate these potential costs, they buy less protection at all ages. In particular, young families stay away from life insurance markets altogether.
This country report was prepared for the 19th World Congress of the International Academy of Comparative Law in Vienna in 2014. It is structured as a questionnaire and provides an overview of the legal framework for Free and Open Source Software (FOSS) and other alternative license models like (e.g.) Creative Commons under German law. The first set of questions addresses the applicable statutory provisions and the reported case law in this area. The second section concerns contractual issues, in particular with regard to the interpretation and validity of open content licenses. The third section deals with copyright aspects of open content models, for example regarding revocation rights and rights to equitable remuneration. The final set of questions pertains to patent, trademark and competition law issues of open content licenses.
Library Buildings around the World" is a survey based on researches of several years. The objective was to gather library buildings on an international level starting with 1990.
The parts Germany, France, United Kingdom, United States have been thoroughly revised, supplemented and completed for this 2nd edition. A revision of the other countries is planned for the next edition.
In the United States, on April 1, 2014, the set of rules commonly known as the "Volcker Rule", prohibiting proprietary trading activities in banks, became effective. The implementation of this rule took more than three years, as “proprietary trading” is an inherently vague concept, overlapping strongly with genuinely economically useful activities such as market-making. As a result, the final Rule is a complex and lengthy combination of prohibitions and exemptions.
In January 2014, the European Commission put forward its proposal on banking structural reform. The proposal includes a Volcker-like provision, prohibiting large, systemically relevant financial institutions from engaging in proprietary trading or hedge fund-related business. This paper offers lessons to be learned from the implementation process for the Volcker rule in the US for the European regulatory process.
his paper distils three lessons for bank regulation from the experience of the 2009-12 euro-area financial crisis. First, it highlights the key role that sovereign debt exposures of banks have played in the feedback loop between bank and fiscal distress, and inquires how the regulation of banks’ sovereign exposures in the euro area should be changed to mitigate this feedback loop in the future. Second, it explores the relationship between the forbearance of non-performing loans by European banks and the tendency of EU regulators to rescue rather than resolving distressed banks, and asks to what extent the new regulatory framework of the euro-area “banking union” can be expected to mitigate excessive forbearance and facilitate resolution of insolvent banks. Finally, the paper highlights that capital requirements based on the ratio of Tier-1 capital to banks’ risk-weighted assets were massively gamed by large banks, which engaged in various forms of regulatory arbitrage to minimize their capital charges while expanding leverage. This argues in favor of relying on a set of simpler and more robust indicators to determine banks’ capital shortfall, such as book and market leverage ratios.
This paper investigates the risk channel of monetary policy on the asset side of banks’ balance sheets. We use a factoraugmented vector autoregression (FAVAR) model to show that aggregate lending standards of U.S. banks, such as their collateral requirements for firms, are significantly loosened in response to an unexpected decrease in the Federal Funds rate. Based on this evidence, we reformulate the costly state verification (CSV) contract to allow for an active financial intermediary, embed it in a New Keynesian dynamic stochastic general equilibrium (DSGE) model, and show that – consistent with our empirical findings – an expansionary monetary policy shock implies a temporary increase in bank lending relative to borrower collateral. In the model, this is accompanied by a higher default rate of borrowers.
Concepts of legal capacity and legal subjectivity have developed gradually through intermediate stages. Accordingly, there are numerous types of legal subjects and partial legal subjects, and ever-new types can develop, at the latest once the law confronts new social and technological challenges. Today such challenges seem to be making themselves felt especially in the field of information and communication technologies. Their specific communicative conditions resulting from the technological networking of social communication have a particularly pronounced influence on legal attributions of identity and action, and hence above all on issues of liability in electronic commerce. Here in particular it is becoming increasingly difficult to distinguish concrete human actors and, for example, to identify them as authors of declarations of intent or even as individually responsible agencies of legal transgressions. The communicative processes in this area appear instead as new kinds of chains of effects whose actors seem to be more socio-technical ensembles of people and things – whereby the artificial components of these hybrid human being-thing linkages can sometimes even be represented as driving forces and independent agents.
The subatomic world is governed by the strong interactions of quarks and gluons, described by Quantum Chromodynamics (QCD). Quarks experience confinement into colour-less objects, i.e. they can not be observed as free particles. Under extreme conditions such as high temperature or high density, this constraint softens and a transition to a phase where quarks and gluons are quasi-free particles (Quark-Gluon-Plasma) can occur. This environment resembles the conditions prevailing during the early stages of the universe shortly after the Big Bang.
The phase diagram of QCD is under investigation in current and future collider experiments, for example at the Large Hadron Collider (LHC) or at the Facility for Antiproton and Ion Research (FAIR). Due to the strength of the strong interactions in the energy regime of interest, analytic methods can not be applied rigorously. The only tool to study QCD from first principles is given by simulations of its discretised version, Lattice QCD (LQCD).
These simulations are in the high-performance computing area, hence, the numerical aspects of LQCD are a vital part in this field of research. In recent years, Graphic Processing Units (GPUs) have been incorporated in these simulations as they are a standard tool for general purpose calculations today.
In the course of this thesis, the LQCD application cl2qcd has been developed, which allows for simulations on GPUs as well as on traditional CPUs, as it is based on OpenCL. cl2qcd constitutes the first application for Wilson type fermions in OpenCL.
It provides excellent performance and has been applied in physics studies presented in this thesis. The investigation of the QCD phase diagram is hampered by the notorious sign-problem, which restricts current simulation algorithms to small values of the chemical potential.
Theoretically, studying unphysical parameter ranges allows for constraints on the phase diagram. Of utmost importance is the clarification of the order of the finite temperature transition in the Nf=2 chiral limit at zero chemical potential. It is not known if it is of first or second order. To this end, simulations utilising Twisted Mass Wilson fermions aiming at the chiral limit are presented in this thesis.
Another possibility is the investigation of QCD at purely imaginary chemical potential. In this region, QCD is known to posses a rich phase structure, which can be used to constrain the phase diagram of QCD at real chemical potential and to clarify the nature of the Nf=2 chiral limit. This phase structure is studied within this thesis, in particular the nature of the Roberge-Weiss endpoint is mapped out using Wilson fermions.
Travelling waves are the physical basis of frequency discrimination in many vertebrate and invertebrate taxa, including mammals, birds, and some insects. In bushcrickets (Tettigoniidae), the crista acustica is the hearing organ that has been shown to use sound-induced travelling waves. Up to now, data on mechanical characteristics of sound-induced travelling waves were only available along the longitudinal (proximal-distal) direction. In this study, we use laser Doppler vibrometry to investigate in-vivo radial (anterior-posterior) features of travelling waves in the tropical bushcricket Mecopoda elongata. Our results demonstrate that the maximum of sound-induced travelling wave amplitude response is always shifted towards the anterior part of the crista acustica. This lateralization of the travelling wave response induces a tilt in the motion of the crista acustica, which presumably optimizes sensory transduction by exerting a shear motion on the sensory cilia in this hearing organ.
Elizabeth Stirredge's spiritual autobiography is a treasury of spiritual wisdom which paints all that which is needed to be a faithful servant of the Lord Jesus Christ and how God in His might works, transforms, and supports an ordinary soul to lead the life of extraordinary faithfulness. The text highlights Stirredge's intimate conviction as well as that of early Quakers. This translation is a welcomed venture because this is a central piece, deserving of much more attention than that which has been accorded to it until now.
Kpewi Durorp is the third attempt at bringing Durorp into the public domain, and is a more detailed introduction to the language. It contains sixteen chapters which address important elements of grammar, with some including mini bilingual dictionaries, with words organised not alphabetically but thematically, with the singular aim of facilitating learning and easy acquisition of the language. Durorp is an interesting and linguistically distinct semi-Bantu or Bantoid language spoken by a minority group of people known as Bororp or people of the Kororp ethnic group. A part of this ethnic group inhabits the Southwestern part of Cameroon while the other occupies the Southeastern tip of Nigeria. A minority group, Kororp has continued to suffer not only cultural and socio-economic shrinkage but also linguistic marginalisation characterised by an obvious erosion of some key elements of the language. Like any other language, however, Durorp has borrowings from languages such as Efik, Ejagham, and even English. There is a Durorp-English Dictionary to facilitate the development of Durorp vocabulary (Langaa, 2013).
Background: Dengue fever (DF) is the most rapidly spreading mosquito-borne viral disease in the world. In this decade it has expanded to new countries and from urban to rural areas. Nepal was regarded DF free until 2004. Since then dengue virus (DENV) has rapidly expanded its range even in mountain regions of Nepal, and major outbreaks occurred in 2006 and 2010. However, no data on the local knowledge, attitude and practice (KAP) of DF in Nepal exist although such information is required for prevention and control measures.
Methods: We conducted a community based cross-sectional survey in five districts of central Nepal between September 2011 and February 2012. We collected information on the socio-demographic characteristics of the participants and their knowledge, attitude and practice regarding DF using a structured questionnaire. We then statistically compared highland and lowland communities to identify possible causes of observed differences.
Principal findings: Out of 589 individuals interviewed, 77% had heard of DF. Only 12% of the sample had good knowledge of DF. Those living in the lowlands were five times more likely to possess good knowledge than highlanders (P<0.001). Despite low knowledge levels, 83% of the people had good attitude and 37% reported good practice. We found a significantly positive correlation among knowledge, attitude and practice (P<0.001). Among the socio-demographic variables, the education level of the participants was an independent predictor of practice level (P<0.05), and education level and interaction between the sex and age group of the participants were independent predictors of attitude level (P<0.05).
Conclusion: Despite the rapid expansion of DENV in Nepal, the knowledge of people about DF was very low. Therefore, massive awareness programmes are urgently required to protect the health of people from DF and to limit its further spread in this country.
KippCity
(2014)
On 28 April 2011, on the Rathausplatz of Neukölln, Christine Hentschel's puzzlement vis-à-vis Neukölln's liberation met the neighbourhood's flickering urbanity, which she seeks to capture in a project called KippCity. KippCity is an experiment in tracing urban change while it happens. If space is the 'event of place', as Doreen Massey holds, the space of KippCity is the transformation of Neukölln. This chapter explores the potentials of multistable figures (Kippbilder) for conceptualizing urban change. This potential, Hentschel argues, lies in the flip-moment itself, in the space-time of urban transformation. In Berlin-Neukölln, a neighbourhood long branded as poor and failing, multiple and partly conflicting flip-scenarios have begun to inspire and haunt the neighbourhood and its self-reflective talk. KippCity Neukölln is thus a flickering figure. But unlike an artefact Kippbild, which flickers between duck and rabbit, for example, KippCity Neukölln does not simply tip into a new pre-fabricated form, but rather wavers between different future scenarios. Neukölln's flickering urbanity is thus nervous, full of uncertainty, frustration and enthusiasm. The article shows how the neighbourhood seeks escape from the dystopia of two dominant flip scenarios of ghettoization and gentrification by digging its claws into its 'Now'.
King of the Jungle
(2014)
In King of the Jungle, the bouts of ethno-religious violence in Jos are fused with the heartbreaking story of two brothers who go through life unaware of each other's existence. Carefully crafted with local colour which evokes memories of pre-2001 Jos, Bizuum Yadok's first novel weaves humour, urban realism, tragedy and redemption.
Kids: Africa in Childhood Poetry powerfully conveys the wishful thinking, imaginations, experiences and critical reflections of children as they grow up. The volume grapples with a wide range of topics, sensations, encounters, emotions, imaginations and vistas commonplace in the psyche of many children across different geographical and cultural spaces. While the audacity of Mawere's poetry finds its basis in the poet's profound ability to uncover a multi-layered journey of childhood to adulthood, its merit lies in the character building, psychological, axiological and pedagogical lessons it imparts in today's youths: it teaches the youths the values of moral rectitude, critical observation and thinking, and careful questioning and reflection. This is a collection for all parents, teachers and the youths of between ages 5-18 who cherish a world ruled by peace and unconditional love of all by all.
Kempfidris : a new genus of myrmicine ants from the Neotropical region (Hymenoptera: Formicidae)
(2014)
The new genus Kempfidris gen. nov. is described based on the workers of a single species, K. inusualis comb. nov., from Brazil, Ecuador, and Venezuela. Kempfidris inusualis comb. nov. was originally described by Fernández (2007) and provisionally placed in Monomorium awaiting a better understanding of the internal relationships in Myrmicinae. Kempfidris gen. nov. has a series of distinctive morphological characters including the mandibular configuration, vestibulate propodeal spiracle, propodeal carinae, and cylindrical micro-pegs on the posteromedian portion of abdominal tergum VI and anteromedian portion of abdominal tergum VII. This last trait appears to be autapomorphic for the genus.
Na(+)/H(+) exchangers are essential for regulation of intracellular proton and sodium concentrations in all living organisms. We examined and experimentally verified a kinetic model for Na(+)/H(+) exchangers, where a single binding site is alternatively occupied by Na(+) or one or two H(+) ions. The proposed transport mechanism inherently down-regulates Na(+)/H(+) exchangers at extreme pH, preventing excessive cytoplasmic acidification or alkalinization. As an experimental test system we present the first electrophysiological investigation of an electroneutral Na(+)/H(+) exchanger, NhaP1 from Methanocaldococcus jannaschii (MjNhaP1), a close homologue of the medically important eukaryotic NHE Na(+)/H(+) exchangers. The kinetic model describes the experimentally observed substrate dependences of MjNhaP1, and the transport mechanism explains alkaline down-regulation of MjNhaP1. Because this model also accounts for acidic down-regulation of the electrogenic NhaA Na(+)/H(+) exchanger from Escherichia coli (EcNhaA, shown in a previous publication) we conclude that it applies generally to all Na(+)/H(+) exchangers, electrogenic as well as electroneutral, and elegantly explains their pH regulation. Furthermore, the electrophysiological analysis allows insight into the electrostatic structure of the translocation complex in electroneutral and electrogenic Na(+)/H(+) exchangers.
One memorable quote from Karl Marx’s conception of religion is, “religion is the opium of the masses.” By this, he critiqued religion as an analgesic that dulls the senses, thus inducing a false sense of satisfaction, and preventing the oppressed from revolting against the grubby socio-economic system. As the sigh of the oppressed, religion makes them to resign to fate since it only gives an unrealistic eschatological hope. Rather than conceive religion from this prismatic way, contemporary events have shown that religion has become an amphetamine or a catalyst for revolt, not only at the global but also national level. This work argues that religion is used as an amphetamine, an energizing pill, to pursue other goals than religious as depicted in the activities of Boko Haram sect, which has raised security challenges in contemporary Nigeria.
Justice, not development : Sen and the hegemonic framework for ameliorating global inequality
(2014)
Starting from the merits of Sen's "Development as freedom", the article also explores its shortcomings. It argues that they are related to an uncritical adoption of the discourse of "development", which is the hegemonic framework for ameliorating global inequality today. This discourse implies certain limitations of thought and action, and the article points out three areas where urgent questions of global justice have been largely ignored by development theory and policy as a consequence. Struggles for justice on a global scale, this is the conclusion, should not take the detour of "development".
We report on a polarization measurement of inclusive J/ψ mesons in the di-electron decay channel at mid-rapidity at 2 < pT < 6 GeV/c in p + p collisions at √s = 200 GeV. Data were taken with the STAR detector at RHIC. The J/ψ polarization measurement should help to distinguish between different models of the J/ψ production mechanism since they predict different pT dependences of the J/ψ polarization. In this analysis, J/ψ polarization is studied in the helicity frame. The polarization parameter λθ measured at RHIC becomes smaller towards high pT , indicating more longitudinal J/ψ polarization as pT increases. The result is compared with predictions of presently available models.
During the 2011 Pb-Pb run, dedicated triggers were used by the ALICE Collaboration to enrich ultra-peripheral collisions (UPC) to measure the J/ψ production cross section and its rapidity dependence at a centre of mass energy of 2.76 TeV per nucleon pair. In this article, the ongoing studies on J/ψ photoproduction in UPC events are presented.
We review recent results on J/ψ production measured by the ALICE collaboration at the LHC. For pp collisions at √s = 7 TeV yields and spectra of inclusive and prompt J/ψ, as well as results on their polarization and the charged particle multiplicity dependence of yields are presented. Measurements of the nuclear modification factor RAA of inclusive J/ψ at mid-(|y| < 0.9) and forward-rapidities (2.5 < y < 4), covering the range to pt = 0, for centrality selected Pb-Pb collisions are discussed. Also, first results on the J/ψυ2 at forward-rapidities are shown.
It Takes Two
(2014)
FunDza celebrates young writers. Between June and December 2013, five of South Africa's best authors teamed up with five talented young writers to bring you this anthology of fast-paced, exciting short stories. From romance and heartache, to mystery and crime, these stories have something thrilling for every reader.
Two different experimental approaches were combined to study the electric dipole strength in the doubly-magic nucleus 48Ca below the neutron threshold. Real-photon scattering experiments using bremsstrahlung up to 9.9 MeV and nearly mono-energetic linearly polarized photons with energies between 6.6 and 9.51 MeV provided strength distribution and parities, and an (α,α' γ) experiment at Eα = 136 MeV gave cross sections for an isoscalar probe. The unexpected difference observed in the dipole response is compared to calculations using the first-order random-phase approximation and points to an energy-dependent isospin character. A strong isoscalar state at 7.6 MeV was identified for the first time supporting a recent theoretical prediction.
Background: Highly infectious diseases (HIDs) are (i) easily transmissible from person to person; (ii) cause a life-threatening illness with no or few treatment options; and (iii) pose a threat for both personnel and the public. Hence, even suspected HID cases should be managed in specialised facilities minimizing infection risks but allowing state-of-the-art critical care. Consensus statements on the operational management of isolation facilities have been published recently. The study presented was set up to compare the operational management, resources, and technical equipment among European isolation facilities. Due to differences in geography, population density, and national response plans it was hypothesized that adherence to recommendations will vary.
Methods and Findings: Until mid of 2010 the European Network for Highly Infectious Diseases conducted a cross-sectional analysis of isolation facilities in Europe, recruiting 48 isolation facilities in 16 countries. Three checklists were disseminated, assessing 44 items and 148 specific questions. The median feedback rate for specific questions was 97.9% (n = 47/48) (range: n = 7/48 (14.6%) to n = 48/48 (100%). Although all facilities enrolled were nominated specialised facilities' serving countries or regions, their design, equipment and personnel management varied. Eighteen facilities fulfilled the definition of a High Level Isolation Unit'. In contrast, 24 facilities could not operate independently from their co-located hospital, and five could not ensure access to equipment essential for infection control. Data presented are not representative for the EU in general, as only 16/27 (59.3%) of all Member States agreed to participate. Another limitation of this study is the time elapsed between data collection and publication; e.g. in Germany one additional facility opened in the meantime.
Conclusion: There are disparities both within and between European countries regarding the design and equipment of isolation facilities. With regard to the International Health Regulations, terminology, capacities and equipment should be standardised.
While patients with chronic hepatitis C virus (HCV) infection are treated in order to prevent liver-related morbidity and mortality, we rely on sustained virological response (SVR) as a virological biomarker to evaluate treatment efficacy in both clinical practice as well as in drug development. However, conclusive evidence for the clinical benefit of antiviral therapy or validity of SVR as surrogate marker, as derived from trials randomizing patients to a treatment or control arm, is lacking. In fact, the Hepatitis C Antiviral Long-term Treatment Against Cirrhosis (HALT-C) trial recently showed an increased mortality rate among interferon-treated patients compared to untreated controls. Consequently, the recommendation to treat patients with chronic HCV infection was challenged.
Here, we argue that the possible harmful effect of long-term low-dose pegylated interferon mono therapy, as was observed in the HALT-C trial cohort, cannot be extrapolated to potentially curative short-term treatment regimens. Furthermore, we discuss SVR as a surrogate biomarker, based on numerous studies which indicated an association between SVR and improvements in health-related quality of life, hepatic inflammation and fibrosis, and portal pressure as well as a reduced risk for hepatocellular carcinoma (HCC), liver failure and mortality.
The recent decline in euro area inflation has triggered new calls for additional monetary stimulus by the ECB in order to counter the threat of a self‐reinforcing deflation and recession spiral. This note reviews the available evidence on inflation expectations, output gaps and other factors driving current inflation through the lens of the Phillips curve. It also draws a comparison to the Japanese experience with deflation in the late 1990s and the evidence from Japan concerning the outputinflation nexus at low trend inflation. The note concludes from this evidence that the risk of a selfreinforcing deflation remains very small. Thus, the ECB best await the impact of the long‐term refinancing operations decided in June that have the potential to induce substantial monetary accommodation once implemented for the first time in September.
The record-breaking prices observed in the art market over the last three years raise the question of whether we are experiencing a speculative bubble. Given the difficulty to determine the fundamental value of artworks, we apply a right-tailed unit root test with forward recursive regressions (SADF test) to detect explosive behaviors directly in the time series of four different art market segments (“Impressionist and Modern”, “Post-war and Contemporary”, “American”, and “Latin American”) for the period from 1970 to 2013. We identify two historical speculative bubbles and find an explosive movement in today’s “Post-war and Contemporary” and “American” fine art market segments.
The exact pathophysiology of contrast-induced nephropathy (CIN) is not fully clarified, yet the osmotic characteristics of contrast media (CM) have been a significant focus in many investigations of CIN. Osmotic effects of CM specific to the kidney include transient decreases in blood flow, filtration fraction, and glomerular filtration rate. Potentially significant secondary effects include an osmotically induced diuresis with a concomitant dehydrating effect. Clinical experiences that have compared the occurrence of CIN between the various classes of CM based on osmolality have suggested a much less than anticipated advantage, if any, with a lower osmolality. Recent animal experiments actually suggest that induction of a mild osmotic diuresis in association with iso-osmolar agents tends to offset potentially deleterious renal effects of high viscosity-mediated intratubular CM stagnation.
Iredi War : A Folkscript
(2014)
Iredi War was the winner of The Nigeria Prize for Literature 2014. The playwright introduces the notion of 'folk script' with its special stamp. The use of the oral literature genre allows for the full exploitation of the creative licence which allows for the swings from the historical to the oral, the natural to the supernatural, the real to the fantastic.
Natural products (NPs) have been a rich source for pharmaceutically used anti-infectives and other drugs. However, the application of anti-infectives inevitably causes the development of resistant and multiresistant pathogens, which have to be treated with novel anti-infectives. The industrial research for novel anti-infectives has been concentrating on members of the bacterial Actinomycetales for a long time. Due to several reasons, e.g. the rediscovery of already known NPs, pharmaceutical companies abandoned their NP-research and focused on drug development based on combinatorial chemistry. However, the limited structural diversity of merely synthetic compound libraries has not been a fruitful source for bioactive compounds. Hence the discovery of novel bioactive NPs as a source for anti-infectives is still of economical and humanitarian interest and will remain to be an important branch of research in the future. One strategy to circumvent the rediscovery of bioactive NPs is the analysis of yet unexplored bacterial taxa. Based on this assumption, this work aimed at the discovery of novel NPs from the entomopathogenic bacterial genera Xenorhabdus and Photorhabdus and other promising taxa, as well as the investigation of their biosynthesis. ...
Understanding the role of structure and social aspects regarding heat stress of people in urban areas requires an interdisciplinary scientific approach that connects methods from both natural sciences and social sciences. In this study, we combine three approaches to provide an interdisciplinary analysis of the structure and social components of heat stress in the city of Aachen, Germany. First, we assess the overall spatial structure of the urban heat island using spatially distributed measurements from mobile air temperature recordings on public transport units combined with spatially distributed geo-statistical data. The results indicate that the time of day matters: During the afternoon, areas with a relative low building density, like the industrial area northeast of the inner city, are the warmest, while surfaces in high-building-density areas like the inner city heat up faster during the evening. Second, we combine these measurements with place-based survey data collected in 2010 from residents aged 50 to 92 regarding their individual housing conditions, medical history and social integration to examine the match among heat-based stress of older residents, social conditions and elevated temperatures in their residential quarter. We identify disadvantaged areas for specific already-disadvantaged demographic groups in the city, pointing to a cumulation of inequalities, including heat stress among the most vulnerable. Third, we compare data of biometeorological measurements on urban public squares during the afternoon with results of the micrometeorological model ENVI-met to examine the spatial variability of the inner-city heat load. We complement the modelling results with on-site interviews to evaluate people’s heat perception at the same public places. A simulation shows that additional vegetation would increase thermal comfort at these public places, whereby the heat load assessed using the predicted mean vote (PMV) value would decrease by approximately 60 %. Furthermore, we demonstrate the strengths and weaknesses of heat stress simulation. ENVI-met allows for an overall reasonable representation of heat load during stable atmospheric conditions. However, due to the setup and structure of ENVI-met, large-scale atmospheric changes that occur during the day cannot readily be integrated into ENVI-met simulations.
Batten disease refers to neuronal ceroid lipofuscinoses (NCLs), which are inherited lysosomal storage diseases with diverse ages of onset and cause progressive neurodegeneration. The most common NCL is Juvenile NCL (JNCL), which begins in early childhood and is characterized by lysosomal accumulation of subunit c of the mitochondrial ATP synthase (subunit c). JNCL is caused by mutations in the gene CLN3. This gene encodes the CLN3 protein, a transmembrane protein of unknown structure. Localization of CLN3 is ambiguous, and its exact cellular function is not known. Thereby, it is unclear what mechanisms lead to neurodegeneration in JNCL. Models of JNCL present disturbed membrane bound organelles and cytoskeleton as well as impaired autophagy and lysosomal function. The JNCL gene defect that most patients harbor is deletion of the exons 7 and 8 of CLN3. In the Cln3Δex7/8/Δex7/8 mouse model of JNCL, this deletion has been introduced to the mouse Cln3 gene.
The actin cytoskeleton consists of filaments formed through polymerization of actin and provides a framework which defines cellular morphology and also facilitates cell motility, cytokinesis, and cell surface remodeling. Rho GTPases are signaling proteins which regulate the assembly and dynamics of the actin cytoskeleton and play an important role in neuronal morphology. Rho GTPases need to be membrane-anchored in order to become active and initiate a signaling cascade. Their membrane anchorage is achieved through their geranylgeranyl tails, which they acquire through prenylation. Protein prenylation refers to the attachment of a geranylgeranyl or farnesyl group to the C-terminus of a protein. The enzyme geranylgeranyl transferase (GGTase) catalyzes geranylgeranylation, whereas geranylgeranyl pyrophosphate (GGPP) is the donor of the geranylgeranyl group. Cells produce GGPP as well as cholesterol and other lipids through the mevalonate pathway (MVA pathway).
The aim of this study was to analyze how the JNCL gene defect affects cellular morphology, especially the actin cytoskeleton and Rho GTPases, and the MVA pathway which is connected with Rho GTPase activation. These important cellular components play crucial roles in neurons and are implicated in other neurodegenerative diseases, but have received little attention in JNCL. The immortalized CbCln3Δex7/8/Δex7/8 cerebellar precursor cell line from Cln3Δex7/8/Δex7/8 mice was used for the experiments and provides a genetically accurate, neuronal cell model of JNCL. CbCln3Δex7/8/Δex7/8 cells present subunit c accumulation only when aged at confluency, but sub-confluent cells display other phenotypes. The experiments of this study were performed both with confluency-aged and sub-confluent cells. Filamentous actin was visualized, and protein levels as well as membrane localization of several small Rho GTPases was analyzed biochemically. Also the protein levels of GGTase and the key enzymes of the mevalonate pathway were determined.
Staining pattern of filamentous actin was disturbed in confluency-aged CbCln3Δex7/8/Δex7/8 cells. Additionally it was found out that these cells did not grow to wild-type size and exhibited an elongated peroxisomal morphology. Rho GTPases had reduced total levels and showed a tendency of decreased membrane localization. Levels of GGTase and the MVA pathway enzymes were altered. Results of sub-confluent CbCln3Δex7/8/Δex7/8 cells were similar with the exception of HMG-CoA reductase, which is the rate-limiting enzyme of the MVA pathway: while its level in confluency-aged CbCln3Δex7/8/Δex7/8 cells was increased, at sub-confluency it showed a reduced level. Also, in contrast with the confluency-aged cells, Rho GTPases presented a tendency of increased membrane localization.
The results of this study reveal that the accurate JNCL gene defect alters cellular morphology and the activity of the MVA pathway in neuronal cells. Small cell size and disrupted architecture of the actin cytoskeleton are confirmed as neuronal JNCL phenotypes, and the peroxisome is introduced as a novel cellular component affected in JNCL. Through defects in endocytosis, autophagy, lysosomal and mitochondrial function, and cytoskeleton, the JNCL gene defect may prevent cells from growing to wild-type size. The JNCL gene defect may attenuate the MVA pathway via mitochondrial dysfunction and/or upregulation of degradative processes. Attenuation of the MVA pathway may contribute to impaired membrane rafts, which are an established phenotype of JNCL cells. As indicated by reduced GGTase level and supported by downregulation of lipid production through the MVA pathway, the JNCL gene defect might also decrease prenylation of proteins.
The mature palm forest of the Vallée de Mai, a UNESCO World Heritage Site, on the Seychelles island of Praslin, is a unique ecosystem containing many endemic species, including the iconic coco de mer palm Lodoicea maldivica. In 2009, the invasive yellow crazy ant Anoplolepis gracilipes was recorded for the first time within the palm forest, raising concern about its potential impacts on the endemic fauna. This research aimed to: (1) assess the current distribution and spread of A. gracilipes within the palm forest; (2) identify environmental variables that are linked to A. gracilipes distribution; and (3) compare endemic species richness and abundance in A. gracilipes invaded and uninvaded areas. Anoplolepis gracilipes was confined to the north-east of the site and remained almost stationary between April 2010 and December 2012, with isolated outbreaks into the forest. Infested areas had significantly higher temperature and humidity and lower canopy cover. Abundance and species richness of the endemic arboreal fauna were lower in the A. gracilipes invaded area. Molluscs were absent from the invaded area. The current restricted distribution of A. gracilipes in this ecosystem, combined with lower abundance of endemic fauna in the invaded area, highlight the need for further research to assess control measures and the possible role of biotic resistance to the invasion of the palm forest by A. gracilipes.
D2.1. provides further elaboration of the original research design and informs about ideas for the final Volume II of bEUcitizen. It is closely connected to task 1 of work package 2: specifying various concrete tasks for the different work packages and formulating overarching questions suitable to provide substantive cohesion and integration of the overall project. The elaboration of 10 crosscutting topics (to become chapters in the “horizontal” book, D2.3.) is a first step towards this goal. Discussing these cross-cutting topics is supposed to feed, infuse and inspire the work done in the different work packages and to build cross-cutting connections between them. Themes 1-10 merge into a valuable overview of the multi-faceted research on (EU) citizenship. They access the main issues of EU-citizenship and citizenship in general from different angles and different disciplines. Taken together these contributions help to identify barriers towards EU citizenship and ways to overcome them. Each Theme formulates questions how it might feed and be fed by further information and findings in the other work packages.
D2.1. is mainly meant for internal use. Its functions are firstly to inform about preliminary ideas, eventual contributions to planned final results and secondly to make out some more of less specific guiding questions that connect the work done by the single researchers in every different work package to the project as a whole. This task implies a normative yardstick, a clear picture of what would be a "good" EU citizenship practice. Elaborating on such a normative yardstick is a meta-topic that cuts across the range of cross-cutting topics presented in this working paper.
Introduction - Issue 7
(2014)
Introduction
(2014)
Bantu languages have been at the heart of the research on the interaction between syntax, prosody and information structure. In these predominantly SVO languages, considerable attention has been devoted to postverbal phenomena. By addressing issues related to Subjects, Topics and Object-Verb word orders, the goal of the present papers is to deepen our understanding of the interaction of different grammatical components (syntax, phonology, semantics/pragmatics) both in individual languages and across the Bantu family. Each paper makes a valuable contribution to ongoing discussions on the preverbal domain.
Introduction
(2014)
Introduction
(2014)
The experience of multistable figures or so-called Kippbilder - the sudden and repeated 'kippen' of perception as the same object is seen under different aspects - is fascinating in its own right. However, what animated the year-long discussion leading to this volume was a critical exploration of the proposition that such figures may offer a helpful model for thinking through the intercultural and interdisciplinary effort of productively negotiating between conflicting positions.
In my paper I take issue with proponents of ‘intersectionality’ which believe that a theoretical concept cannot/should not be detached from its original context of invention. Instead, I argue that the traveling of theory in a global context automatically involves appropriations, amendment and changes in response to the original meaning. However, I reject the idea that ‘intersectionality’ can be used as a freefloating signifier; on the contrary, it has to be embedded in the respective (historical, social, cultural) context in which it is used. I will start by mapping some of the current debates engaging with the pros and cons of the global implementation of the concept (the controversy about master categories, the dispute about the centrality of ‘race’, and the argument about the amendment of categories). I will then turn to my own use of ‘intersectionality’ as a methodological tool (elaborated in Lutz and Davis 2005). Here, we shifted attention from how structures of racism, class discrimination and sexism determine individuals’ identities and practices to how individuals ongoingly and flexibly negotiate their multiple and converging identities in the context of everyday life. Introducing the term doing intersectionality we explored how individuals creatively and often in surprising ways draw upon various aspects of their multiple identities as a resource to gain control over their lives.
In my paper I will show how ‘gender’ or ‘ethnicity’ are invariably linked to structures of domination, but can also mobilize or deconstruct disempowering discourses, even undermine and transform oppressive practices.
We show that, under in vitro conditions, the vulnerability of astroglia to hypoxia is reflected by alterations in endothelin (ET)-1 release and capacity of erythropoietin (EPO) to regulate ET-1 levels. Exposure of cells to 24 h hypoxia did not induce changes in ET-1 release, while 48–72 h hypoxia resulted in increase of ET-1 release from astrocytes that could be abolished by EPO. The endothelin receptor type A (ETA) antagonist BQ123 increased extracellular levels of ET-1 in human fetal astroglial cell line (SV-FHAS). The survival and proliferation of rat primary astrocytes, neural precursors, and neurons upon hypoxic conditions were increased upon administration of BQ123. Hypoxic injury and aging affected the interaction between the EPO and ET systems. Under hypoxia EPO decreased ET-1 release from astrocytes, while ETA receptor blockade enhanced the expression of EPO mRNA and EPO receptor in culture-aged rat astroglia. The blockade of ETA receptor can increase the availability of ET-1 to the ETB receptor and can potentiate the neuroprotective effects of EPO. Thus, the new therapeutic use of combined administration of EPO and ETA receptor antagonists during hypoxia-associated neurodegenerative disorders of the central nervous system (CNS) can be suggested.
We propose an effective theory of SU(3) gluonic matter where interactions between color-electric and color-magnetic gluons are constrained by the center and scale symmetries. Through matching to the dimensionally-reduced magnetic theories, the magnetic gluon condensate qualitatively changes its thermal behavior above the critical temperature. We argue its phenomenological consequences for the thermodynamics, in particular the dynamical breaking of scale invariance.
The impact of international migration, both South-South as well as South-North, on the economic, social and political life of the people in Eastern and Southern Africa [was] not well documented and studied,- and 'the evidence-base for policy on migration and development [was] very weak.' With this in mind, OSSREA's invitation to conduct a study on international migration in Africa had the following objectives: To analyze the nature and type of South-South migration, focusing on issues, such as brain gain and/or brain drain, remittance flows, technical know-how transfers, violations of the right of African migrants and gender dimensions of migration; To investigate the dynamics of migration from Eastern and Southern Africa to the Arab Gulf States as well as to developed countries, focusing on the skills of migrants, brain gain and/or drain, remittance flows, technical know-how transfers, violations of the rights of African migrants and gender dimension of migrants; and to assess the successes, impediments and challenges of African international migrants from Eastern and Southern Africa and to formulate policy recommendations to maximize the gains and minimize the costs associated with international migration in Africa.
The article aims to investigate, under the aspect of translation, the process of legal appropriation and reproduction of international law during the course of the 19th century. An occidental understanding of translation played an important role in the so-called process of universalization in the 19th century, as it made the complexity of global circulation of ideas invisible. Approaches proposed by scholars of Postcolonial, Cultural and Translation Studies are useful for re-reading histories of the circulation of European ideas, particularly the international law doctrines, from a different perspective. The great strides made in Translation and Cultural Studies in the last decades, as well as the discernment practiced in the scholarship of Postcolonial Studies, are important for a broader and more differentiated understanding of the processes of appropriation and reproduction of the doctrines of international law during the 19th century. The present article begins by tracing the connection between translation and universalization of concepts in 19th century international law; after a short excursus on the Western idea of translation, the attention is focused on the translation of international law textbooks. The conclusive section is dedicated to a comparison between Emer de Vattel’s Droit des gens and Andrés Bello’s Principios de Derecho de Jentes.
While distribution conflicts over natural resources were central to the debates on a New International Economic Order, during the last decades the specific distribution conflicts surrounding natural resource exploitation no longer have been at the core of international law. In this paper I trace the developments in the relationship between international law and resource distribution conflicts. I first argue that the New International Economic Order favored the political resolution of distribution conflicts over natural resources and envisaged international distribution conflicts to be addressed by the political organs of international institutions within legal procedures Second, I show how the NIEO was surpassed by a different order that relied largely on the market as a distribution mechanism for raw materials and how international institutions and international law played a crucial role in the establishment of this order by promoting the privatization of natural resource exploitation and protecting foreign direct investment and trade. With reference to the copper industry in Zambia I thirdly illustrate how international investment law, and more broadly international economic law, is shaping (and affecting the resolution of) not only distribution conflicts between, but also within States. I conclude with a call for a renewed focus on an international law of resource conflicts to allow for their political resolution given the countermoves we can observe with respect to international investment law and the persistence of (violent) conflicts over natural resource exploitation within States.
The establishment of robust HCV cell culture systems and characterization of the viral life cycle provided the molecular basis for highly innovative, successful years in HCV drug development. With the identification of direct-acting antiviral agents (DAAs), such as NS3/4A protease inhibitors, NS5A replication complex inhibitors, nucleotide and non-nucleoside polymerase inhibitors, as well as host cell targeting agents, novel therapeutic strategies were established and competitively entered clinical testing. The first-in-class NS3/4A protease inhibitors telaprevir and boceprevir, approved in 2011, were recently outpaced by the pan-genotypic nucleotide polymerase inhibitor sofosbuvir that in combination with pegylated interferon and ribavirin, further shortens therapy durations and also offers the first interferon-free HCV treatment option. In the challenging race towards the goal of interferon-free HCV therapies, however, several oral DAA regimens without nucleotide polymerase inhibitors that combine a NS3/4A protease inhibitor, a NS5A inhibitor and/or a non-nucleoside polymerase inhibitor yielded competitive results. Second generation NS3/4A protease and NS5A inhibitors promise an improved genotypic coverage and a high resistance barrier. Results of novel DAA combination therapies without the backbone of a nucleotide polymerase inhibitor, as well as treatment strategies involving host targeting agents are reviewed herein.
Seven different instruments and measurement methods were used to examine the immersion freezing of bacterial ice nuclei from Snomax® (hereafter Snomax), a product containing ice active protein complexes from non-viable Pseudomonas syringae bacteria. The experimental conditions were kept as similar as possible for the different measurements. Of the participating instruments, some examined droplets which had been made from suspensions directly, and the others examined droplets activated on previously generated Snomax particles, with particle diameters of mostly a few hundred nanometers and up to a few micrometers in some cases. Data were obtained in the temperature range from −2 to −38 °C, and it was found that all ice active protein complexes were already activated above −12 °C. Droplets with different Snomax mass concentrations covering 10 orders of magnitude were examined. Some instruments had very short ice nucleation times down to below 1 s, while others had comparably slow cooling rates around 1 K min−1. Displaying data from the different instruments in terms of numbers of ice active protein complexes per dry mass of Snomax, nm, showed that within their uncertainty the data agree well with each other as well as to previously reported literature results. Two parameterizations were taken from literature for a direct comparison to our results, and these were a time dependent approach based on a contact angle distribution Niedermeier et al. (2014) and a modification of the parameterization presented in Hartmann et~al.~(2013) representing a time independent approach. The agreement between these and the measured data were good, i.e. they agreed within a temperature range of 0.6 K or equivalently a range in nm of a factor of 2. From the results presented herein, we propose that Snomax, at least when carefully shared and prepared, is a suitable material to test and compare different instruments for their accuracy of measuring immersion freezing.
Thought to be monotypic for decades, the only species in the goosefish genus Lophiomus Gill, Lm. setigerus (Vahl), shows a wide range of morphological variation and is distributed widely in the Indo-West Pacific (IWP). In this study, datasets for two mitochondrial and two nuclear genes sequences obtained from samples of Lophiomus collected in different localities across the IWP were constructed and analyzed to explore the phylogeny and species diversity within the genus. Our integrated approach with multiline evidence unveiled an unanticipated richness of at least six delimited species of Lophiomus. Herein, based on materials already available from museums and new specimens obtained primarily through the Tropical Deep-Sea Benthos program surveying IWP benthic fauna, we formally describe three new species: Lm. immaculioralis sp. nov., Lm. nigriventris sp. nov., and Lm. carusoi sp. nov. Also, we resurrect Lm. laticeps stat. rev. from synonyms of Lm. setigerus. These species can be diagnosed by genetics, body coloration, patterns on the floor of the mouth, peritoneum pigmentation, morphometric measurements, and meristic counts of cranial spines, dorsal-fin spines, and pectoral-fin and pelvic-fin rays from each other and from Lm. setigerus. The species Lm. setigerus, as well as the genus Lophiomus, are re-described accordingly based on the new results. Amended identification keys to the four extant lophiid genera and to species of Lophiomus are also provided.
Autophagy plays an essential role in maintaining an intricate balance between nutrient demands and energetic requirements during normal homeostasis. Autophagy recycles metabolic substrates from nonspecific bulk degradation of proteins and excess or damaged organelles. Recent work posits an active and dynamic signaling role for extracellular matrix-evoked autophagic regulation, that is, allosteric and independent of prevailing nutrient conditions. Several candidates, representing a diverse repertoire of matrix constituents (decorin, collagen VI, laminin α2, endostatin, endorepellin, and kringle V), can modulate autophagic signaling pathways. Importantly, a novel principle indicates that matrix constituents can differentially modulate autophagic induction and repression via interaction with specific receptors. Most of the matrix-derived factors described here appear to control autophagy in a canonical manner but independent of nutrient deprivation. Because the molecular composition and structure of the extracellular matrix are dynamically remodeled during various physiological and pathological conditions, we propose that matrix-regulated autophagy is key for maintaining proper tissue homeostasis and disease prevention, such as cancer progression and muscular dystrophies.
A recent trend in international development circles is "New Institutionalism". In a slogan, the idea is just that good institutions matter. The slogan itself is so innocuous as to be hardly worth comment. But the push to improve institutional quality has the potential to have a much less innocuous impact on aid efforts and other aspects of international development. This paper provides a critical introduction to some of the literature on institutional quality. It looks, in particular, at an argument for the conclusion that making aid conditional on good institutional quality will promote development by reducing poverty. This paper suggests that there is little theoretical or empirical evidence that this kind of conditionality is good for the poor.
Like in many parts of the world, water resources in sub-Saharan Africa (SSA) have been pivotal for human survival, economic growth, social development, and practicing certain religion and cultural ethos. However, in spite of the intrinsic values of water, its use and management in sub-Saharan Africa has not been without limitations. The demand for water resources is increasing mainly due to rapid population growth, industrialization and urbanization and dealing with water-related issues has been complex and challenging for sustainable growth. Whilst there are various efforts by national governments, non-government organizations and communities to effectively and efficiently utilize and manage water resources, there are few comprehensive studies in sub- Saharan Africa that show the impact of the efforts on poverty reduction. Although certain reports indicate that many SSA countries lack clear vision on how water use can be harnessed with pro-poor growth and how poor communities can be capacitated to use water for poverty reduction, there are little exhaustive studies that clearly show familiar and innovative water use and management interventions followed by communities, national governments and other stakeholders, and demonstrate the challenges and successes of the same. Cognizant of the knowledge gap, in 2012 OSSREA launched a research project on ordinary and innovative water use and management patterns and practices in SSA, with a view to generate new knowledge on unexploited opportunities that could enhance the contribution of water resources to poverty reduction. This anthology documents various issues including water use and management in agriculture especially in irrigation projects in Ethiopia, Kenya, Uganda and Zimbabwe; water harvesting in Kenya and Uganda; the role of local water use institutions in Ethiopia; and water source maintenance and protection in Uganda.
Background: Right heart failure is a fatal consequence of chronic pulmonary hypertension (PH). The development of PH is characterized by increased proliferation of vascular cells, in particular pulmonary artery smooth muscle cells (PASMCs) and pulmonary artery endothelial cells. In the course of PH, an escalated right ventricular (RV) afterload occurs, which leads to increased perioperative morbidity and mortality. BKCa channels are ubiquitously expressed in vascular smooth muscle cells and their opening induces cell membrane hyperpolarization followed by vasodilation. Moreover, BK activation induces anti-proliferative effects in a multitude of cell types. On this basis, we hypothesized that treatment with the nebulized BK channel opener NS1619 might be a therapy option for pulmonary hypertension and tested this in rats.
Methods: (1) Rats received monocrotaline injection for PH induction. Twenty-four days later, rats were anesthetized and NS1619 or the solvent was administered by inhalation. Systemic hemodynamic parameters, RV hemodynamic parameters, and blood gas analyses were measured before as well as 30 and 120 minutes after inhalation. (2) Rat PASMCs were stimulated with PDGF-BB in the presence and absence of NS1619. AKT, ERK1 and ERK2 activation were investigated by western blot analyses, and relative cell number was determined 48 hours after stimulation.
Results: Inhalation of a 12 µM and 100 µM NS1619 solution significantly reduced RV pressure without affecting systemic arterial pressure. Blood gas analyses demonstrated significantly reduced carbon dioxide and improved oxygenation in NS1619-treated animals pointing towards a considerable pulmonary shunt-reducing effect. In PASMC’s, NS1619 (100 µM) significantly attenuated PASMC proliferation by a pathway independent of AKT and ERK1/2 activation.
Conclusion: NS1619 inhalation reduces RV pressure and improves oxygen supply and its application inhibits PASMC proliferation in vitro. Hence, BK opening might be a novel option for the treatment of pulmonary hypertension.
Introduction: Over the last years, electronic cigarettes (ECs) have become more popular, particularly in individuals who want to give up smoking tobacco. The aim of the present study was to assess the influence of the different e-smoking liquids on the viability and proliferation of human periodontal ligament fibroblasts.
Method and materials: For this study six test solutions with components from ECs were selected: lime-, hazelnut- and menthol-flavored liquids, nicotine, propylene glycol, and PBS as control group. The fibroblasts were incubated up to 96 h with the different liquids, and cell viability was measured by using the PrestoBlue® reagent, the ATP detection and the migration assay. Fluorescence staining was carried out to visualize cell growth and morphology. Data were statistically analyzed by two-tailed one-way ANOVA.
Results: The cell viability assay showed that the proliferation rates of the cells incubated with nicotine or the various flavored liquids of the e-cigarettes were reduced in comparison to the controls, though not all reductions were statistically significant. After an incubation of 96 h with the menthol-flavored liquid the fibroblasts were statistically significant reduced (p < 0.001). Similar results were found for the detection of ATP in fibroblasts; the incubation with menthol-flavored liquids (p < 0.001) led to a statistically significant reduction. The cell visualization tests confirmed these findings.
Conclusion: Within its limits, the present in vitro study demonstrated that menthol additives of e-smoking have a harmful effect on human periodontal ligament fibroblasts. This might indicate that menthol additives should be avoided for e-cigarettes.