Universitätspublikationen
Refine
Year of publication
Document Type
- Article (12991)
- Part of Periodical (3429)
- Doctoral Thesis (3181)
- Contribution to a Periodical (2075)
- Book (2051)
- Working Paper (1870)
- Preprint (1300)
- Review (1050)
- Report (910)
- Conference Proceeding (659)
Language
- English (16275)
- German (13555)
- Portuguese (231)
- Spanish (123)
- Italian (66)
- French (64)
- Multiple languages (59)
- Turkish (12)
- Ukrainian (10)
- slo (7)
Keywords
- Deutschland (132)
- COVID-19 (93)
- inflammation (91)
- Financial Institutions (90)
- ECB (67)
- Capital Markets Union (64)
- SARS-CoV-2 (63)
- Financial Markets (61)
- Adorno (58)
- Banking Union (50)
Institute
- Medizin (6265)
- Präsidium (4979)
- Physik (3059)
- Wirtschaftswissenschaften (2243)
- Gesellschaftswissenschaften (2009)
- Biowissenschaften (1678)
- Frankfurt Institute for Advanced Studies (FIAS) (1455)
- Biochemie und Chemie (1373)
- Sustainable Architecture for Finance in Europe (SAFE) (1370)
- Informatik (1351)
DIGITAL TRANSFORMATION COUPLED WITH THE SIMPLIFIED AVAILABILITY OF DATA BRINGS ARTIFICIAL INTELLIGENCE (AI) CLOSER TO COMMERCIAL USE. FOR THE DATADRIVEN FINANCIAL INDUSTRY, AI IS OF INTENSIVE INTEREST WITHIN PILOT PROJECTS. STILL, FEW AI APPLICATIONS HAVE BEEN IMPLEMENTED SO FAR. THIS STUDY ANALYZES DRIVERS AND INHIBITORS OF A SUCCESSFUL AI ADOPTION IN THE FINANCIAL INDUSTRY BASED ON PANEL DATA COMPRISING 22 SEMI-STRUCTURED INTERVIEWS WITH EXPERTS OF AI IN FINANCE, INCLUDING INTERVIEWEES FROM LEADING SOFTWARE PROVIDERS SUCH AS SAP, IBM, SALESFORCE, AND MICROSOFT. FOR APPLYING AI SUCCESSFULLY, THE GUIDELINES REVEAL SEVERAL DATA CONDITIONS, AI-SPECIFIC ROLE MODELS, AND OVERCOMING MORAL CONCERNS AS CRUCIAL BEFORE TRAINED ALGORITHMS WILL HAVE REACHED A QUALITY LEVEL TO OPERATE WITHOUT HUMAN INTERVENTION.
Responding to inadequate awareness of the outstanding importance of biodiversity, the BioFrankfurt network was founded in 2004 in the State of Hesse, Germany. It is presented here as a case study and may serve as a model for other parts of the world, such as the Middle East. In 2007, only about 26% of the German population were familiar with the term “Biodiversity”, and most of them only had a vague idea about its meaning. The BioFrankfurt network of institutions addressed this problem, raising public awareness and supporting research, education and conservation. A regional biodiversity education program has been developed and delivered to more than 500 schools. Since 2007, an innovative public relations campaign combines raising awareness on regional biodiversity issues with activities to improve the public image of the Frankfurt area. Because of its geographical focus, the network’s activities gained the attention of local and regional politicians and other decision makers, culminating in the joint establishment of a new Biodiversity and Climate Research Centre by BioFrankfurt member institutions. The success of current activities attracts interesting partners, resulting in challenging cooperation initiatives. The authors are convinced that the network’s concepts and activities have a great potential to profoundly enhance the notion and acceptance of biodiversity issues elsewhere. Keywords: BioFrankfurt, biodiversity network, education, public awareness, scientifi c communication
COPA syndrome is a newly discovered hereditary immunodeficiency affecting the lung, kidneys, and joints. The mutated gene encodes the α subunit of the coatomer complex I, a protein transporter from the Golgi back to the endoplasmic reticulum. The impaired return of proteins leads to intracellular stress. The syndrome is an autoimmune and autoinflammatory disease that can be grouped among the interferonopathies. The knowledge about COPA syndrome and its treatment is still limited. In this paper, we describe an additional patient, a 15-year-old girl with rheumatoid factor-positive polyarthritis and rheumatoid nodules since the age of 2, who developed interstitial lung disease. The detected mutation c.698G>A was causing the disease. The patient presented with symmetric polyarthritis on wrists, fingers, and hip and ankle joints, with significant functional impairment, and high disease activity. Laboratory parameters demonstrated chronic inflammation, hypergammaglobulinemia, high titre ANA (antinuclear antibodies) and CCP (anti-citrullinated protein) antibodies, and rheumatoid factors. Therapies with various DMARDs (Disease Modifying Anti-Rheumatic Drugs) and biologicals failed. Upon baricitinib application, the clinical activity decreased dramatically with disappearance of joint pain and morning stiffness and significant decrease of joint swelling. A low disease activity was reached after 12 months, with complete disappearance of rheumatoid nodules. In contrast to IL-1 (interleukin-1), IL-6, and TNF (tumor necrosis factor) inhibitors, baricitinib was very successful, probably because baricitinib acts as a JAK-1/2 (janus kinase-1/2) inhibitor in the IFNα/β (inteferone α/β) pathway. A relatively higher dose in children is necessary. COPA syndrome represents a novel disorder of intracellular transport. Reviewing published literature on COPA syndrome, in addition to our patient, there were 31 cases further described.
Der Neubau für den Fachbereich 09 »Sprach- und Kulturwissenschaften« auf dem Campus Westend bekommt den letzten Schliff. Das international renommierte Künstler:innenkollektiv Raqs Media Collective, das sich 1992 in Neu Delhi gegründet hat, gewann mit seinem Entwurf den vom Land Hessen ausgelobten Wettbewerb für »Kunst am Bau«. Die dreiteilige Arbeit »All, Humans« wird am 2. November 2023 abends im Foyer des SKW-Gebäudes feierlich eingeweiht. Studierende des Masterstudiengangs »Curatorial Studies« haben die Entstehung in den letzten Monaten intensiv verfolgt und bieten im Dialog mit den Künstler*innen und Expert*innen Einblicke und Auseinandersetzungen in die filmische Installation.
We analyze the impact of decreases in available lending resources on quantitative and qualita- tive dimensions of firms’ patenting activities. We thereby make use of the European Banking Authority?s capital exercise to carve out the causal effect of bank lending on firm innovation. In order to do so we combine various datasets to derive information on firms’ financials, their patenting behaviors, as well as their relationships with their lenders. Building on this self- generated dataset, we provide support for the “less finance, less innovation” view. At the same time, we show that lower available financial resources for firms lead to improvement in the qualitative dimensions of their patents. Hence, we carve out a “less finance, less but better innovation” pattern.
Die Bestrahlung atmungsbewegter Tumoren stellt eine Herausforderung für die moderne Strahlentherapie dar. In der vorliegenden Arbeit werden zu Beginn die physikalischen, technischen und medizinischen Grundlagen vorgestellt, um dem Leser den Einstieg in die komplexe Thematik zu erleichtern. Des Weiteren werden verschiedene Techniken zur Bestrahlung atmungsbewegter Zielvolumina vorgestellt. Auch wird auf die Sicherheitssäume eingegangen, die notwendig sind, um Fehler in der Bestrahlungskette beim Festlegen des Planungszielvolumens für die Bestrahlung auszugleichen.
Im Rahmen dieser Arbeit wurde ein Konzept entwickelt, wodurch sich der Sicherheitssaum von bewegten Tumoren in der Radiochirurgie mit dem Tumor-Tracking-System des Cyberknifes noch weiter verkleinern lässt. Somit kann die sogenannte therapeutische Breite der Behandlung weiter vergrößert werden kann. Dafür wurden ein 4D-CT und ein Gating-System in den klinischen Betrieb aufgenommen. Die entwickelte Technik basiert auf den zehn individuellen Atemphasen des 4D-CTs und lässt eine Berücksichtigung bewegter Risikostrukturen bereits während der Bestrahlungsplanung zu. Diese Methode wurde mit aktuellen Bestrahlungstechniken mittels eines Vergleichs der Bestrahlungspläne anhand von zehn Patientenfällen verglichen. Zur Erstellung der Bestrahlungspläne kamen die Bestrahlungsplanungssysteme von Varian (Eclipse 13.5) und Accuray (Multiplan 4.6) zum Einsatz. Es wurden insbesondere die Bestrahlungsdosen an den Risikoorganen und die Volumina ausgewählter Isodosen betrachtet. Hier zeigte sich eine klare Abhängigkeit von der Belastung des gesunden Gewebes von der verwendeten Bestrahlungstechnik. Dies lässt die Schlussfolgerung zu, dass mit einer Reduzierung des Sicherheitssaums, welcher abhängig von der verwendeten Planungs- und Bestrahlungstechnik ist, eine Vergrößerung der therapeutischen Breite einhergeht. Zusätzlich bleibt bei einer geringen Belastung des umliegenden gesunden Gewebes die Möglichkeit für eine weitere Bestrahlung offen.
Anschließend wurden anhand von berechneten Testplänen Messungen an einem für diese Arbeit modifizierten Messphantom am Varian Clinac DHX und am Cyberknife VSI durchgeführt. Hier wurden die beim Planvergleich verwendeten Bestrahlungstechniken verwendet, um einen Abgleich von berechneter und tatsächlich applizierter Dosis zu erhalten. Das verwendete Messphantom simuliert die Atmung des Patienten und lässt gleichzeitig eine Verifikation der Dosisverteilung mit EBT3-Filmen sowie Messungen mit Ionisationskammern zu. Es zeigte sich, dass für die Techniken, welche aktiv die Atmung berücksichtigen (Synchrony am Cyberknife und Gating am Varian Clinac), selbst im Niedrigdosisbereich eine gute Übereinstimmung zwischen Messung und Berechnung der Dosisverteilung vorliegt. Sobald die Bewegung des Zielvolumens bereits bei der Bestrahlungsplanung berücksichtigt wird, steigt die Übereinstimmung weiter an. Für Techniken, welche die Atmung lediglich bei der Zielvolumen-Definition einbeziehen (ITV-Konzept), liegen sowohl die mit Ionisationskammern gemessenen Werte als auch die Übereinstimmung von berechneter und gemessener Dosisverteilung außerhalb des Toleranzbereichs.
Eine weitere Frage dieser Arbeit befasst sich mit der Treffsicherheit des Tumor-Tracking-Systems des Cyberknifes (Synchrony). Hier wurden Messungen mit dem XSightLung-Phantom und unterschiedlichen Sicherheitssäumen, welche die Bewegung des Tumors ausgleichen sollen, durchgeführt. Dies geschah sowohl mit dem für das Phantom vorgesehenen Würfel mit Einschüben für EBT3-Filme als auch mit einem Film-Sanchwich aus Flab-Material zur Untersuchung einer dreidimensionalen Dosisverteilung. Die Analyse der Filme ergab, dass es zumindest an einem Phantom mit einer einfachen kraniokaudalen Bewegung nicht nötig ist, die Bewegung des Zielvolumens durch einen asymmetrischen Sicherheitssaum in Bewegungsrichtung zu kompensieren um die Abdeckung des Zielvolumens mit der gewünschten Dosis zu gewährleisten.
Durch diese Arbeit konnten zusätzlich weitere wertvolle Erkenntnisse für den klinischen Alltag gewonnen werden: bei der Untersuchung der Bewegung von Tumoren in freier Atmung sowie bei maximaler Inspiration und Exspiration zeigte sich, dass zum Teil die Tumorbewegung in maximalen Atemlagen (3-Phasen-CT) deutlich von der freien Atmung abweicht. Dies lässt den Schluss zu, dass für eine Bestrahlung in freier Atmung ein 4D-CT die Tumorbewegung deutlich realistischer widerspiegelt als ein 3-Phasen-CT, zumal letzteres eine größere Dosisbelastung für den Patienten bedeutet.
Ebenfalls konnte anhand einer retrospektiven Untersuchung von Lungentumoren gezeigt werden, dass für die Berechnung von Bestrahlungsplänen für Tumoren in inhomogenem Gewebe der Ray-Tracing-Algorithmus die Dosis im Zielvolumen teilweise sehr stark überschätzt. Um eine realistische Dosisverteilung zu erhalten, sollte deshalb insbesondere bei Tumoren in der Lunge auf den Monte-Carlo-Algorithmus zurückgegriffen werden.
Diese Arbeit beschäftigt sich mit dem Aufbau und der Kalibrierung eines Neutronendetektorarrays für niedrige Energien (Low Energy Neutron detector Array, kurz „LENA“) am kommenden R³B-Aufbau (Reactions with Relativistic Radioactive Beams) am FAIR (Facility for Antiproton and Ion Research) an der GSI in Darmstadt. Die Detektion niederenergetischer Neutronen im Bereich von 100 keV bis 1 MeV ist nötig, um Ladungsaustauschreaktionen, speziell (p,n)-Reaktionen in inverser Kinematik zu untersuchen. In diesem Energiebereich ist die Detektion äußerst schwierig, da Methoden für thermische als auch hochenergetische (100 MeV bis 1 GeV) Neutronen versagen. Neben dem Aufbau des Detektors wird die Bedeutung des Experiments für die nukleare Astrophysik verdeutlicht. Der theoretische Teil dieser Arbeit legt Grundlagen zum Verständnis für den Nachweis von Neutronen, die Funktionsweise des LENA-Detektors und den damit nachweisbaren Kernreaktionen. Des Weiteren wurde eine Simulation des Detektors mit GEANT4 (GEometry And Tracking), einer C++ orientierten Plattform für Simulationen von Wechselwirkungen von Detektormaterial mit Teilchen, durchgeführt. Die Ergebnisse wurden zur Auswertung von Messungen, die im Rahmen einer Strahlzeit im März 2011 an der Physikalisch Technischen Bundesanstalt (PTB) in Braunschweig durchgeführt wurden, herangezogen. Ziel der Arbeit ist es, die Effizienz des Detektors zu bestimmen.
Gridded maps of meteorological variables are needed for the evaluation of weather and climate models and for climate change monitoring. In order to produce them, values at locations where no observing stations are available need to be estimated from point-wise observations. For the interpolation of meteorological observations deterministic and stochastic methods are often combined. Deterministic methods can account for ancillary information such as elevation, continentality or satellite observations. Stochastic methods such as kriging reproduce observed values at the station locations and also account for spatial variability. In the first two studies of this thesis, a flexible interpolation method for the gridding of locally observed daily extreme temperatures is developed that also provides an optimal estimate of the interpolation ncertainty. In the third study, an observational dataset is created using this interpolation method and then applied to evaluate a climate simulation for Africa.
In the first study, the Regression-Kriging-Kriging (RKK) method is tested for the interpolation of daily minimum and maximum temperatures (Tmin and Tmax) in different regions in Europe. RKK accounts for elevation, continentality index and zonal mean temperature and is applicable in regions of differing station density and climate. The accuracy of RKK is compared to Inverse Distance Weighting, a common deterministic interpolation method, and to Ordinary Kriging, a common stochastic interpolation method. The first step in RKK is to use regression kriging, in which multiple linear regression accounts for topographical effects on the temperature field and kriging minimizes the regression error, to interpolate climatological means. In the second step daily deviations from the monthly climatology are interpolated using simple kriging. Owing to the large climatological differences across the investigation area the interpolation is performed in homogeneous subregions defined according to the Köppen-Geiger climate classification. Cross validation demonstrates the superiority of RKK over the simpler algorithms in terms of accuracy and preservation of spatial variability. The interpolation performance however strongly varies across Europe, being considerably higher over Central Europe (highest station density) than over Greenland (few stations along the coast line). This illustrates the strong impact of the station density on the accuracy of the interpolation result. Satellites provide comprehensive observations of climate variables such as land surface temperature (LST) and cloud cover (CC). However, LST is associated with high uncertainty (standard error ~ 1-2°C), preventing its direct application in meteorology and climatology. The second study investigates the usefulness of LST and CC as predictors for the gridding of daily Tmin and Tmax. The RKK algorithm is compared with similar interpolation methods that apply LST and CC in addition to the predictors used with the RKK algorithm. The investigation is conducted in two regions, Central Europe and the Iberian Peninsula, which differ strongly in average cloud cover (Central Europe is approximately 30% cloud free and the Iberian Peninsula approximately 60 % cloud free). RKKLST (in which monthly mean LST is used as an additional predictor) yields for Central Europe no clear improvement over RKK, yet it reduces the interpolation error over the Iberian Peninsula. This finding can be explained by the higher percentage of cloud free pixels over that region in summer which enables a more robust determination of monthly mean LST. Adding a regression step for daily anomalies (using the predictor CC) yields the RKRK method and improves the preservation of spatial variability over the Iberian Peninsula. Moreover, a successive reduction of the station number (from 140 to 10 stations) reveals an increasing superiority of RKKLST and RKRK over RKK in both regions.
The application of a gridded observational dataset for climate monitoring or climate model validation requires knowledge of the uncertainties associated with the dataset. The estimation of the interpolation uncertainty, here the inter quartile range is the used uncertainty measure, is therefore an important issue within the frame of this thesis. By means of cross validation it is shown that the largest uncertainties occur in regions of low station density (e.g. Greenland), in mountainous regions and along coastlines (in these regions model evaluation results should be interpreted carefully). The magnitude of the interpolation error mainly depends on the station density, while the complexity of terrain has substantially less influence. On average over all regions and investigation days the target precision of the uncertainty estimate is reached. However, on local scales and for single days it can be clearly over- or underestimated. The application of satellite-derived predictors (LST and CC) yields no noteworthy improvement of the uncertainty estimate.
In the last study two regional climate simulations for Africa using the ERA-Interim driven COSMO-CLM (CCLM) model at two different horizontal resolutions (0.22° and 0.44°) are validated. It is assessed whether observed patterns and statistical properties of daily Tmin and Tmax are correctly represented in the model. The ERA-Interim reanalysis and a specially created observational dataset are used as reference. The observational dataset is generated by applying the RKRK algorithm (developed within the second study). The investigations show an occasionally large bias in Tmin and Tmax. The hemispheric summers are generally too warm and the temporal variability in temperature is too high, particularly over extra tropical Africa. The diurnal temperature range is overestimated by about 2°C in the northern subtropics but underestimated by about 2°C over large parts of the African tropics. CCLM reproduces the observed frequency distribution of daily Tmin and Tmax in all African climate regions, and the extreme values in the lower percentiles (5, 10, 20%) for Tmin are well simulated. The higher percentiles (80, 90, 95%) for Tmax are however overestimated by 2-5°C. For both Tmin and Tmax the 0.22° simulation is on average 0.5°C warmer than the 0.44° simulation. Additionally, the higher percentiles are about 1°C warmer for both Tmin and Tmax in the higher resolution run, while the lower percentiles in both runs match very well. Although the temperature pattern is represented in more detail along the coastlines and in topographically complex regions, the higher resolution simulation yields no qualitative improvement.
To summarize, the choice of the appropriate algorithm mainly depends on the interpolation conditions. In cases where the station density is high across the target region and the predictor space is adequately covered by observing stations, the computationally less demanding RK algorithm should be preferred. In regions where the station density is low the more robust RKRK algorithm should be the first choice. Due to the strong physical relation of both CC and LST to Tmin and Tmax the missing information is at least partially compensated for. The estimation of the interpolation uncertainty could be improved by applying a normal score transformation to the data prior to a kriging step. This is because the kriging assumption that the increments of the variable of interest are second order stationary can be approximately met by a normal score transformation.
This paper investigates the potential impact of secondary information on rainfall mapping applying Ordinary Kriging. Secondary information tested is a natural area indicator, which is a combination of topographic features and weather conditions. Cross validation shows that secondary information only marginally improves the final mapping, indicating that a one-day accumulation time is possibly too short.
This study presents a method for adjusting long-term climate data records (CDRs) for the integrated use with near-real-time data using the example of surface incoming solar irradiance (SIS). Recently, a 23-year long (1983–2005) continuous SIS CDR has been generated based on the visible channel (0.45–1 μm) of the MVIRI radiometers onboard the geostationary Meteosat First Generation Platform. The CDR is available from the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF). Here, it is assessed whether a homogeneous extension of the SIS CDR to the present is possible with operationally generated surface radiation data provided by CM SAF using the SEVIRI and GERB instruments onboard the Meteosat Second Generation satellites. Three extended CM SAF SIS CDR versions consisting of MVIRI-derived SIS (1983–2005) and three different SIS products derived from the SEVIRI and GERB instruments onboard the MSG satellites (2006 onwards) were tested. A procedure to detect shift inhomogeneities in the extended data record (1983–present) was applied that combines the Standard Normal Homogeneity Test (SNHT) and a penalized maximal T-test with visual inspection. Shift detection was done by comparing the SIS time series with the ground stations mean, in accordance with statistical significance. Several stations of the Baseline Surface Radiation Network (BSRN) and about 50 stations of the Global Energy Balance Archive (GEBA) over Europe were used as the ground-based reference. The analysis indicates several breaks in the data record between 1987 and 1994 probably due to artefacts in the raw data and instrument failures. After 2005 the MVIRI radiometer was replaced by the narrow-band SEVIRI and the broadband GERB radiometers and a new retrieval algorithm was applied. This induces significant challenges for the homogenisation across the satellite generations. Homogenisation is performed by applying a mean-shift correction depending on the shift size of any segment between two break points to the last segment (2006–present). Corrections are applied to the most significant breaks that can be related to satellite changes. This study focuses on the European region, but the methods can be generalized to other regions. To account for seasonal dependence of the mean-shifts the correction was performed independently for each calendar month. In comparison to the ground-based reference the homogenised data record shows an improvement over the original data record in terms of anomaly correlation and bias. In general the method can also be applied for the adjustment of satellite datasets addressing other variables to bridge the gap between CDRs and near-real-time data.