Master's Thesis
Refine
Year of publication
Document Type
- Master's Thesis (42) (remove)
Language
- English (42) (remove)
Has Fulltext
- yes (42)
Is part of the Bibliography
- no (42)
Keywords
- WaterGAP (2)
- global water model (2)
- (n (1)
- AI Safety (1)
- ALICE (1)
- Activism (1)
- App ecosystem (1)
- Autonomous Driving (1)
- Bayesian Inference (1)
- Beryllium-7 (1)
Institute
When performing transfer learning in Computer Vision, normally a pretrained model (source model) that is trained on a specific task and a large dataset like ImageNet is used. The learned representation of that source model is then used to perform a transfer to a target task. Performing transfer learning in this way had a great impact on Computer Vision, because it worked seamlessly, especially on tasks that are related to each other. Current research topics have investigated the relationship between different tasks and their impact on transfer learning by developing similarity methods. These similarity methods have in common, to do transfer learning without actually doing transfer learning in the first place but rather by predicting transfer learning rankings so that the best possible source model can be selected from a range of different source models. However, these methods have focused only on singlesource transfers and have not paid attention to multi-source transfers. Multi-source transfers promise even better results than single-source transfers as they combine information from multiple source tasks, all of which are useful to the target task. We fill this gap and propose a many-to-one task similarity method called MOTS that predicts both, single-source transfers and multi-source transfers to a specific target task. We do that by using linear regression and the source representations of the source models to predict the target representation. We show that we achieve at least results on par with related state-of-the-art methods when only focusing on singlesource transfers using the Pascal VOC and Taskonomy benchmark. We show that we even outperform all of them when using single and multi-source transfers together (0.9 vs. 0.8) on the Taskonomy benchmark. We additionally investigate the performance of MOTS in conjunction with a multi-task learning architecture. The task-decoder heads of a multi-task learning architecture are used in different variations to do multi-source transfers since it promises efficiency over multiple singletask architectures and incurs less computational cost. Results show that our proposed method accurately predicts transfer learning rankings on the NYUD dataset and even shows the best transfer learning results always being achieved when using more than one source task. Additionally, it is further examined that even just using one task-decoder head from the multi-task learning architecture promises better transfer learning results, than using a single-task architecture for the same task, which is due to the shared information from different tasks in the multi-task learning architecture in previous layers. Since the MOTS rankings for selecting the MTI-Net task-decoder head with the highest transfer learning performance were very accurate for the NYUD but not satisfying for the Pascal VOC dataset, further experiments need to varify the generalizability of MOTS rankings for the selection of the optimal task-decoder head from a multi-task architecture.
Autonomous steering of an electric bicycle based on sensor fusion using model predictive control
(2019)
In this thesis a control and steering module for an autonomous bicycle was developed. Based on sensor fusion and model predictive control, the module is able to trace routes autonomously.
The system is developed to run on a Raspberry Pi. An ultrasonic sensor and a 2D Lidar sensor are used for distance measurements. The vehicle’s position is determined by using GPS signals. Additionally, a camera is used to capture pictures for the roadside detection. In order to recognize the road and the position of the vehicle on it, computer vision techniques are used. The captured images are denoised, Canny edge detection is performed and a perspective transformation is applied. Thereafter a sliding window algorithm selects the edges belonging to the roadside and a second order polynomial is fitted to the selected data. Based on this, the road curvature and the lateral position of the vehicle on the road are calculated. The implemented software is thus able to detect straight and curved roads as well as the vehicle’s lateral offset.
A route planning module was implemented to navigate the vehicle from the start to the destination coordinates. This is done by creating an abstract graph of the roads and using Dijkstra’s algorithm to determine the shortest path.
Four MPC controllers were implemented to control the movements of the vehicle. They are based on state space equations derived from the linear single-track vehicle model. This relatively straightforward model makes it possible to predict the vehicle behavior and is efficient to compute. Each controller was built with different parameters for different vehicle speeds to account for the non-linearity of the system. The controllers simulate the future states of the system at each timeslot and select appropriate control signals for steering, throttle and brakes.
In this thesis, all the components of the steering and control module were individually validated. It was established that the each individual component works as expected and certain constraints and accuracy limits were identified. Finally, the closed loop capabilities of the system were assessed using a test vehicle. Despite some limitations imposed by this setup, it was shown that the control module is indeed capable of autonomously navigating a vehicle and avoiding collisions.
Computational workflow optimization for magnetic fluctuation measurements of 3D nano-tetrapods
(2021)
The detailed understanding of micro–and nanoscale structures, in particular their magnetization dynamics, dominates contemporary solid–state physics studies. Most investigations already identified an abundance of phenomena in one–and two–dimensional nanostructures. The following thesis focuses on the magnetic fingerprint of three–dimensional CoFe nano–magnets, specifically the temporal development of their hysteresis loop. These nano–magnets were grown in a tetrahedral pattern on top of a highly susceptible home–build GaAs/AlGaAs micro–Hall sensor using focused electron beam induced deposition (FEBID).
During the measurements, utmost efforts were employed to exemplify current best research practices. The data life cycle of the present thesis is based upon open–source data science tools and packages. Data acquisition and analysis required self–written automated algorithms to handle the extensive quantity of data. Existing instrumental-controlling software was improved, and new Python packages were devised to analyze and visualize the gathered data. The open–source Python data analysis framework (ana) was developed to facilitate computational reproducibility. This framework transparently analyses and visualizes the gathered data automatically using Continuous Analysis tools based on GitLab and Continuous Integration. This automatization uses bespoke scripts combined with virtualization tools like Docker to facilitate reproducible and device–independent results.
The hysteresis loops reveal distinct differences in subsequently measured loops with identical initial experimental parameters, originating from the nano–magnet’s magnetic noise. This noise amplifies in regions where switching processes occur. In such noise–prone regions, the time–dependent scrutinization reveals presumably thermally induced metastable magnetization states. The frequency–dependent power spectral density uncovers a characteristic 1/f² behavior at noise–prone regions with metastable magnetization states.
The internet has often been considered a 'technology of freedom' – a nearly revolutionary tool believed to flatten social hierarchies and democratize access to media by 'giving voice' to everybody equally. Contradictory to this point of view, research has shown the existence of a 'digital divide,' the phenomenon that access to and use of the internet, as well as the outcomes derived from this use, correlate with pre-existing inequalities.
Based on ethnographic fieldwork among activists in Dakar, Senegal, this thesis analyzes how inequalities shape and are shaped by the relationships between activists and smartphones. Do smartphones indeed flatten social hierarchies, or are inequalities rather reproduced – or even reinforced – through them?
Frankfurt as a global international city is home to transcultural people with diverse linguistic biographies and migration backgrounds. As teachers exert significant influence on the language practice of their students and their awareness of self and others, it is crucial to examine the language ideologies and attitudes on multilingualism of teachers who work in different schools in Frankfurt. The online questionnaire was selected as the data collection
method for the combination of qualitative and quantitative analysis where teachers were asked to select their opinion on statements that were designed to represent concurring viewpoints of separate bilingualism and flexible bilingualism. The study builds on existing evidence that multiple factors dynamically shape teachers' attitudes towards multilingualism.
School-level support and cooperation between educational institutions seems to be necessary to establish horizontal continuity and help students benefit from language-sensitive didactic methods, such as translanguaging.
During RUN3 (2021-2023) of the Large Hadron Collider, the Time Projection Chamber (TPC) of ALICE will be operated with quadruple stacks of Gas Electron Multipliers (GEMs). This technology will allow to overcome the rate limitation due to the gated operation of the Multi-Wire Proportional Chambers (MWPCs) used in RUN1 (2009-2013) and RUN2 (2015-2018).
As part of the Upgrade project, long-term irradiation tests, so called "ageing tests", have been carried out. A test setup with a detector using a quadruple stack of 10x10cm2 GEMs was built and operated in Ar-CO2 and Ne-CO2-N2 gas mixtures. The detector performance such as gas gain and energy resolution were monitored continuously. In addition, outgassing tests of materials used for the assembly process of the upgraded TPC were performed. To reach the expected dose of the GEM-based TPC, the detector was operated at much higher gains than the TPC. It was found, that the GEMs could keep their performance within the projected lifetime of the TPC. Most of the tested materials showed no negative impact on the detector. For the tested epoxy adhesive no certain conclusion could be drawn.
At much higher doses than expected for the upgraded TPC, a new phenomenon was observed, which changed the hole geometry of the GEMs and led to a degradation of the energy resolution. Even though its occurrence is not expected during the lifetime of the GEM-based TPC, simulations were carried out to study this effect more systematically. The simulations confirmed, that a change of the hole geometries of the GEMs, lead to an increase of the local gain variation, which results in a decrease of the energy resolution.
Furthermore the effect of methane as quench gas on GEMs was studied, even though this gas is not foreseen to be used in the TPC. From ageing tests with single-wire proportional counters it is well known that hydrocarbons are produced in the plasma of the avalanches, which cover the electrodes and lead to a degradation of the detector performance. Even though GEMs have a quite different geometry, the ageing tests showed, that also this technology tends to methane-induced ageing. A loss of gas gain as well as a degradation of the energy resolution due to deposits on the electrodes was monitored. A qualitative and quantitative comparison between ageing in GEMs and proportional counters was performed.
Software updates are a critical success factor in mobile app ecosystems. Through publishing regular updates, platform providers enhance their operating systems for the benefit of both end users and third-party developers. It is also a way of attracting new customers. However, this platform evolution poses the risk of inadvertently introducing software problems, which can severely disturb the ecosystem’s balance by compromising its foundational technologies. So far, little to no research has addressed this issue from a user-centered perspective. The thesis at hand draws on IS post-adoption literature to investigate the potential negative influences of operating system updates on mobile app users. The release of Apple’s iOS 13 update serves as research object. Based on over half a million user reviews from the AppStore, data mining techniques are applied to study the impact of the new platform version. The results show that iOS 13 caused complications with a large number of popular apps, leading to a significant decline in user ratings and an uptrend in negative sentiment. Feature requests, functional complaints, and device compatibility are identified as the three major issue categories. These issue types are compared in terms of their quantifiable negative effect on users’ continuance intention. In essence, the findings contribute to IS research on post-adoption behavior and provide guidance to ecosystem participants in dealing with update-induced platform issues.
In thesis I investigate the possibility that at the smallest length scale (Planck scale) the very notion of "dimension" needs to be revisited. Due to "quantum effects" spacetime might become very turbulent at these scales and properties like those of "fractals" emerge, including a "scale dependent dimension". It seems that this "spontaneous dimensional reduction" and the appearance of a minimal physical length are very general effects that most approaches to quantum gravity share. Main emphasis is given to the"spectral dimension" and its calculation for strings and p-branes.
Virtual machines are for the most part not used inside of high-energy physics (HEP) environments. Even though they provide a high degree of isolation, the performance overhead they introduce is too great for them to be used. With the rising number of container technologies and their increasing separation capabilities, HEP-environments are evaluating if they could utilize the technology. The container images are small and self-contained which allows them to be easily distributed throughout the global environment. They also offer a near native performance while at the same time aproviding an often acceptable level of isolation. Only the needed services and libraries are packed into an image and executed directly by the host kernel. This work compared the performance impact of the three container technologies Docker, rkt and Singularity. The host kernel was additionally hardened with grsecurity and PaX to strengthen its security and make an exploitation from inside a container harder. The execution time of a physics simulation was used as a benchmark. The results show that the different container technologies have a different impact on the performance. The performance loss on a stock kernel is small; in some cases they were even faster than no container. Docker showed overall the best performance on a stock kernel. The difference on a hardened kernel was bigger than on a stock kernel, but in favor of the container technologies. rkt showed performed in almost all cases better than all the others.
In this thesis, Planck size black holes are discussed. Specifically, new families of black holes are presented. Such black holes exhibit an improved short scale behaviour and can be used to implement gravity self-complete paradigm. Such geometries are also studied within the ADD large extra dimensional scenario. This allows black hole remnant masses to reach the TeV scale. It is shown that the evaporation endpoint for this class of black holes is a cold stable remnant. One family of black holes considered in this thesis features a regular de Sitter core that counters gravitational collapse with a quantum outward pressure. The other family of black holes turns out to nicely fit into the holographic information bound on black holes, and lead to black hole area quantization and applications in the gravitational entropic force. As a result, gravity can be derived as emergent phenomenon from thermodynamics.
The thesis contains an overview about recent quantum gravity black hole approaches and concludes with the derivation of nonlocal operators that modify the Einstein equations to ultraviolet complete field equations.