Refine
Year of publication
- 2023 (5) (remove)
Document Type
- Article (2)
- Contribution to a Periodical (2)
- Preprint (1)
Has Fulltext
- yes (5)
Is part of the Bibliography
- no (5) (remove)
Keywords
- Lipidomics (2)
- Metabolomics (2)
- Biomarker Discovery Study (1)
- K3EDTA plasma sampling (1)
- LC-MS-Based Clinical Research (1)
- Pre-analytics (1)
- Sampling protocol (1)
Institute
Investigators in the cognitive neurosciences have turned to Big Data to address persistent replication and reliability issues by increasing sample sizes, statistical power, and representativeness of data. While there is tremendous potential to advance science through open data sharing, these efforts unveil a host of new questions about how to integrate data arising from distinct sources and instruments. We focus on the most frequently assessed area of cognition - memory testing - and demonstrate a process for reliable data harmonization across three common measures. We aggregated raw data from 53 studies from around the world which measured at least one of three distinct verbal learning tasks, totaling N = 10,505 healthy and brain-injured individuals. A mega analysis was conducted using empirical bayes harmonization to isolate and remove site effects, followed by linear models which adjusted for common covariates. After corrections, a continuous item response theory (IRT) model estimated each individual subject’s latent verbal learning ability while accounting for item difficulties. Harmonization significantly reduced inter-site variance by 37% while preserving covariate effects. The effects of age, sex, and education on scores were found to be highly consistent across memory tests. IRT methods for equating scores across AVLTs agreed with held-out data of dually-administered tests, and these tools are made available for free online. This work demonstrates that large-scale data sharing and harmonization initiatives can offer opportunities to address reproducibility and integration challenges across the behavioral sciences.
The emerging disciplines of lipidomics and metabolomics show great potential for the discovery of diagnostic biomarkers, but appropriate pre-analytical sample-handling procedures are critical because several analytes are prone to ex vivo distortions during sample collection. To test how the intermediate storage temperature and storage period of plasma samples from K3EDTA whole-blood collection tubes affect analyte concentrations, we assessed samples from non-fasting healthy volunteers (n = 9) for a broad spectrum of metabolites, including lipids and lipid mediators, using a well-established LC-MS-based platform. We used a fold change-based approach as a relative measure of analyte stability to evaluate 489 analytes, employing a combination of targeted LC-MS/MS and LC-HRMS screening. The concentrations of many analytes were found to be reliable, often justifying less strict sample handling; however, certain analytes were unstable, supporting the need for meticulous processing. We make four data-driven recommendations for sample-handling protocols with varying degrees of stringency, based on the maximum number of analytes and the feasibility of routine clinical implementation. These protocols also enable the simple evaluation of biomarker candidates based on their analyte-specific vulnerability to ex vivo distortions. In summary, pre-analytical sample handling has a major effect on the suitability of certain metabolites as biomarkers, including several lipids and lipid mediators. Our sample-handling recommendations will increase the reliability and quality of samples when such metabolites are necessary for routine clinical diagnosis.
Small molecule biomarker discovery: Proposed workflow for LC-MS-based clinical research projects
(2023)
Mass spectrometry focusing on small endogenous molecules has become an integral part of biomarker discovery in the pursuit of an in-depth understanding of the pathophysiology of various diseases, ultimately enabling the application of personalized medicine. While LC-MS methods allow researchers to gather vast amounts of data from hundreds or thousands of samples, the successful execution of a study as part of clinical research also requires knowledge transfer with clinicians, involvement of data scientists, and interactions with various stakeholders.
The initial planning phase of a clinical research project involves specifying the scope and design, and engaging relevant experts from different fields. Enrolling subjects and designing trials rely largely on the overall objective of the study and epidemiological considerations, while proper pre-analytical sample handling has immediate implications on the quality of analytical data. Subsequent LC-MS measurements may be conducted in a targeted, semi-targeted, or non-targeted manner, resulting in datasets of varying size and accuracy. Data processing further enhances the quality of data and is a prerequisite for in-silico analysis. Nowadays, the evaluation of such complex datasets relies on a mix of classical statistics and machine learning applications, in combination with other tools, such as pathway analysis and gene set enrichment. Finally, results must be validated before biomarkers can be used as prognostic or diagnostic decision-making tools. Throughout the study, quality control measures should be employed to enhance the reliability of data and increase confidence in the results.
The aim of this graphical review is to provide an overview of the steps to be taken when conducting an LC-MS-based clinical research project to search for small molecule biomarkers.
Für die anstehende Runde der Exzellenzstrategie des Bundes und der Länder bewirbt sich die Goethe-Universität Frankfurt mit vier neuen Clustern zu den Forschungsthemen Vertrauen im Konflikt (CONTRUST), Infektion und Entzündung (EMTHERA), Ursprung der Schweren Elemente (ELEMENTS) und zelluläre Architekturen (SCALE). Die Anträge vereinen die Kompetenzen und zukunftsweisenden Ideen der Goethe-Universität mit denen der Kolleg:innen des Verbunds der Rhein-Main-Universitäten (RMU) und weiterer Partner der vier großen Organisationen der außeruniversitären Forschung. Der seit 2019 bestehende Exzellenzcluster Cardiopulmonary Institute (CPI) wird im kommenden Jahr direkt einen Vollantrag einreichen.
Mit vier neuen Clustern bewirbt sich die Goethe-Universität Frankfurt für die anstehende Runde der Exzellenzstrategie des Bundes und der Länder: Es sind die Forschungsthemen Vertrauen im Konflikt (CONTRUST), Infektion und Entzündung (EMTHERA), Ursprung der Schweren Elemente (ELEMENTS) und zelluläre Architekturen (SCALE). Die Anträge vereinen die Kompetenzen und zukunftsweisenden Ideen der Goethe-Universität mit denen der Kolleg:innen des Verbunds der Rhein-Main-Universitäten (RMU) und weiterere Partner der vier großen Organisationen der außeruniversitären Forschung. Der seit 2019 bestehende Exzellenzcluster Cardiopulmonary Institute (CPI) wird im kommenden Jahr direkt einen Vollantrag einreichen. Im UniReport wird regelmäßig über Forschende der Clusterinitiativen und deren Projekte berichtet.