Psychologie und Sportwissenschaften
Refine
Document Type
- Article (98)
Language
- English (98)
Has Fulltext
- yes (98)
Is part of the Bibliography
- no (98)
Keywords
- pain (4)
- EEG (3)
- Posttraumatic stress disorder (3)
- cognition (3)
- confirmatory factor analysis (3)
- exercise (3)
- health (3)
- ultrasound (3)
- 创伤后应激障碍 (3)
- ADHD (2)
Background: The assessment of therapeutic adherence and competence is essential to understand mechanisms that contribute to treatment outcome. Nevertheless, their assessment is often neglected in psychotherapy research.
Aims/Objective: To develop an adherence and a treatment-specific competence rating scale for Dialectical Behaviour Therapy for Posttraumatic Stress Disorder (DBT-PTSD), and to examine their psychometric properties. Global cognitive behavioural therapeutic competence and disorder-specific therapeutic competence were assessed using already existing scales to confirm their psychometric properties in our sample of patients with PTSD and emotion regulation difficulties.
Method: Two rating scales were developed using an inductive procedure. 155 videotaped therapy sessions from a multicenter randomised controlled trial were rated by trained raters using these scales, 40 randomly chosen videotapes involving eleven therapists and fourteen patients were doubly rated by two raters.
Results: Both the adherence scale (Patient-level ICC = .98; αs = .65; αp = .75) and the treatment-specific competence scale (Patient-level ICC = .98; αs = .78; αp = .82) for DBT-PTSD showed excellent interrater – and good reliability on the patient level. Content validity, including relevance and appropriateness of all items, was confirmed by experts in DBT-PTSD for the new treatment-specific competence scale.
Conclusion: Our results indicate that both scales are reliable instruments. They will be useful to examine possible effects of adherence and treatment-specific competence on DBT-PTSD treatment outcome.
Visual search in natural scenes is a complex task relying on peripheral vision to detect potential targets and central vision to verify them. The segregation of the visual fields has been particularly established by on-screen experiments. We conducted a gaze-contingent experiment in virtual reality in order to test how the perceived roles of central and peripheral visions translated to more natural settings. The use of everyday scenes in virtual reality allowed us to study visual attention by implementing a fairly ecological protocol that cannot be implemented in the real world. Central or peripheral vision was masked during visual search, with target objects selected according to scene semantic rules. Analyzing the resulting search behavior, we found that target objects that were not spatially constrained to a probable location within the scene impacted search measures negatively. Our results diverge from on-screen studies in that search performances were only slightly affected by central vision loss. In particular, a central mask did not impact verification times when the target was grammatically constrained to an anchor object. Our findings demonstrates that the role of central vision (up to 6 degrees of eccentricities) in identifying objects in natural scenes seems to be minor, while the role of peripheral preprocessing of targets in immersive real-world searches may have been underestimated by on-screen experiments.
Children often perform worse than adults on tasks that require focused attention. While this is commonly regarded as a sign of incomplete cognitive development, a broader attentional focus could also endow children with the ability to find novel solutions to a given task. To test this idea, we investigated children’s ability to discover and use novel aspects of the environment that allowed them to improve their decision-making strategy. Participants were given a simple choice task in which the possibility of strategy improvement was neither mentioned by instructions nor encouraged by explicit error feedback. Among 47 children (8—10 years of age) who were instructed to perform the choice task across two experiments, 27.5% showed a full strategy change. This closely matched the proportion of adults who had the same insight (28.2% of n = 39). The amount of erroneous choices, working memory capacity and inhibitory control, in contrast, indicated substantial disadvantages of children in task execution and cognitive control. A task difficulty manipulation did not affect the results. The stark contrast between age-differences in different aspects of cognitive performance might offer a unique opportunity for educators in fostering learning in children.
Development and preliminary validation of the Emotions while Learning an Instrument Scale (ELIS)
(2021)
Learning to play a musical instrument is associated with different, partially conflicting emotions. This paper describes the development and psychometric properties of the Emotions while Learning an Instrument Scale (ELIS). In a longitudinal study with 545 German elementary school children factorial structure and psychometric properties were evaluated. Exploratory and confirmatory factor analyses confirmed a two-factor solution measuring Positive musical Emotions while Learning an Instrument (PELI) and Negative Emotions while Learning an Instrument (NELI). Both subscales yielded scores with adequate internal reliability (Cronbach’s α = .74, .86) and relatively stable retest reliabilities over 18 months (r = .11 -.56). Preliminary evidence of congruent and divergent validity of the subscales is provided. Implications for future research of musical emotional experiences in children are discussed.
Investigation of the sympathetic regulation in delayed onset muscle soreness: results of an RCT
(2021)
Sports-related pain and injury is directly linked to tissue inflammation, thus involving the autonomic nervous system (ANS). In the present experimental study, we disable the sympathetic part of the ANS by applying a stellate ganglion block (SGB) in an experimental model of delayed onset muscle soreness (DOMS) of the biceps muscle. We included 45 healthy participants (female 11, male 34, age 24.16 ± 6.67 years [range 18–53], BMI 23.22 ± 2.09 kg/m2) who were equally randomized to receive either (i) an SGB prior to exercise-induced DOMS (preventive), (ii) sham intervention in addition to DOMS (control/sham), or (iii) SGB after the induction of DOMS (rehabilitative). The aim of the study was to determine whether and to what extent sympathetically maintained pain (SMP) is involved in DOMS processing. Focusing on the muscular area with the greatest eccentric load (biceps distal fifth), a significant time × group interaction on the pressure pain threshold was observed between preventive SGB and sham (p = 0.034). There was a significant effect on pain at motion (p = 0.048), with post hoc statistical difference at 48 h (preventive SGB Δ1.09 ± 0.82 cm VAS vs. sham Δ2.05 ± 1.51 cm VAS; p = 0.04). DOMS mediated an increase in venous cfDNA -as a potential molecular/inflammatory marker of DOMS- within the first 24 h after eccentric exercise (time effect p = 0.018), with a peak at 20 and 60 min. After 60 min, cfDNA levels were significantly decreased comparing preventive SGB to sham (unpaired t-test p = 0.008). At both times, 20 and 60 min, cfDNA significantly correlated with observed changes in PPT. The 20-min increase was more sensitive, as it tended toward significance at 48 h (r = 0.44; p = 0.1) and predicted the early decrease of PPT following preventive stellate blocks at 24 h (r = 0.53; p = 0.04). Our study reveals the broad impact of the ANS on DOMS and exercise-induced pain. For the first time, we have obtained insights into the sympathetic regulation of pain and inflammation following exercise overload. As this study is of a translational pilot character, further research is encouraged to confirm and specify our observations.
Strenuous and unaccustomed exercise frequently lead to what has been coined “delayed onset muscle soreness” (DOMS). As implied by this term, it has been proposed that the associated pain and stiffness stem from micro-lesions, inflammation, or metabolite accumulation within the skeletal muscle. However, recent research points towards a strong involvement of the connective tissue. First, according to anatomical studies, the deep fascia displays an intimate structural relationship with the underlying skeletal muscle and may therefore be damaged during excessive loading. Second, histological and experimental studies suggest a rich supply of algogenic nociceptors whose stimulation evokes stronger pain responses than muscle irritation. Taken together, the findings support the hypothesis that DOMS originates in the muscle-associated connective tissue rather than in the muscle itself. Sports and fitness professionals designing exercise programs should hence consider fascia-oriented methods and techniques (e.g., foam rolling, collagen supplementation) when aiming to treat or prevent DOMS.
Korrektur zu: Roth C, Rettenmaier L and Behringer M (2021) High-Protein Energy-Restriction: Effects on Body Composition, Contractile Properties, Mood, and Sleep in Active Young College Students. Front. Sports Act. Living 3:683327. https://doi.org/10.3389/fspor.2021.683327
Background: The promotion of healthy aging is one of the major challenges for healthcare systems in current times. The present study investigates the effects of a standardized physical activity intervention for older adults on cognitive capacity, self-reported health, fear of falls, balance, leg strength and gait under consideration of movement biography, sleep duration, and current activity behavior. Methods: This single-blinded, randomized controlled trial included 49 community-dwelling older adults (36 women; 82.9 ± 4.5 years of age (Mean [M] ± SD); intervention group = 25; control group = 24). Movement biography, sleep duration, cognitive capacity, self-reported health status, and fear of falls were assessed by means of questionnaires. Leg strength, gait, and current activity levels were captured using a pressure plate, accelerometers, and conducting the functional-reach and chair-rising-test. The multicomponent intervention took place twice a week for 45 min and lasted 16 weeks. Sub-cohorts of different sleep duration were formed to distinguish between intervention effects and benefits of healthy sleep durations. Change scores were evaluated in univariate analyses of covariances (ANCOVAs) between groups and sub-cohorts of different sleep duration in both groups. Changes in cognitive capacity, self-reported health, fear of falls, balance, leg strength, and gait were investigated using the respective baseline values, movement biography, and current activity levels as covariates. Analysis was by intention-to-treat (ITT). Results: We found sub-cohort differences in cognitive capacity change scores [F(3,48) = 5.498, p = 0.003, ηp2 = 0.287]. Effects on fear of falls [F(1,48) = 12.961, p = 0.001, ηp2 = 0.240] and balance change scores F(1,48) = 4.521, p = 0.040, ηp2 = (0.099) were modified by the level of current activity. Effects on gait cadence were modified by the movement biography [F(1,48) = 4.545; p = 0.039, ηp2 = 0.100]. Conclusions: Unlike for functional outcomes, our multicomponent intervention in combination with adequate sleep duration appears to provide combinable beneficial effects for cognitive capacity in older adults. Trainability of gait, fear of falls, and flexibility seems to be affected by movement biography and current physical activity levels. Trial registration: This study was registered at the DRKS (German Clinical Trials Register) on November 11, 2020 with the corresponding trial number: DRKS00020472.
We investigated whether dichotomous data showed the same latent structure as the interval-level data from which they originated. Given constancy of dimensionality and factor loadings reflecting the latent structure of data, the focus was on the variance of the latent variable of a confirmatory factor model. This variance was shown to summarize the information provided by the factor loadings. The results of a simulation study did not reveal exact correspondence of the variances of the latent variables derived from interval-level and dichotomous data but shrinkage. Since shrinkage occurred systematically, methods for recovering the original variance were fleshed out and evaluated.
The sudden impact of the COVID-19 pandemic challenged universities to provide students with online teaching and learning settings that were both immediately applicable and supportive of quality learning. This resulted in a broad variety of synchronous and asynchronous online settings of teaching and learning. While some courses balanced both kinds, others offered either predominantly synchronous or asynchronous teaching and learning. In a survey study with students (N=3,056) and teachers (N=396) from a large German university, we explored whether a predominance of synchronous or asynchronous teaching and learning settings in higher education was associated with certain student experiences and outcomes. Additionally, we examined how well these two types of teaching and learning settings support students’ basic psychological needs for autonomy, competence, and relatedness proposed by self-determination theory (SDT). Data were collected after the first online semester due to the COVID-19 pandemic. The results imply that from the students’ perspective, the teaching methods involved in the two settings of teaching and learning differ with regard to their potential to support social interaction and to support basic psychological needs as proposed by SDT. Students who studied mostly in synchronous settings reported more peer-centered activities such as feedback in comparison to students in mostly asynchronous settings. In contrast, teachers perceived fewer differences between teaching methods in synchronous and asynchronous settings, especially regarding feedback activities. Further, students in mostly synchronous settings reported greater support of their basic psychological needs for competence support and relatedness as well as a greater overall satisfaction with the online term compared to students in mostly asynchronous settings. Across all students, greater fulfillment of psychological needs and higher technology acceptance coincided with outcomes that are more favorable. Implications for the post-pandemic classroom are drawn.
Taking blood via venipuncture is part of the necessary surveillance before and after liver transplantation. The spectrum of response from children and their parents is variable, ranging from a short and limited aversion to paralyzing phobia. The aim of this retrospective, cross-sectional study was to determine the level of anxiety amongst children during venipuncture, to compare the anxiety reported by children and parents, and to identify the factors affecting the children’s and parents’ anxiety in order to develop therapeutic strategies. In total, 147 children (aged 0–17 years, 78 female) and their parents completed questionnaires. Statistical analysis was performed using qualitative and quantitative methods. Results showed that the majority of children reported anxiety and pain during venipuncture. Younger children had more anxiety (self-reported or assessed by parents). Children and parental reports of anxiety were highly correlated. However, the child’s anxiety was often reported as higher by parents than by the children themselves. The child’s general anxiety as well as the parents’ perceived stress from surgical interventions (but not the number of surgical interventions) prompted parental report of child anxiety. For children, the main stressors that correlated with anxiety and pain were factors during the blood collection itself (e.g., feeling the puncture, seeing the syringe). Parental anxiety was mainly related to circumstances before the blood collection (e.g., approaching the clinic, sitting in the waiting room). The main stressors mentioned by parents were the child’s discomfort and their inability to calm the child. Results indicate that the children’s fear of factors during the blood collection, along with the parents’ perceived stress and helplessness as well as their anticipatory anxiety are important starting points for facilitating the drawing of blood from children before and after liver transplantation, thereby supporting a better disease course in the future.
In the COVID-19 pandemic, human solidarity plays a crucial role in meeting this maybe greatest modern societal challenge. Public health communication targets enhancing collective compliance with protective health and safety regulations. Here, we asked whether authoritarian/controlling message framing as compared to a neutral message framing may be more effective than moralizing/prosocial message framing and whether recipients’ self-rated trait autonomy might lessen these effects. In a German sample (n = 708), we measured approval of seven regulations (e.g., reducing contact, wearing a mask) before and after presenting one of three Twitter messages (authoritarian, moralizing, neutral/control) presented by either a high-authority sender (state secretary) or a low-authority sender (social worker). We found that overall, the messages successfully increased participants’ endorsement of the regulations, but only weakly so because of ceiling effects. Highly autonomous participants showed more consistent responses across the two measurements, i.e., lower response shifting, in line with the concept of reactive autonomy. Specifically, when the sender was a social worker, response shifting correlated negatively with trait autonomy. We suggest that a trusted sender encourages more variable responses to imposed societal regulations in individuals low in autonomy, and we discuss several aspects that may improve health communication.
Background: Abnormalities of heart rate (HR) and its variability are characteristic of major depressive disorder (MDD). However, circadian rhythm is rarely taken into account when statistically exploring state or trait markers for depression. Methods: A 4-day electrocardiogram was recorded for 16 treatment-resistant patients with MDD and 16 age- and sex-matched controls before, and for the patient group only, after a single treatment with the rapid-acting antidepressant ketamine or placebo (clinical trial registration available on https://www.clinicaltrialsregister.eu/ with EUDRACT number 2016-001715-21). Circadian rhythm differences of HR and the root mean square of successive differences (RMSSD) were compared between groups and were explored for classification purposes. Baseline HR/RMSSD were tested as predictors for treatment response, and physiological measures were assessed as state markers. Results: Patients showed higher HR and lower RMSSD alongside marked reductions in HR amplitude and RMSSD variation throughout the day. Excellent classification accuracy was achieved using HR during the night, particularly between 2 and 3 a.m. (90.6%). A positive association between baseline HR and treatment response (r = 0.55, p = 0.046) pointed toward better treatment outcome in patients with higher HR. Heart rate also decreased significantly following treatment but was not associated with improved mood after a single infusion of ketamine. Limitations: Our study had a limited sample size, and patients were treated with concomitant antidepressant medication. Conclusion: Patients with depression show a markedly reduced amplitude for HR and dysregulated RMSSD fluctuation. Higher HR and lower RMSSD in depression remain intact throughout a 24-h day, with the highest classification accuracy during the night. Baseline HR levels show potential for treatment response prediction but did not show potential as state markers in this study.
Treatment outcomes of a CBT-based group intervention for adolescents with internet use disorders
(2021)
Background and aims: Instances of Internet use disorders (IUD) including Internet gaming disorder (IGD) and non-gaming pathological Internet use (ng-PIU) have the extent that they are now a growing mental health issue. Individuals suffering from IUD show a large range of symptoms, high comorbidities and impairments in different areas of life. To date there is a lack of efficient and evidence-based treatment programs for such adolescents. The present registered single-arm trial (ClinicalTrials.gov: NCT03582839) aimed to investigate the long-term effects of a brief manualized cognitive behavioral therapy (CBT) program for adolescents with IUD. Methods: N = 54 patients (16.7% female), aged 9–19 years (M = 13.48, SD = 1.72) received the CBT group program PROTECT+. IUD symptom severity (primary outcome variable) as well as comorbid symptoms, risk-related variables and potentially protective skills (secondary outcome variables) were assessed at pretest, posttest, as well as 4 and 12 months after admission. Results: Patients showed a significant reduction in IUD symptom severity at the 12-month follow-up. Effect sizes were medium to large depending on the measure. Beyond the statistical significance, the clinical significance was confirmed using the reliable change index. Secondary outcome variables showed a significant reduction in self-reported depression, social anxiety, performance anxiety and school anxiety as well as in parental-reported general psychopathology. Discussion and conclusions: The present study shows long-term effects of a manual-based CBT treatment for adolescents suffering from IUD. The results indicate that even a 4-session brief intervention can achieve a medium to large effect over 12 months. Future work is needed to confirm the efficacy within a randomized controlled trial (RCT).
Background: Physical activity and sleep quality are both major factors for improving one's health. Knowledge on the interactions of sleep quality and the amount of physical activity may be helpful for implementing multimodal health interventions in older adults. Methods: This preliminary cross-sectional study is based on 64 participants [82.1 ± 6.4 years (MD ± SD); 22 male: 42 female]. The amount of physical activity was assessed by means of an accelerometer (MyWellness Key). Self-reported sleep parameters were obtained using the Pittsburgh Sleep Quality Index. The Barthel Index was used for physical disability rating. Bivariate correlations (Spearman's Rho) were used to explore relationships between the amount of physical activity and sleep quality. To analyse differences between categorial subgroups univariate ANOVAs were applied; in cases of significance, these were followed by Tukey-HSD post-hoc analyses. Results: No linear association between physical activity and sleep quality was found (r = 0.119; p > 0.05). In subgroup analyses (n = 41, Barthel Index ≥90 pts, free of pre-existing conditions), physical activity levels differed significantly between groups of different sleep duration (≥7 h; ≥6 to <7 h; ≥5 to <6 h; <5h; p = 0.037). Conclusion: There is no general association between higher activity levels and better sleep quality in the investigated cohort. However, a sleep duration of ≥5 to <6 h, corresponding to 7.6 h bed rest time, was associated with a higher level of physical activity.
Internet Gaming Disorder (IGD) has been included in the DSM-5 as a diagnosis for further study, and Gaming Disorder as a new diagnosis in the ICD-11. Nonetheless, little is known about the clinical prevalence of IGD in children and adolescents. Additionally, it is unclear if patients with IGD are already identified in routine psychotherapy, using the ICD-10 diagnosis F 63.8 (recommended classification of IGD in ICD-10). This study investigated N = 358 children and adolescents (self and parental rating) of an outpatient psychotherapy centre in Germany using the Video Game Dependency Scale. According to self-report 4.0% of the 11- to 17-year-old patients met criteria for a tentative IGD diagnosis and 14.0% according to the parental report. Of the 5- to 10-year-old patients, 4.1% were diagnosed with tentative IGD according to parental report. Patients meeting IGD criteria were most frequently diagnosed with hyperkinetic disorders, followed by anxiety disorders, F 63.8, conduct disorders, mood disorders and obsessive-compulsive disorders (descending order) as primary clinical diagnoses. Consequently, this study indicates that a significant amount of the clinical population presents IGD. Meaning, appropriate diagnostics should be included in routine psychological diagnostics in order to avoid “hidden” cases of IGD in the future.
Finding a bottle of milk in the bathroom would probably be quite surprising to most of us. Such a surprised reaction is driven by our strong expectations, learned through experience, that a bottle of milk belongs in the kitchen. Our environment is not randomly organized but governed by regularities that allow us to predict what objects can be found in which types of scene. These scene semantics are thought to play an important role in the recognition of objects. But when during development are the semantic predictions so far implemented that such scene-object inconsistencies would lead to semantic processing difficulties? Here we investigated how toddlers perceive their environments, and what expectations govern their attention and perception. To this aim, we used a purely visual paradigm in an ERP experiment and presented 24-month-olds with familiar scenes in which either a semantically consistent or an inconsistent object would appear. The scene-inconsistency effect has been previously studied in adults by means of the N400, a neural marker responding to semantic inconsistencies across many types of stimuli. Our results show that semantic object-scene inconsistencies indeed elicited an enhanced N400 over the left anterior brain region between 750 and 1150 ms post stimulus onset. This modulation of the N400 marker provides first indications that by the age of two toddlers have already established their scene semantics allowing them to detect a purely visual, semantic object-scene inconsistency. Our data suggest the presence of specific semantic knowledge regarding what objects occur in a certain scene category.
Background: It is often advised to ensure a high-protein intake during energy-restricted diets. However, it is unclear whether a high-protein intake is able to maintain muscle mass and contractility in the absence of resistance training.
Materials and Methods: After 1 week of body mass maintenance (45 kcal/kg), 28 male college students not performing resistance training were randomized to either the energy-restricted (ER, 30 kcal/kg, n = 14) or the eucaloric control group (CG, 45 kcal/kg, n = 14) for 6 weeks. Both groups had their protein intake matched at 2.8 g/kg fat-free-mass and continued their habitual training throughout the study. Body composition was assessed weekly using multifrequency bioelectrical impedance analysis. Contractile properties of the m. rectus femoris were examined with Tensiomyography and MyotonPRO at weeks 1, 3, and 5 along with sleep (PSQI) and mood (POMS).
Results: The ER group revealed greater reductions in body mass (Δ −3.22 kg vs. Δ 1.90 kg, p < 0.001, partial η2 = 0.360), lean body mass (Δ −1.49 kg vs. Δ 0.68 kg, p < 0.001, partial η2 = 0.152), body cell mass (Δ −0.85 kg vs. Δ 0.59 kg, p < 0.001, partial η2 = 0.181), intracellular water (Δ −0.58 l vs. Δ 0.55 l, p < 0.001, partial η2 = 0.445) and body fat percentage (Δ −1.74% vs. Δ 1.22%, p < 0.001, partial η2 = 433) compared to the CG. Contractile properties, sleep onset, sleep duration as well as depression, fatigue and hostility did not change (p > 0.05). The PSQI score (Δ −1.43 vs. Δ −0.64, p = 0.006, partial η2 = 0.176) and vigor (Δ −2.79 vs. Δ −4.71, p = 0.040, partial η2 = 0.116) decreased significantly in the ER group and the CG, respectively.
Discussion: The present data show that a high-protein intake alone was not able to prevent lean mass loss associated with a 6-week moderate energy restriction in college students. Notably, it is unknown whether protein intake at 2.8 g/kg fat-free-mass prevented larger decreases in lean body mass. Muscle contractility was not negatively altered by this form of energy restriction. Sleep quality improved in both groups. Whether these advantages are due to the high-protein intake cannot be clarified and warrants further study. Although vigor was negatively affected in both groups, other mood parameters did not change.
Dual-task paradigms encompass a broad range of approaches to measure cognitive load in instructional settings. As a common characteristic, an additional task is implemented alongside a learning task to capture the individual’s unengaged cognitive capacities during the learning process. Measures to determine these capacities are, for instance, reaction times and interval errors on the additional task, while the performance on the learning task is to be maintained. Opposite to retrospectively applied subjective ratings, the continuous assessment within a dual-task paradigm allows to simultaneously monitor changes in the performance related to previously defined tasks. Following the Cognitive Load Theory, these changes in performance correspond to cognitive changes related to the establishment of permanently existing knowledge structures. Yet the current state of research indicates a clear lack of standardization of dual-task paradigms over study settings and task procedures. Typically, dual-task designs are adapted uniquely for each study, albeit with some similarities across different settings and task procedures. These similarities range from the type of modality to the frequency used for the additional task. This results in a lack of validity and comparability between studies due to arbitrarily chosen patterns of frequency without a sound scientific base, potentially confounding variables, or undecided adaptation potentials for future studies. In this paper, the lack of validity and comparability between dual-task settings will be presented, the current taxonomies compared and the future steps for a better standardization and implementation discussed.
Specifying accurate informative prior distributions is a question of carefully selecting studies that comprise the body of comparable background knowledge. Psychological research, however, consists of studies that are being conducted under different circumstances, with different samples and varying instruments. Thus, results of previous studies are heterogeneous, and not all available results can and should contribute equally to an informative prior distribution. This implies a necessary weighting of background information based on the similarity of the previous studies to the focal study at hand. Current approaches to account for heterogeneity by weighting informative prior distributions, such as the power prior and the meta-analytic predictive prior are either not easily accessible or incomplete. To complicate matters further, in the context of Bayesian multiple regression models there are no methods available for quantifying the similarity of a given body of background knowledge to the focal study at hand. Consequently, the purpose of this study is threefold. We first present a novel method to combine the aforementioned sources of heterogeneity in the similarity measure ω. This method is based on a combination of a propensity-score approach to assess the similarity of samples with random- and mixed-effects meta-analytic models to quantify the heterogeneity in outcomes and study characteristics. Second, we show how to use the similarity measure ω as a weight for informative prior distributions for the substantial parameters (regression coefficients) in Bayesian multiple regression models. Third, we investigate the performance and the behavior of the similarity-weighted informative prior distribution in a comprehensive simulation study, where it is compared to the normalized power prior and the meta-analytic predictive prior. The similarity measure ω and the similarity-weighted informative prior distribution as the primary results of this study provide applied researchers with means to specify accurate informative prior distributions.