for Journals by Title or ISSN
for Articles by Keywords
help

Publisher: Hogrefe and Huber Publishing Group   (Total: 32 journals)

Aviation Psychology and Applied Human Factors     Hybrid Journal   (Followers: 8)
Crisis: The J. of Crisis Intervention and Suicide Prevention     Hybrid Journal   (Followers: 16, SJR: 0.565, h-index: 30)
Diagnostica     Hybrid Journal   (Followers: 1, SJR: 0.295, h-index: 30)
European J. of Psychological Assessment     Hybrid Journal   (Followers: 2, SJR: 0.584, h-index: 34)
European Psychologist     Hybrid Journal   (Followers: 6, SJR: 0.447, h-index: 28)
Experimental Psychology     Hybrid Journal   (Followers: 17, SJR: 1.376, h-index: 34)
Forum Psychotherapeutische Praxis     Hybrid Journal   (Followers: 1)
Frühe Bildung     Hybrid Journal   (Followers: 1)
GeroPsych: The J. of Gerontopsychology and Geriatric Psychiatry     Hybrid Journal   (Followers: 3, SJR: 0.347, h-index: 9)
J. of Individual Differences     Hybrid Journal   (Followers: 11, SJR: 0.517, h-index: 16)
J. of Media Psychology     Hybrid Journal   (Followers: 10, SJR: 0.722, h-index: 10)
J. of Personnel Psychology     Hybrid Journal   (Followers: 7, SJR: 1.017, h-index: 6)
J. of Psychophysiology     Hybrid Journal   (Followers: 2, SJR: 0.512, h-index: 34)
Kindheit und Entwicklung     Hybrid Journal   (SJR: 0.856, h-index: 26)
Lernen und Lernstörungen     Hybrid Journal  
Methodology: European J. of Research Methods for the Behavioral and Social Sciences     Hybrid Journal   (Followers: 13, SJR: 0.521, h-index: 15)
Musik- Tanz und Kunsttherapie     Hybrid Journal  
Psychologische Rundschau     Hybrid Journal   (Followers: 3, SJR: 0.177, h-index: 14)
Rorschachiana     Hybrid Journal   (SJR: 0.158, h-index: 3)
Social Psychology     Hybrid Journal   (Followers: 8, SJR: 0.782, h-index: 15)
SUCHT - Zeitschrift für Wissenschaft und Praxis / J. of Addiction Research and Practice     Hybrid Journal   (Followers: 1, SJR: 0.296, h-index: 17)
Zeitschrift für Psychologie / J. of Psychology     Hybrid Journal   (Followers: 1)
Zeitschrift für Arbeits - und Organisationspsychologie A&O     Hybrid Journal   (Followers: 3, SJR: 0.278, h-index: 11)
Zeitschrift für Entwicklungspsychologie und Pädagogische Psychologie     Hybrid Journal   (SJR: 0.389, h-index: 14)
Zeitschrift für Gesundheitspsychologie     Hybrid Journal   (Followers: 1, SJR: 0.173, h-index: 5)
Zeitschrift für Kinder- und Jugendpsychiatrie und Psychotherapie     Hybrid Journal   (Followers: 1, SJR: 0.329, h-index: 18)
Zeitschrift für Klinische Psychologie und Psychotherapie     Hybrid Journal   (SJR: 0.353, h-index: 21)
Zeitschrift für Neuropsychologie     Hybrid Journal   (SJR: 0.187, h-index: 8)
Zeitschrift für Pädagogische Psychologie     Full-text available via subscription   (SJR: 0.98, h-index: 18)
Zeitschrift für Psychiatrie, Psychologie und Psychotherapie     Full-text available via subscription   (Followers: 1, SJR: 0.429, h-index: 18)
Zeitschrift für Psychologie     Hybrid Journal   (Followers: 2, SJR: 0.419, h-index: 12)
Zeitschrift für Sportpsychologie     Hybrid Journal   (Followers: 1, SJR: 0.121, h-index: 5)
Journal Cover   European Journal of Psychological Assessment
  [SJR: 0.584]   [H-I: 34]   [4 followers]  Follow
    
   Hybrid Journal Hybrid journal (It can contain Open Access articles)
   ISSN (Print) 1015-5759 - ISSN (Online) 2151-2426
   Published by Hogrefe and Huber Publishing Group Homepage  [32 journals]
  • The Utrecht-Management of Identity Commitments Scale (U-MICS)
    • Abstract: The Utrecht-Management of Identity Commitments Scale (U-MICS; Crocetti, Rubini, & Meeus, 2008) is a recently developed measure of identity that has been shown to be a reliable tool for assessing identity processes in adolescents. This study examines psychometric properties of the U-MICS in a large adolescent sample from seven European countries focused on the interplay of commitment, in-depth exploration, and reconsideration of commitment. Participants were 1,007 adolescents from Bulgaria (n = 146), the Czech Republic (n = 142), Italy (n = 144), Kosovo (n = 150), Romania (n = 142), Slovenia (n = 156), and the Netherlands (n = 127). We tested the U-MICS measurement invariance, reliability estimates in each language version, and compared latent identity means across groups. Results showed that the U-MICS has good internal consistency as well as configural, metric, and partial scalar invariance across groups in the sampled countries.
      Content Type Journal Article
      Category Original Article
      Pages 1-9

      DOI 10.1027/1015-5759/a000241

      Authors
      Radosveta Dimitrova, Department of Psychology, Stockholm University, Sweden
      Elisabetta Crocetti, Utrecht University, The Netherlands
      Carmen Buzea, Transylvania University of Brasov, Romania
      Venzislav Jordanov, Venzislav Jordanov, National Sports Academy, Bulgaria
      Marianna Kosic, Scientific-Cultural Institute Mandala, Slovene Research Institute, Italy
      Ergyul Tair, Bulgarian Academy of Sciences, Bulgaria
      Jitka Taušová, Palacký University, Czech Republic
      Natasja van Cittert, Tilburg University, The Netherlands
      Fitim Uka, European Center for Vocational Education “Qeap-Heimerer”, Kosovo
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:50 GMT
       
  • The Beck Hopelessness Scale
    • Abstract: The aim of the present study was to examine the construct and cross-cultural validity of the Beck Hopelessness Scale (BHS; Beck, Weissman, Lester, & Trexler, 1974). Beck et al. applied exploratory Principal Components Analysis and argued that the scale measured three specific components (affective, motivational, and cognitive). Subsequent studies identified one, two, three, or more factors, highlighting a lack of clarity regarding the scale’s construct validity. In a large clinical sample, we tested the original three-factor model and explored alternative models using both confirmatory and exploratory factor analytical techniques appropriate for analyzing binary data. In doing so, we investigated whether method variance needs to be taken into account in understanding the structure of the BHS. Our findings supported a bifactor model that explicitly included method effects. We concluded that the BHS measures a single underlying construct of hopelessness, and that an incorporation of method effects consolidates previous findings where positively and negatively worded items loaded on separate factors. Our study further contributes to establishing the cross-cultural validity of this instrument by showing that BHS scores differentiate between depressed, anxious, and nonclinical groups in a Hungarian population.
      Content Type Journal Article
      Category Original Article
      Pages 1-8

      DOI 10.1027/1015-5759/a000240

      Authors
      Marianna Szabó, School of Psychology, The University of Sydney, NSW, Australia
      Veronika Mészáros, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Judit Sallay, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Gyöngyi Ajtay, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Viktor Boross, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Àgnes Udvardy-Mészáros, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Gabriella Vizin, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Dóra Perczel-Forintos, Department of Clinical Psychology, Semmelweis University, Budapest, Hungary
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:49 GMT
       
  • Measurement Invariance of the Self-Description Questionnaire II in a
           Chinese Sample
    • Abstract: Studies on the construct validity of the Self-Description Questionnaire II (SDQII) have not compared the factor structure between the English and Chinese versions of the SDQII. By using rigorous multiple group comparison procedures based upon confirmatory factor analysis (CFA) of measurement invariance, the present study examined the responses of Australian high school students (N = 302) and Chinese high school students (N = 322) using the English and Chinese versions of the SDQII, respectively. CFA provided strong evidence that the factor structure (factor loading and item intercept) of the Chinese version of the SDQII in comparison to responses to the English version of the SDQII is invariant, therefore it allows researchers to confidently utilize both the English and Chinese versions of the SDQII with Chinese and Australian samples separately and cross-culturally.
      Content Type Journal Article
      Category Original Article
      Pages 1-12

      DOI 10.1027/1015-5759/a000242

      Authors
      Kim Chau Leung, Hong Kong Institute of Education, Hong Kong, PR China
      Herbert W. Marsh, Australian Catholic University, Sydney, NSW, Australia
      Rhonda G. Craven, Australian Catholic University, Sydney, NSW, Australia
      Adel S. Abduljabbar, King Saud University, Riyadh, Saudi Arabia
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:48 GMT
       
  • The Psychometric Properties of the Cognitive Fusion Questionnaire in
           Adolescents
    • Abstract: Cognitive fusion can be defined as the inability to view thoughts as just thoughts, which is hypothesized to increase the impact of those thoughts on behavior. Cognitive fusion is a core concept of Acceptance and Commitment Therapy, a therapeutic approach that is being increasingly studied as a treatment for a plethora of chronic health problems. The objective of this study was to evaluate the psychometric properties of the Cognitive Fusion Questionnaire (CFQ) in a sample of adolescents. Three hundred eight adolescents (11–20 years) completed the Catalan version of the questionnaire (CFQ-C) as well as measures assessing anxiety sensitivity and acceptance. The results supported a one-factor solution for the CFQ-C, and indicated an adequate level of internal consistency (Cronbach’s α = 0.79). The validity of the CFQ-C was supported by a significant positive association between the CFQ-C total score and the measure of anxiety sensitivity and by a significant negative association with the measure of acceptance. The findings support the psychometric properties of the CFQ to study the role that cognitive fusion may play in functioning among adolescents.
      Content Type Journal Article
      Category Original Article
      Pages 1-6

      DOI 10.1027/1015-5759/a000244

      Authors
      Ester Solé, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Mélanie Racine, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Elena Castarlenas, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Rocío de la Vega, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Catarina Tomé-Pires, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Mark Jensen, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Jordi Miró, Department of Psychology, Universitat Rovira i Virgili, Tarragona, Spain
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:48 GMT
       
  • Selecting the Best Items for a Short-Form of the Experiences in Close
           Relationships Questionnaire
    • Abstract: Five studies were conducted to develop a short form of the Experiences in Close Relationships (ECR) questionnaire with optimal psychometric properties. Study 1 involved Item Response Theory (IRT) analyses of the responses of 2,066 adults, resulting in a 12-item form of the ECR containing the most discriminating items. The psychometric properties of the ECR-12 were further demonstrated in two longitudinal studies of community samples of couples (Studies 2 and 3), in a sample of individuals in same-sex relationships (Study 4), and with couples seeking therapy (Study 5). The psychometric properties of the ECR-12 are as good as those of the original ECR and superior to those of an existing short form. The ECR-12 can confidently be used by researchers and mental health practitioners when a short measure of attachment anxiety and avoidance is required.
      Content Type Journal Article
      Category Original Article
      Pages 1-15

      DOI 10.1027/1015-5759/a000243

      Authors
      Marie-France Lafontaine, Department of Psychology, University of Ottawa, Canada
      Audrey Brassard, Université de Sherbrooke, Canada
      Yvan Lussier, Université du Québec à Trois-Rivières, Canada
      Pierre Valois, Université Laval, Canada
      Philip R. Shaver, University of California, Davis, CA, USA
      Susan M. Johnson, University of Ottawa, Ottawa Couple and Family Institute, Canada
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:47 GMT
       
  • Measuring Psychology Students’ Information-Seeking Skills in a
           Situational Judgment Test Format
    • Abstract: Three studies were conducted to develop a test for academic information-seeking skills in psychology students that measures both procedural and declarative aspects of the concept. A skill decomposition breaking down information-seeking into 10 sub skills was used to create a situational judgment test with 22 items. A scoring key was developed based on expert ratings (N = 14). Subsequently, the test was administered to two samples of N = 78 and N = 81 psychology students. Within the first sample, the scale reached an internal consistency (Cronbach’s Alpha) of α = .75. Scale validity was investigated with data from the second sample. High correlations between the scale and two different information search tasks (r = .42 to .64; p < .001) as well as a declarative information literacy test (r = .51; p < .001) were found. The findings are discussed with regard to their implications for research and practice.
      Content Type Journal Article
      Category Multistudy Report
      Pages 1-10

      DOI 10.1027/1015-5759/a000239

      Authors
      Tom Rosman, Leibniz Institute for Psychology Information, Trier, Germany
      Anne-Kathrin Mayer, Leibniz Institute for Psychology Information, Trier, Germany
      Günter Krampen, Leibniz Institute for Psychology Information, Trier, Germany
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:46 GMT
       
  • Measuring the Situational Eight DIAMONDS Characteristics of Situations
    • Abstract: It has been suggested that people perceive psychological characteristics of situations on eight major dimensions (Rauthmann et al., 2014): The “Situational Eight” DIAMONDS (Duty, Intellect, Adversity, Mating, pOsitivity, Negativity, Deception, Sociality). These dimensions have been captured with the 32-item RSQ-8. The current work optimizes the RSQ-8 to derive more economical yet informative and precise scales, captured in the newly developed S8*. Nomological associations of the original RSQ-8 and the S8* with situation cues (extracted from written situation descriptions) were compared. Application areas of the S8* are outlined.
      Content Type Journal Article
      Category Original Article
      Pages 1-10

      DOI 10.1027/1015-5759/a000246

      Authors
      John F. Rauthmann, Humboldt-Universität zu Berlin, Germany
      Ryne A. Sherman, Florida Atlantic University, FL, USA
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:46 GMT
       
  • Ultra-Brief Measures for the Situational Eight DIAMONDS Domains
    • Abstract: People perceive psychological situations on the “Situational Eight” DIAMONDS characteristics (Duty, Intellect, Adversity, Mating, pOsitivity, Negativity, Deception, Sociality; Rauthmann et al., 2014). To facilitate situational assessment and economically measure these dimensions, we propose four ultra-brief one-item scales (S8-I, S8-II, S8-III-A, S8-III-P) validated against the already existing 24-item S8*. Convergent/discriminant validity of the four S8-scales was examined by analyses of the multi-characteristics multi-measures matrix, and their nomological associations with external criteria were compared. Application areas of the scales are discussed.
      Content Type Journal Article
      Category Original Article
      Pages 1-10

      DOI 10.1027/1015-5759/a000245

      Authors
      John F. Rauthmann, Humboldt-Universität zu Berlin, Germany
      Ryne A. Sherman, Florida Atlantic University, FL, USA
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:46 GMT
       
  • A Reliability Generalization Study for a Multidimensional Loneliness Scale
    • Abstract: Research on the average reliability and factors that affect the reliability of loneliness scales has been restricted to unidimensional measures. A reliability generalization (RG) study was conducted for a multidimensional loneness measure, that is, the Loneliness and Aloneness Scale for Children and Adolescents (LACA). Multilevel meta-analyses were performed on 79 studies that comprised 92 samples (for a total of 41,076 participants). Average reliability (Cronbach’s alpha) across samples was good (i.e., .80 or above) for all four subscales. Studies with higher sampling quality yielded slightly higher alphas for one of the subscales (i.e., Parent-related loneliness). For adolescents, as compared to children, alphas were somewhat lower for three of the four subscales and higher for the Affinity for aloneness subscale. Suggestions for future research are outlined. From a reliability perspective, the LACA is a good option for researchers who want to use a multidimensional loneliness measure with children and adolescents.
      Content Type Journal Article
      Category Original Article
      Pages 1-8

      DOI 10.1027/1015-5759/a000237

      Authors
      Marlies Maes, Department of School Psychology and Child and Adolescent Development, KU Leuven, Belgium
      Wim Van den Noortgate, Department of Methodology and Educational Sciences, KU Leuven, Belgium
      Luc Goossens, Department of School Psychology and Child and Adolescent Development, KU Leuven, Belgium
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:46 GMT
       
  • Development of the Fear of Intimacy Components Questionnaire (FICQ)
    • Abstract: We developed a new instrument designed to measure fear of intimacy in romantic relationships. We suggest assessing fear of intimacy through two dimensions: self-revelation and dependence. The Fear of Intimacy Components Questionnaire (FICQ) was validated across three studies in which a 10-item solution systematically emerged. Consistently with a two component perspective, a two-factor solution fitted data the best: fear of losing the self (FLS) and fear of losing the other (FLO). Qualitative analyses verified content validity. Exploratory and confirmatory factor analyses tested the factor structure. Multigroup analyses supported the structural invariance across gender, age, and relationship status. Both factors showed adequate discriminant validity and internal consistency, and good 3-week period test-retest reliability. Associations between the FICQ and insecure attachment orientations demonstrated convergent validity. The association between the FICQ and relationship satisfaction above and beyond a preexisting measure offered criterion validity. By going beyond traditional self-revelation-focused conception of fear of intimacy, that is, by proposing a bi-dimensional structure to fear of intimacy, we believe that this new measure will contribute to future research on fear of intimacy.
      Content Type Journal Article
      Category Original Article
      Pages 1-8

      DOI 10.1027/1015-5759/a000238

      Authors
      Maria Pedro Sobral, Center for Psychology at University of Porto, Portugal
      Maria Emília Costa, Center for Psychology at University of Porto, Portugal
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Thu, 26 Feb 2015 20:26:46 GMT
       
  • The Issue of Fuzzy Concepts in Test Construction and Possible Remedies
    • Abstract: The Issue of Fuzzy Concepts in Test Construction and Possible Remedies
      Content Type Journal Article
      Category Editorial
      Pages 1-4

      DOI 10.1027/1015-5759/a000255

      Authors
      Matthias Ziegler, Humboldt-Universität zu Berlin, Germany
      Christoph J. Kemper, Institute for Medical and Pharmaceutical Proficiency Assessment, Mainz, Germany
      Timo Lenzner, GESIS – Leibniz-Institute for the Social Sciences, Mannheim, Germany
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      Journal Volume Volume 31
      Journal Issue Volume 31, Number 1 / 2015
      PubDate: Wed, 04 Feb 2015 13:26:26 GMT
       
  • How Test Takers See Test Examiners
    • Abstract: We addressed potential test takers’ preferences for women or men as examiners as well as how examiners were perceived depending on their gender. We employed an online design with 375 students who provided preferences for and ratings of examiners based on short video clips. The clips showed four out of 15 psychologists who differed in age (young vs. middle-aged) and gender giving an introduction to a fictional intelligence test session. Employing multivariate multilevel analyses we found female examiners to be perceived as more social competent and middle-aged examiners being perceived as more competent. Data analyses revealed a significant preference for choosing women as examiners. Results were discussed with reference to test performance and fairness.
      Content Type Journal Article
      Category Original Article
      Pages 1-9

      DOI 10.1027/1015-5759/a000232

      Authors
      Isabella Vormittag, Department of Education and Psychology, Division for Psychological Assessment, Free University Berlin, Germany
      Tuulia M. Ortner, Department of Psychology, Division for Psychological Assessment, University of Salzburg, Austria
      Tobias Koch, Department of Education and Psychology, Division for Methods and Evaluation, Free University Berlin, Germany
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Wed, 10 Dec 2014 16:21:15 GMT
       
  • The Dutch Symptom Checklist-90-Revised
    • Abstract: The Symptom Checklist-90-Revised (SCL-90-R; Derogatis, 1977, 1994) was constructed to measure both general psychological distress and specific primary symptoms of distress. In this study, we evaluated to what extent the scale scores of the Dutch SCL-90-R reflect general and/or specific aspects of psychological distress in a psychiatric outpatients sample (N = 1,842), using a hierarchical factor model. The results revealed that the total scale score measures general psychological distress, with high reliability. The subscale scores Sleep Difficulties, Agoraphobia, Hostility, and Somatization reflect the specific primary symptoms reasonably well, with high reliability. The subscale score Depression hardly measures specific symptoms of distress, but instead a very common construct as is measured with the total scale of the SCL-90-R. The use of the Depression subscale score beyond the total scale score of the SCL-90-R appears therefore of limited value in clinical practice.
      Content Type Journal Article
      Category Original Article
      Pages 1-9

      DOI 10.1027/1015-5759/a000233

      Authors
      Iris A. M. Smits, University of Groningen, Faculty of Behavioural and Social Sciences, The Netherlands
      Marieke E. Timmerman, University of Groningen, Faculty of Behavioural and Social Sciences, The Netherlands
      Dick P. H. Barelds, University of Groningen, Faculty of Behavioural and Social Sciences, The Netherlands
      Rob R. Meijer, University of Groningen, Faculty of Behavioural and Social Sciences, The Netherlands
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Tue, 09 Dec 2014 13:26:42 GMT
       
  • Detection of Differential Item Functioning in the Cornell Critical
           Thinking Test Between Turkish and United States Students
    • Abstract: Critical thinking is a broad term that includes core elements such as reasoning, evaluating, and metacognition that should be transferred to students in educational systems. The integration of such skills into models of student success is increasing on an international scale. The Cornell Critical Thinking Test is an internationally used tool to assess critical thinking skills. However, limited validity evidence of the translated versions of the instrument exists to support the inferences based on the CCTT scores. This study examined the CCTT Turkish version. Specifically, translated items were examined for measurement equivalence by determining if items function differently across students from United States and Turkey. Differential Item Functioning (DIF) analysis via logistic regression was employed. Results demonstrated that each subtest contained DIF items and 10% of the items in the instrument were identified as DIF. Mean differences between students in each country were not influenced by these items. A critical content review of the translated item gave insight as to why items may be functioning differently.
      Content Type Journal Article
      Category Original Article
      Pages 1-9

      DOI 10.1027/1015-5759/a000230

      Authors
      Hafize Sahin, Washington State University, Pullman, WA, USA
      Brian F. French, Washington State University, Pullman, WA, USA
      Brian Hand, University of Iowa, Iowa City, IA, USA
      Murat Gunel, TED University, Ankara, Turkey
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Tue, 09 Dec 2014 13:26:42 GMT
       
  • Brief Form of the Interpersonal Competence Questionnaire (ICQ-15)
    • Abstract: The Interpersonal Competence Questionnaire (ICQ) developed by Buhrmester and colleagues (1988) in the US assesses the multidimensional construct of social competence via five distinct, but related subscales. Two versions comprising 40 and 30 items, respectively, are available in German. The purpose of the current study is to develop and validate a brief version of the ICQ among a large adult sample that is representative of the German general population. Data were collected from 2,009 participants. Three confirmatory factor analyses (CFA) were conducted in order to develop and validate the ICQ-15. Cronbach’s alpha coefficients were computed for the ICQ-15. An initial CFA with the ICQ-30 formed the basis for the selection of the items to be included in the ICQ-15. Two subsequent CFA’s with the ICQ-15 revealed an excellent fit of the hypothesized five-factor model to the observed data. Internal consistency coefficients were in the adequate range. This preliminary evaluation shows that the ICQ-15 is a structurally valid measure of interpersonal competence recommended for research contexts with limited assessment time and for psychotherapy progress tracking in clinical settings.
      Content Type Journal Article
      Category Original Article
      Pages 1-8

      DOI 10.1027/1015-5759/a000234

      Authors
      Adina Coroiu, Department of Educational and Counselling Psychology, McGill University, Montreal, Quebec, Canada
      Alexandra Meyer, Department of Psychosomatic Medicine and Psychotheraphy, Universal Medical Center, Mainz, Germany
      Carlos A. Gomez-Garibello, Department of Educational and Counselling Psychology, McGill University, Montreal, Quebec, Canada
      Elmar Brähler, Department of Psychosomatic Medicine and Psychotheraphy, Universal Medical Center, Mainz, Germany
      Aike Hessel, Pension Insurance Oldenburg-Bremen, Coordination Management – Social Medicine, Bremen, Germany
      Annett Körner, Department of Educational and Counselling Psychology, McGill University, Montreal, Quebec, Canada
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Tue, 09 Dec 2014 13:26:42 GMT
       
  • The Answer Is Blowing in the Wind
    • Abstract: This study examined the effects of weather on personality self-ratings. Single-assessment data were derived from the German General Social Survey conducted in 2008. For a subset of the participants (N = 478), official weather station data for the day a personality inventory was completed could be determined. Among these respondents, 140 (29%) completed the personality inventory on an unambiguously sunny day, 59 (12%) completed the measure on an unambiguously rainy day, and 279 (59%) completed the questionnaire on a day characterized by mixed weather conditions. Results revealed that self-ratings for some personality domains differed depending on the weather conditions on the day the inventory was completed. When compared with corresponding self-ratings collected under mixed weather conditions, ratings for the Big Five dimension of Openness to Experience were significantly lower on rainy days and ratings for Conscientiousness were significantly lower on sunny days. These results are suggestive of some limitations on the assumed situational independence of trait ratings.
      Content Type Journal Article
      Category Original Article
      Pages 1-7

      DOI 10.1027/1015-5759/a000236

      Authors
      Beatrice Rammstedt, GESIS – Leibniz Institute for the Social Sciences, Mannheim, Germany
      Michael Mutz, Georg-August-University Göttingen, Germany
      Richard F. Farmer, Oregon Research Institute, Eugene, OR, USA
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Tue, 09 Dec 2014 13:26:41 GMT
       
  • Experience and Diagnostic Anchors in Referral Letters
    • Abstract: The present study investigated whether diagnostic anchors, that is: diagnoses suggested in referral letters, influence judgments made by clinical psychologists with different levels of experience. Moderately experienced clinicians (N = 98) and very experienced clinicians (n = 126) were randomly assigned to reading a referral letter suggesting either depression or anxiety, or no referral letter. They then read a psychiatric report about a depressed patient, and gave a preliminary and final diagnosis. Results showed that the correctness of the diagnoses by very experienced clinicians was unaffected by the referral diagnosis. Moderately experienced clinicians did use the suggested diagnosis as anchor; when they had read a referral letter suggesting depressive complaints they were more inclined to classify the patient with a depressive disorder. In conclusion, the diagnosis in a referral letter influences the diagnostic decision made by moderately experienced clinicians.
      Content Type Journal Article
      Category Original Article
      Pages 1-7

      DOI 10.1027/1015-5759/a000235

      Authors
      Nanon L. Spaanjaars, Diagnostic Decision Making, Behavioural Science Institute, Radboud University Nijmegen, The Netherlands
      Marleen Groenier, Instructional Technology, University of Twente, Enschede, The Netherlands
      Monique O. M. van de Ven, Diagnostic Decision Making, Behavioural Science Institute, Radboud University Nijmegen, The Netherlands
      Cilia L. M. Witteman, Diagnostic Decision Making, Behavioural Science Institute, Radboud University Nijmegen, The Netherlands
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Tue, 09 Dec 2014 13:26:41 GMT
       
  • Factor Structure of the Ruminative Responses Scale
    • Abstract: The 10-item Ruminative Responses Scale is used to measure two facets of rumination: brooding and reflection. These subscales are used to seek differential correlations with other variables of interest (e.g., depression). The validity of these facets, however, is questionable because brooding and reflection were distinguished based on factor analyses, but subsequent analyses have been inconsistent. We investigated these facets using factor analyses in a large community-based sample (N = 625). Other measures of rumination and depression were used as criteria for validity analyses. Only the brooding items formed a robust scale. A consistent reflection factor did not emerge. Brooding showed convergent validity with other measures of rumination as well as depression, all rs > .4. Brooding was also higher among participants with a history of depression compared with never-depressed participants. Implications for the interpretation of past research and for conducting future research are discussed.
      Content Type Journal Article
      Category Original Article
      Pages 1-7

      DOI 10.1027/1015-5759/a000231

      Authors
      James W. Griffith, Department of Medical Social Sciences, Northwestern University, Chicago, IL, USA
      Filip Raes, Centre for Learning and Experimental Psychopathology, KU Leuven, Belgium
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      PubDate: Tue, 09 Dec 2014 13:26:41 GMT
       
  • Stop and State Your Intentions!
    • Abstract: Stop and State Your Intentions!
      Content Type Journal Article
      Category Editorial
      Pages 239-242

      DOI 10.1027/1015-5759/a000228

      Authors
      Matthias Ziegler, Humboldt-Universität zu Berlin, Germany
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      Journal Volume Volume 30
      Journal Issue Volume 30, Number 4 / 2014
      PubDate: Fri, 07 Nov 2014 13:58:12 GMT
       
  • Ad Hoc Reviewers 2014
    • Abstract: Ad Hoc Reviewers 2014
      Content Type Journal Article
      Category Volume Information
      Pages 315-316

      DOI 10.1027/1015-5759/a000229
      Journal European Journal of Psychological Assessment
      Online ISSN 2151-2426
      Print ISSN 1015-5759
      Journal Volume Volume 30
      Journal Issue Volume 30, Number 4 / 2014
      PubDate: Fri, 07 Nov 2014 13:58:12 GMT
       
 
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
 
About JournalTOCs
API
Help
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-2015