Skip to main content
Free AccessEditorial

Planning a Career in Psychological Assessment

Lessons Learned From the EAPA Winter School 2021

Published Online:https://doi.org/10.1027/1015-5759/a000666

Two aims that are explicitly formulated in the mission statement of the European Association of Psychological Assessment (EAPA) are to “promote the interest of young professionals and scientists in the science of psychological assessment” and to “create opportunities for scientific exchange about psychological assessment” (EAPA, 2021). In an effort to address these goals, EAPA held its first-ever Winter School1 bringing early career researchers (ECRs) and experts together. The school was hosted by the Martin Luther University Halle-Wittenberg on May 7, 2021 (online due to the pandemic situation). In this Editorial, we highlight some of the issues discussed, advice and concerns expressed about what to consider when planning a career in psychological assessment. While this cannot be a comprehensive guide, it should be seen as a starting point for a more extensive discussion aimed at helping emerging scholars in the field. Of note, the European Journal of Psychological Assessment, EJPA, as the flagship journal of EAPA, often receives submissions from ECRs (as first authors) and shares the EAPA’s commitment toward supporting them through high-quality and swift review processes. We hope that this Editorial helps to serve as an additional building block in supporting young scholars at the beginning of their career.

Thinking Big, Being Innovative, and Planning Ahead While being Employed on a Short-Term Contract

ECRs often face a paradoxical situation: They are expected to “think big” when planning for new research projects, be innovative, follow a long-term plan (e.g., establishing a longitudinal study), and address important societal issues while working on short-term or temporary contracts that offer little guarantees for the future. Also, there is the pressure to publish work in high numbers and of high quality. ECRs are expected to do all this while, among other things, improving one’s writing skills, attending conferences, networking, and working on grant applications (and frequently also having to manage teaching at an advanced level and supervising undergraduate students) – not to speak of having a life outside of academia and work (just as a thought: How should one think about starting a family with all these uncertainties around?)! Hence, one important consideration is how to balance smaller and larger research projects and publication strategies. In psychological assessment, projects can range from basic works on test adaptations, improvement and revisions of existing measures, and creation of test norms to contributions to methodological issues – to name but a few. Working in a team and having good mentors will help making good decisions on how to select publishable material, how to combine findings to produce important and impactful publications that will contribute to knowledge and the researcher’s profile.

Of course, “thinking big” is a long-term investment. Being innovative and helping to resolve bigger issues will help with networking efforts, increase chances to get funding, and enable follow-up research. Also, the so-called “replication crisis,” which has hit psychology particularly hard (or at least at first, while other disciplines yet have to re-evaluate their practices; e.g., Shrout & Rodgers, 2018), has shown the merit in replicating and extending existing work. It seems that the merits of the latter are better understood in the field of psychological assessment. Most specialized journals require evidence for the stability and replicability of findings (e.g., cross validation with a second, independently collected sample). Taken together, working on a portfolio that covers basic and applied work around test development and evaluation, while also attempting to solve more theoretical and complex research questions is a good fundament for a career in psychological assessment. In short, a recommendation not to “put all of their eggs in one basket” but trying to have parallel projects to minimize the risk of depending on the success of a single study or project also seems important. We welcome hiring committees re-evaluating current standards (e.g., rating the quality of the publications of an applicant higher than their mere number) and also considering factors such as the innovativeness and future-mindedness of an applicant more strongly. Such practices would probably also encourage ECRs to take more risks with larger ideas, rather than re-iterating what has been successful for others.

Diversifying

Tenure track positions or (full) professorships dedicated to psychological assessment are comparatively rare. For example, the German Psychological Association’s (DGPs) chapter on personality, individual differences, and psychological assessment hosts a list of all chairs in the German-speaking countries in their field; as of May 2021, only 7 out of 108 university professorships are specifically dedicated to psychological assessment, while others are combined with adjacent disciplines, mostly personality or educational psychology, individual differences, organizational psychology, or counseling psychology. Hence, candidates have to demonstrate expertise (typically via publications) in another (substantive) field to be an eligible candidate for a chair in psychological assessment. Experts’ experiences at the EAPA Winter School converged on this point independently of their academic home country.

ECRs also need to diversify in other areas, for example, in the methods they use for their research. This notion will be familiar to those working in the field of psychological assessment, as they are used to pursuing multi-method approaches and know about the limitations of a mono-method approach. Although, most advanced researchers will develop preferences for certain approaches, they will make sure that one’s research is not only the repetition of a once established assessment and analysis method.

Theory Matters

A shared concern is that new measures in the field of psychological assessment sometimes lack theoretical justification and that their nomological net seems somewhat under-developed. This is, in fact, an issue with some submissions that report the development of new assessment tools received by EJPA. While one could argue that assessment journals and their readership might be primarily interested in the technical details of analyses and findings, there is a desire for more information on theoretical background of the psychological constructs assessed. Providing the theoretical background, prior findings on the theoretical and empirical nomological net, and distinctions from previously existing instruments is important, because these components provide evidence for (a) the need of a potentially new measure, (b) no redundancy with existing measures, and (c) no wasted resources in terms of time of test takers and researchers alike (see e.g., Ziegler, 2020). Publishing such information routinely will not only improve understanding of the constructs assessed but also prevent the occurrence of jingle-jangle fallacies and the risk of presenting old wine in new bottles (e.g., Condon et al., 2020; Ponnock et al., 2020). Additionally, this makes it easier to interpret findings substantively. Of course, the addition of the theoretical background needs to be balanced with word counts and making sure that all technical information is reported, but repositories and electronic supplementary materials provide ample space for moving parts of a manuscript that may be less central to its main aim.

Mentoring and (Scientific) Autonomy

When it comes to career planning, the mentoring process is key. Establishing a Winter School for postdocs and a Summer School for PhD students are some of the initiatives set by EAPA to improve networking with an emphasis on psychological assessment and to support ECRs in this regard. ECRs are particularly encouraged to develop their professional contacts, work on their networks, and engage actively with mentors, peers, and other researchers in the field. The EAPA will actively work on improving networking opportunities. Here, the Winter School might be an initial starting point that, hopefully, results in more long-term mentoring efforts. In this regard, the first Winter School has already resulted in some first promising contacts and enhanced networking for the ECRs. The challenges mentioned earlier can be remedied using the help of more senior researchers acting as mentors and letting junior team members actively contribute to research conducted in the lab. However, postdocs also need to demonstrate autonomy and visibility in the field. This can be done by working with co-authors other than the head of the lab to build one’s own network as well as by finding a niche, for example, by becoming an expert in a certain topic. We argue that the field of psychological assessment is well-suited for ECRs to find open questions, whether it be studying methods of assessment (e.g., studying Situational Judgment Tests or the use of process data), methodological considerations such as factor analytic questions or test takers’ responses, providing knowledge on the assessment of specific psychological constructs of broad (e.g., general intelligence; big five personality traits) or narrow (e.g., attention; need for cognition) individual differences variables and testing differential approaches of assessing them (e.g., self- and informant reports; behavioral traces using Smartphone data; or non-verbal tasks), or providing information on the assessment approaches’ reliability and validity. There is also space for basic research; for example, improving decision making when writing psychological reports or finding new ways to improve the norming of existing measures seem good fields for further basic work.

Know Your Methods (and More)

Psychological assessment has strong roots in data analysis and methodological and analytic expertise. However, new ways of collecting and analyzing data are progressing rapidly, particularly with the rise of “big data” and artificial intelligence and machine learning (e.g., Cheung & Jak, 2018; Harari et al., 2020). Interestingly, a recent sentiment analysis of psychologists’ Twitter accounts showed an increasing interest in topics of “Data Science,” “Artificial Intelligence,” and “Machine Learning” whereas traditional topics of psychological assessment (e.g., terms such as validity or reliability or “classical” psychometric analyses) did not emerge as “hot topics” (Bittermann et al., 2021). It must be noted that these approaches come with their specific challenges with respect to ethical considerations (e.g., data protection and consent for usage of the data by all single data units) that warrant broader discussion (e.g., Favaretto et al., 2020). Despite all opportunities “big data” bring, basic ethical considerations must be weighed against a potential gain. Without pointing out a single analytic approach, a recommendation is to be open to new data analytic approaches, to learn programming, and keep your finger on the pulse of new methods. Combining and integrating new and more established modes of analysis and sources of data in a meaningful way and learning about new ways of analyzing these types of data will also be an advantage. Also, many large-scale panel data that currently exist might give ECRs the opportunity to conduct secondary analysis and contribute to their research program. However, no fancy analysis method can alleviate a weak research design, psychometrically unsound instruments, or biased sampling and using a new method only because it is available (i.e., for its own sake), but without a theoretical justification does not add to the generation of new knowledge. Therefore, our experts recommend crafting the basics of study design, analyses, and interpretation while being open to new methodological approaches.

Embrace Transparency and Open Science

The replication crisis has affected publishing in the field of psychological assessment similarly to other disciplines with regard to efforts to make research more transparent. For example, both EAPA outlets (i.e., EJPA and Psychological Test Adaptation and DevelopmentPTAD) and outlets from our neighbor field of personality psychology (e.g., the European Association for Personality Psychology’s European Journal of Personality and Personality Science) have implemented and promoted open science practices to increase the transparency of the published studies (Back, 2019; Greiff & Allen, 2018; Greiff et al., 2020; Rauthmann, 2020; Ziegler, 2020). This initiative requires authors to make their data, codes and syntaxes, and materials openly available in independent repositories (e.g., Open Science Framework). In particular, ECRs are encouraged to follow the three main pillars of the open science movement; namely, open data (i.e., providing the raw data and syntaxes/codes), open materials (i.e., providing questionnaires, instructions, and clarifying analysis and experimental software), and pre-registration (i.e., stating hypotheses and an analysis plan before data analysis, or optimally data collection, begins). These strategies contribute to increased transparency of the research and support independent researchers’ endeavors to replicate findings and extend existing knowledge (e.g., when testing measurement invariance between language adaptations of the same measure or when comparing existing data with newly collected data). However, these efforts need to be balanced with limiting factors, for example, copyrighted instruments cannot be made openly available, participants’ sensitive personal data should be protected (e.g., not providing data that allow identification of participants such as email or IP addresses), and missing a priori knowledge for pre-registrations should be considered (e.g., how to provide estimates on fit indices when testing newly introduced instruments?). Also, registered reports, in which authors submit their research proposal including a theoretical rationale and data analysis plans, have received increasing interest in the community and are more and more visible in terms of output (e.g., Mehlenbacher, 2019), as observed in the EAPA journals. The active engagement in open science practices is also increasingly and explicitly expected by universities from applicants for (full) professorships. Therefore, it is highly recommended for ECRs to engage in open science practices and contribute to the transparency and reproducibility in the field.

Learn to Communicate with Practitioners

Psychological assessment as a field is strongly tied to the practical use of measurement instruments, to ensure that newly developed or revised instruments meet the demands of the practitioners. Learning how to develop test materials so that they can be applied with ease, write test manuals in a way that they can be understood by test users, and enable useful interpretations of the data is key for successful communication with practitioners. Also, practice can inform research; researchers might learn from practitioners in which areas further research or new tools are needed. Ensuring a mutual exchange allows for the development of practice-friendly tools that might help for solving “real world-problems.” Also, such feedbacks may help to inform the need for further areas of application (e.g., having a form for adults only, but practitioners might help for an understanding of the importance for parallel measures for adolescents or specific target groups). Finally, applied areas such as the work with special needs target groups potentially require an even more close collaboration, because of additional demands that need to be met in these cases.

Aside from basic work, a recommendation for ECRs is to always keep in mind the practical use of psychological assessment for research and other applications. This could also help in, for example, expanding the normative sample or generating other data for further psychometric analyses.

Conclusion

The EAPA wishes to be strongly involved in supporting the professional development of young researchers. By organizing winter/summer schools and conferences, it hopes to allow these young researchers to have access to an international forum and develop their skills and academic network. Psychological assessment has – alongside with psychology as a whole discipline – changed rapidly in the past decades and will, hopefully, continue this development trajectory (e.g., toward more transparency and replicability). Of course, this development goes along with restrictions in the predictability of career decision outcomes and any recommendations for future career planning based on past experiences have limitations. Nonetheless, we think that embracing recent developments (e.g., regarding transparency, new methodological advances, etc.), while also keeping in mind the more traditional merits of psychological assessment (e.g., multi-methodology, focus on validity) is a strong recommendation for all aspiring researchers. The issue of our ability to assess individuals, their evolutions and their interactions with their contexts remain crucial for the development of psychology as a science. Our societies will have to face many challenges, and psychology may contribute by providing important inputs and advice. For example, to address a challenge of lowering our ecological footprint, being able to assess, follow, and document evolutions of human behavior is crucial. We hope that our participation in the professional development of our young researchers will help them make socially meaningful contributions in the future.

Experts participating in the 1st EAPA Winter School received an honorarium paid for by “Hogrefe.” The hosts are grateful for their financial support in realizing this event.

1Skepticism against holding a Winter School in May can be alleviated: The temperature in Halle was 6 degrees Celsius to provide the suitable ambience to rightfully call it Winter School.

References

  • Back, M. D. (2019). Increasing scientific quality in the expanding field of personality science. European Journal of Personality, 33(1), 3–6. https://doi.org/10.1002/per.2192 First citation in articleCrossrefGoogle Scholar

  • Bittermann, A., Batzdorfer, V., Müller, S. M., & Steinmetz, H. (2021). Mining Twitter to detect hotspots in psychology. Zeitschrift für Psychologie, 229(1), 3–14. https://doi.org/10.1027/2151-2604/a000437 First citation in articleLinkGoogle Scholar

  • Cheung, M. W.-L., & Jak, S. (2018). Challenges of big data analyses and applications in psychology. Zeitschrift für Psychologie, 226(4), 209–211. https://doi.org/10.1027/2151-2604/a000348 First citation in articleLinkGoogle Scholar

  • Condon, D. M., Wood, D., Mõttus, R., Booth, T., Costantini, G., Greiff, S., Johnson, W., Lukaszewski, A., Murray, A., Revelle, W., Wright, A. G. C., Ziegler, M., & Zimmermann, J. (2020). Bottom up construction of a personality taxonomy. European Journal of Psychological Assessment, 36(6), 923–934. https://doi.org/10.1027/1015-5759/a000626 First citation in articleLinkGoogle Scholar

  • Favaretto, M., De Clercq, E., Gaab, J., & Elger, B. S. (2020). First do no harm: An exploration of researchers’ ethics of conduct in Big Data behavioral studies. PLoS One, 15(11), Article e0241865. https://doi.org/10.1371/journal.pone.0241865 First citation in articleCrossrefGoogle Scholar

  • EAPA. (2021, May 1). Our mission. https://www.eapa.science/about-eapa/our-mission First citation in articleGoogle Scholar

  • Greiff, S., & Allen, M. S. (2018). EJPA introduces registered reports as new submission format. European Journal of Psychological Assessment, 34(4), 217–219. https://doi.org/10.1027/1015-5759/a000492 First citation in articleLinkGoogle Scholar

  • Greiff, S., Van der Westhuizen, L., Mund, M., Rauthmann, J. F., & Wetzel, E. (2020). Introducing new open science practices at EJPA. European Journal of Psychological Assessment, 36(5), 717–720. https://doi.org/10.1027/1015-5759/a000628 First citation in articleLinkGoogle Scholar

  • Harari, G. M., Vaid, S. S., Müller, S. R., Stachl, C., Marrero, Z., Schoedel, R., Bühner, M., & Gosling, S. D. (2020). Personality sensing for theory development and assessment in the digital age. European Journal of Personality, 34(5), 649–669. https://doi.org/10.1002/per.2273 First citation in articleCrossrefGoogle Scholar

  • Mehlenbacher, A. R. (2019). Registered reports: Genre evolution and the research article. Written Communication, 36(1), 38–67. https://doi.org/10.1177/0741088318804534 First citation in articleCrossrefGoogle Scholar

  • Ponnock, A., Muenks, K., Morell, M., Yang, J. S., Gladstone, J. R., & Wigfield, A. (2020). Grit and conscientiousness: Another jangle fallacy. Journal of Research in Personality, 89, Article 104021. https://doi.org/10.1016/j.jrp.2020.104021 First citation in articleCrossrefGoogle Scholar

  • Rauthmann, J. F. (2020). Personality science: Unified in diversity. Personality Science, 1(1), Article e5297. https://doi.org/10.5964/ps.5297 First citation in articleCrossrefGoogle Scholar

  • Shrout, P. E., & Rodgers, J. L. (2018). Psychology, science, and knowledge construction: Broadening perspectives from the replication crisis. Annual Review of Psychology, 69, 487–510. https://doi.org/10.1146/annurev-psych-122216-011845 First citation in articleCrossrefGoogle Scholar

  • Ziegler, M. (2020). Psychological test adaptation and development – How papers are structured and why. Psychological Test Adaptation and Development, 1, 1–9. https://doi.org/10.1027/2698-1866/a000002 First citation in articleAbstractGoogle Scholar