Skip to main content
Log in

Do social sciences and humanities behave like life and hard sciences?

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

The quantitative evaluation of Social Science and Humanities (SSH) and the investigation of the existing similarities between SSH and Life and Hard Sciences (LHS) represent the forefront of scientometrics research. We analyse the scientific production of the universe of Italian academic scholars , over a 10-year period across 2002–2012, from a national database built by the Italian National Agency for the Evaluation of Universities and Research Institutes. We demonstrate that all Italian scholars of SSH and LHS are equals, as far as their publishing habits. They share the same general law, which is a lognormal. At the same time, however, they are different, because we measured their scientific production with different indicators required by the Italian law; we eliminated the “silent” scholars and obtained different scaling values—proxy of their productivity rates. Our findings may be useful to further develop indirect quali–quantitative comparative analysis across heterogeneous disciplines and, more broadly, to investigate on the generative mechanisms behind the observed empirical regularities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. See http://www.scimagojr.com/countryrank.php?year=2015 (last accessed on 15 December 2016). For a bibliometric analysis of the Italian science see also Daraio and Moed (2011).

References

  • Archambault, É., Vignola-Gagné, É., Cǒté, G., Lariviere, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.

    Article  Google Scholar 

  • Ardanuy, J., Urbano, C., & Quintana, L. (2009). A citation analysis of Catalan literary studies (1974–2003): Towards a bibliometrics of humanities studies in minority languages. Scientometrics, 81(2), 347–366.

    Article  Google Scholar 

  • Cartlidge, E. (2010). Italian Parliament passes Controversial University reforms. Science, 330, 1462–1463.

    Article  Google Scholar 

  • Daraio, C., & Moed, H. F. (2011). Is Italian science declining? Research Policy, 40(10), 1380–1392.

    Article  Google Scholar 

  • Deville, P., Wang, D., Sinatra, R., Song, C., Blondel, V. D., & Barabsi, A. L. (2014). Career on the move: geography, stratification, and scientific impact. Scientific Reports, 4(4770), 4770.

    Google Scholar 

  • Egghe, L., & Rousseau, R. (1990). Introduction to informetrics. Quantitative methods in library, documentation and information science. Amsterdam: Elsevier.

    Google Scholar 

  • Egghe, L., & Rousseau, R. (1996). Stochastic processes determined by a general success-breeds-success principle. Mathematical and Computer Modelling, 23(4), 93–104.

    Article  MathSciNet  MATH  Google Scholar 

  • Evans, T. S., Hopkins, N., & Kaube, B. S. (2012). Universality of performance indicators based on citation and reference counts. Scientometrics, 93, 473–495.

    Article  Google Scholar 

  • Fanelli, D., & Glänzel, W. (2013). Bibliometric evidence for a hierarchy of the sciences. PLoS ONE, 8(6), e66938.

    Article  Google Scholar 

  • Ferrara, A., & Bonaccorsi, A. (2016). How robust is journal ratingin Humanities and Social Science? Evidence from a large-scale,multi-method exercise. Research Evaluation, February 2016. dOI:10.1093/reseval/rvv048.

  • Giménez-Toledo, E., Mañana-Rodríguez, J., Engels, T. C., Ingwersen, P., Pölönen, J., Sivertsen, G., et al. (2016). Taking scholarly books into account: current developments in five European countries. Scientometrics, 107(2), 685–699.

    Article  Google Scholar 

  • Guimera, R., Uzzi, B., Spiro, J., & Amaral, L. A. N. (2005). Team assembly mechanisms determine collaboration network structure and team performance. Science, 308(5722), 697–702.

    Article  Google Scholar 

  • Hicks, D. (2004). The four literatures of social science. In H. Moed, W. Glanzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology studies (pp. 473–496). Dordrecht: Kluwer Academic Publishers.

    Google Scholar 

  • Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429–431.

    Article  Google Scholar 

  • Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of The American Society for Information Science and Technology, 59(11), 1819–1828.

    Article  Google Scholar 

  • Jaffe, K. (2014). Social and natural sciences differ in their research strategies, adapted to work for different knowledge landscapes. PloS ONE, 9(11), e113901.

    Article  Google Scholar 

  • Limpert, E., Stahel, W. A., & Abbt, M. (2001). Log-normal distributions across the sciences: Keys and clues. BioScience, 51(5), 341–352.

    Article  Google Scholar 

  • Linmans, A. J. M. (2010). Why with bibliometrics the Humanities does not need to be the weakest link. Indicators for Research Evaluation Based on Citations, Library Holdings, and Productivity Measures, Scientometrics, 83, 337–354.

    Google Scholar 

  • Lotka, A. J. (1926). The frequency distribution of scientic productivity. Journal of the Washington Academy of Sciences, 16, 317323.

    Google Scholar 

  • Martinez-Mekler, G., Martinez, R. A., del Rio, M. B., Mansilla, R., Miramontes, P., & Cocho, G. (2009). Universality of rank-ordering distributions in the arts and sciences. PLoS ONE, 4(3), e4791.

    Article  Google Scholar 

  • Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.

    Article  Google Scholar 

  • Moed, H. F., & Van Leeuwen, T. N. (1996). Impact factors can mislead. Nature, 381(6579), 186–186.

    Article  Google Scholar 

  • Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.

    Google Scholar 

  • Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the Association for Information Science and Technology, 65(8), 1627–1638.

    Article  Google Scholar 

  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66, 81–100.

    Article  Google Scholar 

  • Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.

    Article  Google Scholar 

  • Owens, B. (2013). Judgement day. Nature, 502, 288–290.

    Article  Google Scholar 

  • Press, W. H., Teukolsky, S. A., Vetterling, W. T., & Flannery, B. P. (2007). Numerical recipes 3rd edition: The art of scientic computing. New York: Cambridge University Press.

    MATH  Google Scholar 

  • Radicchi, F., Fortunato, S., & Castellano, C. (2008). Universality of citation distributions: Toward an objective measure of scientific impact. Proceedings of the National Academy of Sciences of the United States of America, 105, 17268–17272.

    Article  Google Scholar 

  • Rørstad, K., & Aksnes, D. W. (2015). Publication rate expressed by age, gender and academic positionA large-scale analysis of Norwegian academic staff. Journal of Informetrics, 9(2), 317–333.

    Article  Google Scholar 

  • Ruocco, G., & Daraio, C. (2013). An empirical approach to compare the performance of heterogeneous academic fields. Scientometrics, 97, 601–625.

    Article  Google Scholar 

  • Seglen, P. (1992). The skewness of science. Journal of the American Society for Information Science, 43, 628638.

    Article  Google Scholar 

  • Stringer, M. J., Sales-Pardo, M., & Amaral, L. A. N. (2008). Effectiveness of journal ranking schemes as a tool for locating information. PLoS ONE, 3(2), e1683.

    Article  Google Scholar 

  • Stringer, M. J., SalesPardo, M., & Amaral, L. A. N. (2010). Statistical validation of a global model for the distribution of the ultimate number of citations accrued by papers published in a scientific journal. Journal of the American Society for Information Science and Technology, 61(7), 1377–1385.

    Article  Google Scholar 

  • Torres-Salinas, D., & Moed, H. F. (2009). Library catalog analysis as a tool in studies of social sciences and humanities: An exploratory study of published book titles in economics. Journal of Informetrics, 3, 9–26.

    Article  Google Scholar 

  • Uzzi, B., & Spiro, J. (2005). Collaboration and creativity: The small world problem. American Journal of Sociology, 111(2), 447–504.

    Article  Google Scholar 

  • Uzzi, B., Mukherjee, S., Stringer, M., & Jones, B. (2013). Atypical combinations and scientific impact. Science, 342(6157), 468–472.

    Article  Google Scholar 

  • van Leeuwen, T. (2006). The application of bibliometric analyses in the evaluation of social science research. Who benefits from it, and why it is still feasible. Scientometrics, 66, 133–154.

    Article  Google Scholar 

  • van Raan, A. F. (2006). Performancerelated differences of bibliometric statistical properties of research groups: Cumulative advantages and hierarchically layered networks. Journal of the American Society for Information Science and Technology, 57(14), 1919–1935.

    Article  Google Scholar 

  • van Raan, A. F. J. (2008). Scaling rules in the science system: Influence of field-specific citation characteristics on the impact of research groups. Journal of the American Society for Information Science and Technology, 59(4), 565576.

    Article  Google Scholar 

  • Verleysen, F. T., & Weeren, A. (2016). Clustering by publication patterns of senior authors in the social sciences and humanities. Journal of Informetrics, 10(1), 254–272.

    Article  Google Scholar 

  • Waltman, L., van Eck, N. J., & van Raan, A. F. J. (2012). Universality of citation distributions revisited. Journal of the American Society for Information Science and Technology, 63(1), 72–77.

    Article  Google Scholar 

  • White, H. D., Boell, S. K., Yu, H., Davis, M., Wilson, C. S., & Cole, F. T. H. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096.

    Article  Google Scholar 

  • Whitley, R. (2000). The intellectual and social organization of the sciences. New York: Oxford University Press.

    Google Scholar 

  • Wuchty, S., Jones, B. F., & Uzzi, B. (2007). The increasing dominance of teams in production of knowledge. Science, 316(5827), 1036–1039.

    Article  Google Scholar 

  • Zuccala, A., & Cornacchia, R. (2016). Data matching, integration,and interoperability for a metric assessment of monographs. Scientometrics, 1–20.

  • Zuccala, A. (2013). Evaluating the Humanities. Vitalizing ’the forgotten sciences’, Research Trends, n., 32, 3–6.

    Google Scholar 

Download references

Acknowledgements

We thank ANVUR for providing the data used in the study. We also thank Proff. D. Checchi, A. Graziosi and M. Schaerf for useful discussions and Dr. I. Bongioanni for preliminary data analysis.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giancarlo Ruocco.

Appendices

Appendix 1: List of disciplinary sectors (SSDs)

See Table 3.

Table 3 List of disciplinary sectors (SSDs)of the Italian academic system

Appendix 2: Sources and nature of data

Law no. 240/2010 has changed radically the way Italian professors have to be promoted, by creating a dual system of National Scientific Habilitation and local recruitment. The Ministerial Decree no. 76/2012 established that quantititive indicators should be used to qualify minimum requirements for candidates and delegated ANVUR to produce such indicators. On June 2012 ANVUR published in its website the Deliberation no. 50/2012, which specified in great detail the procedures for the computation of indicators based on publications. Based on the legislation that mandated the general criteria for the procedure, ANVUR announced that indicators would have been constructed on the basis of the data included by scholars in their personal homepage, called “loginmiur”. The Deliberation asked all Italian scholars to update the data included in their homepages before a deadline, following a set of instructions. These instructions clearly delimitated the kinds of publications that scholars should include in the upload of metadata. In all fields of Humanities and Socials Sciences the categories of publications were defined as follows: (a) books, including monographs, critical editions, critical translations, (b) chapters of books and journal articles, (c) articles published in A-rated journals.

The dataset was extracted on July 14, 2012, from the personal homepage of all Italian professors and researchers. The homepages are managed by Cineca, a university consortium in charge of creating and maintaining large scale computing facilities for scientific purposes, as well as information systems and services on behalf of the Ministry of University and Research and of universities. Homepages are self-administered by individual scholars and are not accessible externally. They are routinely used by the Ministry for the management of information related to academic staff, e.g. the submission of proposals for funding or of candidatures for committees or panels. With respect to the quality of data available, the following remarks are relevant. Individual scholars are responsible for updating the information. No administrative consequence or sanction is considered for not updating or filling in wrong information. The control on the quality of data provided by Cineca is a formal one. This means that Cineca, using automatic computer queries, ensures that the data uploaded by scholars are consistent with formal definitions. Among the formal controls there are the following:

  • for books, the ISBN code is requested and the distinction between author and editor is recorded;

  • for journals, the ISSN code is requested;

  • all articles from journals are assigned to journals whose metadata are found in a Master list, managed by Cineca, based on controls of the titles;

  • if a new journal title is included in the homepage, the system does not record the metadata until the new journal is recognized and included in the Master list.

With respect to the quality of data, we must distinguish between books and journal articles. With respect to books, the actual content of the category may include non-academic books. If a book has a ISBN code, its metadata are recorded in the homepage without filters. This means that the nature of the book recorded is left to the determination of scholars. While the homepage is clearly aimed at academic purposes, many scholars tend to use it as a personal repository for all their publications, including popular science, professional texts, books aimed at a political or general audience, and the like. While Deliberation 50 asked scholars to update only books of academic type, it is impossible to give an estimate of the inflation represented by non-academic books. On the contrary, for journal articles there has been a massive work of classification made by ANVUR upon mandate of the Decree 76/2012. In particular, among journals included in personal homepages, ANVUR selected those considered, following the same Deliberation n. 50, “scientific” and excluded non-scientific ones. Among scientific journals a small set (approximately 12% of the total) was included in the category of A-rated journals. As a consequence, for journals the possibility of inflation of data is almost non-existent, since all data have been filtered using formally approved lists of journals.

With respect to the issues of quality of data, it is useful to report two quality control procedures. These may give a hint of the magnitude of the problem of data quality, while they do not resolve the problem once and for all.

The first is a quality control of the data on books carried out by ANVUR during the procedure. In fact, it was possible for Full professors who were denied the access to the Habilitation panel in the first round to submit an appeal. After receiving the appeal, ANVUR re-examined all publications in order to verify whether there were errors or, most frequently omissions (in particular, omissions of actually existing ISBN codes) in the application, so that the candidate actually met the thresholds and could eventually be admitted, or rather there was a confirmation of the rejection. During the appeal procedures, the publications of more than 1500 Full professors were examined manually, of which 800 from SSH. Contrary to initial expectations, the cases in which the books declared in the homepages were clearly non-academic were extremely rare (among them, one case of tourist guide was prominent, however). In almost all cases, controversies were related to the following issues: date of publication (e.g. Professor \(\times\) declares the book Y as published in year 2011, so that it is computed in the indicator, but then it is found that the date of publication has been postponed, or even it is not yet published)—wrong ISBN (e.g. Professor J includes a number not corresponding with the title of the book, leaving open the question whether it is a mistake or a fraud, i.e. the book does not have a ISBN code)—dubious academic qualification of publishers (e.g. the publisher is not member of a list of publishers associated to the National Association (AIE), has not a website or official address). It must be remembered that in the Italian legislation a provision for the precise qualification of “scientific book” is found, following a long-awaited initiative to create a transparent and open public archive of scientific publications established in 2009. However, this legislative provision has never been materialized, so that the concept of “scientific book” is left empty of specific content. Nevertheless, almost all cases treated during the appeal procedure had to deal with controversies over books whose academic content was beyond dispute. This does not exclude the fact that non-academic books have inflated the number of books in general. In fact, appeal procedures were typically submitted by Full professors whose overall production was below the thresholds by a little margin, so that recognizing even a small number of previously excluded publications might reverse the outcome. It is still possible that the list of books of those admitted since the beginning include either scientific and non-scientific books.

The second quality procedure was put in place immediately after the publication of the Deliberation n. 50, in order to check whether some senior scholar would cancel their full list in protest, perhaps with a goal of diminishing the aggregate figures. In order to detect this possibility, ANVUR controlled the figures of individual scholars daily, using a routine algorithm developed by Cineca. In the period 15 June–14 July 2012 there were no examples of massive cancelations.

Appendix 3: Detailed results

See Table 4.

Table 4 Detailed results for the LHS (a, c) and SSH (b, d) disciplinary sectors

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bonaccorsi, A., Daraio, C., Fantoni, S. et al. Do social sciences and humanities behave like life and hard sciences?. Scientometrics 112, 607–653 (2017). https://doi.org/10.1007/s11192-017-2384-0

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-017-2384-0

Keywords

Navigation