Skip to main content
Log in

The field-standardized average impact of national research systems compared to world average: the case of Italy

Scientometrics Aims and scope Submit manuscript

Abstract

The study presents a time-series analysis of field-standardized average impact of Italian research compared to the world average. The approach is purely bibliometric, based on census of the full scientific production from all Italian public research organizations active in 2001–2006 (hard sciences only). The analysis is conducted both at sectorial level (aggregated, by scientific discipline and for single fields within disciplines) and at organizational level (by type of organization and for single organizations). The essence of the methodology should be replicable in all other national contexts. Its offers support to policy-makers and administrators for strategic analysis aimed at identifying strengths and weaknesses of national research systems and institutions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. This refers to “the conjecture that EU countries play a leading global role in terms of top-level scientific output, but lag behind in the ability of converting this strength into wealth-generating innovations” (Dosi et al. 2006).

  2. http://www.arwu.org/ (last accessed 18 Feb 2011).

  3. http://www.timeshighereducation.co.uk/world-university-rankings/ (last accessed 18 Feb 2011).

  4. http://ranking.heeact.edu.tw/en-us/2010/homepage/ (last accessed 18 Feb 2011).

  5. For example, average impact per publication of 10% above the world average could correspond to below-average productivity if the average product per researcher in the field examined was less that 90.9% of the world total.

  6. Mathematics; Physics; Chemistry; Earth and space sciences; Biology; Biomedical research; Clinical medicine; Engineering.

  7. The case of the University of Rome “Tor Vergata” offers an example of the need for and complexity of these elaborations: the authors identified 150 different bibliometric addresses for this institution over the six years examined.

  8. Publications in multidisciplinary journals are assigned to all fields associated with the relative journals and standardization is carried out with respect to the average of the XCRs for all the individual fields.

  9. Extracted from the Thomson Reuters Journal Citation Report, 2008.

  10. SCImago country rankings. http://www.scimagoir.com/ (last accessed 18 Feb 2011).

  11. Concentration indices shown in brackets in Table 7 represent a measure of association between two variables based on frequency data, which varies around the neutral value of 1. For example, in biology, the value of 1.03 for universities derives from the ratio of two ratios: ratio of total universities’ publications in biology to all Italian publications in biology (70.0) divided by ratio of total universities’ publications to all Italian publications (67.9).

  12. It should be noted that nine out of ten ranked institutions have quite few publications. As a consequence the extract of publications published in top journal is very little. The resulting ranking for Cites/JXCR may be due then to one or very few articles, whose citations may determine the overall ranking. While it may be of interest to a decision maker knowing the top research institutions regardless their size, potential bias may be avoided simply increasing the minimum threshold of published articles.

References

  • Abramo, G., & D’Angelo, C. A. (2007). Measuring science: Irresistible temptations, easy shortcuts and dangerous consequences. Current Science, 93(6), 762–766.

    Google Scholar 

  • Abramo, G., & D’Angelo, C. A. (2011). National-scale research performance assessment at the individual level. Scientometrics, 86(2), 347–364.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. Research Evaluation, 17(2), 111–121. doi:10.3152/095820208X280916.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Di Costa, F. (2010). Testing the trade-off between productivity and quality in research activities. Journal of the American Society for Information Science and Technology, 61(1), 132–140.

    Google Scholar 

  • Braun, T., Glänzel, W., & Grupp, H. (1995). The scientometric weight of 50 nations in 27 science areas, 1989–1993. Part I e II. Scientometrics, 33(3), 263–293.

    Article  Google Scholar 

  • Dosi, G., Llerena, P., & Labini, M. S. (2006). The relationships between science, technologies and their industrial exploitation: an illustration through the myths and realities of the so-called `European Paradox’. Research Policy, 35(10), 1450–1464.

    Article  Google Scholar 

  • Ingwersen P. (2009). Brazil research in selected scientific areas: Trends 1981–2005. In Proceedings of ISSI 2009–12th international conference of the international society for scientometrics and informetrics (Vol. 2, pp. 692–696). Rio de Janeiro: ISSI.

  • King, D. A. (2004). The scientific impact of nations. Nature, 430, 311–316.

    Article  Google Scholar 

  • Leydesdorff, L., & Wagner, C. (2009). Is the United States losing ground in science? A global perspective on the world science system. Scientometrics, 78(1), 23–36.

    Article  Google Scholar 

  • May, R. M. (1997). The scientific wealth of nations. Science, 275, 793–796.

    Article  Google Scholar 

  • Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer. ISBN: 978–1-4020–3713-9.

    Google Scholar 

  • Moed, H. F., Buger, W. J., Franfort, J. G., & van Raan, A. F. J. (1985). The use of bibliometric data for the measurement of university research performance. Research Policy, 14(3), 131–149.

    Article  Google Scholar 

  • Moed, H. F., Glänzel, W., & Schmoch, U. (2004). Handbook of quantitative science and technology research: the use of publication and patent statistics in studies of S & T systems. Dordrecht: Springer.

    Google Scholar 

  • Radicchi, F., Fortunato, S., & Castellano, C. (2008). Universality of citation distributions: Toward an objective measure of scientific impact. Proceedings of the National Academy of Sciences of the United States of America, 105(45), 17268–17272.

    Article  Google Scholar 

  • Van Leeuwen, T. N., Visser, M. S., Moed, H. F., Nederhof, T. J., & van Raan, A. F. J. (2003). The Holy Grail of science policy: exploring and combining bibliometric tools in search of scientific excellence. Scientometrics, 57(2), 257–280.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Abramo.

Appendix–further analysis

Appendix–further analysis

Table 12 presents the list of the top 10 organizations for percentage of publications in top journals, from those with at least 50 publications over the 2001–2006 period. We see that all these organizations but one are hospitals, and that in all cases but one their publications receive average citations that are higher than for other works in the same top journals.

Table 12 Standardized average impact of publications per Italian organization; data 2001–2006, limited to the top 10 for incidence of “top journal” publications from those organizations with a minimum of 50 total publications

Table 13 presents the example of a list of the top ten organizations for standardized average impact (Cites/XCR), for the top 10% of their publications. Seven of these organizations are also present in Table 10 and the exact same three organizations hold the top three places in both tables.

Table 13 Standardized average impact of publications per Italian organization; data 2001–2006, limited to the top 10 for average CITES/XCR of their top 10% publications, from those organizations with at least 50 total publications

The study of organizations through the impact of their total research output risks hiding differences concerning their internal disciplines and fields. The analysis can be detailed to reveal this level of data. Table 14 presents information for the ten best national organizations in the physics discipline, identified for Cites/XCR. The top three organizations are research institutes, which all score above two for Cites/XCR. There are also four universities, two of which are schools for advanced studies (Pisa School for Advanced Studies; Trieste International School for Advanced Studies).

Table 14 Standardized average impact of publications in physics per Italian organization; data 2001–2006, limited to the top 10 organizations for Cites/XCR, from those with a total of at least 50

The analysis can inquire deeper, for example to the level of fields. Table 15 presents the list of the top ten national organizations for research production in oncology, as identified for Cites/XCR.

Table 15 Standardized average impact of publications in oncology per Italian research organizations; data 2001–2006, limited to the top 10 organizations for CITES/XCR, from those with a total of at least 50 publications

Rights and permissions

Reprints and permissions

About this article

Cite this article

Abramo, G., D’Angelo, C.A. & Viel, F. The field-standardized average impact of national research systems compared to world average: the case of Italy. Scientometrics 88, 599–615 (2011). https://doi.org/10.1007/s11192-011-0406-x

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-011-0406-x

Keywords

Navigation