Next Article in Journal
Scientific Research in Ecuador: A Bibliometric Analysis
Previous Article in Journal
Local News and Geolocation Technology in the Case of Portugal
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Publisher Transparency among Communications and Library and Information Science Journals: Analysis and Recommendations

by
Alexandre López-Borrull
1,*,
Mari Vállez
2,
Candela Ollé
1 and
Mario Pérez-Montoro
2
1
Faculty of Information and Communication Sciences, Universitat Oberta de Catalunya, 08035 Barcelona, Spain
2
Faculty of Information and Audiovisual Media, Universitat de Barcelona, 08014 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Publications 2021, 9(4), 54; https://doi.org/10.3390/publications9040054
Submission received: 2 September 2021 / Revised: 11 November 2021 / Accepted: 19 November 2021 / Published: 24 November 2021

Abstract

:
The principal goal of the research study is to analyze the transparency of a selection of academic journals based on an analysis model with 20 indicators grouped into 6 parameters. Given the evident interest in and commitment to transparency among quality academic journals and researchers’ difficulties in choosing journals that meet a set of criteria, we present indicators that may help researchers choose journals while also helping journals to consider what information from the editorial process to publish, or not, on their websites to attract authors in the highly competitive environment of today’s scholarly communication. To test the validity of the indicators, we analyze a small sample: the Spanish Communications and Library and Information Science journals listed in the Scimago Journal Rank. The results confirm that our analysis model is valid and can be extrapolated to other disciplines and journals.

1. Introduction

In recent years, the scholarly communication ecosystem has undergone a series of changes, often exacerbating aspects that have been in crisis since the late 20th century [1]. Open access (defined as a movement that fights to put the maximum number of scientific articles and content open for free [2]) or paradigms such as open science could change this ecosystem even more and favor the emergence of new actors. The European Commission defines open science as ‘a new approach to the scientific process based on cooperative work and new ways of disseminating knowledge, improving accessibility to and re-usability of research outputs by using digital technologies and new collaborative tools’ [3]. In such a fast-changing environment, choosing a journal in which to publish becomes more complex. Likewise, the use of the impact factor for comparison and differentiation between journals has been called into question [4]. In fact, in recent years, more than ever, initiatives have been promoted that are critical of the use of the impact factor for the evaluation of research and try to create and promote alternatives. These alternatives evaluate research based on the use of the article, rather than journal, or take a qualitative, rather than a quantitative, approach. For example, DORA [5] and the Leiden Manifesto [6] have opened up discussions at an international level. In some cases, this debate has also been conducted at a national level [4]. So, there is the opportunity to analyze the quality of journals in other terms. For example, there is Plan S, which promotes a system for quality open access academic journals [7].

1.1. Transparency as a Vector and Value in the Evolution of Scholarly Communication

For academic journals, the arrival of the internet meant both disruption and an opportunity to optimize the process of scholarly communication, improving the dissemination of knowledge through online publication [8,9]. This also led to the opportunity to promote open access to academic publications and changes in both scientific policies and business models [10]. The advances in technology have generated and consolidated open-science initiatives [11], including debate on the peer-review model [12]. This is not only an advance in terms of scientific output (whether articles or research data) but also a clear improvement in quality and transparency. It forms part of a process that affects transparency, open data (the process of defining how scientific data may be published and re-used without price or permission barriers [13]), and open government (high levels of transparency and mechanisms for public scrutiny and oversight [14]).
Academic journals also have a wide range of (often invisible) internal data, such as their budgets, number of articles rejected, number of reviewers, response time, etc. Some of these data are perceived as for internal use only or important to maintain a competitive edge in the process. These are just a few examples of data that are available to journals’ editorial boards but which are not shared with readers or prospective authors. Scholarly communication (and journals in particular) has a set of ethical challenges that can be linked to attitudes but also to the culture of sharing data and information [15]. Now that transparency has become a key element of management, especially in the public sphere, a positive move would see academic journals offer as much information as possible about the different processes involved in their publication; this would raise journals’ prestige and make it easier to assess their quality [16]. As Fosang and Colbran point out [17], transparency is the key to quality.
The high number of journals belonging to publishers of public universities must also be considered since the debate on the governance and financing of universities directly affects journals. Formerly, they were very vocational with a low budget, but they are now increasingly competitive, professional journals with a more global vision. If journals are paid for with public funds, then this must also be taken into account in terms of transparency, in the same way that national laws may require universities to have transparency portals (and open data). In recent years, the academic journal ecosystem has gone through a series of changes that have made the process of choosing a journal for publishing research results more complex [1]. Changes in the ecosystem of journals, greater pressure to publish in journals in higher quartiles, the push for open access publishing, and a lack of knowledge regarding journals are some ideas that appear in various studies that deal with the decisions doctoral students make when publishing their articles [18,19]. This affects Ph.D. students in particular, especially those opting to publish their thesis as a compendium of publications [20]. Threats to scholarly communication, including predatory journals and publishers, also need to be considered. Having more and better information on the editorial processes would help improve the decision-making process when authors look to choose a journal to publish in. Different assessment models are now being promoted, which are critical of the weight of the impact factor and dependence on quartiles; they offer new ways to evaluate journals and choose where to publish [5,6]. Authors faced with similar rankings in indexes and undifferentiated metrics may end up making their choice for other reasons. Undoubtedly, the speed of publication and the quality of the peer review are crucial, especially at a time when some assessment agencies seem to be changing their criteria in response to the emergence and consolidation of new publishers and business models. The new transformative agreements between countries, universities, and publishers to establish payment quotas for a greater number of open articles is a good example of change. Transformative agreements (also known as ‘offsetting’, ‘read and publish’, or ‘publish and read’ agreements) have shifted the focus of scholarly journal licensing from cost containment towards open access publication [21]. Undoubtedly, the pressure to publish with open access has led publishers to a change. It is to be expected that open science and its implementation will accelerate this transformation. All of this is also happening while authors are expressing concern about the possibility of their work falling into the hands of predatory journals [22].
The quality indicators used in assessing academic journals usually include transparency, but a wider vision of transparency is required, especially in terms of the use of public funding. For example, one of the quality indicators for Plan S-compliant open access journals involves information transparency. Plan S is an initiative of Coalition S, the European Research Council (ERC), and several European state agencies. It has attained a prominent place in the field of research and scholarly communication since its first version was published on 4 September 2018. The proposal aims to accelerate the transition to open access and ensure that, from 2021, all scientific publications derived from publicly funded projects are published in open access immediately. Plan S allows the publication of the article in three ways, one of which is publication in quality open access journals. It lists several aspects to be met by journals; some are mandatory, and others are strongly recommended.
In fact, Plan S itself presents as one of its principles the control of spending on scientific publications, declaring that publication rates should be standardized and limited. As a result, there has been some movement among publishers towards greater transparency regarding prices and margins. For example, MDPI provides a breakdown of the cost of the publication of its articles [23]

1.2. Transparency as a Metric for Analyzing and Comparing Journals

We propose the creation of a series of indicators to assess the transparency of a journal’s editorial process. The indicators are not linked to the content of the articles or the supplementary data but to the information the journal itself supplies on the process. Some of these elements are closely linked to elements compiled by databases and quality agencies. However, as far as we know, at present, no classification of this type exists, despite the research on transparency in academic journals that has been published [24,25,26]. It could help to improve and optimize journals and be used in the form of a checklist to improve the information they provide.
As a preliminary measure to test its effectiveness and ability to differentiate among similar journals, we propose analyzing Social Sciences journals indexed in Scimago Journal Rank, and specifically Communication, and Library and Information Sciences (LIS) journals, in order to see whether these journals are opting to offer the data generated when processing and publishing articles. This will enable us to gather a sample to show how far this practice is being implemented in journals. Based on this analysis, we will present proposed indicators that journals could use to increase transparency in their processes. Our goals are:
-
To develop a proposal to improve journals, enabling them to have a transparency policy.
-
To create a closed set of indicators for studying and comparing the transparency of academic journals.
-
To study the level of transparency of Spanish Communication and LIS journals indexed in the Scimago Journal Rank.

2. Methodology

The methodology used for this study centered on the analysis of informational content in the website pages of the selected journals. The selection process focused on journals in the fields of Communication and Library and Information Sciences published in Spain and used Scimago Journal and Country Rank. Duplicate journals across both disciplines were eliminated. The final corpus analyzed had 25 journals, which were examined by 4 assessors in April 2021.
The assessment used 6 parameters (own and external human resources, financial resources, efficiency of the editorial process, quality of the editorial process, transparency of policies, and transparency of article metadata) and a total of 20 indicators, given the values 0/1. Descriptions of the indicators can be found in Table 1.
Each journal was assessed twice, by two different assessors, to ensure the same criteria were applied, and in cases where they did not agree, a third assessor re-assessed the indicator in question.
The indicators used were based on previous work by López-Borrull et al. [28] and on the review, analysis, and subsequent selection of the transparency indicators of the Directory of Open Access Journals (DOAJ), the Spanish Foundation for Science and Technology (FECYT, from its Spanish initials), Web of Science (WOS), SCOPUS, and Plan S (see Table 2). FECYT is a public foundation, coming under the umbrella of the Ministry of Science and Innovation. Its mission is to promote scientific research of excellence. FECYT organizes the Call for the Evaluation of Editorial and Scientific Quality to obtain the FECYT Quality Seal. This seal means that academic journals comply with a series of indicators that FECYT defines and renews periodically. It should be noted that we could have added other sources, such as Latindex, but we considered that Table 2 was already comprehensive enough.

3. Results and Discussion

The methodology used enabled us to obtain interesting results. Specifically, these results related to the transparent assessment of the proposed indicators, to the generation of a general transparency index for the best and worst journals in the selected corpus, to a distribution analysis of the journals, and to a study of the correlation between the level of transparency and other external quantitative indicators applicable to the journals in the corpus. Our results show that the analysis system developed for this study is effective in assessing the transparency of the proposed indicators. We can analyze the level of compliance with the 20 indicators and identify in the analyzed corpus of journals which indicators have the worst scores in this analysis, which have the best, and which indicators obtained intermediate results.
In relation to this, we also propose a visual representation of this analysis of compliance with the indicators through the creation of a bar chart showing the value obtained for each indicator and grouping each parameter’s indicators by color (as shown in Figure 1). This enables a nominal comparison between the indicator values and the distribution of results by parameter. In our study, we can generate the corresponding graph (Figure 1) and see the levels of compliance with or implementation of the 20 indicators in the selected corpus of journals. The figure is designed to show which indicators are already well established and which journals need to devote more effort to this end. With this analysis and proposed visualization, we can see that the indicators with the lowest compliance are 2 and 3 (reviewers), 5 (itemizing costs), 17 (open-data policies), and 19 (monitoring self-citation). The indicators with an intermediate score are 4 (APC), 12 (plagiarism), 7 (response time), and 20 (metrics). The indicators with the highest compliance are 1 (editorial board), 10 (article review and selection process), 13 (indexing), 14 (code of ethics), 15 (license type), and 16 (open access policies).
Below is a possible interpretation of the indicators with the highest and lowest compliance. Publishing the list of reviewers does not seem to be a common practice of scholarly journals. From our point of view, for increasingly global journals and bearing in mind the debate about what a predatory journal is, the greater the transparency in the editorial process, the better. At one extreme, we might find open peer review and the debate about anonymity and its validity today [12], but at least making known which people have acted as reviewers serves to support and validate the editorial process. Thus, we would point out the need for databases and quality agencies that evaluate journals to promote such indicators to help better understand and delimit what is a good practice and what is a quality scholarly journal.
As far as cost is concerned, again, this is clearly not a common practice. We would propose that this is a valid indicator for two reasons. Scientific policy related to open science and Plan S is concerned with ensuring that the cost (and the profits) are adjusted to the market. Likewise, the costs have to be sustainable in an ecosystem based on public funding, where austerity and control of expenses can directly affect research budgets.
The need for journals to have open-data policies may be less pressing. In this sense, the type of data and the disciplines themselves may help explain the low compliance. However, in the future, it seems that most strategies, plans, and funding bodies will require the sharing of data, and journals have to be clear about their strategy in relation to this, as Palmer [29] and García-García et al. [30] also point out. The debate about who hosts and curates datasets is especially relevant when it comes to responsibility for privacy, legal, and ethical issues related to supplementary materials. This explains, for example, how even in publications related to the COVID-19 pandemic, there have been low levels of data sharing [31]. A similar explanation would also make sense in relation to the criterion on self-citations. Even though none of the journals studied complied with this indicator, we think that it should remain on the list. For example, there was controversy recently stemming from a study developed by the National Agency for Quality Assessment and Accreditation of Spain (ANECA) in relation to the ‘non-standard behaviors’ of certain journals in relation to self-citation. It focused on the difficulty of setting and understanding, as often happens with plagiarism, a clear border between what is considered correct and what is not [32]. Certain databases use the amount of self-citation as a criterion. Knowing if a journal has practices that could affect whether it is included in a database or not is something that should, we believe, be known to authors before they submit their articles. It would also help improve monitoring of this behavior.
The second result we want to highlight is that our assessment system also allows for a more global analysis of the parameters. We can analyze the level of consolidation of the aspects in the corpus of journals included in the study: which indicators scored worst in this analysis, which scored best, and which obtained intermediate results. We also propose a visual representation of this analysis of compliance with the parameters through the creation of a vertical bar chart showing the value obtained for each parameter (as shown in Figure 2). This allows for a nominal comparison of the values associated with each of the parameters.
If we apply this analysis to our study, we can generate the corresponding graph (Figure 2) and see the levels of compliance with or implementation of the six parameters in the selected corpus of journals. As we can see, some areas are strongly consolidated, such as ‘quality of the editorial process’, and ‘editorial policy’. In this particular case, this may be because it is one of the quality criteria applied by DOAJ, FECYT, Web of Science, and others. However, other parameters (such as ‘financial resources’, ‘efficiency of the editorial process’, or ‘metadata’) have some way to go to raise the level of transparency of the analyzed journals.
The third result we want to highlight is that our analysis system also lets us generate a general transparency index for the best journals in the selected corpus. The best journals are those above the average general transparency score for the corpus. We can analyze the general level of transparency of these journals by aggregating their scores for all the indicators and generating a ranked distribution of the journals based on the quantitative value obtained. We propose a visual representation of this analysis of the general transparency index of the best journals in a vertical bar chart showing the value obtained by each journal when aggregating the total scores of the indicators. By adding a line, we can compare this index for each journal against the average general transparency value of the journals in the analyzed corpus (as seen in Figure 3). This enables a nominal comparison of the values of this index, ranking this subset of the best journals in the corpus according to the transparency index and showing how they are above the average general transparency level of the corpus.
If we apply this analysis to our study (25 journals from the fields of Communication and LIS), we can generate the corresponding graph (Figure 3) and compare the general transparency indices of the best journals in the selected corpus (14 in total). By introducing a line that codifies the average general transparency index of the corpus (10.92), we can see how far each journal is above this average value.
The fourth result we want to highlight complements the previous one: our analysis system also lets us generate a general transparency index for journals with the lowest compliance in the selected corpus. These journals are those below the average general transparency score for the corpus. We can analyze the general level of transparency of these journals by aggregating their scores for all the indicators and generating a ranked distribution of the journals based on the quantitative value obtained. We propose a visual representation of this analysis of the general transparency index of the worst journals in a vertical bar chart showing the value obtained by each journal when aggregating the total scores of the indicators. By adding a line, we can compare this index for each journal against the average general transparency value of the journals in the analyzed corpus. This enables a nominal comparison of the values of this index, ranking this subset of the worst journals in the corpus according to the transparency index and showing how they are below the average general transparency level of the corpus. By introducing a line that codifies the average general transparency index of the corpus (10.92), we can see how far each journal is below this average value.
The fifth result we want to highlight is that our assessment system also lets us perform a distribution analysis of the selected journals in the corpus, using the number of indicators each publication complies with. We can analyze how these journals are distributed in four quartiles, where the first quartile has the best journals according to this index, and the fourth quartile has the worst. We propose a visual representation of this distribution analysis of the journals according to the number of indicators they comply with by creating a box-and-whisker plot showing how the journals are distributed in the resulting quartiles (as shown in Figure 4). This enables us to see whether the distribution is symmetrical, or if the journals are clumped together in the lower or upper levels of compliance area of the level of indicators complied with, or whether there are outliers.
If we apply this analysis to our study, we can generate the corresponding graph (Figure 4) and see the distribution of the entire set of journals in terms of the number of indicators they comply with. As we can see, the diagram shows that the distribution of the journals is fairly symmetrical, with a similar proportion of journals in all four quartiles. The journals comply with a minimum of 5 indicators and a maximum of 16 indicators. The average and median compliance of the indicators is very similar, at around 11.
The last result we want to highlight is that our assessment system also lets us study the relationship between the level of transparency and other external quantitative indicators applicable to the journals in the selected corpus. We can see whether such correlation exists, and if so, whether the correlation is positive or negative. Thus, we observe a certain degree of visual correlation from the scatter plot and the trend line included in this graph. We propose a visual representation of this analysis in the form of a scatter plot showing each journal as a point placed along the X and Y axes based on the numerical values of the journal’s transparency index and the quantitative value of the other external indicator selected (as shown in Figure 5). This graph can be completed by adding a trend line to highlight the correlation.
If we apply this analysis to our study, we can generate the corresponding graph (Figure 5) and see the correlation between the transparency index of the journals in the corpus and their impact factor (specifically, the Scimago Journal & Country Rank (SJR)). As we can see in this graph, the trend line shows that there is a certain degree of positive correlation (although not a very strong one) between a journal’s SJR and its level of transparency.

4. Conclusions and Recommendations

Our main conclusions are as follows. First, regarding the choice of indicators:
-
The distribution of results confirms that the choice of criteria seems appropriate: the values do not all show high compliance or low compliance. There are different results that point to the possibility of comparing journals: we can see a progression and where improvements are needed.
Second, with regard to the chosen sample, but with a possible correlation for the validity of transparency analysis:
-
The indicators relating to editorial policy and to journal quality are the ones with the highest levels of compliance. This relates to the requirements for secondary databases and enables us to identify a set of quality journals in different disciplines. In this sense, then, it can be pointed out that there is a consensus for the quality indicators that a journal must meet, and the great competitiveness between journals validates compliance internally (sustainability of the journal in relation to the funders) and externally (placing in quartiles of the databases)
-
The indicators relating to metadata present a clearer area for improvement. This can also be explained by the fact that certain criteria are recommended but not required by journal indexers. Likewise, there is a need for clear metadata policies that allow for the interoperability of scholarly articles and the ability to apply data-mining and knowledge-extraction mechanisms that are only possible with quality metadata. Plan S, for example, seeks to achieve quality standards, although, for the moment, it has placed them in the field of non-mandatory supplementary indicators for journals that must or want to comply with Plan S [22].
-
Indicator 19 (monitoring self-citation) at 0 and indicator 5 (itemizing costs of the publication) at 1 point are the lowest on the analysis (Figure 1). There is full compliance with indicators 1 (editorial board), 13 (indexing), and 15 (license type). There is a wide range of indicators in an intermediate position.
Finally, in relation to the specific sample chosen for study:
-
The Revista Latina de Comunicación Social and Comunicar are clearly ahead with over 75% compliance with the indicators studied. At the other extreme, Anales de Documentación and Tripodos are the journals with the lowest level of compliance.
-
There is significant room for improvement for journals to openly provide the information they may already have and the criteria they apply. In other cases, they could consider including them. Indeed, we believe that the indicators could be used by journals as a self-assessment tool for ongoing improvement.
The possible limitations of the study come from the choice of the sample and the discipline. The sample should be expanded, and rankings comparing academic journals could be created. The journals themselves could include, as part of their best practices, icons showing their compliance with the indicators/parameters. This would make it easier to quickly see their compliance with Plan S, transparency, etc., without having to comb through large amounts of information published in widely differing ways. The proposed indicators allow for analysis and verification of the transparency of academic journals, and they can help interpret the transparency of these academic journals (many of which are from academic publishers receiving public funding).

Author Contributions

Conceptualization, A.L.-B., M.V., and C.O.; methodology, formal analysis, investigation, data curation, writing—original draft preparation, A.L.-B., M.V., C.O., and M.P.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This research and the APC was funded by Ministerio de Ciencia, Innovación y Universidades, grant number RTI2018-094360-B-I00.

Data Availability Statement

The data presented in this study are available in https://doi.org/10.6084/m9.figshare.17060336.v1.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abadal, E. (Ed.) Revistas Científicas. Situación Actual y Retos de Futuro; Edicions Universitat de Barcelona: Barcelona, Spain, 2017; pp. 13–18. ISBN 978-84-9168-004-8. Available online: http://eprints.rclis.org/32138/ (accessed on 11 November 2021).
  2. Melero, R. Significado del acceso abierto (open access) a las publicaciones científicas: Definición, recursos copyright e impacto. Prof. Inform. 2005, 15, 255–266. [Google Scholar]
  3. Comisión Europea. Commission Recommendation of 25.4.2018 on Access to and Preservation of Scientific Information. 2018. Available online: https://ec.europa.eu/digital-single-market/en/news/recommendation-access-and-preservation-scientific-information (accessed on 11 November 2021).
  4. Delgado-López-Cózar, E.; Ràfols, I.; Abadal, E. Letter: A call for a radical change in research evaluation in Spain. Prof. Inform. 2021, 30. [Google Scholar] [CrossRef]
  5. DORA. San Francisco Declaration on Research Assessment. 2012. Available online: https://sfdora.org/ (accessed on 11 September 2021).
  6. Hicks, D.; Wouters, P.; Waltman, L.; De Rijcke, S.; Rafols, I. Bibliometrics: The Leiden Manifesto for research metrics. Nature 2015, 520, 429–431. [Google Scholar] [CrossRef] [Green Version]
  7. Abadal, E.; López-Borrull, A.; Ollé Castellà, C.; Garcia-Grimau, F. El plan S para acelerar el acceso abierto: Contexto, retos y debate generado. Hipertext.Net 2019, 19, 75–83. [Google Scholar] [CrossRef] [Green Version]
  8. Bachrach, S.M. The journal crisis: Redirecting the blame. J. Chem. Inf. Comp. Sci. 2001, 41, 264–268. [Google Scholar] [CrossRef]
  9. Llewellyn, R.D.; Pellack, L.J.; Shonrock, D.D. The Use of Electronic-Only Journals in Scientific Research. Issues Sci. Technol. Lib. 2002. Available online: http://www.istl.org/02-summer/refereed.html?a_aid=3598aabf (accessed on 11 November 2021).
  10. Keefer, A. Aproximació al Moviment “Open Access”. BiD 2005, 15. Available online: https://bid.ub.edu/15keefer.htm (accessed on 11 November 2021).
  11. Pinfield, S.; Wakeling, S.; Bawden, D.; Robinson, L. Open Access in Theory and Practice: The Theory-Practice Relationship and Openness; Routledge: London, UK; New York, NY, USA, 2021; Available online: https://www.taylorfrancis.com/books/9780429276842 (accessed on 11 November 2021).
  12. Björk, B.-C.; Hedlund, T. Emerging new methods of peer review in scholarly journals. Learn. Pub. 2015, 28, 85–91. [Google Scholar] [CrossRef]
  13. Murray-Rust, P. Open Data in Science. Nat. Prec. 2008. [Google Scholar] [CrossRef]
  14. Ruijer, E.; Détienne, F.; Baker, M.; Groff, J.; Meijer, A.J. The Politics of Open Government Data: Understanding Organizational Responses to Pressure for More Transparency. Am. Rev. Public Adm. 2020, 50, 260–274. [Google Scholar] [CrossRef] [Green Version]
  15. Baiget, T. Ética en revistas científicas. Ibersid Rev. Sist. Inf. Doc. 2010, 4, 59–65. [Google Scholar]
  16. Wicherts, J.M. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals. PLoS ONE 2016, 11, e0147913. [Google Scholar] [CrossRef]
  17. Fosang, A.J.; Colbran, R.J. Transparency is the key to quality. J. Biol. Chem. 2015, 290, 29692–29694. [Google Scholar] [CrossRef] [Green Version]
  18. Tenopir, C.; Dalton, E.; Fish, A.; Christian, L.; Jones, M.; Smith, M. What motivates authors of scholarly articles? The importance of journal attributes and potential audience on publication choice. Publications 2016, 4, 22. [Google Scholar] [CrossRef] [Green Version]
  19. Nicholas, D.; Rodríguez-Bravo, B.; Watkinson, A.; Boukacem-Zeghmouri, C.; Herman, E.; Xu, J.; Świgoń, M. Early career researchers and their publishing and authorship practices. Learn. Pub. 2017, 30, 205–217. [Google Scholar] [CrossRef] [Green Version]
  20. Mason, S.; Merga, M.K.; Morris, J.E. Choosing the Thesis by Publication approach: Motivations and influencers for doctoral candidates. Aust. Educ. Res. 2020, 47, 857–871. [Google Scholar] [CrossRef]
  21. Borrego, Á.; Anglada, L.; Abadal, E. Transformative agreements: Do they pave the way to open access? Learn. Pub. 2020, 34, 216–232. [Google Scholar] [CrossRef]
  22. Inouye, K.; Mills, D. Fear of the academic fake? Journal editorials and the amplification of the ‘predatory publishing’ discourse. Learn. Pub. 2021, 34, 396–406. [Google Scholar] [CrossRef]
  23. MDPI. Article Processing Charges (APC) Information. Available online: https://www.mdpi.com/apc#why-apc (accessed on 11 September 2021).
  24. Dal-Ré, R. Transparencia de las revistas españolas de medicina hacia sus lectores y autores. An. Pediatría 2019, 91, 67–70. [Google Scholar] [CrossRef] [PubMed]
  25. Fernández, M.T.; Guerra, J.T. Transparencia editorial en revistas científicas mexicanas de educación: Hacia una gestión integral de las políticas editoriales en las publicaciones periódicas científicas. Investig. Bibl. 2021, 35, 13–32. [Google Scholar] [CrossRef]
  26. Vercellini, P.; Buggio, L.; Viganò, P.; Somigliana, E. Peer review in medical journals: Beyond quality of reports towards transparency and public scrutiny of the process. Eur. J. Intern. Med. 2016, 31, 15–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Fair Open Access Alliance. FOAA Breakdown of Publication Services and Fees. Available online: https://www.fairopenaccess.org/foaa-breakdown-of-publication-services-and-fees/ (accessed on 11 September 2021).
  28. López-Borrull, A.; Ollé-Castellà, C.; García-Grimau, F.; Abadal, E. Plan S y ecosistema de revistas españolas de ciencias sociales hacia el acceso abierto: Amenazas y oportunidades. Prof. Inform. 2020, 29, e290214. [Google Scholar] [CrossRef]
  29. Pampel, H.; Dallmaier-Tiessen, S. Open research data: From vision to practice. In Opening Science. The Evolving Guide on How the Internet Is Changing Research, Collaboration and Scholarly Publishing; Bartling, S., Friesike, S., Eds.; Springer: Heidelbert, Germany, 2014; pp. 213–224. [Google Scholar] [CrossRef]
  30. García-García, A.; López-Borrull, A.; Peset-Mancebo, M.F. Data journals: Eclosión de nuevas revistas especializadas en datos. Prof. Inform. 2015, 24, 845–854. [Google Scholar] [CrossRef] [Green Version]
  31. Lucas-Dominguez, R.; Alonso-Arroyo, A.; Vidal-Infer, A.; Aleixandre-Benavent, R. The sharing of research data facing the COVID-19 pandemic. Scientometrics 2021, 126, 4975–4990. [Google Scholar] [CrossRef] [PubMed]
  32. ANECA. Análisis Bibliométrico e Impacto de Las Editoriales Open-Access en España. Available online: http://www.aneca.es/Documentos-y-publicaciones/Evaluacion-de-la-investigacion/Informe-revistas-Open-Access (accessed on 11 September 2021).
Figure 1. Aggregate values of the indicators.
Figure 1. Aggregate values of the indicators.
Publications 09 00054 g001
Figure 2. Levels of compliance with or implementation of the parameters.
Figure 2. Levels of compliance with or implementation of the parameters.
Publications 09 00054 g002
Figure 3. Journals’ compliance with indicators (I).
Figure 3. Journals’ compliance with indicators (I).
Publications 09 00054 g003
Figure 4. Plot of compliance with the indicators.
Figure 4. Plot of compliance with the indicators.
Publications 09 00054 g004
Figure 5. Diagram of the correlation between the compliance value and the Scimago Journal & Country Rank.
Figure 5. Diagram of the correlation between the compliance value and the Scimago Journal & Country Rank.
Publications 09 00054 g005
Table 1. List and description of the indicators used for analysis of the journals.
Table 1. List and description of the indicators used for analysis of the journals.
IndicatorTitleDescription of Requirement
Ind1Editorial boardThe members and membership of the journal’s editorial board are available.
Ind2ReviewersThe names of the reviewers are available.
Ind3Information on the reviewersThe affiliation and/or origin of the reviewers are available.
Ind4Article publication charge (APC)The article publication charges (APCs) are available.
Ind5Itemizing costs of the publicationThe costs associated with article processing and publication are available (according, for instance, to FOOA [27]).
Ind6Funding of the publicationThe publication’s funding sources (public, private, etc.) are available.
Ind7Response timeThe estimated response time for the decision to publish articles is available.
Ind8Rejected articlesThe number or percentage of articles rejected by the journal is available.
Ind9Collection of annual data on the publicationAn annual information/data/stats report or infographic from the journal is available.
Ind10Manuscript review and selection processThe criteria applied during the manuscript review and selection process are available.
Ind11Sections of the publicationThe characteristics that the manuscripts must meet to be published in the different sections of the journal are available.
Ind12PlagiarismThere are mechanisms to detect plagiarism.
Ind13IndexingDetailed information on the journal’s indexing is available.
Ind14Code of ethicsThe publication’s code of ethics is available.
Ind15License typeThe type of transfer of authors’ rights is made explicit.
Ind16Open access policiesThe publication’s open access policy is made explicit.
Ind17Open-data policiesThe publication’s open-data policy is made explicit.
Ind18Co-authorshipEach author’s role in articles must be reported.
Ind19Monitoring self-citationThe journal has a self-citation policy.
Ind20Article metricsArticle metrics are reported.
Table 2. Indicators used for analysis of journals and correspondence with sources of information.
Table 2. Indicators used for analysis of journals and correspondence with sources of information.
IndicatorParameterTitleDOAJFECYTWOSSCOPUSPlan S
Ind1Own and external human resourcesEditorial boardXXXXX
Ind2Reviewers-X---
Ind3Information on the reviewers-----
Ind4Financial resourcesArticle publication charge (APC)XX--X
Ind5Itemizing costs of the publication----X
Ind6Funding of the publication----X
Ind7Efficiency of the editorial processResponse time----X
Ind8Rejected articles----X
Ind9Collection of annual data on the publication----X
Ind10Quality of the editorial processManuscript review and selection processXXXXX
Ind11Sections of the publication-X-X-
Ind12PlagiarismXXXXX
Ind13Indexing-----
Ind14Editorial policyCode of ethicsXXXXX
Ind15License typeXX--X
Ind16Open access policiesXX--X
Ind17Open-data policies----X
Ind18MetadataCo-authorship-X---
Ind19Monitoring self-citationX-XX-
Ind20Article metrics-----
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

López-Borrull, A.; Vállez, M.; Ollé, C.; Pérez-Montoro, M. Publisher Transparency among Communications and Library and Information Science Journals: Analysis and Recommendations. Publications 2021, 9, 54. https://doi.org/10.3390/publications9040054

AMA Style

López-Borrull A, Vállez M, Ollé C, Pérez-Montoro M. Publisher Transparency among Communications and Library and Information Science Journals: Analysis and Recommendations. Publications. 2021; 9(4):54. https://doi.org/10.3390/publications9040054

Chicago/Turabian Style

López-Borrull, Alexandre, Mari Vállez, Candela Ollé, and Mario Pérez-Montoro. 2021. "Publisher Transparency among Communications and Library and Information Science Journals: Analysis and Recommendations" Publications 9, no. 4: 54. https://doi.org/10.3390/publications9040054

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop