National culture as a correlate of research output and impact

National culture has been overlooked in discussions related to research output and impact owing to individual, socio-political structure, and economic factors. This study shows the relationships between the dimensions of cultural value orientation of the nation and research output & impact. More than 60 countries were included, and Spearman correlation analysis was employed. The variables were taken from Geert Hofstede and Scimago Journal & Country Rank worksheets. This study found that (1) Power distance - the positive inclination of the culture toward power disparities among people - is negatively correlated with research impact; (2) Individualism - the level of independence a society keeps up among its individuals - are positively correlated with research output and research impact; (3) Indulgence - the degree to which society members do not attempt to control their urges - is positively correlated with research impact; and (4) after controlling the Log GDP per capita, uncertainty avoidance - the manner in which that a society seeks to manage the actuality that the future can never be controlled - is negatively correlated with research impact.


Reviewer Status
Makri (2018) recently released a report on the increasing number of publications in various countries. She stated that it's unclear what has triggered and driven the strong gains in Egypt and Pakistan. Throughout the report, various variables believed to be responsible for the increasing number of publications, such as indexation duration, funding, global engagement, international collaboration, and political policies on science and higher education, are explained.
Several predictors of research output and impact had been identified, i.e. author characteristics, co-authorship networks, citation history, journal impact factors, tweets (Xiaomei et al., 2017), cohort effects (in terms of scientific discipline), age, career stages, gender, the country of origin of the PhD holders, and reward structure of the research enactment (Claudia & Francisco, 2007). They are mostly at the individual and institutional level. At the country level, the predictors are the number of universities, GDP per capita, control of corruption, civil liberties (Mueller et al., 2016), country's wealth and population size, country's value of research tradition, tenure and promotion criterion, experimental costs, IRB (Institutional Review Boards) review flexibility, language barrier, and the training of new young researchers (Demaria, 2009). However, national cultural orientation (in this paper, the term is used interchangeably with: national culture, national cultural value, national culture dimension) is yet to be analyzed, with the present study assuming that individual, institutional, and structural factors are also influenced by the cultural values of a nation. Hofstede Insights (2019) defined culture as the collective mental programming of the human mind which distinguishes one group of people from another, consisting of six dimensions, i.e. (1) power distance (PDI) -acceptance on the unequal power distribution in a society; (2) uncertainty avoidance (UAI) -intolerance of ambiguity and uncustomary thoughts and practices; (3) individualism (IDV) -projection of individuals' "I" in a society rather than "we" (collectivism); (4) masculinity (MAS) -the toughness and competitiveness rather than the tenderness and cooperativeness (femininity) orientation; (5) long term orientation (LTOWVS) -the society's preference of time-honored rather than pragmatic approaches (short term normative orientation); and (6) indulgence (IVR) -the society facilitation towards a fun and enjoyable life rather than restraint (suppression of needs gratification by strict social norms).
National culture is relatively stable (Maseland & van Hoorn, 2017) and is widely used to explain various performances at the country level, such as learning and academic performance (Signorini et al., 2009). The present study hypothesized that there are correlations between the national culture dimensions and research performance indicators, i.e. research output and impact. The research performance is assumed to be mediated by research culture, and the culture experiences stimulations and challenges from the national culture.

Methods
All following data were retrieved on August 18, 2019, and compiled into a worksheet (see Underlying data (Abraham, 2019) as the material of this present analysis. Countries' region, total documents/DOC, citable documents/CITA, citations/CIT, self-citations/SELF, H-index/HINDEX, and citations per document/CPD   Principal component analysis (PCA) and Independent-samples Kruskal-Wallis H Test were done using IBM SPSS Statistics version 25 for Windows to get two major components from dimensions reduction of DOC, CITA, CIT, SELF, HINDEX, and CPD, as well as comparison between countries' regions in terms of the reduced dimensions. Correlation analysis was conducted using JASP version 0.10.2 for Windows, and Partial correlation analysis was conducted using IBM SPSS Statistics.

Results and Discussion
The purpose of this study is to show whether there are correlations between national cultural values and research output and impact. Because correlation is not causation, the following analysis and interpretation do not attempt to state definitively that there is a causal effect from one variable to another. Even though in this discussion cultural value orientation is often used as an explanation of research output and impact, this is more due to the chronological flow that culture comes and envelops, engulfs a country first than the SCIMAGOJR measures ( Table 1). The argument is in line with the proposition of Sen (2004) that culture is a constituent of development and economic behavior, as expressed as follows: "The furtherance of well-being and freedoms that we seek in development cannot but include the enrichment of human lives through … forms of cultural expression and practice, which we have reason to value …. Cultural influence can make a major difference to work ethics, responsible conduct, spirited motivation, dynamic management, entrepreneurial initiatives, willingness to take risks, and a variety of other aspects of human behavior which can be critical to economic success." (pp. 39-40).

Amendments from Version 2
The name of the first component as the result of PCA has been modified into 'Research Output' instead of 'Research Performance'. Results and Discussion have been combined into one section. The limitation -added with an information about the lack of time series data from the free downloaded SCIMAGOJR dataset -has been moved after the section.
Any further responses from the reviewers can be found at the end of the article

REVISED
In other words, culture can influence public policy which regulates human capital; whereas, research output and impact depend on human capital, in addition to the fact that research is a contributor to economic growth and development (Blanco et al., 2015). However, this study is cautious for not trapping itself in cultural determinism.
A Principal Components Analysis (PCA) was done resulting in two components extracted with a total variance explained 92.073% (Table 2), namely: • Component 1: "Research Output" (a synthesis of DOC, CITA, CIT, SELF, HINDEX). This component comprises of volume-dependent measures (i.e., measures that expand with the quantity of publications of a country).
• Component 2: "Research Impact" (based on CPD alone). This component comprises of a volume-free measure (i.e., a measure that is autonomous of the quantity of publications of a country). The correlation between Component 1 and Component 2 is weak (< 0.2; see also the plots of the indicators in Figure 1). It might be that CPD is more difficult to manipulate or be an object of the author's engineering.
Descriptive statistics of SCIMAGOJR measures (Table 1) showed that the research output (DOC, CITA, CIT, SELF, HINDEX) and impact (CPD) data are not normally distributed (p of Shapiro-Wilk < .05). Therefore, correlation analysis was done with Spearman's correlation.
In anticipating the inflated type-1 error, the analysis employed significance level of q (adjusted p) = 0.00714. The four results (Table 3) are as follows: First, Power Distance (PDI) is negatively correlated with Research Impact. This could be because PDI negatively correlates with democracy (Maleki & Hendriks, 2014). The lower level of democracy reduces the opportunity of the academic community to exchange and market (in the broad sense) scientific information, as well as debate openly. Likewise, democracy that does not flourish deters the use of research results in creating public policies. Science is co-opted or used as just a tool to achieve exclusive interests by ideologues, pundits, and political leaders; they ignore the state-of-the-art nature of the research (Branscomb & Rosenberg, 2012). In addition, PDI might manifest itself in academic writing in the form of rigid, authoritative, defensive, and dogmatic styles (Koutsantoni, 2005). All the conditions could reduce research impact.

Second, Individualism (IDV) is positively correlated with
Research Output and Research Impact. The positive correlations could be explained using the findings of Deschacht & Maes (2017). They found that in countries with more individualistic cultures: (1) the scientists prioritize their self-development, (2) the records of scientific work are historically longer (usually Western countries), and (2) self-citations flourish more. This does not necessarily mean that there have been citation abuses, but that self-citation is used to refer to their prior works, thereby, preventing unnecessary repetitions of ideas in newer works (Deschacht, 2017). Although IDV and collaboration are often contested (e.g. Kemp, 2013), a "collaborative individualism" (Limerick & Cunnington, 1993) -stressing both working together and self-emancipation -is possible, explaining the positive correlation.

Third, Indulgence (IVR) is positively correlated with
Research Impact. This may be because IVR -the warranted one -facilitates academic freedom (Ohmann, 2011), as stated by Jefferson (2011) regarding psychological gratification, "Difference of opinion is advantageous … [F]ree inquiry must be indulged, and how can we wish others to indulge it while we refuse it ourselves" (p. 26). Conversely, a restraint (as opposed to indulgence) will facilitate the destruction of goal pursuit, e.g. designing and executing impactful studies, through   ρ = Spearman's rho; * p < 0.05; ** p < 0.01; *** p < 0.001; q = adjusted pvalues (Gaetano, 2018;Holm, 1979); the significance level is 0.00714; **** q < 0.00714; CI = Confidence Interval; 0.99% CI = 0.99286% CI.  (Adams, 2015). This is because open science increases public esteem in science. IVR may also manifest itself in a "lovely" academic writing style (Kiriakos & Tienari, 2018). This style is not dry and cold, but rather dialogical, humanistic, more reflexive, and capable of showing authors' courage and vulnerability. Compelling insights are more easily born from the writings that embody those qualities; as mentioned, "a thin line exists between interesting insights and selfindulgence" (Nadin & Cassell, 2006, p. 214). Scientific authors who read such works would be attracted to cite them, leading to an increase in the works' impact. In addition, "strategic indulgence" is possible and known to be a creative process that enables one to balance academic activity (such as writing) with non-academic ones (Jia et al., 2018) -fostering insight.

Fourth,
LGDP is positively correlated with Research Output. This is in line with the finding of Mueller et al. (2016), that economic prosperity (per capita GDP) is one of the best predictors of the country's research output.
Partial correlation by controlling LGDP showed that the directions of correlation between variables are the same as the results of Spearman's correlation above (Table 3), but there is an additional new result (Table 4). Uncertainty Avoidance (UAI) is found negatively correlated with Research Impact. This is understandable considering that impactful research requires innovation. The characteristics of UAI -which are intolerant of ideas and practices that are ambiguous and not conventional -do not support innovation (Bauer & Suerdem, 2016). Uncertainty avoidance cultural orientation is difficult to challenge and scrape unfunctional attitudes and values that are already stable. Therefore, it will also be hard to produce breakthroughs in research and publication, reducing the potential for citations per document. One premise advocated by Leiden Manifesto for Research Metrics is "Science and technology indicators are prone to conceptual ambiguity and uncertainty and require strong assumptions that are not universally accepted" (Hicks et al., 2015, para. 21). Higher UAI national culture would adhere to the invariance assumption that is detrimental to the development of science and publication real impact. Un-openness to the pluralistic approach in the impact measurement will invite citation cartels. Citations per document (CPD) will be seen reductionistically as the destination of scientific works, so that CPD will be easy to become a target of manipulation.
In fact, we have been reminded that the production of knowledge and its memories must not forget the relevance of knowledge to diverse publics. What is needed is a "careful and conscientious citation ... [citation as] a form of engagement", in which "citation as a crude measure of impact" is only the byproduct of the reflexive action (Mott & Cockayne, 2017, p. 2, 11). It will need lower UAI.

Research output across regions
Descriptive statistics of national culture, research output, and research impact (    Figure 2, Figure 3).
Eastern Europe's superiority in terms of research output may be due to the rise of democracy, the emergence of the need for research excellence standards, the promotion of international research collaboration, and cooperation with international bodies (such as the World Bank) that enable these countries to enjoy large research grants (Henderson et al., 2012;Švab, 2004).
Henderson et al. (2012) further stated a fact about research culture in Eastern Europe, as follows: "Though not a uniform phenomenon across all disciplines or countries, some participants noted that in CEE (Central and Eastern Europe) research tends to be more dependent on political power. This can relate both to the partisan provision of financial resources and to researchers' ambitions to convince political actors." It appears that political activities are melting pots of the interests of academics, politicians, and research funders, which provide work opportunities that has implication in improving research output in the region's countries. Those interests are given "energy" by the belief of the people that "Our people are not perfect, but our culture is superior to others." (Kim, 2018, para. 6).
Makri's finding (2018) (2019) noted that there is a "meeting point" between the career interests of faculty members in universities and the business interests of publishing in the countries. This is exacerbated by the relaxation of the promotion standard of faculty members, so that a surge in publication occurs in Scopus indexed journals-that grow rapidly  "Recently, some indexing systems, like Scopus, have also pursued the same strategy and delisted some of the low-quality journals published in the Middle East and Iran. Although some of the editors and publishers of the delisted journals have attributed these events to political issues, to be honest, I, for one, believe that in most instances, they, themselves, should bear the brunt of the situations they have for their poor work quality." (p. 4) Noteworthy is the fact mentioned by Plackett (2015), that: "The predatory journal industry exists on a spectrum-at one end, some such journals maintain they are conducting valid peer review. At the other end of the spectrum, predatory journals sometimes blackmail academics who eventually realize they've published in a journal with a negative reputation." (para. 21) That is, the issue of predatory journals in the Middle East is not an easy problem to evaluate. This argument is reinforced by Jones' (2015) argument, that the flourish of predatory journals is not the real problem. The fundamental problem, according to Jones, is information inequality; in which case, the prosocial role of librarians and publishers to keep potential writers away from illegitimate journals may still be difficult to expect. It is not surprising that, based on the results of this present study, even though research output of Middle East outperforms Latin America, in terms of research impact, the opposite occurs, i.e. Latin America outperforms the Middle East, also the Asiatic Region, and Eastern Europe.
According to SCIMAGOJR data (https://www.scimagojr.com/ countryrank.php?region=Latin%20America&order=cd&ord= desc), retrieved on September 2019, the six countries with the highest combination of documents and citations are Panama, Puerto Rico, Uruguay, Costa Rica, Argentina, and Chile. Related to the literature in these countries, Ward (2016) stated its virtue, "Only with slow, careful, detailed analysis, concern, and empathy even can be liberated from the old ways of seeing" (p. xxiii). These "human qualities" of Latin America's publications may attract citations repeatedly. This explanation, nevertheless, is still speculative and requires testing in subsequent empirical studies.
Plots of national cultures, research output, and Log GDP per capita (Figure 2; missing scores do not bring up the line) showed that, based on low vs. high research output criteria (< -0.30 σ vs. > 0.30 σ), it is found that, among 33 countries (7 low vs. 26 high), (1) United States, (2) China, (3) United Kingdom, (4) Germany, and (5) Japan are countries with the highest research output. Descriptively, in each of these countries, the national cultural orientations that play roles the most and the least are, respectively: (1) Individualism, long term orientation; (2) Long term orientation, individualism; (3) Individualism, uncertainty avoidance; (4) Long term orientation, power distance; (5) Masculinity, indulgence. For countries with the lowest research output, there is no data available on their national cultural orientation.

Research impact across regions
Latin America's superiority in terms of research impact cannot be separated from the orientation of studies that aspires to decolonize the research itself (International Institute of Social Studies, 2019), even beginning from the decolonization of consciousness (Garza, 2010). Decolonization of research in the context of Latin America has the meaning of restoring the authentic identity of society, from an oppressed condition-by "capitalism, hegemony, racism, classism, sexism, etc." (Garza, p. 110)to an emancipated situation. There is hope for reconnection of the daily lives of people and their families, communities, and even living creatures, from those that have been being alienated by the oppression. The assumption is, "You actually cannot have meaningful, impactful research unless you engage communities" (Janes, 2017, p. 114 Meanwhile, the issues of (de-)colonization are studied very seriously by countries that experience a similar fate and become huge energy for doing high impact research. This is because many problems "have been attributed to the impact of  (2) United Kingdom. Descriptively, in each of these countries, the cultural orientations that play roles the most and the least are, respectively: (1) Uncertainty avoidance, power distance; (2) Individualism, uncertainty avoidance (as well as power distance). For countries with the lowest research impact, there is no data available on national cultural orientation.

The limitation of SCIMAGOJR data
There are three things that need to be aware of when reading the results, namely: First, the SCIMAGOJR data (Table 1) includes both journal articles, conference proceedings papers, and does not exclude other types of documents (i.e. short survey, review) (Guerrero-Bote & Moya-Anegón, 2012). A number of countries or institutions exclude non-journal articles from evaluating their performance (e.g. Suryani et al., 2013), so the applicability of the results of this study to these countries might be limited. In this present study, data from SCIMAGOJR is used because, among others, it can be downloaded for free. This limitation may affect the accuracy of research output and impact measurements.
Second, in a number of dimensions of research output and impact measurement, Scopus, which supplies the data of SCIMA-GOJR, has a number of limitations; for example (1) Scopus has poor coverage of articles, conference papers, and book chapters compared to Crossref, Dimensions, Google Scholar, and Microsoft Academic; (2) Scopus is somewhat late in indexing inpress articles compared to all four; (3) Socially, Scopus does not support open citation (Harzing, 2019). However, the limitation of Scopus is offset by its advantages, namely Scopus is still an extensive source of quality citation data (van Eck et al., 2018).
Third, SCIMAGOJR, at least in its open access form, does not provide time series data. SCIMAGOJR data is cumulative data at a particular point in time, not annual data. Thus, the results of the correlation of various variables with SCIMAGOJR indicators might be most likely to suffer from long-term influences of background trends. However, the author has made a number of attempts to minimize the possibility of correlational bias. First, the author has found theoretical support confirming that national cultural orientation does not fluctuate much between years, e.g. "Hofstede et al. (2010) compare nations to organisms, citizens to cells, and cultures to DNA .... And cultures, like organisms, can stay consistent for long periods, evolve gradually over time, or adapt to sudden changes" (Whalen, 2016, p. 4). Second, the LGDP variable was controlled (with partial correlation analysis) because it was realized that the correlation between national cultural orientation and research output and impact might be affected by the country's economic situation.

Conclusion
National culture dimensions, especially power distance, individualism, indulgence, and uncertainty avoidance are pivotal variables that are to be considered in justifying research impact. In addition, the only variable that correlates with research output is individualism.
Owing to the fact that the national culture is relatively enduring, countries need to measure their elasticity of hopes and action plans in an effort to boost research output and impact, by integrating the national culture in the estimate. National culture can be integrated as a moderating variable in the predictive relationship between GDP per capita and research output and impact. Diversification of this study -based on the document and authors' collaboration types, the indexing databases, the disciplines, as well as the history and development of the research in a country -is a future opportunity for further study. Jonathan P. Tennant IGDORE, Berlin, Germany I think at this stage, asking to do any more work would be asking too much of an already impressive manuscript.

Data availability
While I am still a little cautious about the results and the correlations due to the treatment of time series data, I don't think that at the present this should stop this MS from being indexed and more widely used, discussed, and built upon.
problem, many of the correlation coefficients reported might be artificially higher than what is realistic. I think that this needs to be very carefully considered here. My apologies for not indicating this in the first report, but I could not see the data to check this. The limitations section might work better after the rest of the discussion My expertise on the intersection between politics and research is quite limited, and I will refrain from commenting on those elements of the discussion. Although they are, at least to me, very interesting! As much of the discussion again is based on the results, which I suspect might change given my recommendations for the methods above, I will refrain from commenting on them too much at the present. From what I can gauge though, they seem to be well thought out, contain relevant literature, and do not oversell the results too much (in their present state). My apologies for asking for potentially more analytical work to be done at this stage. I feel that it is necessary to look at the data through the lens of time though to better understand some of the results being obtained here. Keep up the great work for now! results in the discussion section. The author could also consider combining the presentation and the discussion of the results in a single section called 'Results and Discussion'.
Second, in the principal component analysis, I don't think the first component should be labeled 'research performance'. The term 'research performance' is very general and could mean many different things. Therefore, 'research performance' is not a very helpful label for the first component. The essential difference between the two components is that the first one consists of size-dependent variables (i.e., variables that increase with the number of publications of a country) while the second component consists of a size-independent variable (i.e., a variable that is independent of the number of publications of a country). The first component could be labeled 'research output' (since all variables depend on the size of the research output of a country), while the second one could be labeled 'research impact'.
No competing interests were disclosed.
The author relies strongly on statistical significance testing. My recommendation is to leave out all significance tests and instead to present confidence intervals for the correlation coefficients. Significance testing leads to problematic dichotomous thinking, as has for instance been pointed out in a recent contribution in Nature (Amrhein et al. ). Following the so-called estimation statistics approach, reporting confidence intervals is preferable over reporting significance tests ( ). I am aware that another reviewer (Tennant) https://en.wikipedia.org/wiki/Estimation_statistics recommends performing even more significance tests. I disagree with this recommendation. I don't consider this to be good statistical practice.
It would be nice if the author could deepen the analysis a bit more. This can for instance be done by showing scatter plots for the most interesting relationships between variables. In these scatter plots, the names of countries could be shown, especially for those countries that seem to display interesting behavior (e.g., outliers). This would lead to a more in-depth analysis that probably offers richer insights.
The paper uses lots of abbreviations. This makes the paper more difficult to read. My recommendation is to reduce the number of abbreviations that are used. It may also be helpful to include a table listing all abbreviations and the corresponding full terms.
Reviewer Expertise: I am an expert in scientometrics. I don't have any specific expertise on the cultural orientation of countries.
I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above. , which permits unrestricted use, distribution, and reproduction in any medium, provided the original Attribution License work is properly cited.

IGDORE, Berlin, Germany
The author presents an interesting piece of insight into research impact through a cultural lens, which is quite distinct from a lot of more recent studies which tend to focus on 'academic impact' metrics. As a short note, I found it useful in exposing a different dimension to the ongoing debates around research impact. Given my area of "expertise", I feel that someone who understands social research impact more could provide a great deal of additional insight here during the review process.

Data
The data are included in Figshare, as well as summarized in integrated tables. I note that there are a lot of missing data included though, is this just a case of availability? Also, I note that the Scimago database is based on Scopus data, which tends to be biased in a number of dimensions. Is it possible to make a note of this? Abstract The abstract jumps right into results around Individualism and Power distance and indulgence, without describing what these are (even briefly). This makes it difficult to understand for readers who are perhaps unfamiliar with these concepts. Perhaps a brief explanation of these could be added instead of describing the methods and the data sources, which aren't really needed?

Introduction
Just to pull out the 'correlation does not imply causation' card here; just because there is a correlation between number of publications and other external factors, does not imply a causal relationship necessarily.
There are a couple of typos (e.g.'twits') that might just need a quick copy edit to fix. I think the Introduction does a nice job of describing the previous research, and situates the present report well within that. Not sure if the comment about China at the end of the Introduction adds too much here.

Materials and methods
So the methods are pretty simple, which is nice. But also, I think perhaps a bit too simple here given that you're performing a lot of bivariate analyses, and a couple of extra steps are recommended. First, you want to perform an assessment of normality for data series prior to any correlation analyses, using the Shapiro-Wilk test (e.g.,shapiro.test function in R). From the output, if the p-values are greater than the pre-defined alpha level (traditionally, 0.05) this implies that the p-values are greater than the pre-defined alpha level (traditionally, 0.05) this implies that the distribution of the data are not significantly different from a normal distribution, and therefore you can assume normality and use Pearson's test (Pearson's product moment correlation coefficient [r]). If p > 0.05, you should instead perform a non-parametric Spearman's rank correlation (ρ). Secondly, once you've done this, for each test, report both the raw and adjusted p-values. The latter can be calculated using the p.adjust() function, and using the 'BH' model (Benjamini & Hochberg, 1995 ). This method accounts for the false-discovery test when performing multiple hypothesis tests with the same data set, which can inflate type-1 error (i.e. in order to avoid falsely rejecting a true null hypothesis; a false positive). What this will probably do is reduce the 'significance' of some of your results too (which is why it's best to report both the raw and adjusted values). In addition to this, it seems like you have multivariate data, so multivariate analyses might be more informative here. I would strongly recommend performing a Principal Components Analysis on your data (perhaps just only with the variables with more complete data), and inspecting that as a compliment to the bivariate ones. This is fairly easy to do and display using in built functions in R.

Results
I expect that the results will change a bit given my above recommendations to the methods, so won't comment too much on them at this stage. The nice thing about PCA though is that it produces good summary plots, which might be useful here.
In the text, can the country abbreviations be given to make reading a bit easier? M, SD, and N I think need explaining here too. Lots of acronyms can get a bit confusing!

Discussion and conclusions
As above, I don't want to comment too much on the Discussion and Conclusions at the present, as I think the above recommended methods will change some of the interpretations. However, at the present there seems to be a logical progression between reported results and conclusions. Congratulations to the author on a great and interesting piece of work. I would be happy to see a revised version of this too if needed.

If applicable, is the statistical analysis and its interpretation appropriate? Partly
Are all the source data underlying the results available to ensure full reproducibility? Yes Are the conclusions drawn adequately supported by the results? 1

Are the conclusions drawn adequately supported by the results? Yes
No competing interests were disclosed.

Competing Interests:
Reviewer Expertise: Palaeontology, Open Scholarly Communication I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

The benefits of publishing with F1000Research:
Your article is published within days, with no editorial bias You can publish traditional articles, null/negative results, case reports, data notes and more The peer review process is transparent and collaborative Your article is indexed in PubMed after passing peer review Dedicated customer support at every stage For pre-submission enquiries, contact research@f1000.com