Are pre-compiled citation indexes of peer-reviewed journals an adequate control for research quality ? A case study of library and information science

Looks at the South African Department of Education's new recommendations for the evaluation of higher education research in South Africa, and examines two primary aspects: the use of pre-compiled journal lists from overseas, and the apparent reliance on peer review as a guarantee of quality. Pointing out that these are tried and tested standards of quality, the authors argue that there are nonetheless disciplinary differences between experimental sciences such as physics or chemistry and other disciplines that make these measures difficult to apply across the spectrum. They present an analysis of library and information science publications in the chosen lists and point to the weakness of the selection of titles in this discipline. In addition, there are extra difficulties for scientists from South Africa and the developing world in securing publication in premier international library and information science journals. The authors conclude by calling for the employment of other, additional evaluation measures in an integrated system.


Introduction
Governments, research councils and universities around the world seek to encourage and reward high quality research in different ways, including the channelling of funds to those who have produced it in the past in the hope that these same individuals will be motivated to do it again.But the precondition for this process -the identification of research as being of 'high quality' -is a complex and contentious matter, differing between disciplines, and for practical purposes some short cuts are inevitably taken. 3In this paper we focus on journal articles and suggest that the proposed use in South Africa of five pre-compiled lists of accredited journals as indicators of quality (Dept. of Education 2003: 3) may in significant respects constitute such a short cut, even though it is a refinement of previous practice.This may have in turn, and at least in the library and information science (LIS) discipline, some deleterious consequences.There is of course an extensive literature on the bibliometrics and evaluation of research publication, including useful work done in South Africa itself (see e.g.Jacobs 200 I), but much of this is highly technical and may be inaccessible to a non-specialist.Given the importance of the topic, we hope here to stimulate debate among the broader ranks of the LIS profession.If we are accused of adventurism, wrong-headedness or bias, at least we will have succeeded in setting 'the intellectual pot boiling' (Sturrock 1998).
In general, an article as the product of research appears in an academic journal only after undergoing a process of peer review, usually anonymous, by colleagues in the field. 4The aura of authority thus acquired by the published work derives both from the fact of being published in a particular periodical, and from the degree of difficulty attached to gaining acceptance by the more prestigious titles.Such journals in their hierarchical relationships constitute a self-reinforcing network -their contents can be accessed through mainstream indexes, they are included in systems of citation analysis, and so on.Especially in the scientific, technical and medical (hereafter STM) disciplines, this system has the advantage of submitting all contributions to evaluation according to internationally agreed standards of experimental method, reproducibility and significance.The test of quality, in other words, is the same for everybody -and rightly so.By and large, as far as it is possible to tell, the system works.The most important research results do indeed reach their intended audience.But as we shall see, the picture is less clear around the edges.
A little background may be appropriate before we go on to discuss the changes that are currently being implemented in the journal accreditation procedure in this country.According to Section 3 (I) of the Higher Education Act (Act 101 of 1997) the Minister of Education is empowered to propose policies and procedures for measuring research output from higher education.This he has duly done, and the new procedures to be adopted were promulgated in September 2003 and will come into force from I January 2005 (Dept. of Education, 2003: 2).I The Department (DoE) has decided to use four general lists of approved academic journals.Publication of an article in a listed journal will attract subsidy; in an unlisted journal, it will not.The chosen lists are the three citation indexes of the Institute of Scientific Information (lSI) and the journal list of the International Bibliography of the Social Sciences (IBSS).These four are to be supplemented by a newly compiled list of South African journals that are not all included in the lSI and IBSSindices. 2 The effect of all this is simply to revise and expand the official lists of accredited journals, tinkering with a long accepted and frequently criticised standard without apparently examining its underlying logic.
According to the Department's policy, accredited journals 'must meet the following minimum criteria to be eligible for inclusion in the list of approved journals: However, it is unclear from the published policy exactly how the use of citation indexes and lists compiled by foreign organisations, over whose selection procedures and criteria the Department has absolutely no control, can constitute a guarantee that all the cited conditions have been satisfied.
The analysis that we offer in this article is by no means intended to be a substantive critique of the two key elements of this system -the pre-compiled citation index or list, and peer review -in and of themselves.They 'ain't broke', so they do not need to be fixed.We do want to suggest at the same time, however, that they are obviously not perfect -they are merely the best that the academy has been able to devise up to now.With regard to the first element, we argue that if the DoE in South Africa is to use general lists of accredited journals as a means of assigning research funds across all disciplines, then some disciplines will certainly do better than others.If this is the case, we recommend that the list system be employed in conjunction with other measures, some possible examples of which we discuss in the conclusion.

The pre-compiled list and its problems
A major attraction of developing and using a list is that it is clear and unambiguous, and can be easily managed -either a journal is in or it is out.Evaluation costs can therefore be kept low.By requiring that journals use peer review for submitted articles, the compilers of such lists implicitly endorse the quality of the titles included, although this is really an honour system: it is enough for the editor to affirm the practice.These are significant advantages.However, the employment of lists that are compiled by external or even overseas bodies, according to standards that mayor may not be explicit, appropriate to South Africa, or fully understood, inevita?ly runs the risk of diluting the guarantee of quality, and may sometimes compromise it.
A well-compiled list presumably provides all concerned with what is effectively an inventory of the core journals in a given discipline, provided that some measure of impact for each title is also available.This said, there are powerful criticisms that can be made both against the use of pre-compiled lists as a scientific practice, and more specifically against I. Subsidies to universities and technikons are, in part, calculated on the basis of research reports submitted by the institutions themselves (Dept. of Education 2003: 6-7).An auditor's report (a publication count) -based on articles, books, chapters in books and other categories of publication -is compiled every year (n) covering the previous year (n-I).This policy is, in the Department's own caveat 'not intended to measure all output, but to enhance productivity by recognising the major types of research output produced by higher education institutions' (Dept. of  the weaknesses of the particular lists that the Department here in South Africa proposes to adopt.We believe that the weaknesses that we can identify may not be atypical and are likely to be replicated across at least some other disciplines. Bradford's Law of Scattering (Bradford 1948) ~tates that if journals are ranked in order of decreasing productivity of articles on a given subject, a nucleus of highly productive journals in a discipline can always be identified.However, significant articles will also be found in a large volume of non-core journals.The relationship between the number of productive journals in the core and the outlying clusters of less productive journals tends to the ratio Thus any comprehensive search for significant articles, however defined, needs to take account of the outlying clusters as well as of the core.However, such a search beyond the core journals involves rapidly diminishing returns.Of particular significance for South Africa is the fact that many developing researchers may be unable to secure acceptance in core journals, which are naturally under pressure from a multitude of authors, and which, as we argue below, may tend to undervalue research from the South.Rather than suffer delays that might affect their career prospects, they are more likely, therefore, to opt for timely publication in a journal from an outlying cluster.I A secondary and related problem, especially in emerging non-STM disciplines, is the identification and definition of a disciplinary domain.The traditional disciplines such as physics and chemistry, or even sociology or economics, have reasonably well defined boundaries, with journal sets to match.But new cross-disciplinary subjects -gender, African and other area studies, HIV/AIDS, knowledge management -are constantly materialising, with journal sets to match.The problem is that the new journal sets may not find a place in the lSI or IBSS inventories for a considerable time, if ever.
Whilst we accept the need for any new journal to prove its worth in terms of regularity, quality and timeliness, any rigidity in the process of recognition may cause active and productive researchers to publish outside the emerging disciplinary core, thus further delaying the establishment and recognition of that core.In some cases, such an outcome could have seriously negative social and economic consequences, by delaying the publication of significant research results.
Let us turn to the critique of the specific lists under consideration by the Department.
It is in their nature to be conservative, and this naturally implies certain biases.The five lists are strongly inclined to favour English as the language of academic discourse and publication.This recognises what is clearly an objective reality in the world of scholarly communication.
The Minister of Education, indeed, has stated that he 'acknowledges the current position of English and Afrikaans as the dominant languages of instruction in higher education and believes that in the light of practical and other considerations it will be necessary to work within the confines of the status quo until such time as other South African languages have been developed to a level where they may be used in all higher education functions' (Department of Education 2002: 10).
While we reluctantly accept this as realistic policy, we suggest that the Department may be missing an opportunity for the active promotion of multilingualism.This is a declared policy objective, yet there is no refer~nce to possible support for publication in South African languages anywhere in the Policy and procedures for measurement of research document.
Similarly, the overseas lists and citation indexes are understandably biased in favour of the so-called 'international' journals (Le.predominantly those published in the United States or the United Kingdom).2We shall return to this point below, but we want to emphasise that we are most emphatically not advocating a crude 'local is lekker' philosophy at all costs.Of course South African researchers must publish internationally; and of course the communities of referees doing peer review must remain as diverse as possible.But the lists as they are made up at present are skewed.
A subtle but important point is that journals that are marginally included and excluded in citation lists as the result of ranking by citation impact are vulnerable to 'noise and random effects' that may lead to considerable fluctuations in their rankings over quite short time periods (Rousseau 2002: 428).Thus the inclusion or exclusion of low impact journals from citation lists does not necessarily reflect significant differences in quality.
The lists also tend to favour recognition of print or print-plus-electronic publication as a medium, rather than .
embracing new means of scholarly communication such as purely electronic journals or scholarly open archives.
Adherence to the use of these simple indexes may be predicated on a (perhaps rapidly) disappearing model of scholarly communication. 3 I.The importance of this insight becomes even clearer when we consider that in 1995, lSI was indexing about 3,300 scientific journals from the 70,000 or so published worldwide -or about 4.7 percent of the literature (Gibbs, 1995:76).This level of participation, as Christopher T. Zielinski of the World Health Organization has commented, 'is simply too little to account for the scientific output of eighty percent of the world' (cited by Gibbs, 1995:79).2. Rousseau argues, citing Garfield, that there is no 'scientifically valid definition of bias' but does not seem to argue against the truth of the charge (2002: 429).

Peer review and the lSI
Let us now turn to the second element in the model for research evaluation, namely peer review, which is also the second of the Department's listed criteria for accreditation of a journal (Dept. of Education 2003: 6).We propose to draw the reader's attention to several aspects of peer review practice in the special context of the positions and policies of the lSI, compiler of three of the pre-compiled citation indexes to be used by the Department.
The Institute of Scientific Information is a private body, based in Philadelphia in the United States, and perhaps best known among information workers for its untiring advocacy of citation indexing and analysisas a measure of research quality and impact.While this is a defensible position, the concept has been subjected to criticism over the years and the issueremains contentious.For example, a recently published analysisof the ResearchAssessment Exercisesconducted in the UK in the I 990s hasargued that there is serious theoretical difficulty in applying bibliometric assumptions based upon citation analysisto the assessmentof the quantum of research quality.The paper arguesthat the evidence for a significant correlation between citation aggregatesand research quality for individual entities is weak (Warner 2000).The future value of citation analysiscould be to inform, but not to determine, judgements of research quality, and a combination of methods -we believe -is highly desirable.
Peer review, however, is not as important to the lSI as it may appear at first glance.In his voluminous writings on bibliometrics and related topics, the Institute's founder, Eugene Garfield, mentions peer review as a characteristic indicator of quality in scientific journals on many occasions.What is striking, however, is that it is obvious that he does not consider it to be by any means the most important indicator of quality, asthe following quotations indicate: 'Peer review of submissions, editorial board membership, and the reputation of the publisher or sponsoring society are other indicators of journal quality.' (Garfield 1990: our emphasis) 'Other indicators of quality are also important -such as whether a journal relies on peer review to assessthe relevance of a submitted manuscript ...' (Garfield 1990) Indeed, Garfield goes further.He is quite critical of what he terms 'patterns of sloppiness in peer review' and recognises that the system needs reform, while acknowledging that it has made a 'monumentally valuable contribution' to maintaining quality in the scientific literature (Garfield 1993).In fact, it is not hard to find instances of 'sloppiness'.A couple of examples of scandal will serve.In 2002, it emerged that two eccentric French television personalities had succeeded in publishing papers in mainstream physics journals.The subject matter was so arcane that the physics establishment was reportedly unable to decide whether the papers were an elaborate hoax or represented a major breakthrough in the understanding of 'string theory' (Theoretical physics 2002).Similarly, there have been accusations that the respected journal Nature has published 'ju,nk science' (Milloy 2002), and the Sokal hoax in Social Text in 1996 is widely known (Sturrock 1998).It is obviously encouraging that these cases were spotted quite quickly, even after publication, and we mention these stories to make the obvious point that the peer review procedure by itself works most of the time, especially in the experimental sciences.It cannot, however, prevent occasionalslip-ups, or even put a stop to controversy.Quality control in the publication of academic research reports in the disciplinary literatures needs more than one kind of guarantee, especially if it is linked to the allocation of research funds.
In a forthcoming article we intend to explore more fully the origins of peer review in the STM disciplines, with their shared assumptions about scientific method, experimentation and the testing of reported results.The spread of peer review practices from the STM literature into the 'endlessly and happily expansive discourse of thought in general' (Sturrock 1998) may not actually amount to a category mistake, but the assumption that they can provide the same sort of guarantees in the wilder terrain may do so.This is not least becausethe experimental sciencesshare a paradigm, while the social sciencesand humanities are characterised by competing schools of thought.I It is hardly surprising, therefore, if rejection rates appear to be higher in the latter disciplines, since a Marxist, let us say, may get less than a sympathetic hearing from a Post-Modernist, even with the best will in the world.
For these reasons, we believe that the reliance in South Africa on lists and citation indexes of accredited journals, however compiled and with whatever degree of care and caution, is an inadequate strategy to certify the quality of the national research enterprise.A reading of the background literature on lSI's activities, as well as an analysisof its actual listingsfor our discipline (see below), lead to the conclusion that the ranking of a particular journal as high quality depends on a range of objective and subjective factors.
There is another problem for us in South Africa and the underdeveloped world.We might legitimately expect that the DoE would consider the strengthening of local journals as an important policy objective, and indeed, the inclusion of a special South African list is a clear indication that it does.It is equally clear that 151 does not regard the support of local publishing anywhere in the world as part of its mission, to put it mildly.What it terms 'international journals' -apparently a coded term for US or British publications -are where it thinks the best articles should appear: 'The fundamental issuefor scientists in small countries ... is this -research of international significancecan and should be submitted to the international journals.Indeed, we know from extensive studies that the best papers from small countries are published in the international journals.It is quite possible that from time to time we overlook the low impact journal ... .' (Garfield, 1979: 7).I This is possibly a naive position, asthere seems to be some evidence of anti-Third World bias from international journals -even before the peer review process in some cases.To say this is not to take a radical or political position: the distinguished scientist Wieland Gevers of the University of Cape Town commented with some feeling on the topic several years ago, and there is little reason to suppose that things have improved. 2Rousseau(2002: 429) points out that the bias may be 'inherent in the scientific community as a whole' while conceding that by using 151products we effectively 'reduce evaluation studies to the North American standard'.In addition, it may be that the lSI's refusal to take an internationalist view is counter-productive.As a US commentator pointed out several years ago, in an article on hidden Third World science: 'The invisibility to which mainstream science publishing condemns most Third World research thwarts the efforts of poor countries to strengthen their indigenous science journals -and with them the quality of research in regions that need it most.'(Gibbs, 1995:77) Such a system virtually guarantees that the major research libraries of the world will not subscribe to journals that fall outside what has been termed 'the magic circle of citation analysis'.As we have already noted, the system is selfperpetuating and self-reinforcing, both with regard to peer review and with regard to citation impact analysis(Zielinski, quoted by Gibbs, 1995:78).
In this scenario, the DoE and other funding bodies will effectively subsidise South Africa's researchers, who will then publish overseas in costly 'international' journals approved of by lSI, and local libraries will struggle to find the necessary foreign exchange to buy the product back.In a recent paper, Britz and Lor (2003) analyse six forms of information movement from the Southern to the Northern hemispheres, and draw attention to the need to create an ethical and balancedsystem for promoting an equitable flow of information.

The evaluation system and LIS research
Do 'international' lists present a balanced view of worldwide research?If not, as seems possible, is there an implicit assumption that most Third World research output can safely be ignored?In a recent and relevant study of the international visibility of the Ibero-American LIS literature, Moya-Aneg6n and Herrero-Solana complain that most of the Latin American and Iberian LISjournals (Le. in Spanishor Portuguese) are not indexed by 151(Moya-Aneg6n & Herrero-Solana, 2002:54-55).The articles that do get indexed are those which fit best with the current preoccupations of the North American section of our profession.Moya-Aneg6n and Herrero-Solana also point out that the lSI's Social Science Citation Index hugely over-represents what some politicians now want to call the 'Anglosphere' -indeed the Latin and Latin American literature makes up lessthan one percent of the total content,3 as Figure I below shows.
Invisibility is obviously a serious problem for a research publication, and is taken seriously by the editors of affected journals.In some cases,individual journals havegone to extraordinary lengths to try to keep themselves in the 151 indexes.
The story of the Archivos de Investigacion Medica (AIM) is a salutary example.This Mexican medical journal used to be indexed by 151, but as a result of the nation's 1982 financial collapse, was forced to suspend publication for six months.
This violated the 151principle that journals must appear to a strict publication schedule, and AIM was dropped.All the journal's subsequent attempt to gain reinstatement, up to and including abandoning Spanish even in its title, failed to impress lSI, and in 1995 at least, the journal remained in limbo (Gibbs, 1995: 76-77).AIM was not the only victim, I.It should be noted that 'best' in this statement apparently means 'most highly-cited', a connotation that is questionable, if nothing else. 2. Gevers described the quality of peer reviewing for South African life scientists as 'appalling', adding that it smacked of 'First Worldism' (Gibbs, 1995: 82). 3. The total number of LISarticles for the period 1991-2000 is 74,255, of which 43,120 were US-produced.There are 31 South African and 41 Brazilian articles covered in the same period.(Moya-Aneg6n & Herrero-Solana, 2002:56 [Table II]).It may be worth noting that LISis not the only discipline for which this is a problem.For a discussion of the visibility of Argentine literary periodicals in the indexes, see Romanos de Tiratel and others, 2003, and with regard to African studies, the critiques date back to the era of the printed index (Scheven, 1977).
however.The number of science journals from less developed countries included in lSI's system had dropped from eighty in 1981 to fifty by 1993 (Gibbs, 1995:78).The problem is not only one of under-representation, but of possible distortion as well.Moya-Aneg6n and Herrero-Solana point to an instance -Mexico, which is in fact the third most productive nation in Latin America in terms of LIS journals, is seriously underrated in the results produced by the citation analysis method (Moya-Aneg6n & Herrero-Solana, 2002:58).Despite these distortions, and the resultant invisibility, Moya-Aneg6n and Herrero-Solana conclude, without offering any explanation, that most Iberian and Ibero-American LIS researchers actually prefer to publish in local journals.
In this context, is it likely that the Department's lSI-oriented system will be fair to local LIS researchers?As an exercise, we made a comparison between • the journal list of Library and Information Science Abstracts (LISA), widely regarded as the premier current abstracting and indexing service for the discipline of library and information science, on the one hand, • the list of the International Bibliography of the Social Sciences, • and the generalist lSI citation indexes on the other.LISA has been published since 1968 and is extensive in its coverage of the country-specific and international journals considered significant in the discipline.It covers the output of more than sixty-eight countries, in more than twenty languages, and has eclectic subject coverage within the domain of library and information science.It thus forms an appropriate base from which to compare the disciplinary coverage of the lSI and South African Department of Education lists.
In Table I, printed as an appendix to this paper, we present the list of journals that are considered 'accredited' for the publication of articles from South African researchers in the field of library and information science.The number preceding each title is the number of articles indexed up to November 2002 in LISA.This provides a crude ranking, taking into account the fact that the journals may have been published over widely varying periods, may not have been added to the LISA service at the same time, and may have differing frequency and volume of contents.The number following each title is its International Standard Serial Number (ISSN).
Of the 476 LIS-related journals listed in the LISA database, 98 are also listed on the lSI; three are included on the South African Department of Education list and two (which are also listed on the lSI) are included in the list of the International Bibliography of the Social Sciences.The South African Journal of Information Management was also included, although it is not abstracted yet by LISA.This indicates that only research and papers published in the 101 journals would be 'accredited' by this system.The latest versions of both lists were used for this check.English language is reduced to 97.This may seem more than adequate, but it is also evident that the spread of titles selected within the discipline is strongly biased towards technology.Of the 101 titles, 36 are technology-focussed.There are no journals included that specialise in research about public libraries, work with children, or school librarianship, aspects of critical importance in developing and multi-cultural societies such as ours.The lists largely neglect journals dealing with policy development, indigenous knowledge systems, and issuesrelating to the information society.
A counter-argument is that papers in such areas as policy development could be submitted to accredited journals outside the LIS discipline.This is a tendentious argument, however, becauseit neglects the importance of exploring such topics within the context of the LIS discipline; moreover, the purpose of LIS research is to develop the discipline and ensure that important lines of thought are conveyed to its practitioners.Publishing outside the discipline tends to dilute both intentions.
These inadequacies become all the more poignant when one considers that a significant change in recent years in South African higher education hasbeen the encouragement of faculty members in technikons to engagein research and to publish.Within the LIS discipline, the pool of potential contributors to the research culture has thus increased, and there will be increasing pressure on journals as new authors seek to publish.We recognise that rejection rates from premier international LIS journals are high, and that the factors influencing where authors decide to publish may be complex.Nevertheless, it is our contention that the pool of DoE-accredited journals is so small that many would-be contributors will be delayed or disappointed in their quest.The implication is that research in LIS of good to high quality may not necessarilyfind a publisher, or recognition if it does.
An additional point relates to the state of the LISdiscipline in South Africa.Democratic development engenders many changesand challengesfor the structure, administration and practice of library and information science: these matters are mostly of concern to practitioners within South Africa and, however well researched and written, are unlikely to engage the interest of editors and reviewers of 'international' journals.I This is not to diminish the importance of such research and writing but it is to underline the improbability of recognition of its value through publication in such journals.Given that the only accredited journals in LIS originating from South Africa are South African Journal of Ubraries and Information Science (approximately eight peer-reviewed articles a year), Mousaion (approximately fourteen peer-reviewed articles a year) and South African Journal of Information Management (approximately sixteen peer-reviewed articles a year), it is evident that the chances of being published in them is correspondingly small.It is highly likely that research and development having"apotential to influence LIS practice for the benefit of South African society will not find a place in an accredited jou~nal.

Conclusion: One size does not necessarily fit all
The identification of high quality in research output is an ongoing and implicit process within the research community in any given discipline and establishesa shifting hierarchy of significance.The process of awarding funds for research, on the other hand, establishesa hierarchy of value that overlaps with but is not necessarilyidentical to the hierarchy of qualityfor one thing it is likely to be lesscomprehensive as money is scarce.Thus, not all quality research outputs have the same value, and some may be significantly more valuable than others.This discrimination is necessaryfrom the government's point of view to limit the possibility of an uncontrolled demand for public funding.In the case of journal articles, still a dominant form of scholarly communication, only those published in approved journals attract government subsidies.Thus, the acceptance or rejection of a journal article by government for subsidy purposes determines the distribution of funds.It follows that, for any given discipline, the compilation of the list of 'accredited journals' is potentially a controversial exercise that is unlikely to completely satisfy many.From the researcher's viewpoint, such a list should ideally both reflect the spread of the discipline, and identify as many suitable journals as possible, to increase the possibility of financial recognition for individual published work.
Our conclusions are based substantially upon an analysis of only one discipline.Furthermore, it is a discipline that largely focuses upon practice and policy development.The conclusions may not, therefore, be valid for the full range of social sciences, and may be even less so for experimental science.However, we are sufficiently exercised by the newly promulgated evaluation policy to suggest that similar analytical exercises might be conducted within other disciplines.They could serve to establish whether there is, in fact, any cause for alarm in other domains about a reliance on highly exclusive lists of approved journals.It seems to us that both components of this model -peer review itself as commonly practised, and the use of select lists to confer canonical status -do not serve to encourage the local production or the local publication of quality research in South Africa.What, then, do we think might be done, with respect at least to our own discipline?If pressed, we should like to see the use of the lSI and IBSS lists considered as one element in an integrated range of measures.The relative weight that inclusion of a journal title in such citation indexes and lists should bear is a problem that will require careful consideration.
We accept their necessity, in other words, but are unconvinced of their sufficiency.The establishment of an integrated range of assessment measures could, for example, be the outcome of a review process conducted under the aegis of the professional body for our discipline in South Africa, L1ASA(Library and Information Association of South Africa).In this way some assurance might be gained that research conducted by South African LIS researchers and funded by national bodies for the benefit of citizens, will be readily available in locally accessible sources, and will not become the private intellectual property of foreign interests.1 At the same time, it will encourage the survival and growth of local journals, and with them the research culture in our universities and technikons.
(a) The purpose of the journal must be to disseminate research results and the content must support high level learning, teaching and research in the relevant subject area (b) Articles accepted for publication in the journal must be peer reviewed (c) The majority of contributions to the journal must be beyond a single institution (d) The journal must have an International Standard Serial Number (ISSN) (e) The journal must be published regularly (f) The journal must have an editorial board that includes members beyond a single institution and is reflective of expertise in the relevant subject area (g) The journal must be distributed beyond a single institution' (Dept. of Education 2003: 6).
It is necessary to review the 'accepted' list with care because there are some oddities: College and Research Libraries News covers recent professional events and does not generally include scholarly articles.Similar doubts may be raised about Government Information Quarterly and Information Today.New Scientist would hardly count as a mainstream LIS journal.Adding language of publication as another possible restriction, the number of 'acceptable' LIS journals in the SAJnl Libs & Info Sci 2005, 71 (I) I. Dr Simon Donnelly, formerly Lecturer in Linguistics at the University of the Witwatersrand, makes a similar point in relation to the publication in South Africa of research articles about minority and threatened languages (Word of Mouth, interviewed by John Orr, SAfm, 18th April 2004).SAJnl Libs & Info Sci 2005, 71 (I) Education 2003: I). 2. The Department of Education list may be found at http://www.education.gov.za/content/documents/307.pdf [site visited II April 2004]; the lSI master journal list and other information may be found at: http://www.isinet.com/isi/journals/[site visited I I April 2004]; the journal list of the International Bibliography of the Social Sciences may be found at http://www.lse.ac.uklcollections/IBSS/ about/journals.htm[site visited II April 2004].