Electronic, Scientific, Technical and Medical Journal Publishing and its Implications: A meeting at the National Academy of Sciences - May 19-20, 2003

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 1 June 2003

109

Citation

Gelfand, J. (2003), "Electronic, Scientific, Technical and Medical Journal Publishing and its Implications: A meeting at the National Academy of Sciences - May 19-20, 2003", Library Hi Tech News, Vol. 20 No. 6. https://doi.org/10.1108/lhtn.2003.23920fac.006

Publisher

:

Emerald Group Publishing Limited

Copyright © 2003, MCB UP Limited


Electronic, Scientific, Technical and Medical Journal Publishing and its Implications: A meeting at the National Academy of Sciences - May 19-20, 2003

Julia Gelfand

This symposium, "Electronic scientific, technical and medical journal publishing and its implications" held at the National Academy of Sciences in Washington, DC, May 19-20, 2003 was sponsored by the following groups of the National Academies:

  • Committee on Science, Engineering and Public Policy.

  • Board on Life Sciences.

  • Computer Science and Telecommunications Board.

In the planning stages for many months, the symposium focused on new trends and took a broad look across the scientific, technical and medical (STM) journal enterprise as it currently is configured and how it can evolve with new technologies and reader expectations. The symposium had four goals:

  1. 1.

    Identify the recent technical changes in publishing and other factors that influence the decisions of journal publishers to produce journals electronically.

  2. 2.

    Identify the needs of the scientific, engineering and medical community as users of journals whether electronic or printed.

  3. 3.

    Discuss the responses of not-for-profit and commercial STM publishers and of other stakeholders in the STM community to the opportunities and challenges posed by the shift to electronic publishing.

  4. 4.

    Examine the spectrum of proposals that has been put forth to respond to the needs of users as the publishing industry shifts to electronic information production and dissemination.

Laudable goals after nearly a decade of different experiences and opportunities created by emerging technologies and a range of more sophisticated user/reader expectations. However, after two days, the symposium was most successful at sharing content and status reports rather than defining any clear trajectory and set of directions. All the sessions were very interesting but answers to questions were scarce. There were predictions and expectations articulated, but how the information industry will work with contributors and institutional subscribers remains to be seen as the economic forecast for library materials budgets is bleak for the next few years.

In the spectrum of what the National Academies do and why this symposium was hosted by them, let's be reminded that the Academies issue two classes of reports: science for policy; and policy for science. Both kinds of reports promote, validate, disseminate and have an obligation through the giant government agencies and the prolific journal literature that publishes scientific work. The work of this symposium falls into the latter category, for the bringing together of a wide spectrum of knowledgeable people leads to provocative ideas and promotes greater discussion.

After several introductions by leadership of the National Academy of Sciences, James Duderstadt, President Emeritus and University Professor of the Science and Engineering Millennium Project delivered the keynote address at the University of Michigan. Author of the recent publication, Higher Education in the Digital Age: Technology Issues and Strategies for American Colleges and Universities (Greenwood Press, 2002), Duderstadt highlighted several themes and suggested that to date academics are still dealing with these issues via avoidance:

  • Price escalation in digital publishing continues.

  • Price increases in digital reference tools is significant – often as much as 600 percent over print.

  • Licensing schemes.

  • Nature of searching – "supersizing" – access to full stable of forming new consortia.

  • Shift of first-sale.

  • Open source strategies – new initiatives in open software.

  • Reinforces that the university is perceived as public good to produce content.

  • User is assuming more costs.

To jumpstart the program, Duderstadt addressed the changing nature of research and how important intellectual property has become. The shift in new knowledge creation dictates many challenges, including how libraries have moved from a "sense of place" to a sense of utility. The evolution of digital technology remains incomplete with new roles and enterprises forcing constant identification of these new technologies, the implications remain unclear and the role for stakeholders, such as government and industry remains clouded. The semantic Web is another mystery and there may be disruptive technologies out there. With today's emphasis on access to information, there may be a conflict with openness and the tendency to exploit commercialization poses important new constraints that temper how scholarly publishing evolves and is sustained.

The symposium was arranged around five panels. The first one was economically focused on the Costs of Publication and Dr Floyd Bloom moderated it. There are many different and sometimes competing controls in STM publishing, and Dr Bloom shared that. As a scientist, reviewer, editor, reader and user he was far less concerned with publishing costs than he was as President of the American Association for the Advancement of Science. Five speakers participated in this session. Michael Keller from HighWire Press and Stanford University provided an overview that included data about the institutional marketplace and he demonstrated that 50 percent of STM journals are published by the 20 largest global publishers and that STM has been the fastest growing segment of the media industry for the past 15 years. The proliferation of new content especially in digital form and library consortia has seen a major decline in personal subscriptions. The "Big deal" when library consortia licensed large chunks of content from the major publishers is coming to an end with more restrictive institutional budgets and more selective collections becoming necessary. Greater animosity towards for-profit publishers is more widely sensed as libraries want more freedom in their choices.

Some content costs for manuscript preparation and distribution of the final product are now more easily supported by networked applications and other publishing expenses can be reduced. Thus, the questions of why digital products cost so much have forced alternative options. Examples include digital repositories and other forms of eScholarship initiatives. Keller shared the Stanford eJournal Study (http://ejust.stanford.edu) that tracked journal preparation and distribution costs. The conclusions confirm that money in higher education will be tight for the next few years. Other predictions Keller shared include the need for ongoing peer review, more participation in preprint archive activities, with the most at-risk publications being the secondary sources such as indexing and abstracting services and other tertiary products.

Kent Anderson, Publishing Director for the New England Journal of Medicine shared how the definition of publishing is changing. The desire by readers to have content in perpetuity and library consortia performing careful analysis of usage data will challenge publishing. Physicians are not willing to get rid of print, and with the costs for print remaining steady while costs for online soar, new modalities will emerge. He called for a new investment in the education mission because it is harder to identify whom the readers are and they in turn have greater customer service needs. Maintaining the behind-the-scenes customer support and the technical skill sets required of publishing personnel contribute to higher publishing costs.

Robert Bovenschulte, Director of Publications Division for the American Chemical Society (ACS) identified two principles that are driving change for publishing: the cost of publishing and responding to the volume of publishing. The double-digit rate of submissions in the last two years at the ACS and the fact that technology is so pervasive in every aspect of STM publishing, plus there are many new platforms besides the web to be supported. Archiving has seriously impacted costs. Ending print publishing according to Bovenschulte is a worthwhile goal and may, over time, offer a 15-20 percent savings but it is not anytime soon.

Bernard Rous, Deputy Director and Electronic Publisher for the Association for Computing Machinery (ACM) contextualized STM publishing as bimodal with the environment composed of print and electronic components with lots of guesswork about what will constitute ePublishing for the future, which has yet to reach a steady state. Parallel systems are needed to produce both cultures, thus the high costs. Like previous speakers, Rous said that many STM publishers did not anticipate fully the need for customer support, the role metadata would play in preparing content and creating new taxonomies and the increase costs associated with licensing and negotiating agreements over selling traditional print subscriptions.

Gordon Tibbitts, President of Blackwell Publishing, USA, was the final speaker in this session and he tried to show where the cost-tipping points are shifting from print to electronic content. Readers and institutional subscribers continue to have difficulty in distinguishing between what features they would like to have versus what they must have. The cost shifting will reflect the complexity of layout, declining personal subscriptions, and more submissions. Tibbitts concludes that one can focus on simpler solutions and think phases of implementation and encourage cooperative efforts to respond to more demands. The transition from a publishing industry to an information industry with so much innovation and collaboration among many partners will continue to drive changes.

The second panel was about publication business models and revenue streams and was moderated by Professor Jeffrey MacKie-Mason from the University of Michigan, who gave the opening presentation, when representatives from Berkeley Free Press were absent (bePress). The defining question was how to recover production costs while inducing ongoing investments. Revenue is the transfer of value between parties, who includes the teacher, public, practitioner, and industry. Access points affect structure of business models and reflect the range of universities, laboratories and governments around the world. Different forms of access and the role for government revenue provide a range of options in creating business models. The impact on science is important because it promotes three critical practices: collaboration is a premium and costly way of doing business; distribution to the Third World is essential but that source cannot pay the same costs; and making sure that professional peer review takes place is also costly.

Brian Crawford from Wiley, Joseph Esposito from SRI Consulting, Wendy Lougee, Director of the University of Minnesota Library and Patrick Brown, a biochemist at Stanford spoke at this session. Crawford provided a larger overview about access-related issues. If access drives usage, then the need for better customer relationship management is greater. Access via IP authentication may not be the wave of the future but if it is there will be greater enhancements made. Publishers will continue to distinguish between customer paid and author paid models in their business plans.

Esposito predicts the consolidation among publishers to continue despite threats of antitrust actions. He thinks there may be more information providers for the STM markets than publishers. Consolidation among publishers may take a different direction with smaller publishers linking together to become larger companies rather than just being absorbed or bought out by the bigger and already established global companies. In order for smaller companies to achieve critical mass they may choose to have their content delivered by the platforms of larger companies. The open access movement will continue to influence pricing models and publishers will likely respond by offering bundling opportunities. Another method publishers may introduce to achieve downstream value migration and to target competitors is to disintermediate the wholesalers or vendors. Still another group of publishers that may be most vulnerable is the not-for-profit, and Esposito identifies two prospective targets, ACS and OCLC. In addition to the strategies of consolidation, bundling, downstream value migration and targeting vulnerable competitors, Esposito mentions two product-based strategies: the creation of meta-content and the shift to Web services. In conclusion, Esposito suggests that publishers may engage in some channel diversification – seeking new sales channels, especially to the commercial sector. The kinds of publications most needed by businesses are of an immediate need and this will tend to shift capital investment towards a more applied content.

Lougee sees tension in the academic library community between cost and value. She also sees the demise of the middleman – specifically the serials vendor. She sees the subtle trend of publisher as product versus publisher in process with the challenge being to grow from managing fixed content to evolving content. Libraries will see a shift from savings in subscriptions to archiving solutions. The new roles for libraries may include seeing the larger libraries serve as incubators helping smaller libraries and publishers navigate their ways and as a player in these changing and challenging times.

Patrick Brown reminded us of several initiatives that were made by the science community. The Public Library of Science (www.publiclibraryofscience.org/) and PubMedCentral (www.pubmedcentral.nih.gov/) encourage open access or a midwife business model based on conditions of perpetual right of access and make available suitable eFormats. He also suggested examining Creative Commons (www.creativecommons.org) the attribution license created by Professor Lawrence Lessig at Stanford. In the life sciences, two strong models of open access currently exist and are widespread: GenBank and genome sequencing projects where the costs are charged to authors and their institutional sponsors.

The third panel was on Legal Issues in Production, Dissemination and Use. Three speakers covered this subject, Ann Okerson, AUL for Collections and Technical Services at Yale University Libraries, Jane Ginsburg from the Columbia University Law School and Michael Jensen, Professor Emeritus at the Harvard Business School. Ginsburg compiled a table of STM Journal contracts based on the RoMEO (Rights Metadata for Open Archiving) Project at Loughborough and reviewed them based on answers to three posed questions:

  1. 1.

    What rights does copyright establish?

  2. 2.

    Who owns content?

  3. 3.

    Assuming the scientist or academic owns the content, what rights does the author give up?

Ann Okerson reviewed the theories and practices associated with licensing. Even though some think licensing is in a state of flux, it clarifies and disambiguates numerous matters opening up a dialog between publisher/provider and subscriber. State jurisdiction governs licensure in the USA. There are some hot issues for libraries and many policy issues still to be resolved. For libraries, the thorniest issues are around interlibrary lending, perpetual access, archiving and the fact that licenses tend to privilege richer libraries with larger staffs that have expertise and time to devote to licensing. For consumers, shrinkwrap or click-through licenses can seriously disadvantage purchase. There is a movement to online only so the future will tell what will evolve. Some good examples of licenses are noted on Okerson's Liblicense Web site at: www.yale.edu/~license/authors-licenses.shtml

Michael Jensen, an economist reviewed what the Social Science Research Network (SSRN) (www.ssrn.com/) does in providing access to abstracts of 405 journals and 103 full text journals.

The second day of the symposium had a more futuristic focus and the opening remarks by Professor Daniel Atkins from the School of Information at the University of Michigan set the stage for his colleague, Professor Paul Resnick and subsequent speakers Richard Luce, Director of the Research Library at Los Alamos National Laboratory and Professor Hal Abelson from MIT. These three pioneers committed to open archive architecture, are specialists in information communication technology and founders of increasingly popular measures such as the Free Software Forum and the Creative Commons. Resnick championed the democratization of feedback and demonstrated how many economically viable online businesses and information providers offer the same elements that can be successful in publishing, i.e. ebay.com, amazon.com and how publishing venues such as www.merlot.org can have some of the same positive attributes. Future parallels for scientific publishing may include integrating more behavioral indicators and metrics and a greater dependence on evaluation.

Luce demonstrated that nearly a decade after the high energy physics preprint server went public at http://arXiv.org/ lots of other examples of preprint servers were launched, such as Figaro at www.aao.gov.au/ or eScholarship examples at MIT's D-Space or the University of California's CDL at http://escholarship.cdlib.org/ These are all models that can either co-exist with the current publishing models or help the current system evolve into something better able to meet the needs of the scientific community. One of the major issues associated with preprints is how to "dissect the peer review stranglehold." Other, still to be resolved, issues include archiving, long-term curation, addressing the new expectations that communities of scholars and scientists have, and transdisciplinary boundaries.

Abelson reviewed the different projects launched at MIT that pushed the envelope ahead of its time and demonstrated how the institution was committed to "generating, disseminating and preserving knowledge." The MIT projects are most interesting because they each include a number of partners and collaborators and are functioning at a scale on which many things can be learned quickly in the arenas of scholarship and instruction. Already, clones have surfaced at other institutions and the horizon is bright for alternative publishing options and using the Internet for educational purposes. DSpace and OpenCourseWare are the directions along which scientific literature can continue to explore if it adopts a more open access function, and copyright law will make it easier to share content. Abelson's final slide, "Can new technologies give universities a seat at the bargaining table?" is what we are anxious to learn.

Among the most interesting sessions was the last formal session on What Constitutes a Publication in the Digital Environment, where Clifford Lynch, CEO of the Coalition of Networked Information introduced three large-scale STM projects from different environments and disciplines. Monica Bradforod, Executive Editor of Science at the AAAS, introduced the AAAS' new electronic product, Signal Transduction Knowledge Environment (STKE) and explained the challenges of how this resource better serves the disciplines of biology and life sciences, computational, statistical and information science. It is clear that institutional subscriptions will replace lost income from other revenue streams at the AAAS. Within a year, STKE has exceeded its expectations about utilizing technology and surpassing its number of subscribers.

Alex Szalay, Professor of Astronomy at Johns Hopkins, demonstrated how he and his colleagues have produced a very large astronomical data set, known as the Virtual Observatory (www.us-vo.org/). He shared the changing roles of a scientist in establishing a presence with this kind of large exponential data set. The project direction, assuming roles of publishers and curators, the fiscal oversight and need for more standards, templates and documentation, expansion of software capabilities and new technologies are all critical to the ongoing success of such a large-scale project. However, the contributions that the project makes to teaching and research, and how it can always be improved sustains the contributors. Emerging new concepts including standardizing distributed data and utilizing XML, SOAP and WSDL for ways to analyze data, and the new distributed computing processes of grid services all contribute to the goal of creating the most detailed map of the Northern sky. What this project allows for is having all the data available on the Internet with applications in a distributed worldwide and cross-indexed resource. The new models these scientists are creating force data exploration to be the next emerging new branch of science. With many challenges ahead in how to establish scalable solutions for the increased flow of new data on the way, scientific publishing will be even more complex, dealing with petabytes by 2010.

Biological research was the final example described and David Lipman, Director of the National Institutes of Health/National Center for Biotechnology Information (www.ncbi.nlm.nih.gov/), demonstrated the data proliferation with functional genomics and cell biology. The scenarios are the same, more powerful software, greater ease of use, more collaboration from colleagues around the globe and more functionality contribute to this ePublishing as the only viable way to handle data of this magnitude.

The final wrap-up session was led by Mary Waltham and three respondents offered their ideas about where all these fascinating challenges are leading the scientific community and the publishing industry in particular. I was unable to attend this session but we do know that the concept of a publication is still evolving and we continue to have serious concerns about costs, archiving, economies of scale, disaggregation, quality control, business models, copyright, licensing, interoperability, integration, credits and rewards, partnerships and collaboration. The horizon is complex, the directions are fuzzy and everyone seeks more granularity and an economic model that can be sustained. STM publishing will continue to challenge and haunt us but with a cast of characters that were in this symposium line-up one could not have had a better overview of the current landscape. Selected symposium papers are available at www.nationalacademics.org/cosepup/E-pubagenda.html

Julia Gelfand(jgelfand@uci.edu) is the Applied Sciences and Engineering Librarian at the University of California, Irvine Libraries, California, USA.

Related articles