To predict the future of academic publishing it make sense to focus on conflicts in the existing library system. In general the situation can be divided into the time before the advent of the internet and the time after the 1990s which is working with electronic communication.
Academic publishing before the internet was located around special libraries. The reason was, that only special library are able to provide detail knowledge. In a single building, all the books and journals about a subject can be collected. With a general library this was not possible, because the amount of shelfs and the costs would explode. If no internet is available and printed books are the only medium, a special library fulfills the needs of professional engineers well. The user comes with a concrete specialized problem, and the library provides the answer to the question.
During the transition from printed journals to online journals there was a bottleneck visible. Most specialized libraries have failed to transfer the concept into the internet. Some library portals are available in the internet, which are providing access to a low amount of ressources from a special domain. These website have failed. The reason is, that internet users are not interested in using 100 different websites, but they are interested in a meta-search engine which provides as much information as possible. As a result, not specialized libraries but information aggregation search engines have become famous in the internet age. The most famous one are Google Scholar, and Elsevier based universal search engines.
It seems, that in the internet age, the concept of a specialized library has become obsolete. This has produced a lot of stress to the system because there is a need for something which is new, but nobody knows what the future library will look like. What is sure was the old outdated concept of specialized libraries. The main idea was to work with constraints which are reducing complexity. In a specialized library about computer science, only books and journals from this subject are welcome. Everything else gets ignored and is called not relevant. That means, if a user in a specialized computer science library likes to read an article about music theory, the request was denied because it doesn't fit to the core subject. Addtionally, all the users in a specialized library have an expert background by default. They are experts for a certain subject but not informated about all the other knowledge in the world.
The main benefit for complexity reduction is to lower the costs. If books outside a specialized domain are ignored, they do not have to stored in the shelf. This saves money, human labor and physical space. The combination of low costs plus indepth information access was the success factor for special library.
The working hypothesis is, that with the advent of the internet, the restrictions of the past do not make sense anymore. The problem is, that the complexity has exploded. The world of a special library was easy to understand. But if all specialized libraries are merging together into a single universal library, it's unclear what the system is about.
Discipline-oriented digital libraries
Since the 1990s, some attempts were made to build so called “Discipline-oriented digital libraries”. The idea is to transfer the concept of a special library into the internet age. This idea make sense but fails at the same time. From the perspective of a printed special library it make sense. A special library is the core of academic infrastructure. It's major bottleneck is, that no fulltext search is possible. Providing the same information in the online format makes a lot of sense for the users.
At the same time, these websites have failed. Because the result is, that lots of different search engines are available. What the user likes is a meta-search engine which can search through all the content. One option in doing so are information aggregation websites from Elsevier and Springer which combining the content from different special libraries. The disadvantage is, that a publishing house is different from a library. So the question remains open, how an internet version of a special library will look like.
The reason why it make sense to analyze classical special libraries in detail is because they can answer the question what academic publishing is. Academic excellence and a special library is the same. So what exactly is the difference between a normal library and a special library? It has to do with a certain sort of information. In a normal / general library, the books are about fictional topics. For example the novel Anna Karenina from Leo Tolstoy is an example. A specialized library won't collect such a book, even if it's world literature. A second property of a special library is, that it provides in depth knowledge about a concrete domain.
So we can summarize that specialized library are focussed on non-fictional information with detailed topic in mind. If a document contains lots of special vocabulary spoken by experts on the field, and provides non-fictional content, then the document is scientific.
Building a digital discipline oriented library
Reproducing the specialized focus of an academic library in the internet age can be realized with tags. The difference between a fictional book from a general library, and a scientific book is, that the second one is tagged differently. A non scientific book can be tagged with “novel, adventure”, while an academic book is tagged with “non-fiction, computer science, neural networks”. What academic users are demanding, is content which is tagged in a certain way.
Specialized libraries and special journals are creating the reputation by focus on a certain topic. Academic work means, to ignore most of the information and oriented only on a sub part of the problem. Roughly spoken, a journal which is publishing papers from different disciplines is not an academic journal. Only a journal which is dedicated to a concrete issue is able to create a reputation.
At youtube there is a video available which introduces a science library from the late 1970s, Life Sciences Library 1979 - Restored Version, https://www.youtube.com/watch?v=xe0lFUlz-io Even if the library is not very large and it working with printed index cards, it's a state of the art library even for today's needs. That means, if the internet is offline, a specialized library would provide the same or even better information available online. The low amount of books and the missing fulltext search is not a problem in a special library, because the content can be explored manually.
science library are the core of academic excellence
In the public perception, special libraries are often ignored. They doesn't have prestigious buildings and in contrast to universal libraries the amount of books they have to offer is small. Most special libraries are small in the size. That means, the amount of employees is below 100, and it's located in a single building. On the other hand, special libraries are more important for researchers than large universal libraries. What researchers are doing at foremost is to focus on a concrete topic. This is equal to become an expert. And special libraries are needed by the experts for reading new information.
A special library is the same like academic working. It's not possible to become an expert for everything. But there are domains like physics, mathematics, biology and so on. It make sense to collect all the information in a single library and prevent of creating larger universal spaces. This is not located in a certain mentality, but the simple reason has to do with costs. A special library produces lower costs than a universal library. At the same time a special library creates a high entry barrier. Users from a different department are not allowed to enter a special library. Only mathematicans can visit a mathematical library.
In the electronic age the borders have blurred. The idea is to get access information worldwide and from all subjects under a single search engine. This new understanding of a knowledge hub is working different from the former science library. And the question is how to maintain the normal specialized quality standard in the electronic age.
Let us give a concrete example. The hot topic in the age of Open Access are so called predatory publishers. That are electronic journals which have lowered the entry barrier. Predatory journals are perceived by academic as low quality journals. But what if someone creates a specialized predatory journal? That is a low entry barrier journal which is dedicated to a special topic and accepts only manuscripts from this domain. Such a predatory journal has to be called a normal academic journal, because it provides this sort of information which is needed by experts. Or let me explain it the other way around. If a researcher gets specialized information about his domain which are uptodate, the researcher is happy. No matter if the journal is called predatory or not.
In contrast, a fake journal is equal to the attempt to describe a subject in a general way. That means, by avoiding the specialized vocabulary of the expert. Mainstream information which are targeted to a larger audience can be called a fake science journal because they do not fulfill the needs of academics.
No comments:
Post a Comment