We are delighted to launch Hindawi’s journal reports today. These reports, developed with the help of DataSalon, showcase a range of journal metrics about the different publishing services we provide for our journals. By exposing more detailed data on our workflows – from submission through peer review to publication and beyond – we are giving researchers, partners, and funders a clearer view of what’s under the ‘journal hood’. We are also raising greater awareness of less talked-about services, such as how we are helping to make the publication process more equitable and published articles more accessible and discoverable.
This is the first phase of our journal reports and detailed metrics are available by following the “see full report” link from the journal’s main page. In this first phase, our reports give greater insight into acceptance rates and decision times, but also the median time in peer review and the median number of reviews per article. Alongside traditional metrics, such as citations and article views, the reports also display maps of the geographic distribution of authors, editors, and reviewers.
The final section demonstrates how we make articles more accessible and discoverable. It takes advantage of data from Crossref’s participation reports, which we extracted from Crossref’s open API. The section includes the percentage of articles in the journal that are open access (i.e. 100%), and the proportion of corresponding authors with an ORCID ID. It also shows the extent to which abstracts and citations are open. Hindawi supports the initiative for open citations (I4OC) and we are also a founding organisation for the initiative for open abstracts (I4OA). Because our metadata is machine readable and openly available, it makes the articles we publish more discoverable than publishers who don’t make this information openly available. The infrastructure for Open Access is also a key building block of Open Science.
By sharing this information, we want to help researchers make better-informed decisions about where to publish, instead of them relying on limited and often problematic indicators, such as the Journal Impact Factor or the presence of a journal in a particular (and disproportionately western) database. We want to expose services that are becoming increasingly important for researchers practising Open Science, but which aren’t yet as valued as traditional indicators, such as machine readability; the openness of references and abstracts; and the use of persistent identifiers to ensure that different research outputs can be digitally connected. At a time when there are real concerns about the trustworthiness of science, and of publishing, we want to foster trust through openness.
Inspired by the San Francisco Declaration on Research Assessment (DORA), our journal reports are part of our commitment to an open science approach to publishing. DORA is now 10 years old and set out a key principle for responsible research assessment – that researchers should not be judged on where they publish. It has grown into a global organisation that is leading and actively promoting “the need to improve the ways in which the outputs of scholarly research are evaluated” and is providing resources to do this. There are also many other aligned initiatives, such as INORMS and the European Commission’s Towards a reform of the research assessment system, and we hope our new reports can play a role in helping this reform.
In addition, when we began creating our journal reports, Hindawi was also a member of the price and service transparency pilot organised by cOAlition S. As part of the pilot, we were tasked with providing data on aspects of our workflow (such as time to ‘first decision’ or the number of peer review reports per published article), as well as our pricing makeup. This framework now forms part of the cOAlition S Journal Comparison Service for librarians and their Journal Checker Tool for researchers. But why not also make this data openly available to everyone?
This is what we are now doing because Hindawi is committed to taking an evidence-informed approach to publishing, which is what Open Science is all about. But getting this data together and ensuring consistency between our internal and external reports takes time and we have a way to go.
Mathias Astell, Chief Journal Development Officer at Hindawi commented:
“These reports mark an important step in increasing transparency of journal performance and makeup – helping researchers, institutions, funders and publishing partners to have a clearer understanding of the value a journal is providing. We believe that clearly presenting the geographic representation of our journals will start to help subvert historical geographical biases in publishing. We also believe that openly sharing the metrics that make a difference to authors’ experience is vital in moving past the problematic ways of traditionally assessing journals and research. This is the first iteration of our new reports, and we are planning to further increase transparency and openness through future phases.”
By starting to unbundle and share data on our different services, we aim to unravel the ‘black box’ of publishing and open it up to independent scrutiny. Having a more transparent and collaborative approach to our workflows will also help fuel innovation in publishing and further support researchers, funders and institutions who are committed to Open Science.