2025 June 17
Evolving the preprint evaluation world with Sciety
This post is based on an interview with Sciety team at eLife.
Crossref is taking over the service management of Similarity Check from Turnitin. That means we’re your first port of call for questions and your agreement will be direct with us. This is a very good thing because we have agreed and will continue to agree the best possible set-up for our collective membership. Similarity Check participants need to take action to confirm the new terms with us as soon as possible and before 31st August 2019. Instructions will be circulated early June via email.
Over 100 Million unique scholarly works are distributed into systems across the research enterprise 24/7 via our APIs at a rate of around 633 Million queries a month. Crossref is broadcasting descriptions of these works (metadata) to all corners of the digital universe.
“Pre-prints” are sometimes neither Pre nor Print (c.f. https://doi-org.pluma.sjfc.edu/10.12688/f1000research.11408.1, but they do go on and get published in journals. While researchers may have different motivations for posting a preprint, such as establishing a record of priority or seeking rapid feedback, the primary motivation appears to be timely sharing of results prior to journal publication.
Our newest dedicated record type—peer review—has received a warm welcome from our members since rollout last November. We are pleased to formally integrate them into the scholarly record, giving the scholars who participated credit for their work, ensuring readers and systems dependably get from the reviews to the article (and vice versa), and making sure that links to these works persist over time.
The Crossref graph of the research enterprise is growing at an impressive rate of 2.5 million records a month - scholarly communications of all stripes and sizes. Preprints are one of the fastest growing types of content. While preprints may not be new, the growth may well be: ~30% for the past 2 years (compared to article growth of 2-3% for the same period). We began supporting preprints in November 2016 at the behest of our members. When members register them, we ensure that: links to these publications persist over time; they are connected to the full history of the shared research results; and the citation record is clear and up-to-date.
The ancient Romans performed a purification rite (“lustration”) after taking a census every five years. The term “lustrum” designated not only the animal sacrifice (“suovetaurilia”) but was also applied to the period of time itself. At Crossref, we’re not exactly in the business of sacrificial rituals. But over the weekend I thought it would be fun to dive into the metadata and look at very high level changes during this period of time.
Researchers are adopting new tools that create consistency and shareability in their experimental methods. Increasingly, these are viewed as key components in driving reproducibility and replicability. They provide transparency in reporting key methodological and analytical information. They are also used for sharing the artifacts which make up a processing trail for the results: data, material, analytical code, and related software on which the conclusions of the paper rely. Where expert feedback was also shared, such reviews further enrich this record. We capture these ideas and build on the notion of the “article nexus” blogpost with a new variation: “the research nexus.”
About 13-20 billion researcher-hours were spent in 2015 doing peer reviews. What valuable work! Let’s get more mileage out of these labors and make these expert discussions citable, persistent, and linked up to the scholarly record. As we previously shared during Peer Review week, Crossref is lauintroducing support for a new record type to support the registration of peer reviews. We’re one step closer to changing that. Today, we are excited to announce that we’re open for deposits.
A number of our members have asked if they can register their peer reviews with us. They believe that discussions around scholarly works should have DOIs and be citable to provide further context and provenance for researchers reading the article. To that end, we can announce some pertinent news as we enter Peer Review Week 2017 : Crossref infrastructure is soon to be extended to manage DOIs for peer reviews. Launching next month will be support for this new resource/record type, with schema specifically dedicated to the reviews and discussions of scholarly content.
Very carefully, one at a time? However you wish.
Last year, we introduced linking publication metadata to associated data and software when registering publisher content with Crossref Linking Publications to Data and Software. This blog post follows the “whats” and “whys” with the all-important “how(s)” for depositing data and software citations. We have made the process simple and fairly straightforward: publishers deposit data & software links by adding them directly into the standard metadata deposit via relation type and/or references. This is part of the **existing Content Registration ** process and requires no new workflows.
Destacando nuestra comunidad en Colombia
2025 June 05