I have a lovely mental image of the Enlightenment in full swing. Inside a coffee house in Edinburgh, or Paris, a group of like-minded thinkers are engaged in stimulating debate: David Hume, Adam Smith, Voltaire, Diderot, and the like (not Rousseau, who would probably have been ostracized for bad behaviour). Out of that discussion emerges some of the classic Enlightenment texts that we know so well.
Can we replicate that intellectual ferment in the context of present-day academic research? Today, by contrast, any academic who has a new idea has to spend around six months preparing a statement for academic publication before he or she can engage in serious discussion with peers. That cannot be the best way to stimulate debate.
So, enter Octopus, a new initiative that aims to make the dissemination and publication of scientific work “fast, free and fair”. It’s not quite clear who is behind this initiative – no individuals are credited, although the UK Reproducibility Network is stated as “in partnership”. UKRN’s stated goal is “ensuring the UK remains a centre for world-leading research”, which sounds a bit defensive, and which suggests (from its title) that its scope is limited to ensuring research results are replicable.
Now there is Octopus (a rather threatening title, but let’s live with it), which promises to transform the nature of research: at least, the framework being set up by Octopus seems to have that potential. Octopus is designed (from the website) to be nothing less than “to replace journals and papers”! That’s the entirety of academic publishing! Clearly, the project is ambitious.
How it works
Anyone with an Orcid ID can post content. “Content” submitted can be much smaller than a full article; they list eight types on their site, including “problems”, “hypotheses”, “protocols”, and “analyses”. A key requirement is that every new publication posted must be linked to an existing content item on Octopus. There is no peer review, but once published, any content can be reviewed and rated by other users. In other words, Octopus is a little like a preprint server with smaller units of publication, plus the possibility of reviews.
As I see it, the biggest advantage is that publication is instant. I could publish a hypothesis in the next five minutes. Since all new content has to be linked to existing content on the site, it should be possible to see a clear thread of discussion through the site as more content is posted.
More fundamentally, the requirement to link to existing posts appears to replicate the nature of academic thinking. You read, or you discuss something, and then you respond. There is a lovely diagram on the site of this academic thinking “chain”:
Reviews of content posted (at least from the dummy articles posted on the site) are structured – perhaps over schematic, a bit like a film or restaurant review, based on a star rating, it would seem:
I am concerned about the star ratings. From the screenshot above, it looks a bit like Airbnb ratings, which as we all know from experience are less than ideal. Is that what is required to “keep the UK as a centre for world-leading research”? I don’t think so.
The instantaneous nature of publication on Octopus is also a drawback. It means I could publish an outlandish idea with little justification. Until and unless a paper is critiqued, it will retain some credibility by simply existing on this platform (and Octopus describes the uploading of content to this site as “publication”). I note that the major preprint servers such as bioRxiv and medRxiv do include some minimal but, I think, necessary checking, for example to make sure nobody is claiming they have found an immediate cure for cancer, or publish the perfect recipe for a terrorist attack.
But there are some more substantial problems with the Octopus publication workflow. I haven’t looked closely at the uploading process for Octopus, but one question springs to mind immediately. New submissions have to be linked to existing content. You therefore have the “cold start” problem that many recommender systems face: you can only recommend something when the platform is already populated. Similarly, at the start of Octopus, you don’t have enough content to link to. Why not link to existing published content? It would appear from this requirement that Octopus is attempting to turn its back on the entirety of already published content. Why is that necessary? Without any kind of bridge from the existing infrastructure, it seems to me very unlikely that Octopus will gain widespread adoption. My instinct is that Octopus might have more chance of success as a complement to the existing publishing system, rather than working in isolation from it.
Even if Octopus becomes successful, there will be problems. Let’s imagine the content on Octopus has exploded and there are now hundreds of thousands of content items. How does a researcher find what is already on the platform? Some kind of automated discovery aid tool will be required.
I haven’t discussed (and there is no mention on the site) of how all this will be funded in the long term, but presumably it will continue to be free for people posting.
Overall, Octopus is a fascinating proposal for rethinking how some aspects of the research process could work more effectively. It’s far too early to say how (and if) it will be taken up, but it certainly made me think of ways in which the current research workflow could change. Who knows, Octopus may yet be the 21st century equivalent of the Enlightenment coffee house.