*Thanks to Paul and Hannah for taking the time for this interview, which is also featured in Scholastica’s white paper “Iterate to Innovate: How scholarly publishers can use Agile methodologies to respond to change more effectively.” You can access the full paper here.*
At the time of writing this blog, over 1,800,000 preprints are in circulation on the arXiv alone.
Preprint use has been climbing across scholarly disciplines as a means of sharing new research findings faster, with more than 30 preprint servers introduced over the last five years and record-breaking levels of preprint posting. Now, in the midst of COVID-19, the power of preprints as a tool for early research dissemination has become ever more apparent, as have the implications of preprint sharing — both positive and negative.
To quote comic book superhero Peter Parker, “with great power comes great responsibility,” and the pandemic has magnified how preprints can not only speed up the dissemination of promising new findings but also faulty science. STM’s call to scholars to help stop the spread of fake news in November 2020 was a sobering reminder of how shoddy research claims can infiltrate social media and even the mainstream news.
To help scholars stay ahead of the deluge of preprints across disciplines and more rapidly assess their claims, a new community-driven initiative led by the development arm of eLife has emerged. Sciety is a platform for following new preprints and reviews of them or available commentary via “Science Twitter.” Using an Agile software development approach, eLife and its partners have been working closely with the communities they serve to iteratively release new Sciety functionality based on scholars’ changing needs.
In the interview below, eLife’s Head of Technology Paul Shannon and Product Manager Hannah Drury discuss the aims of Sciety and their “working software first” development approach.
Can you briefly overview how you went about launching Sciety and how the idea for it evolved over time?
PS: The early thinking behind Sciety was that we wanted to develop a way to collect submissions of preprints and then collect submissions of reviews and bring them together to have everything housed in one place. So our initial idea was to develop a sort of publish, review, curate model platform as a centralised place to store preprints and reviews. But then, around January of this year, we started to realize how large of an undertaking that was going to be when the world really needed a solution sooner. Also a lot of that sort of infrastructure actually exists so we began thinking about how to build on that. The publishing of content can be done via existing preprints servers like the arXiv and bioRxiv and then there are peer review systems that can be integrated with them. What’s missing, we’ve found, is a way to bring together preprints with peer reviews, so we started focusing on solving that problem.
Initially we started out building a way to just link a preprint hosted on any existing repository to an associated peer review hosted elsewhere. That was useful to people, but what we found is it was sort of a one hit wonder situation. A user would come in, look at a review, and then go away, that was it for their interaction. To get a better sense of what was going on we started asking users what was missing and they were saying things like, “I couldn’t find anything relevant to me” or “I didn’t know when to come back to find more content.” We started to realize that the experience they were looking for was a lot like “Science Twitter” where scientists use Twitter to share preprints and reviews, and from there we decided to sort of piggyback on that. So we started to develop features where you could see content in a Twitter-like feed and then follow different communities that you could come back to regularly. We’re effectively building social networking features into an aggregation service and that’s what Sciety is now. New users come to view a peer review on a preprint and then they’re encouraged to stay or come back to follow all of the work being shared in the different reviewing communities.
The Sciety website notes that you’re developing this application with a “working software first” approach — can you unpack that concept and how it relates to Agile?
PS: Agile software development is all about being able to respond to business, landscape, and market climate changes very quickly. Software is there to solve problems for people and what Agile does is really emphasize the “soft” part of that. We’ve adopted a lot of practices from what’s called “extreme programming” that enable us to change the software and get things out for feedback quickly. The term “extreme programming” comes from the fact that you’re doing things on the extreme. So the biggest extreme would be if you write and deploy one line of code at a time. We’re not quite on that level, but we do sometimes deploy sets of changes to the code one at a time, which might take us between 10 minutes and an hour to write. And then when we add in bigger features we initially hard code or sort of mock up those things within the site itself to start making them visible to users faster. In those cases we take steps to make clear whatever is not quite finished yet.
This kind of Agile development with extreme programming requires a very whole-team approach. So even though Hannah and I are not actively developing this software from a coding standpoint in our roles we are working with the development team all of the time as they’re writing the code and helping them to quickly implement internal and external feedback. When you look at things like the Agile Manifesto and the origins of Agile in software development it’s all about putting users first and really getting working software out rather than writing up lengthy documentation in advance about why your software is going to be brilliant. So we know we’re putting something out there that isn’t going to be perfect, but at least we can get feedback on the bits that are working as well as the imperfect bits, and from there we can figure out the most important things to do.
HD: I think the biggest thing for me is limiting the number of assumptions that we make and limiting the amount that we design upfront. You want to get stuff in front of people to test any assumptions. So, rather than a defined final software plan, we work on the basis of experiments. Each experiment has a hypothesis and then we run the experiment and review the outcome to see what we’ve learned and decide whether to carry on in the same direction or pivot and try something else.
How is Sciety a community-driven initiative and why did you choose that approach?
PS: Early on we realized that the development of this application would have to work in a different way to how we’ve built software at eLife before, because the problem is so unknown and we knew we’d need a high adoption rate to start having any kind of impact. To ensure we were building things people definitely wanted, we decided to really follow the “collaborate with users” tenet of Agile development, in addition to working iteratively and all of that. We also decided to make this a community-driven platform to involve different perspectives. So this is not just an eLife project but a community initiative any organization can join.
HD: The interesting thing in combining Agile with the publishing industry is it seems like the publishing industry moves so slowly. One of the problems we’ve found we’re facing in trying to develop something that relies on such a huge behavioral shift is that the industry moves at a sort of glacial pace, so expecting people to change their habits is a challenge. But, knowing that, I think it’s also made the level of interest we’ve been able to generate all the more exciting.
How are you gathering feedback for iterative software planning and development?
HD: We use a variety of different methods for gathering feedback. We look at Google Analytics for quantitative feedback and then we’re also making sure that we’re constantly gathering qualitative feedback by speaking with people who are not only using the product on a regular basis but also potential users. I’ve found that it can be just as useful to talk to people who don’t use your product but who fall within the target market you’ve identified as it is to talk to users. In our case that is early career researchers in particular, and we’re looking to learn about the sorts of tools they’re using right now, what works well for them, and what doesn’t.
By gathering that sort of non-user feedback, what you’re doing is generating opportunities for your product to posite new ways of solving the problems your target market faces. It’s always important to be sure whatever you’re building is solving a problem that people actually have and not a problem that you think people have. Then that feedback quite literally feeds back into the development loop and helps you to know whether a particular feature is finished or not. And I don’t think that software is ever really finished in the true sense of the word.
PS: Software is kind of like a garden in that you have to keep tending to it forever and keep adding to it and every month it looks slightly different. And the main thing about Agile, as Hannah points out, is it’s always about having rapid feedback loops. We try to keep those loops very small so the iterations can be very rapid and the software evolves through ironing out imperfections and then putting in the next set of changes until it gets better and better. Before we were taking more of a modernist approach where we would map out most things upfront and have a sort of polished design before we started to build it. So we would start out with stronger opinions about whether something would work and then get a bit of user feedback through testing prototypes generally after the fact. Now our approach is much more to build something that’s not quite polished and polish it with the user to get them involved in the process. The idea is if the user is involved they’re telling us what they want so therefore they will use it and they will feel more engaged with it as well. If people aren’t using a particular feature we’ll get rid of it and put something else in its place to see if they use that instead. A lot of Agile is about having these kinds of regular increments rather than a big design upfront. By working in this way we don’t have to take on too much risk or, rather, we can take a big risk but it doesn’t last for long if it fails. So we can fail fast and learn early rather than over investing in something and being afraid to take it back.
Do you think Sciety would have evolved to the point it has today if you weren’t following an Agile development process?
PS: We’re absolutely convinced that had we not been doing it this way there would be no social features inside of what we built, yet that’s what people really wanted. So it was all based on the fact that following an Agile development process we could show someone the website and watch them use it or ask them to use it and give us feedback early on rather than waiting to finish building the whole thing. And I will say that it’s been a struggle for us to not come up with solutions all of the time. Instead we’re really pushing ourselves to come up with problems to solve and then test out one small solution at a time to see if it works rather than trying to solve all problems at once.