The Ronin Institute has launched its own YouTube channel!
The Cultivating Ensembles in STEM Education and Research (CESTEMER) was held at the Goodman Theater in downtown Chicago on September 15-17, 2017. Initiated by Raquell Holmes and improvscience in 2012, it brings together a diverse mixture of scientists, artists, humanists and performers to discuss and discover new ways of doing science in groups. I attended to share what we’ve been working on at the Ronin Institute, as well as gathering new ideas and strategies for the way forward. There are now many great venues and conferences for discussions on improving science communication, the value of creativity in our workplaces, or integrating the arts and humanities into STEM and education – CESTEMER was about all of those things, but with an added unique emphasis on group performance and play.
In addition to the regular talks, poster sessions, and keynotes, all conference attendees had opportunities to participate as performers through games and techniques drawn from theater and improv. This meant the conference was not the usual armchair experience – all conference attendees were co-creators of the performance that was the conference itself. Why is this important? Performance is critical to group learning because of it’s “show, not tell” and experiential nature. To take just one example, the workshop run by Nancy Watt and Carolyn Sealfon “Whose Idea Is It Anyway?” tackled the ownership of ideas in science. Workshop participants grouped together to solve a physics problem and were asked to “play” different characters drawn from several personality types. By experimenting with different characters, we were able to experience how each group solved problems based upon their willingness to build on other’s ideas, embrace mistakes as learning opportunities, share credit and move the collaboration forward.
The intense competition to demonstrate individual “ownership” of an idea often prevails in the academic world (coupled with an artificial scarcity that is perpetuated by the journal prestige system amongst other things) can sometimes lead to an atmosphere of distrust. Therefore the direct experience of the value of empathetic collaboration to produce both better results, as well as unexpected and serendipitous discoveries, through such workshops, will become increasingly invaluable as a means for cultural change in our institutions. This bottom-up approach, coupled with more top-down changes in publications and funding incentives, will, I believe, lead to more durable cultural change than either alone. Plus it’s also a much more fun way of doing science!
I presented a short talk outlining how the Ronin Institute is aiming to foster new ways of thinking of the scientific enterprise as an “ecosystem” of peers. In this ecosystem, scientists collectively empower themselves to build scientific careers in whatever mode or style works for them in the context of the rest of their lives (whether this is in a university setting or elsewhere). I contrasted this ecosystem idea with the usual “pipeline” metaphor that conceives that pursuit of autonomous research requires following one of a set of fairly narrow career paths, controlled by a relatively small number of gatekeepers. I shared the concrete steps we have made in cultivating our own science communities, such as the face-to-face local meetups, participant-driven events like our first Unconference, the virtual meetings: the weekly Tuesday “watercooler” and virtual web research seminars. You can see the slides here:
In summary, CESTEMER was a really fantastic opportunity to generate new “spores” in our evolving ecosystem of science and scholarship. I thank CESTEMER for inviting us, and I’m excited for the Ronin Institute to become part of this conversation. I look forward to all these spores travelling back to each of the participants’ everyday workplaces and spreading the message that we all do our best work when we listen and play together. I plan to attend the next CESTEMER conference in 2019 and I invite anyone interested to join me!
Academics in traditional university environments tend to be keenly aware of where their university ranks, whether they like to admit this or not. Most familiar are the college-level rankings like those from the US News & World Report, which weigh the undergraduate experience heavily. However in the research world, the notion of “excellence” has become the coin of the realm as evidenced by a proliferation of “excellence frameworks” such as the Research Excellence Framework (UK), the German Universities Excellence Initiative, the Excellence in Research for Australia and the Performance Based Research Fund (New Zealand). Given that many resources from capital funds, grants and permanent positions are doled out in accordance with rankings, where one’s institution stands goes beyond mere bragging rights. Most academics understand the arbitrary nature of such rankings and despite regular kvetching that they are either “unfair” (usually from those at an institution “lower” in the rankings) or that they have “finally” recognized the true worth of their institution (usually from those rising in the rankings), the existence of the ranking system itself, is normally taken as given. After all, how are we to sort the worthy from the unworthy?
Samuel Moore, Cameron Neylon, Martin Paul Eve, Daniel Paul O’Donnell and Damian Pattinson have published an (ahem), excellent research paper “Excellence R Us”: university research and the fetishisation of excellence that comprehensively examines both the notion and practices of “excellence” in research. Excellence, as most of the research frameworks define it, essentially boils down to some combination of ranking institutions by their scholars ability to publish in established prestige journals, ability to gain external grants and other easily-measured metric of scholarly output.
Their conclusion, in a nutshell: “excellence” is totally bogus:
…a focus on “excellence” impedes rather than promotes scientific and scholarly activity: it at the same time discourages both the intellectual risk-taking required to make the most significant advances in paradigm-shifting research and the careful “Normal Science” (Kuhn  2012) that allows us to consolidate our knowledge in the wake of such advances. It encourages researchers to engage in counterproductive conscious and unconscious gamesmanship. And it impoverishes science and scholarship by encouraging concentration rather than distribution of effort.
In other words in the context of scientific scholarship: focusing on excellence prevents the two things that we say we want from from science: careful reproducible science and the big breakthroughs. The article covers familiar ground to those who have been following the state of academia including discussions of the lack of reproducibility in science, the pernicious use of journal prestige to evaluate academics, and the general environment of hypercompetition in research. Many, if not most, academics are aware these issues, having been covered extensively in the trade press in recent years, but continue to view them through the lens of their effect on traditional tenure-track (or equivalent) faculty with established research programs. So it is refreshing that the article tackles how the rhetoric of ”excellence” can restrict the range of types and styles of scholarship, issues that are close to the heart of the Ronin Institute:
There is, however, another effect of the drive for “excellence”: a restriction in the range of scholars, of the research and scholarship performed by such scholars, and the impact such research and scholarship has on the larger population. Although “excellence” is commonly presented as the most fair or efficient way to distribute scarce resources (Sewitz, 2014), it in fact can have an impoverishing effect on the very practices that it seeks to encourage. A funding programme that looks to improve a nation’s research capacity by differentially rewarding “excellence” can have the paradoxical effect of reducing this capacity by underfunding the very forms of “normal” work that make science function (Kuhn  2012) or distract attention from national priorities and well-conducted research towards a focus on performance measures of North America and Europe (Vessuri et al., 2014)
The article continues by pointing out that “excellence” is often used as a proxy for academic work that fit certain “standard” modes, which can result in a more bland and conformist world of scholarship:
Given the strong evidence that there is systemic bias within the institutions of research against women, under-represented ethnic groups, non-traditional centres of scholarship, and other disadvantaged groups (for a forthright admission of this bias with regard to non-traditional centres of scholarship, see Goodrich, 1945), it follows that an emphasis on the performance of “excellence”—or, in other words, being able to convince colleagues that one is even more deserving of reward than others in the same field—will create even stronger pressure to conform to unexamined biases and norms within the disciplinary culture: challenging expectations as to what it means to be a scientist is a very difficult way of demonstrating that you are the “best” at science; it is much easier if your appearance, work patterns, and research goals conform to those of which your adjudicators have previous experience. In a culture of “excellence” the quality of work from those who do not work in the expected “normative” fashion run a serious risk of being under-estimated and unrecognised.
As the authors point out it is common in such pieces to identify external factors such as:
institutional administrators captured by neo-liberal ideologies, funders over-focussed on delivering measurable returns rather than positive change, governments obsessed with economic growth at the cost of social or community value
as the primary cultural driver of metric-driven “excellence”. And this is definitely a huge part of the issue (see Ronin blog posts “Graeber on the Transformation of Universities” and “Henry Heller on IP-Based Capitalism at Universities”), but it’s not the only driver. Attributing these issues purely to external forces lets the academy somewhat off the hook since, as the authors continue:
the roots of the problem in fact lie in the internal narratives of the academy and the nature of “excellence” and “quality” as supposedly shared concepts that researchers have developed into shields of their autonomy. The solution to such problems lies not in arguing for more resources for distribution via existing channels as this will simply lead to further concentration and hypercompetition. Instead, we have argued, these internal narratives of the academy must be reformulated.
In other words: academia probably needs to take a look in the mirror once in a while and should question whether current norms really still serve their twin stated goals of encouraging sound “normal” scholarship as well as risky breakthroughs. I would also add: it should be enabling all scholars to participate in whatever way fits their individual talents, rather than promote a “one-size-fits-all” notion of alpha-academic success. There is much more to the article than space allows here, it’s a good piece for anybody interested in the future of scholarship, and it includes a highly detailed bibliography.
Coda: In a nice example of walking the walk, the authors have this note about “subverting traditional scarce markers of prestige” by adopting:
a redistributive approach to the order of their names in the byline. As an international collaboration of uniformly nice people (cf. Moran et al., 2016; Hoover et al., 1987; see Tartamelia, 2014 for an explanation), lacking access to a croquet field (cf. Hassell and May, 1974), writing as individuals rather than an academic version of the Borg (see Guedj, 2009), and not identifying any excellent pun (cf. Alpher et al., 1948; Lord et al., 1986) or “disarmingly quaint nom de guerre” (cf. Mrs Kinpaisby, 2008, 298 [thanks to Oli Duke-Williams for this reference]) to be made from the ordering of our names, we elected to assign index numbers to our surnames and randomize these using an online tool.
Mickey von Dassow is a biologist who is interested in exploring how physics contributes to environmental effects on development and recently joined the Ronin Institute as a Research Scholar. Here is a edited version of an interview I did with him from last year (full version).
Can you describe your background?
My background is in biomechanics and developmental biology. My Ph.D. asked how feedback between form and function shapes marine invertebrate colonies. During my postdoc I worked on the physics of morphogenesis in vertebrate embryos, specifically focusing on trying to understand how the embryo tolerates inherent and environmentally driven mechanical variability. Since then I have been independently investigating interactions among ecology, biomechanics, and development of marine invertebrate embryos, as well as teaching courses.
Tell us more about Independent Generation of Research (IGoR) ?
IGoR is a wiki for sharing research ideas, skills, and resources among novice, amateur, and professional scientists. The goal is to make it easier for everyone to do scientific research, regardless of how they make a living. One of the main motivations was that I often need devices that are just beyond my own skills to make, but which hobbyists with other skills could easily help me make. This got me thinking that I could do more and better science if there was an easy way for me to build collaborations with amateurs who have different skill sets. I also realized I would have much more fun doing science if I had a way to keep doing it whether or not I get the next grant or research job. Amateurs, such as Benjamin Franklin, Charles Darwin, or Grote Reber (the inventor of the radio telescope), used to be major contributors to scientific research. Today’s technologies should make it much easier for people to do science outside of a career, but we need ways to pool people’s talent and experience.
Where do you see “citizen science” going in the next 5 or 10 years?
I should say that IGoR is inspired by “citizen science,” but is a bit different from most citizen science. At the moment, most (but not all) citizen science seems to follow a model in which a few experts design a way to obtain a lot of data by getting many volunteers to do some low-skilled, repetitive task. However, there is a lot of interest in community-generated approaches (such as Public Lab, iNaturalist, OpenROV, and others), and approaches where there is real feedback between professionals and citizen scientists, involving creative and intellectual input from citizen scientists.
How does citizen science relate to the “open-science” movement?
As far as I can tell, the open-science movement seems to be focused mostly on open data and open publication models, but there are a lot of other strands to it. One strand that IGoR is definitely a part of is trying to move away from a status quo in which research is almost all done by people employed as researchers by big institutions. Open science, open source generally, citizen science, and the Maker/Hacker movement, all seem to be pushing against the divide between the professional and everyone else….
Are there particular kinds of research areas or projects that tend to fall through the cracks of traditional funding agencies (NSF, NIH etc.)?
Yes. Funding agencies and universities like high-tech science. If you use a big machine that goes ping to do it, you have a much higher chance of success than if you just need to watch something with your own eyeballs, even if the intellectual merit is the same or better. Funding agencies are also driven by fashion, so in biology anything “omics” is in, and organisms seem to be pretty much out for the moment. Finally, they are not good at funding brand new projects, or new or unknown researchers. For example, researchers often say you need to do the project before you can get funding for it, and then use the funding for the next step. This makes perfect sense: your best bet with limited money is the big lab, with lots of toys, piles of preliminary data, and oodles of publications to prove they can do the job. However, that makes it hard for new researchers, small labs, or people trying new directions. Cutting those researchers out reduces the diversity of research questions and perspectives.
My hypothesis for why “omics” and traditional model organisms dominate (even when there are better ones for particular problems) is positive feedback. If approach or field X is fashionable it will garner higher profile publications and more funding, so people doing X will have more opportunities, and other people will pay more attention to X, hence X seems even more exciting and an even better bet for funding or new hires. But, attention and funding are limited, so the more those go to X, the less they go to everything else. As I write this, it suggests that the answer is to make funding, and also publication visibility, a non-zero sum game. That gets back to finding new ways to support science, and to tell people about it, which encourage diversification of questions and approaches.
What kinds of changes in the institutional structures of science (e.g., peer review, publications, promotions etc.) would encourage more citizen science, open science or independent scholarship?
I think one of the biggest things that academic institutions could do is to teach students that independent scholarship is possible. There will never be enough funding for everyone who wants to do research (and is skilled at it) to make a living doing it. However, we all know that some of the deepest conceptual advances, notably Darwin and Wallace’s theory of evolution, came from people who were not employed as scientists. There are still many of important questions that can be addressed by an individual investigator on a shoestring budget.
So, if we value science (or scholarship generally), we need to create an environment in which research can be an avocation rather than a career. The most important parts of that are to make that choice socially acceptable within the scientific community, and to teach people – starting in undergrad and going through all career stages – how to make it work. There are many resources describing how to succeed in academia (or whatever other career one might choose); but, there are few, if any, guides to doing research successfully when one is not doing it for one’s job.
Are there other new models of doing research, outside of mainstream academic research institutions, that you have seen out there that inspire you?
Community labs are one that excites me a lot, and is an inspiration for IGoR. They could be great for getting novices, amateurs, and independent professionals working together to do substantive research; their main limitation is that they are few and far between. The Ronin Institute aims to create a more flexible approach to being an independent scholar, so that more professional-level scholars can do research. Even simple things like providing an institutional affiliation for applying for grants could be very helpful.
What’s your favourite organism?
Do I have to choose just one? Ctenophores might be it right now. The way they glide through the water with waves of iridescence running down bands of beating cilia, is incredibly beautiful. I love the fact that they coordinate a lot of their motion and sensation using interactions among cilia: a very different approach than most animals. They also have some very cool developmental features. For example, some of them can regenerate half or more of their body as adults, despite the fact that (for the most part) each embryonic cell forms a particular part of the body, and cannot be replaced when lost. There is a point in their development where they gain the ability to regenerate. However, I love lots of invertebrates, and I can’t look at ciliates without wondering why I don’t study them.
Read the full interview
A recent editorial in Nature “Young scientists thrive in life after academia” on the future of careers for today’s scientists is on one hand, both optimistic, but on the other, deeply unsatisfying. The editorial is clearly well-intentioned, providing what it sees as a hope for a generation of new scientists facing the worse funding climate and academic job market in decades. I agree with the editors that it is encouraging that people with PhDs and long periods of training are finding gainful employment.
However the editorial has what might be called a cultural blindspot: the default assumption that doing research science is largely an activity that one undertakes only within a specific set of jobs performed in certain institutions and once you’re out of those institutions, there’s both no way to continue, nor any way back. Of those who moved out of academic positions it says:
Many had managed to stay in touch with science, and worked in a related function such as administration, outreach or publishing.
This strikes me as a disempowering message: the best one can hope for is “to stay in touch with science”. Is this really the most we can do for those who have spent many years acquiring skill and knowledge of a subject? Is doing science really like a step function: all or nothing? To be fair, the editorial doesn’t say this explicitly, but it’s all in the subtext.
After reading its description of the real struggles of today’s scientists:
The hours, the workload, the instability of postdoc positions, the expectations, the low pay, the pressure and competition, the lack of opportunities and the fear of failure: all can combine to make the early-career years difficult indeed.
One might be tempted to ask, why are academic institutions of science like that? Do they need to be? Maybe we should change them? And perhaps the institutions should adapt to the people working in them, rather than the other way around? A recent article on the the news side of Nature “the scientific 1%” makes it clear just how much concentration of wealth and prestige in a small number of institutions and groups leads to this intense competition and pressure. Maybe that’s a good place to start reform?
These are questions that the Nature editorial does not seem prepared to tackle. Instead the focus is on more “honest career advice”, as if the institutions themselves are fixed and unreformable. Nature also subtly reinforces its own place in this hierarchy:
More than three-quarters of them had published as a principal author and one-fifth had published a paper in a high-impact journal such as Nature.
Essentially, if these scientists published in Nature, they must be good! This elides Nature’s role in buoying an incentive system based upon an artificial scarcity of “slots” in highly prestigious journals. A system that that indirectly perpetuates the academic rat-race and the concentration of resources that has made aspects of research in academic environments so unpleasant in the first place.
Towards the end of the editorial, we start getting a little closer to a more expansive view of the situation in the discussion of different paths:
Science should wish them well. As Nature has pointed out before, a regular flow of bright, highly trained and scientifically literate workers heading into the wider world can only benefit society and science. It is time to normalize these sideways steps, and for universities, senior scientists and research funders to accept and embrace the different paths that young researchers choose to follow.
This is a more open-minded view, but it still begs the question: why wait for these gatekeepers to approve or “normalize” these paths? Scientists can collectively empower themselves. Because traditional academia is highly hierarchical this notion is in Nature’s cultural blindspot .
Ultimately we need a broader cultural shift that decouples the activity of science from specific institutions. Academic institutions may “wish them well”, but “science” cannot, because there is nobody who can speak for science as a whole, not Nature, not career advisors, not academic institutions, not even the Nobel Prize Committee. This is because nobody owns science. We don’t expect artists’ to drop doing art if they don’t land a position at an art museum or residency at a gallery, they keep doing their art. Yet this is the commonplace expectation in science. An editorial which explores ways of both reforming the existing system to be more humane, and examines ways to empower all scientists to continue to do science, who may never be, nor ever want to be, university-based professors, that values contributions regardless of affiliation or job title, now that’s an editorial I’d like to read.
 Many will point out that, of course, people outside traditional academic positions still publish papers, and do scientific work. Those working in biotech or pharma companies, might continue publish middle-author papers with a dozen co-authors. Or a scientist working at a conservation nonprofit might help contribute to a paper the data analysis of an ecological project in a collaboration with an academic. That’s not what I mean: I mean curiosity-driven research that is independently-initiated while not employed at an academic institution. Nobody will stop you doing research absent a standard institutional affiliation, but everything about the current funding and publication system will either passively or actively work against those who choose to work “outside” (See Richard Lewontin’s “Legitimation is the Name of the Game“).
 I don’t mean to minimize the financial aspects and need for jobs and income: between the postdoc crisis, the over-reliance of adjuncts and the increasing neoliberal corporatization of universities in general, all of which we have discussed here on the Ronin blog, most scientists’ immediate concerns is continuing to pay the bills while doing their work. But in a world in which steady, predictable career paths may disappear in general, we can’t only think of institutions and job titles, as even the notions of early, mid and late career are likely to change radically in the coming years. I’m not suggesting that this is necessarily all good. I wouldn’t want all institutions or universities to just melt away for example, but a world in which only a few get stability and all the goodies and the rest do not isn’t so great either. There is an extreme libertarian, and undesirable, version of this future in which there is a Hunger Games-style race to the bottom, with an even more wealth inequality. But there is potential progressive version of this future, sketched out most compelling by economist Guy Standing in which basic economic security would be provided via, amongst other things, something like a universal basic income. This is a subject I discussed briefly in a previous blog post.
Open science has well and truly arrived. Preprints. Research Parasites. Scientific Reproducibility. Citizen science. Mozilla, the producer of the Firefox browser, has started an Open Science initiative. Open science really hit the mainstream in 2016. So what is open science? Depending on who you ask, it simply means more timely and regular releases of data sets, and publication in open-access journals. Others imagine a more radical transformation of science and scholarship and are advocating “open-notebook” science with a continuous public record of scientific work and concomitant release of open data. In this more expansive vision: science will be ultimately transformed from a series of static snapshots represented by papers and grants into a more supple and real-time practice where the production of science involves both professionals and citizen scientists blending, and co-creating a publicly available shared knowledge. Michael Nielsen, author of the 2012 book Reinventing Discovery: The New Era of Networked Science describes open science, less as a set of specific practices, but ultimately as a process to amplify collective intelligence to solve scientific problems more easily:
To amplify collective intelligence, we should scale up collaborations, increasing the cognitive diversity and range of available expertise as much as possible. This broadens the range of problems that can be easy solved … Ideally, the collaboration will achieve designed serendipity, so that a problem that seems hard to the person posing it finds its way to a person with just the right microexpertise to easily solve it.
Attempts to reform the way we do science have been underway for decades, from arXiv in 1990s, to open access publishing in the early 2000s. The degree to which any scientific field practices open science varies considerably, but it’s pretty fair to say that the institutional embracement of open science hasn’t exactly been speedy despite demonstrated successes in the physical sciences and mathematics such as the Polymath project and Galaxy Zoo. In physics it is had been mainstream for a while now to release manuscripts first on arXiv and big data sets through standardized repositories. In the biomedical sciences, progress has been considerably slower, perhaps due its larger institutional and financial footprint: it’s the proverbial large supertanker than needs a long time to turn around, let alone move in different direction. Whatever the reason, open science is now firmly on the radar, and it has unleashed a torrent of opinion and criticism, examining all aspects from its practicality to its desirability.
Although there is a spectrum of responses, criticism of open-science tends to fall into one of two camps, that I will call “conservative” and “radical”. This terminology is not intended to imply an association with any conventional political labels, they are simply used for convenience to indicate the relative degree of comfort with the institutional status quo. Let’s look at these two groups of critiques.
The conservative critique: what are all these damn people doing with my data?
The conservative response to regular timely release of pre-publication data could be best summarized by the phrase: “are you kidding me? why would I do that?” The apotheosis of this notion was appeared in an editorial published in the New England Journal of Medicinewhich described with some horror the “emergence of a new class of research parasites”. They further concluded that some of these parasites might not only use that data for their own publications, but might seek to examine whether the original study was correct. Many scientists took to Twitter to express their amazement that anybody would object to a re-examination of the data, since falsification is presumed to be the backbone of the scientific enterprise and use of the hashtag #IAmAResearchParasite was trending for several days.
From the perspective of the current incentive system, however, this response is totally rational. In a model where labs or principal investigators are largely funded on “high-impact” papers and grants, there is intense pressure to keep the lid buttoned on data as long as possible, even if a collaborative process could, in principle, produce a better result. (In software world this is often summarized by the phrase attributed to Erik Raymond: “with enough eyeballs, all bugs are shallow”). It’s also a collective action problem: if all scientists were to release their data, then it would easier for individuals to buy into a more frequent release of data.
The second area where open-science approaches run into another entrenched form of institutional power is the battle over preprints. Preprints are well established in physics, and about 80% of pre-prints end up in “traditional journals”. In biology, however, allowing preprints to then be accepted by traditional publishers has been fiercely resisted especially by “high-impact factor” journals published by publishing conglomerates like Elsevier, ever since e-Biomed was proposed as a biology-version of arXiv back in 1999. It’s my sense that these publishers are reading the tea-leaves and realizing, probably quite rightly, that eventually scientists will just cut out of the middleman (the journal). Mike Eisen, one of the pioneers of e-Biomed and PLOS has in fact explicitly proposed that we should eventually just do away with journals and move to a complete preprint + post-publication peer review system. Obviously a nightmare scenario if you’re the head of multibillion dollar highly profitable publishing conglomerate that benefits from the free labor of scientists (peer review has been estimated to be worth ~1.9 billion British pounds per year).
One of the usual counterarguments to post-publication peer review is that it will produce a flood of lower quality papers. The refrain is: how will I know what to read, now anybody can publish? This can indeed be an issue but it need not be insurmountable. Nielsen and others have proposed a publish-then-filter model. New experiments like Science Matters, which publish single observations and The Winnower which archives grey literature and blogs can also provide these purposes. One thing seems clear, especially after the Accelerating Science and Publication in Biology (ASAPbio) meeting this Spring that the tide, in biomedical science at least, seems to be turning towards the acceptance of preprints, notably via the biology-specific preprint server bioRxiv. Over time this may lead to an increased willingness to explore different publishing models.
Of course, all these changes are really just baby steps in a truly fully-fledged open science, because they still enshrine a “scientific paper” as the sole end goal. Other aspects of the open science movement, including building tools for reproducibility, “rewarding” non-paper research products such as code, infrastructure and raw data itself promise to be equally important, but despite the progress made in the last 6 years, are activities that are still largely unrewarded by the current academic system. It will probably take a true generational turnover before we see a full-throated embrace of open science, and many of those driving the changes have concluded that it is best done outside traditional academic institutions. In fact, thinking about open-science solely in terms of economic incentive structures, may be a wrong, or at least incomplete, way to think about open-science, which leads me to…
The “radical” critique: be careful what you wish for.
Arguments for open-science made in response to the conservative critique tend to assume that release of more data, code, papers is a pure good in and of itself, and downplay the political economy in which they are embedded. Indeed, as I just argued above above: a fertile intellectual commons that all scholars, professional and otherwise can use to pursue their own intellectual adventures is a worthy goal and, in an ideal world, should lead to a truly more democratic science. However an interesting paper by sociologist David Tyfield: “Transition to Science 2.0: “Remoralizing” the Economy of Science” says essentially: not so fast.
Tyfield suggests that the release of vast troves of data, papers or research results although potentially beneficial to science as an enterprise, could simply exacerbate the trends towards the increasing marketization and corporatization of science and will disproportionately benefit large corporations. There are several trends that worry Tyfield and other scholars such as Philip Mirowski, Gary Hall and Eric Kansa, including:
the capturing of publically-funded research value by commercial platforms
open-science will simply consolidate a different set of gatekeepers and introduce yet more “metrics” of productivity used to “incentivize” scholars to work harder
and a focus on system-wide progress of science ignores costs and benefits to individual humans, scientists or non-scientists
Let’s take them in order.
1. Capturing of academic labor output by commercial interests
Academia.edu has probably become familiar as a destination for scholars of all stripes to share their work via archiving their PDFs. Recently they sent an email to select participants to join their editor program, and be an unpaid editor for the site to recommend publications appearing on the site to others that area of research expertise. This move was roundly criticized and led to another Twitter hashtag: #DeleteAcademiaEdu. Gary Hall wrote a paper ““Should This Be the Last Thing You Read on Academia.edu?” (available on Academia.edu!) comparing Academia.edu’s business model to Uber, noting that
…the majority of academics who are part of Academia.edu’s social network are the product of the state-regulated, public higher education system, as is their research (a system, it should be said, from which public funding is steadily being withdrawn). But just as Airbnb and Uber are parasitic on the public ‘infrastructure and the investment’ that was ‘made by cities a generation ago’ (roads, buildings, street lighting, etc.), so Academia.edu has a parasitical relationship to the public education system, in that these academics are labouring for it for free to help build its privately-owned for-profit platform by providing the aggregated input, data and attention value.
My own sense is that those running Academia.edu and ResearchGate are in it for idealistic reasons, and don’t see themselves as the next rapacious Uber-like company, but Hall’s point is that the business model that they operate under may force them to become increasingly extractive.
2. The tyranny of metrics
The second aspect is more subtle, as already scholars are ranked in terms of their “productivity” as measured through papers and grants. The counter-argument to the conservative reaction against against open-science is that it brings more research outputs into that system, thus “incentivizing” the publication or release of intermediate results, useful research by-products, code and the like. As I noted above these pieces can represent an intellectual commons where teams of scholars or individuals can build upon their release, thus incentivizing the generation of “public goods”. On this issue, Eric Kansa, a digital humanities scholar and practitioner of open data in an article “It’s the Neoliberalism, Stupid: Why instrumentalist arguments for Open Access, Open Data, and Open Science are not enough“ exposes the limitations of relying on open-science metrics alone:
Metrics, even better Alt-metrics, won’t make researchers or research more creative and innovative. The crux of the problem centers A Hunger Games-style “winner take all” dynamic that pervades commerce and in the Academy. A rapidly shrinking minority has any hope of gaining job security or the time and resources needed for autonomous research. In an employment environment where one slip means complete ejection from the academy, risk-taking becomes quasi-suicidal. With employment increasingly precarious, professional pressures balloon in ways that make risk taking and going outside of established norms unthinkable. Adding more or better metrics without addressing the underlying job security issues just adds to the ways people will be ejected from the research community.
Metrics, while valuable, need to carry fewer professional consequences. In other words, researchers need freedom to experiment and fail and not make every last article, grant proposal, or tweet “count.”
Kansa says that metrics, while useful up to a point, can be counterproductive because they simply add more steps in the treadmill in which scientists already operate. In other words, even though science as a whole may benefit, individual actual humans scientists may not. After all, if your job depends on producing evermore research “products” every year (whether open or not) then adding yet another set of outputs to please a search or tenure committee doesn’t seem like much fun. Because unless the fundamental model for hiring and funding changes and university administrations stop treating science as a business that must “grow”, new open-science “outputs” won’t substitute for papers and grants, they’ll just be added to them.
3. Who benefits?
Tyfield goes further in his analysis and suggests that the locus of progress is at the wrong level, and that open-science prioritizes “scientific progress” in the abstract, above improving the lot of the individual humans that comprise it:
Yet, as we have seen above, one needs only to ask the more humdrum question of “where are the jobs?” to see that the focus in such accounts is firmly at the system level of the global “data web” and the accelerated “progress” of “science” while totally neglecting that of the human individual and his/her place in such a society. It thus fails (and is likely to be seen to do so relatively quickly) precisely the test of moral economy that has triggered the breakdown of the passing order and the pursuit of transition to another: it massively rewards the undeserving and impoverishes the many and fails to deliver the “goods” of more, better and more-democratically-engaged knowledge that tackles the urgent and “wicked” problems of the multiple environmental, health and resource-based crises.
Further, he argues that open-science could:
undermine the compensation of human knowledge labour—albeit under the seemingly democratizing banner of “free information”—upon which such a system is entirely dependent. Furthermore, this analysis not only thereby destroys the “livelihoods” of a genuinely “creative” “knowledge-based economy” but also sponsors the construction of a system characterized by even more concentrated corporate control of knowledge (eg. information, data) than that of the IP-intensive corporate model of neoliberalism it ostensibly subverts.
Mirowski is even more blunt in his assessment of open-science:
It would be misguided to infer that Science 2.0 is being driven by some technological imperative to ‘improve’ science in any coherent sense. Rather, the objective of each and every internet innovation in this area is rather to further impose neoliberal market-like organization upon the previously private idiosyncratic practices of the individual scientist….Open Science 2.0 does not exist to democratize or otherwise improve research. Rather, it is engineered to position a few large firms at the electronic portals of the modern commercialization of knowledge.
This seems to me an excessively pessimistic view of open-science. Many of the most exciting initiatives in open science have been grass-roots driven efforts, especially the push towards preprints in biology. It’s hard to see how the rise of preprints is more market-like than already exists in the rush towards submission in prestige journals. I have personally supported open science for many years for it’s potential to improve reproducibility, and to produce open-source scientific software that is beneficial to all. (Releasing scientific software under open source licenses is now a stated goal of the scientific establishment: in days past I would have needed to spend a good deal of effort convincing otherwise skeptical senior colleagues and universities of the value of open-sourcing my own efforts. Now this is less necessary, which counts as some kind of progress).
Nevertheless, Tyfield and Mirowski are right to point out the dangers of a pollyannaish view of the digitization of scientific practices. After all, a democratic, decentralized and open-source ethos was part of the founding principles of many of the now-dominant market players in the digital economy such as Google back in the late 1990s and early 2000s. And as these companies have grown to become even more powerful, many of their vaunted principles have given way to a more winner-take-all approach.
It is therefore not unreasonable to be concerned that similar dynamics could occur in the open science world. No doubt they will. But, that is different to saying that all open-science practices are being explicitly engineered towards the undesirable neoliberal outcome described by these authors. The challenge is partially biological in nature: how to create a system where co-operative behaviour lead in which the benefits of open science practices are spread across many individuals and one that resists the encroachment of cheaters.
Reframing the question
Many arguments made in support or in opposition to open-science are ultimately unsatisfying because they both frame “success” as individual scientists adapting to existing and fixed institutional structures and norms. We should, instead, turn this question around and ask how do we use open science approaches in the context of retooling our institutions to benefit actual living and breathing humans (scientists and nonscientists)? How can we use open science to enable as many people who have the interest and talent to pursue science for it’s own sake and to generate knowledge that is broadly useful for society, and not just elite institutions, venture capital firms or global megacorporations? This reframing takes the conversation out of the instrumentalist language of “carrot-and-sticks”, “rewards” and “incentives”, since as previously discussed here on the Ronin blog, any such system can be used to service questionable ends. We should, as Ernesto Priego says, be “working towards a type of scholarship which is about learning from each other, not about surveillance and gatekeeping”. It also means prioritizing open science approaches that benefit not only institutions or science in the abstract but help improve the lot of individual working scientists.
So what would this look like? If traditional measures of research “quality” (see “Excellence R Us: University Research and the Fetishisation of Excellence”) and progress of science in the abstract is not sufficient, how will we know if open science is helping to drive an inclusive and humane approach to science? A partial list of what this might look like would include: “permission-less” innovation (the end of paywalls, end-user license agreements, data embargoes would be extended to all, not just institutionally-based researchers); a more equitable distribution of power and resources (a shift away from massive labs and distribution of funding geared towards living wages for the many, rather than stability and large rewards for an elite few); a rise in independent scholarship (these might be considered the “research parasites” of the NEJM, but it would not be a one-way street, independent scholars would contribute back to the commons, perhaps by releasing data under GPL copyleft-style licenses); and an open notebook science that is structured to enable learning and not surveillance. There is a obviously a lot of overlap with more traditional arguments for open-science with which I fully agree, but the structural nature of the actual working conditions for scientists and the political economy in which they are embedded need to be kept firmly in focus.
How to get there?
So how can we begin to build a human-centered open-science? Here’s a few places to start:
1) Strengthening existing public institutions such as libraries to support open science. Shrinking library budgets have reduced the ability for libraries to perform many of their core functions, let alone the new ones that scholars need. This is an area where there is an increased role for the state (via funding and support) provide the core infrastructure for open science that is not subject to the vagaries of the market.
2) Explore platform cooperativism and commons-based models for open-scienceThe decentralized architecture of the Internet and the libertarian promises of the so-called “sharing economy” have not magically created a nirvana where people get paid or credited fairly for value created. (Scholars from the social sciences, have generally been way ahead of scientists and technologists in recognizing these trends). Ownership and governance models matter much more than the technical architecture. As we develop open science platforms we should follow and draw inspiration from the platform cooperativism movement in which users have at least partial ownership or control of the platform, rather than simply being passive nodes in an “on-demand” economy as in Uber or AirBnB.
3) Push for larger-scale social and economic changes In the long-run perhaps only larger socioeconomic changes will be sufficient to underwrite the ideal of an inclusive open-science. One such change gaining steam is the push towards a universal basic income (UBI): paying all citizens a fixed basic income regardless of circumstances. Sociologists such as Guy Standing have argued that the rise of automation and the precarity of work (already a reality with the postdoctoral scholar glut) will eventually make something like a UBI a necessity (the exact form it takes, however, is critical). In science, a UBI could enable the true benefits of open science approaches by decoupling job security from arbitrary notions of research “productivity”. By extending the privilege of pursuing whatever truly interested them currently only enjoyed (in principle) by tenured faculty, to scientists that currently lack such job protections, would take a huge amount of pressure off young investigators that currently feel a need to squeeze as much work out of every dataset, graduate student or postdoc. Of course many kinds of science require more resources than a single faculty member’s paycheck, but many labs are likely bigger than they “need” to be and a UBI could have the effect of reducing the pressures (real or imagined) to “grow” one’s lab (see the “Problem with Building a Group” in Kitsune #2).
The naysayers out there (whether from the “conservative” or “radical” camp) are likely to scoff at many, if not all, of these proposed changes and dismiss them as non-starters, or quibble with the exact details. The current funding climate certainly doesn’t favour changes, but that doesn’t mean that change isn’t possible. We can start now.