As a follow-up to my recent post on Scientific Publishing, here is what Wikinomics has to say on the topic, starting p.157. (Again, that book has flaws, but it is well-researched and The Economist found it good enough to make its Best of 2007 rankings, alongside Black Swan and, more surprisingly, Super Crunchers, so what can I say...) Let's start with an unfortunately accurate description of the current situation: "Each paper is peer reviewed by two or more experts, and can go through numerous revisions before it is accepted for publishing. Frustrated authors can find their cutting-edge discoveries less cutting edge after a lumbering review process has delayed final publication by up to a year, and in some cases longer." As a matter of fact, review times of nine to twelve months is not unusual, and I know of one paper that had to wait over four years before being published, counting time spent with the authors for revisions. At least now people can post their working papers on their website while reviewers pore over their formulas. This obviously raises the question of "whether the antiquated journal system is adequate to satisfy [scientists'] needs."
Here comes the fun stuff, about exploding collaboration in scientific research: "One study conducted by the Santa Fe Institute found that the average high-energy physicist now has around 173 collaborators. The same study found that the average number of authors per scientific paper has doubled and tripled in a number of fields. [When you read the notes at the back of the book, you learn that the number has increased "from an average of slightly over 1 to averages of 2.22 in computer science, [...], 3.75 for biomedicine, and 8.96 authors for high-energy physics."] A growing number of papers have between two hundred and five hundred authors, and the highest-ranking paper in the study had an astonishing 1,681 authors." 1,681! I'd be curious to know what exactly those 1,681 people did. (A put paper in the printer, B turned the machine on, C brought coffee for everybody, D opened the software, E frowned when looking at the data, F borrowed a book from the library...) How do you find reviewers when 1,681 people in the field are off limits? Let's not even talk about the fact that you need a book just to print all these names. (The working paper these stats are taken from is "Who is the best connected scientist? A study of scientific co-authorship networks" by M.E.J. Newman (2000).) Wikinomics mentions online e-print services such as arXiv as a better alternative that will hopefully "engage a much greater proportion of the scientific community in the peer-review process", and while the authors sound a bit idealistic, letting the scientific community know faster of one's results cannot be a bad thing. (Unless you made a mistake in the paper, in which case of course you will be left hoping nobody has bothered reading your work. Speed to market is a two-edged sword...)
My favorite example of collaborative science doesn't involve proofreading, though. Instead, it is about what The Economist called "citizen science" in its Technology Quarterly issue of December 8th, 2007. For instance, volunteers in the Galaxy Zoo project download high-resolution images on their home computer and analyze them in search of galaxies. Apparently, "amateurs with just a little training can distinguish between different types of galaxy far more efficiently than computers can"; "100,000 volunteers classified over 1m galaxies in a few months." To validate the results and minimize mistakes, "each image was viewed by over 30 volunteers." (As a less technology-intensive project, but every bit as worthwhile, the PeopleFinder project helped survivors of Hurricane Katrina reconnect with loved ones (Wikinomics, p.186) and aggregate notices from bulletin boards in a massive data-coding operation.) This was not only made possible by the Internet, but is an unintended side effect of the increase in households with broadband connections. While volunteers must certainly sign waivers before they can have access to the data, it will be interesting to see what happens when an academic basks in the spotlight for a galaxy found by someone else. How do you properly credit all the volunteers, if 30 of them spotted the thing before you did? In the same vein, a designer of high-end shoes (Wikinomics, p.129) let customers submit designs, and has promised to make the best ones. He isn't offering any royalties, though - he'll simply put the name of the customer on the winning shoe design. The story doesn't say how many customers gave it a try - if volunteers' brain power consistently benefits one individual and his project without anything in return but the feeling of having helped someone, collaboration might be too lopsided to last long. After all, researchers take the time to review other people's papers for free because they know someone will review their work too.
Comments