Friday 8 June 2012

Connotea: Bookmarks matching tag oa.new (50 items)

Connotea: Bookmarks matching tag oa.new (50 items)


Ending Knowledge Cartels

Posted: 07 Jun 2012 02:04 PM PDT

 
Ending Knowledge Cartels
academhack » Academhack, (06 Jun 2012)

Times Higher Education - Leader: Let's ask profitable questions

Posted: 07 Jun 2012 01:21 PM PDT

 
Times Higher Education - Leader: Let's ask profitable questions
www.timeshighereducation.co.uk
“There are those who believe that the backing given by some major publishers to a US bill that would have banned government agencies from imposing open-access mandates was the biggest own goal in the recent history of academic publishing. Elsevier's support for the abandoned bill prompted a pledge to boycott it that has so far been signed by nearly 12,000 people. The signatories saw the legislation - known as the Research Works Act - as an aggressive and unsavoury move by the publishers to protect their profit margins in the face of perceived threats from the rise of open access. And a host of developments in recent months suggest that the push for open access may have reached a tipping point. These include tougher new open-access policies from Research Councils UK and the Wellcome Trust ... indications that the European Commission is likely to follow suit, and, as we reported last week, a petition on the White House website that hopes to persuade the Obama administration to introduce an open-access mandate for all publicly funded research. David Willetts, the UK's universities and science minister, has also restated his determination to see all state-funded research in this country made open access. However, he has been equally clear that he has no interest in destroying publishers' business models. The job of trying to pick a way through all this has fallen to a group of researchers, publishers, librarians and funders led by Dame Janet Finch, the former vice-chancellor of Keele University. The latest minutes of the group, which was convened by Mr Willetts, suggest that it will place at the heart of its road map the ‘gold’ open-access model, under which funders pay article charges to replace the subscription income lost by journal publishers. The issue of affordability for the UK is complicated by a number of factors, including whether the country goes it alone on open access. If it does, it will be paying both article charges for its own research and subscription charges for other countries' work. But the Finch group minutes suggest that article fees of around £1,500 would allow the transition to open access without additional costs to the UK academy. This compares with the $3,000 (£1,900) charged by most Elsevier journals for open access, and the $5,000 levied by its prestigious Cell titles. However, the group has decided against suggesting any benchmark fee level in its final report... But it does seem surprising that the Finch group appears unlikely to take the opportunity to at least consider whether the profits of academic publishers of all kinds - which, after all, come in large part from the public purse - are appropriate, particularly at a time when the research budget is declining in real terms. Maybe Elsevier's own goal wasn't so catastrophic after all.”

My comments to Times Higher Education about the White House petition

Posted: 07 Jun 2012 01:16 PM PDT

 
My comments to Times Higher Education about the White House petition
Mike Taylor
Sauropod Vertebra Picture of the Week #AcademicSpring, (06 Jun 2012)
“I am briefly quoted in Times Higher Education‘s new article about the White House public access petition Since my response had to be quite dramatically cut for space, here is the full text of what I sent the writer, Paul Jump: ‘The success of this petition is important for several reasons. First, it puts paid to the pernicious lie that open access isn’t important because research is useless to non-specialists. Support for the petition has come from many non-academic quarters, including patient support group Patients Like Me, Wikipedia, Creative Commons, the American Association of Law Libraries, and the Association of College and Research Libraries. Perhaps equally important, it’s attracted support from publishers – not only open-access publishers such as PLoS, BMC and InTechWeb, but also forward-thinking subscription publishers like Rockefeller University Press. It’s also featured widely in the non-academic media, appearing on the news-for-nerds sites Slashdot, Reddit, and Hacker News, in newspapers like the Guardian, and in magazines like Wired. All of this makes the crucial point that open access isn’t just an esoteric preference of a few disgruntled academics, as the hugely profitable commercial subscription-based academic publishers have consistently tried to paint it. It’s something that has huge implications for all of our lives: for health care, education, legislative deliberations, small businesses, and ultimately the health of the planet. Open-access advocates have seen this for a long time, but now the message is getting out. Irrespective of what response the Obama administration makes to the petition’s very rapid achievement of the required 25,000 signatures, what’s been said about it around the world lays waste the idea that open access is nothing more than an alternative business model for scholarly publishing. It’s a much bigger revolution than that.’”

Times Higher Education - Open-access petitioners trigger White House response

Posted: 07 Jun 2012 01:15 PM PDT

 
Times Higher Education - Open-access petitioners trigger White House response
www.timeshighereducation.co.uk
“A petition asking the Obama administration to implement an open-access mandate for all publicly funded research has reached the required number of signatories to trigger an official response. The petition has garnered nearly 26,000 signatures since it was launched on the White House website on 20 May. The administration has pledged to respond to any petitions that are signed by more than 25,000 people. The petition calls on the administration to extend the open-access mandate currently imposed by the National Institutes of Health to all federal funders of research. Its organisers, a group of advocates under the banner ‘Access2Research’, said on their website that they hoped to blow the 25,000 target “out of the water” to demonstrate to the White House that ‘this issue matters to people, not just a few publishers’. Mike Taylor, an open-access advocate and a palaeontologist affiliated with the University of Bristol, said the widespread support the petition had received from many non-academic groups, media outlets and even ‘forward-thinking’ subscription publishers put paid to the ‘pernicious lie’ that ‘open access isn’t important because research is useless to non-specialists’. ‘All of this makes the crucial point that open access isn’t just an esoteric preference of a few disgruntled academics, as the hugely profitable commercial subscription-based academic publishers have consistently tried to paint it’, he said. Stephen Curry, professor of structural biology at Imperial College London and another prominent supporter of open access, agreed that the petition would be a ‘great boost’ to the global push for open access. ‘With the UK, the EU and now the US all moving in the same direction, I very much hope that we can realise the international coordination that will be needed to make open access work worldwide,’ he said. The UK’s approach to access will be informed by the conclusions, due to be published this month, of a group of publishers, librarians and funders chaired by former Keele University vice-chancellor Dame Janet Finch and convened by David Willetts, minister for universities and science.”

Britain searches for best way to promote open access

Posted: 07 Jun 2012 01:13 PM PDT

 
Britain searches for best way to promote open access
Paul Jump for Times Higher Education
Inside Higher Ed, (07 Jun 2012)
“The group charged with thrashing out how Britain should expand access to publicly funded research has decided against setting any guideline figures for open-access article charges, raising concerns that it will not stop commercial publishers' alleged profiteering. The Working Group on Expanding Access, chaired by a former Keele University vice chancellor, Janet Finch, is basing its projections of what a wholesale switch to ‘gold’ – or ‘author pays’ – open access would cost the British academy on a ‘cost-neutral’ fee of £1,450 (about $2,250) per article published. However, according to the minutes of the group’s penultimate meeting in April, members noted that the figure was "no more than an average" that would vary according to discipline and various other factors such as take-up rates both within the UK and abroad. For these reasons, they ‘agreed strongly that it would not be appropriate’ to set a ‘benchmark’ figure for article fees... the decision not to endorse any guideline article fee will disappoint those who advocate a move to open access partly as a means to moderate what they regard as commercial publishers' excessive profit margins. Tim Gowers, the University of Cambridge professor of mathematics whose pledge in January to boycott Elsevier has so far been echoed by 12,000 academics, told Times Higher Education that he was concerned that the Finch commission was not intending to address the issue of publishers’ profits. ‘There may be good reasons for not specifying a benchmark article fee, but I see no argument against at least establishing a principle that such fees should be for the purpose of covering publication costs rather than replacing lost profits,’ Gowers said... At the very least, he said, the group should specify that ‘if two journals in the same area are of comparable quality, then funding bodies will cover publication charges at the level of the cheaper one.’ Robert Kiley, head of digital services at the Wellcome Trust and a Finch group member, said the existence of downward pressure on article fees was demonstrated by the fact that fees charged by recent start-ups were considerably less than the ‘standard’ charge of $3,000 (£1,900). But Peter Murray-Rust, reader in molecular informatics at Cambridge, said there was little market pressure on publishers to bring down costs or improve their products. ‘There is even less market force in the gold model, where publishers can charge what they like with no regulation,’ he said. ‘The market often resembles personal vanity products, where only the brand matters and cost of production is irrelevant.’”

On Predatory Publishers: a Q&A With Jeffrey Beall

Posted: 07 Jun 2012 01:06 PM PDT

 
On Predatory Publishers: a Q&A With Jeffrey Beall
Carl Elliott
Brainstorm, (05 Jun 2012)
Use the link to access the interview introduces as follows: “If your incoming flow of email spam looks anything like mine, it probably features a regular invitation to submit an article to a journal you have never heard of, or to be a part of its editorial board, or maybe even to edit the journal.  The names of the publishers vary, but the invitations usually look something like this one, which arrived last week... ‘I am very pleasure that you can read this letter. Given the achievement you made in your research field, we sincerely invite you to join the Editorial Board for the Advances in Bioscience and Biotechnology (ABB). We are looking for Editorial Board members and Editor-in-Chief with renewal options.’ And if you attempt to find out more, very soon you will find yourself looking at Beall’s List of Predatory, Open-Access Publishers, a sardonic, highly informative guide to a particular sort of publishing scam.  The author of the list is Jeffrey Beall, an academic librarian at Auraria Library, University of Colorado at Denver, and the man behind the Scholarly Open Access blog.  He graciously agreed to explain the scam to me.”

A success, and a long road ahead

Posted: 07 Jun 2012 01:06 PM PDT

 
A success, and a long road ahead
Kevin Smith, J.D.
Scholarly Communications @ Duke, (04 Jun 2012)
“Last night the We the People petition to encourage public access to the results of taxpayer-funded research reached and exceeded its goal of 25,000 signatures, so we should expect a response from the White House.  Thank you all who signed the petition!  It is impressive to reach the goal in only half the allotted time. If you have not signed, please do so anyway.  The more signatures on the petition the clearer it will be that this is an issue the White House should embrace during this election year.  And an ever-growing list of signatures will also help shape the response; we want substantive action here, not simply assurances of further study. It is probably simply coincidence that on Friday the UK Publishers Association released a report purporting to show that public access to research articles after a six-month embargo, which is a move being considered by the seven Research Councils UK would result in large-scale subscription cancellations... The timing of the report is clearly intended to influence the RCUK; indeed, a short comment on the Publisher’s Association website takes an extremely Chicken Little approach to the report, assuring policy makers that a six-month embargo from the RCUK would ‘cause publisher collapse.’ But there are a couple of problems with the report that should prevent policy makers and the public from accepting its assertions too readily.  And even after acknowledging that the report probably claims to prove too much, we might still ask, ‘so what.’ One problem with the report is that it is based on an excessively simple survey.  A single question was sent to 950 librarians, with 210 replies.  The question was just this: ‘If the (majority of) content of research journals was freely available within 6 months of publication, would you continue to subscribe?’ The problem, of course, is that there is insufficient context for librarians to make a reasoned response to this broad question.  What is a ‘majority?’ How available, and searchable, would these articles be?  And continue to subscribe to what? This last question is particularly important, because the Publishers Association lumped together those responses that said they would cancel everything, which is a very small number, with those who said they would cancel some journals.  The two percentages are simple combined in the press release that announces impending disaster.  But surely the publishers realize that some cancellations are going to happen regardless of public access, due to their own pricing policies... So it appears that the question was too broad, attempts by respondents to be more specific were brushed aside, and then different answers (some, inevitable, cancellations v. sweeping cancellations) were combined to create a picture of disaster with which to frighten policy makers.  But even if we acknowledge the flaws in this study, it is also useful to ask whether the result it predicts would really be a disaster... To say that scholarship will dry up if some publishers go out of business is simply not true; scholarship will continue and it will find new ways, and likely more efficient ones, to reach those who want or need to read and use it.  This is already happening, which is why the publishers are so frightened in the first place... Transformational change is coming, and if publishers cannot find a way to adapt, we should not worry over much, at least not to the point of failing to experiment with new options... The Publishers Association links this study to an argument that the only form of open access that should be encouraged is ‘Gold’ OA, in which (sometimes) a fee is paid by the author in order for the work to be free for access for all readers.  Certainly fully open access journals, which is the real meaning of gold OA, have an important role to play in the future.  But note that only some of them are supported by author-side fees.  The new journal eLife, which hopes to rival Science and Nature, is supported by research funders and will not change publication fees.  And these kinds of fully OA journals are quite different, one suspects, from what the Publishers Association means by Gold OA.  They are most likely referring to open access to individual articles for which a special fee has been paid, while the other contents of the journal as a whole remain behind a subscription barrier. In short, in their ‘embrace’ of gold open access the Publishers Association is asking the public to pour more money into the inefficient system they have created, not less.  Gold OA will be part of the short term future, I believe, which is the only future we can dare to predict.  Green OA, including the kind of public access that both the White House and the RCUK are considering, will also be a part of the future, and is likely to prove the more sustainable option.  But the hybrid OA model on which the publishers want to pin their futures must be only a transitional step toward full gold OA; it is not a sustainable approach for even the short-term long haul.”

York librarian validates social media metrics for research impact evaluation

Posted: 07 Jun 2012 01:02 PM PDT

 
York librarian validates social media metrics for research impact evaluation
YFile, (06 Jun 2012)
“York Business Librarian Xuemei Li has become the first researcher to ever have a study validating the usefulness of altmetrics published in an academic journal. Altmetrics is the study of social media metrics used for analyzing and informing scholarship. Li’s first research study was published in April, 2012 in Scientometrics and Li’s second study was accepted by the 17th International Conference on Science and Technology Indicatorstaking place September 2012. ‘Researchers are integrating various social media tools such as blogs, wikis, Twitter and social bookmarks into their research processes to save, organize, share, and disseminate various research sources. It is even more difficult for traditional bibliometric indicators to capture the totality of research influence on the web,’ Li explains. ‘Nevertheless, the traces left by researchers and the general public through those social media tools hold big potential for measuring different research influences, and this is what altmetrics aims to measure. Altmetrics can be used to complement traditional citation-based measurements.’ Li’s first study of altmetrics sampled 1613 papers published in Nature and Science in 2007 and compared citations with reader counts. She found significant statistical correlations between citations from Web of Science (now the Web of Knowledge) and Google Scholar and reader counts from the social media bookmark tools CiteULike and Mendeley. The findings suggest that the type of scholarly influence one’s research has  – as measured by these social media tools  – is related to traditional citation-based impact.Li’s second study compared nearly 1400 Faculty of 1000 (F1000) post-publication peer reviews and Mendeley usage data with traditional bibliometric indicators. This study suggests that F1000 – a database that stores only the best quality biomedical articles after they’ve been published, as selected by over 10,000 faculty members worldwide – is good at acknowledging the merit of an article from (the F1000) experts’ point of view while Mendeley reader counts are more closely related to citation counts. ‘Faculty are striving to demonstrate the impact of their research in a world where the web has become a critical communications channel,’ says Cynthia Archer, York University librarian. ‘Li’s ground-breaking research serves to validate the usefulness of social media based altmetrics to monitor and track faculty research impact.’ Li and other researchers are working hard to identify, monitor, and evaluate potential social media tools towards building reliable altmetric indicators.”

Why MIT’s Technology Review is going digital first — Tech News and Analysis

Posted: 07 Jun 2012 01:01 PM PDT

 
Why MIT’s Technology Review is going digital first — Tech News and Analysis
gigaom.com
“Magazines and newspapers of all kinds have been experimenting with paywalls, iPad apps and other methods of handling the ongoing disruption that the web and digital media have produced, but very few have taken a fully ‘digital first’ approach. MIT’s well-respected Technology Review magazine has become the latest to embrace that principle, and editor Jason Pontin says that while he isn’t turning his back on print, it is no longer the most important medium for the brand of journalism his magazine practices. I talked with Pontin on Monday about the decision, as well as several related questions — including his dislike of paywalls and what he wants to implement instead. In a note to readers published on the site, Pontin said that everything the magazine produces will be published free-of-charge on the website and will appear there first — in other words, nothing will be ‘saved’ for the printed version of the magazine, as some publications do in order to give the print version some sense of exclusivity. Some stories and content will be published first online and later in print, and others will be published simultaneously in a number of different media. And print will be just one of many forms, he said: ‘For us, print will be just another platform [and] by no means the most important. I began as a traditional print journalist, and I still delight in what print does well. But there’s almost nothing… that print now does best.‘ When I asked Pontin why he decided the magazine needed to go digital first, and what that transformation meant to him as an editor, he said that focusing on digital above all else is important because it ‘promotes innovation and excellence’ and that Technology Review is doing it because ‘we want to be a better publisher, we want to publish smarter and more link-y journalism, we want to create more beautiful and interactive designs, and want to better serve our advertising partners in more innovative ways.’ One thing Pontin said he doesn’t mean by digital first is the kind of open-door, ‘user-generated content’ approach taken by some digital-native publications... Pontin said he doesn’t agree with the idea that ‘digital-first publishers should throw open their editorial pages to so-called content from a ragbag of constituents, including non-writers and marketers...’ As he put it in his note to readers: ‘Paywalls, no matter how elegantly devised, have for me the smell of paper and ink, as if publishers were trying to revive a subscription business irremediably tied to the distribution of physical products.’ All of the different variations of paywall that the magazine tried had drawbacks, Pontin said in his interview with me, and “the all-or-nothing was the most disastrously bad of all options.” As a result, Technology Review will be rolling out some form of membership-based model over the next few months, which will ask readers to pay for specific features or methods of distribution — the site has a survey that asks what things readers might be willing to pay for, such as freedom from advertising or customized content... ‘We’d like to have a free-for-use site, because there’s an essential linkiness that seems to demand an entirely free site — and also because MIT is an institution committed to openness — and then we want to experiment with what we think people will pay for, such as membership models, and different forms of distribution and platforms.’ Pontin also noted that at least some of what he has been able to do with Technology Review was made possible because the magazine is a not-for-profit corporation controlled by the Massachusetts Institute of Technology. Although he still has to pay for things out of a fixed budget, Pontin said this public mandate did make it somewhat easier to take risks or experiment with different models (in the same way that British newspaper The Guardian was able to take a ‘digital first’ approach thanks in part to being owned by a charitable trust rather than a for-profit or publicly-traded corporation). ‘I have enormous sympathy for established media companies,’ Pontin said. ‘There are smart editors out there and smart publishers, and at a high level the lineaments of what the future of publishing will look like are more broadly understood than new media critics sometimes allow. But in many cases those smart publishers and editors are constrained by the actual economic demands of publishing companies.’ Unless some of those constraints are removed, Pontin suggests, those entities could face a very bleak future indeed.”

Argentina takes steps towards open access law - SciDev.Net

Posted: 07 Jun 2012 11:13 AM PDT

 
Argentina takes steps towards open access law - SciDev.Net
www.scidev.net
Argentina is a step closer to becoming the first country to pass legislation to make all publicly funded research available in open access repositories. The Chamber of Deputies passed a new bill last month (23 May) stating that all national scientific institutions must provide open access (OA) archives of their research, allowing the public full access to journal articles, dissertation theses and technical reports, as well as data obtained by publicly funded projects, but excluding confidential data.

Springer are digging themselves deeper into a hole

Posted: 07 Jun 2012 10:16 AM PDT

 
Springer are digging themselves deeper into a hole
Mike Taylor
Sauropod Vertebra Picture of the Week #AcademicSpring, (05 Jun 2012)
“Oh dear, this is depressing to watch... The Problem... Last year (2011-12-01), Peter Murray Rust of Cambridge University published an article in BMC’s Journal of Cheminformatics — which, like all BMC journals, is owned by Springer. Note that the journal is open access, and that the ‘Open Access’ button on the article’s page links to Springer’s open access page, which says: ‘All articles are published under the Creative Commons Attribution (CC-BY) license, enabling authors to retain copyright to their work...’ Yesterday, Peter found that figures from this CC BY-licenced paper, to which the authors retained copyright, had been co-opted by Springer Images, with the following claim at the bottom of the page: ‘License... This image is copyrighted by BioMed Central Ltd. This image is published with open access and made available for noncommercial purposes. For more information on what you are allowed to do with this image, please see the Creative Commons pages. If you would like to obtain permissions for the re-use or re-print of this image, please click here.’ This is a very, very bad thing. If you doubt it, consider what Springer’s attitude would be if I took material that they owned the copyright on and claimed that it was mine. It would not be pretty. Peter looked around Springer Images some more. What he found there was also not pretty. He found that they had also wrongly claimed copyright for CC BY images from Wikipedia(more details) and from PLoS, Maybe more interesting still, Peter’s browsing in Springer Images shows that they have also pre-empted copyright on non-CC materials owned by rival Big Four academic publisher Wiley. Will Wiley pursue Springer for this violation? We can only hope so — after all, we’re often told that the reason for copyright transfers is that the publishers have the resources to do these things on our behalf... Precedent... I just found out that Klaus Graf reported this very problem back in 2009. [In German:English translation.] Nothing was done about it then, but let’s be charitable and assume that’s because it never came to Springer’s attention... Springer’s Responses... First up, Bettina Goerner, Springer’ Science and Business Media Open Access Manager, who spoke with Peter: ‘Something has gone wrong. Springer is working very hard. They hope to fix it by July.’ By July?! So what we’re being told is this: Springer have a grotesque attribution, licencing and copyright problem on their Images site, whether by design or accident, which results in their gaining revenue from material that is not theirs. And they intend to continue profiting from it for another month. Not acceptable! ... But the one that provoked me to write this article is this thick wedge of doublespeak posted on Google+ by Wim van der Stelt, Executive Vice President of Corporate Strategy. I’ll quote it in full so no-one thinks I am misrepresenting it [use the link to access the full text of the letter quoted here] ... I won’t respond to this phrase by phrase — in part because Peter has already done so — but I will quote the response that I posted on SpringerOpen’s Google+ page: ‘Dear Springer, ARE YOU COMPLETELY OUT OF YOUR MINDS? Have you not been watching what’s happened to Elsevier? You have screwed up royally on Springer Images. And your response is to blame Peter Murray-Rust for exposing your copytheft? If you want to come out of this with any shred of credibility intact, and not as the targets of the next Cost Of Knowledge boycott, you need to PROPERLY APOLOGISE RIGHT NOW: first, to the people whose work you stole, then to Peter for your contemptible blame-shifting. Once you’ve done that, we can start to think about whether we can move forward with you. Just calling yourselves “Springer Open” is not going to get the job done.’ It’s shocking to me that, after all the developments of the last six months, with all the new awareness of what publishers are up to, and with all the active engagement with revolutionising scholarly publishing, Springer think they can make this go away by attacking the messenger. it won’t work. Springer now have a very narrow window in which to try to unwind this clodhopping manoeuvre. They need to undo all that they’ve done regarding the Springer Images debacle, and apologise unreservedly to Peter for their entirely baseless suggestion that he is somehow in the wrong for pointing out their wrongdoing. If they don’t do it, then I doubt the results will be spectacular; but they will be profound. All around the world, researchers will quietly classify Springer in the ‘just as bad as Elsevier’ bucket. We’ll stop submitting to Springer journals. We’ll stop recommending them to our friends, colleagues and students. We’ll stop volunteering as editors and reviewers. Queitly but inexorably, the life-blood will be sucked out of Springer, just as it is being from Elsevier . Oh, and Wiley? Take the chance now to get your house in order before someone notices something that you’re doing. There’s nowhere to hide misdeeds in 2012. Someone’s going to notice.”

Publishers' Association survey on subscriptions: methodological critique

Posted: 07 Jun 2012 10:16 AM PDT

 
Publishers' Association survey on subscriptions: methodological critique
poeticeconomics.blogspot.ca
“... The Association of Learned, Professional and Society Publishers (ALPSP) and the Publishers' Association just released the results of a survey asking if libraries would continue subscriptions if a majority of the content of research journals was freely available. The results of this study (downloadable from here) appear alarming - predicting catastrophic losses of subscriptions and subsequently journals and publishers! This post is a methodological critique. In summary, the recommendation of this study (against an OA mandate with a 6-month embargo) is not supported by the research presented - and most importantly, by the research omitted, which is in brief all of the considerable evidence that would counter this recommendation, including a 2006 study by ALPSP itself which is not cited... The recommendation from this [current] study is: ‘It is strongly recommended that no mandate is issued on making all or most journal articles available free of charge after a six month embargo until both libraries and publishers have had time to understand the issues better and have together taken steps to explore alternatives to a fully open access publishing model which could be mutually attractive.’ Here is the simply question asked by this ALPSP study: ‘If the (majority of) content of research journals was freely available within 6 months of publication, would you continue to subscribe? Please give a separate answer for a) Scientific, Technical and Medical journals and b) Humanities, Arts and Social Sciences Journals if your library has holdings in both of these categories.’ Methodological critique: [1] The response rate to this study was 26%. With any survey, there is always the possibility of response bias... [2] A 2006 ALPSP survey... found that cancellations by librarians would likely be minimal even with IMMEDIATE free access... For example, even if 79% of content were freely available immediately on publication, only 10% of libraries surveyed indicated that they would consider cancelling the journals. That this study, conducted by the same association, which found very different results, was not cited... [3] The recommendation of this study implies that the only possible source of revenue for scholarly journals is subscriptions. Even for traditional journals, this has never been the only source of revenue... A growing number of libraries provide funds to cover article processing fees for open access. Why did the study not ask libraries if they have such a fund, or would consider developing one if there was a mandate for open access within 6 months? Why not mention that there are now more than 7,000 fully open access journals - including profitable commercial journals, and hundreds of society journals as noted by Suber & Sutton?... [4] A key recommendation of this study is that ‘no mandate is issued on making all or most journal articles available free of charge after a six month embargo until both libraries and publishers have had time to understand the issues better and have together taken steps to explore alternatives to a fully open access publishing model which could be mutually attractive’. This is puzzling. If the researchers think that libraries and publishers should work together (I agree), then why was was this a publisher-only study? Wouldn't a combined library / publisher / scholar study - like the PEER project - be a better approach? Further, this recommendation implies that the idea of libraries and publishers working together to make open access happen is a new one. As I've explained in a bit of detail in my previous post Society publishers: time to quit whining and make the leap to open access, these discussions have been going on for more than decade. Why not cite some of these discussions and related research, such as survey conducted by myself and other researchers across Canada to figure out how to help journals make the leap to open access? This is just one example!”

Protectionism Against the Past (or: Why are Copyright Terms so Long?)

Posted: 07 Jun 2012 10:14 AM PDT

 
Protectionism Against the Past (or: Why are Copyright Terms so Long?)
Julian Sanchez
Julian Sanchez, (05 Jun 2012)
“Under current law, this blog post will remain under copyright until 70 years after my death—which if I’m lucky means a century or more from the date of authorship. That’s an insanely long time when you consider that most economic studies have shown there’s almost no marginal incentive effect on production once you extend copyright terms much beyond the original span: 14 years renewable once, or 28 years total. Why would we needlessly lock away our own culture for so long? One popular answer is the Mickey Mouse Theory. Though the effective commercial lifespan of the vast majority of copyrighted works is just a few years, a very few—like some of Disney’s iconic properties—continue to be immensely profitable for much longer. The owners of these properties then throw gobs of money at Congress, which ritualistically serves up a retroactive extension whenever these come within spitting distance of the public domain in order to protect their cash cows (or mice, as the case may be). No doubt there’s something to that. Yet if that were the sole concern, you’d think the content industries would prefer a renewal structure that maxed out at the same term. The cost of renewing the registration of their profitable (or potentially profitable) works would be trivial for the labels and studios, but they’d also gain access to orphan works that nobody was making any use of. Our system, by contrast, seems perversely designed not just to provide extended protection for revenue-generating works, but to guarantee a minimal public domain. Here’s an alternative hypothesis: Insanely long copyright terms are how the culture industries avoid competing with their own back catalogs. Imagine that we still had a copyright term that maxed out at 28 years, the regime the first Americans lived under. The shorter term wouldn’t in itself have much effect on output or incentives to create. But it would mean that, today, every book, song, image, and movie produced before 1984 was freely available to anyone with an Internet connection. Under those conditions, would we be anywhere near as willing to pay a premium for the latest release? ...”

#springergate: SpringerImages should be closed down until they mend it.

Posted: 07 Jun 2012 10:05 AM PDT

 
#springergate: SpringerImages should be closed down until they mend it.
petermr's blog, (06 Jun 2012)
[Use the link to access the complete blog post as well as the comments section where debate continues.] “Five days ago I wrote to Springer about violations of copyright on their site, SpringerImages. Since then I have documented everything on this blog and those who want to know more details can read recent blogs. I have made it clear that I consider the current practice is unacceptable, morally, legally, and ethically. Springer rang me yesterday , agreed to put out a factual statement about the site. They then contacted me and asked me to retract what I had said and its implications. I said I would retract the word ‘theft’. Much of the rest of what I have said is fact. Springer have not yet explained the problem. The current position is summarised by Mike Taylor http://svpow.com/2012/06/05/springer-are-digging-themselves-deeper-into-a-hole/... Springer have a grotesque attribution, licencing and copyright problem on their Images site, whether by design or accident, which results in their gaining revenue from material that is not theirs. And they intend to continue profiting from it for another month. Not acceptable! At the very least, the Springer Images site should immediately be modified to show a prominent banner stating “the copyright and licence information pertaining to these resources is wrong: contact the original creators for permissions” until the mistakes are all fixed. That is the least they can do. Since I may be asked to RETRACT opinions I shall stick to FACTs and labelled HYPOTHESES. I shall also deal ONLY with non-OA content. (The problem that alerted me was in the mislabelling of my OA CC-BY material.) I would welcome correction of what follows: [1] FACT: SpringerImages are still listing my content as “copyright BMC”, 5 days after my reporting it. [2] FACT The site is a commercial site (confirmed by Bettina Goerner). As an example, if an academic wishes to use a Springer image in a course pack it will cost USD60. [3] FACT Individual (non-corporate) membership costs USD595 (presumably per year) from the site [4] FACT: Many of the licensing algorithms (and I found it very difficult to get quotes) refer to “agents of a commercial organization” and “member of the pharmaceutical industry”. HYPOTHESIS: they also sell to industry and generate income. [5] HYPOTHESIS: Much (probably most) of the SpringerImages site is taken from Non-OA material sources, [6] FACT much of it is copyrighted ‘Springer’ (various Springer companies such as Springer Verlag, Springer Medizin, etc.) [7] FACT the visitor to the site is told that they require a subscription to view the images. [8] FACT I looked for apparent, alleged, violations of third party copyright (such as Wikipedia). Out of the first ten examples I looked at all were copyrighted Springer. [9] HYPOTHESIS Some of the authors of these materials have not given Springer explicit permission to include them in Springer Images, change the copyright and resell them. [10] FACT after 5 days I have been unable to find any changes to the site as a result of reporting the problem(s). [11] FACT Springer are aware that there are images on the site that are mislabelled. [12] FACT They are continuing to sell them [13] FACT Springer have made no public announcement to customers of SpringerImages. [14] HYPOTHESIS some customers will pay for material that Springer does not have the right to sell to them [15] HYPOTHESIS some customers will pay for material that should be branded as FREE (gratis and libre)...”

Academics are revolting: the open-access frontier

Posted: 07 Jun 2012 10:04 AM PDT

 
Academics are revolting: the open-access frontier
Catherine Moffat
Overland literary journal, (05 Jun 2012)
“... Authors and creators, keen to add the ‘value’ and prestige of a publisher to their work, or just excited that someone – anyone, is interested in publishing them, often give their work away with little thought of remuneration or what’s happening to their copyright. Of course, publishers rightly argue they have the costs of editing, producing, advertising and distribution, not to mention the financial risks. It’s about adding value. The rush to publish in ‘quality’ journals at any cost is particularly true of academic publishing... Check out what’s happening at Sydney University (and coming soon to a university near you), where a retrospective performance marker of four research publications in the last three years is being used as one criteria to cull the ‘non performing’. Of course, academics, unlike your average author, generally have the benefit of an income that is independent of their publications. As Louise Adler pointed out in The Australian last year, authors and creators not publishing in academia are probably in more actual danger of perishing, given, ‘the average annual income of Australian writers has declined in the past decade from $23,000 to a character-building $11,000’... In a recent appearance at the Sydney Writer’s Festival on a panel discussing the implications of theProtect Intellectual Property Act (PIPA) and the Stop Online Piracy Act (SOPA), Jeff Jarvis stressed the need for authors to develop new business models. While this may be good advice, many authors, and academic authors in particular, risk becoming trapped in their existing arrangements, relying on proving their publication record in traditional formats. But as the cost of academic journals rise, many are asking who benefits most from the current model? Philip Soos in a piece on The Conversation, ‘The Great Publishing Swindle’, argues that what was intended as a public good has become a publishing monopoly. Danny Kingsley, also writing on The Conversation about the US Research Works Act – a Small Bill in the US, a Giant Impact for Research Worldwide – breaks it down like this: ‘Publicly funded research being undertaken by researchers who are often themselves (in Australia almost exclusively) also publicly funded, is written up and submitted to a publisher. The publisher sends it back out to the academic community to peer review the work, for no charge. Many of the editors of journals are also academics who again are doing the work gratis. The publisher then adds the journal design to the article and publishes it, charging disproportionally large subscription fees for access to the work. These fees are paid by university libraries, again, with public funding.’ And they are big fees. Each year university libraries pay millions of dollar to give scholars the right to access material that in the main has been provided by the same scholars, free of charge. Prices continue to increase while university budgets diminish and, as Adam Habib outlines, developing countries are priced out of the market. While authors are using the internet to experiment with new ways to reach their audience, many academics are also revolting and choosing open-access publishing models that offer the same peer reviewed guarantee while allowing anyone who wants it, access to their work. Take a look at this open-access.net clip for a really simple overview... Unless you’re Kim.Dotcom making an estimated $175 million from alleged illegal filesharing, it’s unlikely you’ll find the FBI on your doorstep any time soon. I can’t help thinking, however, that we’ve been living in the digital equivalent of a frontier town, and sooner or later, for good or ill, the lawmen ride in and the frontier gets tamed. Big Copyright has been slow to mobilise and grasp the monetary potential of the web, but it’s starting to lumber to its feet, and when it gets there you can be sure the 99% are going to be left hanging. The extensions of PIPA, SOPA and the Research Works Act to copyright embedded in free trade agreements and similar actions are indications that the giant is awake now...”

MLA embraces "open access" writer agreements for journals | Inside Higher Ed

Posted: 07 Jun 2012 09:55 AM PDT

 
MLA embraces "open access" writer agreements for journals | Inside Higher Ed
www.insidehighered.com
“Literary scholars on Twitter were offering praise Tuesday for an announcement by the Modern Language Association that it is adopting a new author agreement for its journals (including the flagship PMLA) that will leave copyright with authors, enabling them to post versions in open access repositories, or on individual or departmental websites. The reactions included ‘Fantastic,’ ‘Great open access news,’ ‘very cool and important’ and ‘a watershed [for open access] in the humanities?’ The open access movement has in some ways made the most headway in the sciences, where requirements from federal agencies and other funders have many times forced journals to permit authors to post their papers in repositories that have no paywall. Humanities journals, in contrast, publish relatively little work that is the direct result of grants, so these publications (and the disciplinary groups that run them) have been able to consider these issues without government pressure. Rosemary G. Feal, executive director of the MLA, said that the association's new policy ‘was not responding at all’ to the legislation and regulations. Rather, she said, ‘we see that publishing needs are changing, and our members are telling us that they want to place their scholarship in repositories, and to disseminate work on blogs.’ Professors want to produce articles that ‘circulate freely,’ she said, and that reach as many people as possible. The new MLA policy appears to move beyond those of other humanities organizations -- although some of them have created ways to work with authors who want their scholarship in open access repositories... Many disciplinary associations have been dubious of the open access movement, saying that it would hurt their revenues from journals (either directly through subscriptions or indirectly as an incentive to become a member of the association). Feal said she did not share those concerns. ‘We believe the value of PMLA is not just the individual article, but the curation of the issue,’ she said. PMLA regularly includes thematic issues or issues where articles relate to one another. While there will be value in reading individual articles, she said, that does not replace the journal. Further, she said, the individual articles posted elsewhere could attract interest to the journal. The MLA's journal policies tend to be watched by many humanities related associations, and Feal said she hoped the MLA change would have an influence in encouraging open access. ‘While recognizing that each association and journal has its own business model, we hope that they will find ways, like the MLA, to disseminate their scholarship broadly.’ Brett Bobley, director of the Office of Digital Humanities at the National Endowment for the Humanities, said he was very pleased by Tuesday's news... He said that the MLA, ‘as one of the largest scholarly societies in the humanities,’ could inspire other groups to experiment. And Bobley said that the results could broaden the reach of new ideas being published in these journals. ‘It will be interesting to see the results of this new policy the MLA has put in place,’ he said. ‘Might it, for example, greatly expand readership to some of their published articles? We'll have to wait and see.’”

MLA Journals Adopt New Open-Access-Friendly Author Agreements

Posted: 07 Jun 2012 09:54 AM PDT

 
MLA Journals Adopt New Open-Access-Friendly Author Agreements
www.mla.org
“The journals of the Modern Language Association, including PMLA, Profession, and the ADE and ADFL bulletins, have adopted new open-access-friendly author agreements, which will go into use with their next full issues. The revised agreements leave copyright with the authors and explicitly permit authors to deposit in open-access repositories and post on personal or departmental Web sites the versions of their manuscripts accepted for publication. For more information on the new agreements, please contact the office of scholarly communication.”

ARSSF Signs Berlin Declaration | Agricultural Information Management Standards (AIMS)

Posted: 07 Jun 2012 09:53 AM PDT

 
ARSSF Signs Berlin Declaration | Agricultural Information Management Standards (AIMS)
and Food Agriculture
“Wed, 06/06/2012 — Sridhar.Gutam... It is a great pleasure to inform you that Agricultural Research Services Scientists´ Forum had signed Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities (w.e.f 18/5/2012) and is added to the list of signatories. Thanks & Regards... Sridhar”

The ARL/DLF/DuraSpace 2012 E-Science Institute

Posted: 07 Jun 2012 09:52 AM PDT

 
The ARL/DLF/DuraSpace 2012 E-Science Institute
duraspace.org
“The 2012 Association of Research Libraries/Digital Library Federation (ARL/DLF)/DuraSpace E-Science Institute (ESI) will be held during the 2012-2013 academic year for academic and research library audiences seeking opportunities to boost institutional support and management of e-research. Please enter your email address above to be notified when registration information is available...”

What's On @ ANU - Events- Universities: national investment or a waste of money

Posted: 07 Jun 2012 09:50 AM PDT

 
What's On @ ANU - Events- Universities: national investment or a waste of money
billboard.anu.edu.au
Use the link above to get more information about events from Australian National University (ANU): “Universities Australia CommUNIty Forums are an opportunity for people to connect with universities and influence the future education agenda. We want to encourage individuals in the community to communicate their understanding of how our universities support and shape Australia. We want to understand how Australians value universities and everything they can do for our nation. Universities Australia, the voice of Australian universities, wants to hear your voice. The first forum, hosted by The Australian National University will cover the topic Universities: national investment or a waste of money? Moderated by the ABC’s Emma Alberici the panel includes: [1] Kate Carnell AO: CEO of beyondblue, former ACT Chief Minister [2] Professor Simon Marginson: Higher Education, University of Melbourne [3] Professor Stephen Parker: Vice-Chancellor and President of the University of Canberra.”

Future Tense: Academic Journals and the Price of Knowledge

Posted: 07 Jun 2012 09:49 AM PDT

 
Future Tense: Academic Journals and the Price of Knowledge
Sunday 10 June 2012
Use the link to access information about the upcoming June 10, 2012 radio broadcast from Future Tense a program of Radio National, Australia. The website offers the following description of Future Tense: “New ideas, new approaches, new technologies – exploring the edge of change… Future Tense dissects and analyses the social, cultural and economic faultlines arising from rapid transformation. Nothing is outside our brief - from politics to social media to urban agriculture. Our weekly half-hour program/podcast is available from itunes and is broadcast across Australia on Radio National. We can also be heard throughout the Asia/Pacific region on Radio Australia, across Canada via CBC Radio and in Ireland on RTE Choice” The upcoming broadcast is described as follows: “Librarians and academics all over the world are up in arms. They’re angry and they’re making their displeasure known—they’re as mad as hell and they're not going to take it anymore! We explore the international debate over the cost of academic journals and the implications for access to knowledge.”

Reply from and to Springer

Posted: 07 Jun 2012 09:48 AM PDT

 
Reply from and to Springer
petermr's blog, (05 Jun 2012)
[Use the link to access the full text letter sent by Mr. Wim van der Stelt, Executive Vice President, Corporate Strategy, Springer Science+Business Media in reply to the blogger, Mr. Peter Murray-Rust, University of Cambridge, regarding the use and reuse of images by Springer Images.] The blog post opens as follows: “Springer have given an answer to some of my questions – I interleave my comments... ‘>>We have contacted Peter Murray-Rust, a blogger, to discuss Springer Images.  Mr Murray-Rust has drawn attention to problems with the www.springerimages.com website and Springer is working flat out to correct them. Mr Murray-Rust has, on his blog (http://blogs.ch.cam.ac.uk/pmr/2012/06/05/springergate-update-from-bettina-goerner-and-some-explanations-i-urge-that-scientific-images-should-be-free-as-in-speech-for-everyone/), made allegations that are untrue and we would like to respond to them...’ If Springer can show to my, and public satisfaction that any of my statements are untrue, then I will retract them...”

Science in the Open » Blog Archive » Added Value: I do not think those words mean what you think they mean

Posted: 06 Jun 2012 09:17 AM PDT

 
Science in the Open » Blog Archive » Added Value: I do not think those words mean what you think they mean
cameronneylon.net
“There are two major strands to position of traditional publishers have taken in justifying the process by which they will make the, now inevitable, transition to a system supporting Open Access. The first of these is that the transition will cost ‘more money’. The exact costs are not clear but the, broadly reasonable, assumption is that there needs to be transitional funding available to support what will clearly be a mixed system over some transitional period. The argument of course is how much money and where it will come from, as well as an issue that hasn’t yet been publicly broached, how long will it last for? Expect lots of positioning on this over the coming months with statements about “average paper costs” and ‘reasonable time frames’, with incumbent subscription publishers targeting figures of around $2,500-5,000 and ten years respectively, and those on my side of the fence suggesting figures of around $1,500 and two years. This will be fun to watch but the key will be to see where this money comes from (and what subsequently gets cut), the mechanisms put in place to release this ‘extra’ money and the way in which they are set up so as to wind down, and provide downwards price pressure. The second arm of the publisher argument has been that they provide ‘added value’ over what the scholarly community provides into the publication process. It has become a common call of the incumbent subscription publishers that they are not doing enough to explain this added value. Most recently David Crotty has posted at Scholarly Kitchen saying that this was a core theme of the recent SSP meeting. This value exists, but clearly we disagree on its quantitative value. The problem is we never see any actual figures given. But I think there are some recent numbers that can help us put some bounds on what this added value really is, and ironically they have been provided by the publisher associations in their efforts to head off six month embargo periods... The key data points I want to use are as follows: [1] All publisher associations and most incumbent publishers have actively campaigned against open access mandates that make the final refereed version of a scholarly article, prior to typesetting, publication, indexing, and archival, online in any form either immediately or within six months after publication. The Publishers Association (UK) and ALPSP are both on record as stating that such a mandate would be ‘unsustainable’ and most recently that it would bankrupt publishers. [2] In a survey run by ALPSP of research libraries (although there are a series of concerns that have to be raised about the methodology) a significant proportion of libraries stated that they would cut some subscriptions if the majority research articles were available online six months after formal publication. The survey states that it appeared that most respondents assumed that the freely available version would be the original author version, i.e. not that which was peer reviewed. [3] There are multiple examples of financially viable publishing houses running a pure Open Access programme with average author charges of around $1500. These are concentrated in the life and medical sciences where there is both significant funding and no existing culture of pre-print archives. [4] The SCOAP3 project has created a formal journal publication framework which will provide open access to peer reviewed papers for a community that does have a strong pre-print culture utilising the ArXiv. Let us start at the top...”

UCSF sets example for open access to research

Posted: 06 Jun 2012 09:16 AM PDT

 
UCSF sets example for open access to research
www.highlandernews.org
“UC San Francisco (UCSF) has become the largest scientific institution in the nation and the first UC campus to allow for public access to its research publications. The university, which publishes around 4,500 scholarly papers each year, will be following in the footsteps of other notable universities such as Harvard and the Massachusetts Institute of Technology. ‘Our primary motivation is to make our research available to anyone who is interested in it, whether they are members of the general public or scientists without costly subscriptions to journals,’ stated UCSF School of Medicine Professor Richard A. Schneider in an article by the UCSF Newsroom. Schneider spearheaded the open-access policy initiative and has eight years of experience on the UCSF Academic Senate Committee on Library and Scholarly Communication, for which he currently serves as chairman. A reason that a significant number of research institutions have not adopted the open-access policy is illustrated by the obstacle that UCSF now faces: convincing commercial publishers to change their contracts. ‘The system as it is currently designed where you have to pay to get access is only benefiting one group and that is the publishers who are making billions of dollars in profit every year—all off of our free labor,’ stated Schneider in an interview with the Highlander. According to Schneider, commercial publishers have been able to achieve profit margins of over 30 percent—an outcome accomplished by charging fees for access to scholarly journals and renowned research articles. These fees have resulted in institutions such as the University of California paying (usually around $40 million a year) for student and faculty access to large databases. Access by the general public, however, largely remains restricted to fee-based subscriptions, visiting a university library or having access only to research abstracts and summaries of articles. Despite these obstacles, the future for open-access scholarly articles remains relatively promising due to the efforts of potent research donors and organizations such as the National Institutes of Health (NIH)... Since the policy was the result of a collaboration among committees such as UC Committee on Library and Scholarly Communication, the UCSF publications will also be available on databases operated by the UC. ‘The decision is a huge step forward in eliminating barriers to scientific research. By opening the currently closed system, this policy will fuel innovation and discovery, and give the taxpaying public free access to oversee their investments in research,’ stated Schneider.”

Libraries Advocating for Open Access: Best Practices and Lessons Le...

Posted: 06 Jun 2012 09:15 AM PDT

 
Libraries Advocating for Open Access: Best Practices and Lessons Le...
www.slideshare.net
Use the link to access the slideshare posted by Iryna Kuchma, EIFL Open Access Programme Manager for a presentation given at the Fifth Belgrade International Open Access Conference, National Library of Serbia, May 18-19, 2012.

Have medical journals missed the Web 2.0 roller coaster? - Buck - 2012 - Emergency Medicine Australasia - Wiley Online Library

Posted: 06 Jun 2012 09:14 AM PDT

 
Have medical journals missed the Web 2.0 roller coaster? - Buck - 2012 - Emergency Medicine Australasia - Wiley Online Library
onlinelibrary.wiley.com
Use the link to access the full text article (DOI: 10.1111/j.1742-6723.2012.01570.x) published in the current issue of Emergency Medicine Australasia (EMA) from WIley. The abstract reads as follows: “Access for clinicians to medical information has never been easier or faster, with the specialty of emergency medicine leading the world in the online dissemination of information to its practitioners. This has allowed those training in and practising emergency medicine to stay abreast of their field with relatively little effort. Australasian emergency physicians are among those pioneering online medical education and the use of social media. Three years ago in this journal Cadogan provided an in-depth synopsis of many of the available Web 2.0 resources that could be of use to medical professionals, and their potential utility and popularity within the rapidly growing Australasian and international emergency medicine community. Unfortunately, it appears that his call to ‘get with the times’ has fallen on deaf ears in the medical journal publishing world.1 Medical journals in general have been slow to adapt to the online paradigm, with few really using the vast array of social media and Web 2.0 tools for distributing journal content. Modern online business and marketing models are also scarce within the journal publishing world. Although some emergency medicine journals have adopted a range of Web 2.0 tools, such as supplemental content in online articles, real-time comments and forums, social media plug-ins, videos, podcasts and more (e.g. Annals of Emergency Medicine, Academic Emergency Medicine and the Emergency Medicine Journal), others (such as Resuscitation, the American Journal of Emergency Medicine, Canadian Journal of Emergency Medicine and this journal Emergency Medicine Australasia) have shown no signs of modernising their content. The latter journals, for instance, simply provide digitised versions of their print articles, and charge significant fees for non-subscribers to purchase individual items. Meanwhile, there is discussion around the utility of social media for scientists and researchers, as many are interested in Web 2.0 tools that will help them ‘do science’ rather than ‘talk about science’. Some feel that journals should be helping develop these tools, for example data gathering and analysis, rather than simply promoting published material.2 Several issues have arisen as a result of the way in which modern medical information is created and the speed at which it can be accessed and distributed, particularly in the field of emergency medicine. These have direct implications for the particular niche of medical journal publishing and social media, and are discussed below.”

Springergate: Systematic “copyglitch” appropriation of Wikimedia content

Posted: 06 Jun 2012 09:13 AM PDT

 
Springergate: Systematic “copyglitch” appropriation of Wikimedia content
petermr's blog, (05 Jun 2012)
[Use the link to access the full blog post including the images discussed below and comments on the post.] “I have communicated with Wikip(m)edia over the apparent systematic relicensing and relabeling of their content into ‘SpringerImages’. It’s fair to say that the individuals I have heard from are seriously upset. The action is clearly a breach of copyright and therefore illegal in most jurisdictions. The problem is that Wikip(m)edia are not the owners of the content. So they can’t do anything legally. But they can and should and will (I hope) make a public fuss. They have suggested blogging it (and Wikipedia carries a great deal of public opinion). So I have: [1] Searched http://springerimages.com for “Wikipedia”. This will only give results where the string ‘Wikipedia’ is in the caption, and there are probably >>10 times more than won’t have this. But this gave 350 hits. Therefore I assume there are thousands of Wikipedia images badged as ‘SpringerImages’. [2] Looked for the licensor. In some cases this is not Springer, so I assume it is a publisher which is either owned by Springer or where they have an agreement with Springer. Note that the material taken from Wiley and PLoS is relicensed as Springer’s. I have omitted any material which does not have ‘Springer’ in the licensor. [3] Copied the results to this page for the first ten I found. I have found NO (ZILCH, NADA in Neylon-speak) entries which honour the original licensee. I therefore hypothesize that ‘ALL your open content are belong to SPRINGER’. I shall continue. No doubt Springer will say ‘terribly sorry, it was a glitch’. It’s a very profitable glitch for them. They resell other people’s content and build up a brand from it. When it goes wrong they can say ‘sorry’. For me this is similar to the infamous fake journals published by Elsevier UPDATE: Daniel Mietchen (Wikimedia/OKFN) has just mailed me – a special page that WP uses for recording violations. It contains the phrase: ‘Sometimes, media organizations just don’t understand that in most cases, you just can’t rip an image off Commons and just use it.’ Well, Springer, you had better understand that right now. Because you have spent enough time telling US what we cannot do...”

Access Denied

Posted: 06 Jun 2012 09:12 AM PDT

 
Access Denied
Victoria Charlton
I, Science, (24 May 2012)
“Giving the public access to the research that they fund is about much more than eliminating journal pay-walls... For months, academics have been taking an uncharacteristic interest in the detailed financials of the publishing world, and, for many scientists, the fight for our right to party – no, sorry, to access largely incomprehensible journal articles – has taken on a revolutionary tone. Rumour has it, the mathematicians are revolting. (Against Dutch publisher Reed Elsevier, that is.) ... I do think we’re in danger of losing sight of the bigger picture on this one. Please, hear me out. At the beginning of May, the Minister for Science and Universities, David Willetts, promised to put ‘more data and power in the hands of the people’ by making selected journal articles accessible to anyone free of charge. According to Mr Willetts, ‘giving people the right to roam freely over publicly funded research will usher in a new era of academic discovery and collaboration’. All very noble, in theory... First, the obvious one: if access to an article is going to be free, who’s going to cover the costs of publishing it? While it’s true that publishing costs in an increasingly digital world are lower than they used to be, they are still substantial. The major academic players, such as Reed Elsevier and Wiley, rely on sophisticated – and very expensive – IT platforms to deliver their content, and while the papers themselves are written for free, administering the editorial and peer review process for 240,000 articles a year is not cheap. Elsevier employs nearly 7000 people and spends over £1bn per year ensuring that peer reviewed publications reach their intended audience. (Interestingly, whilst Elsevier earns a healthy but not astronomical profit margin of 37%, their competitor, Wiley, earns an even healthier 42% but has nevertheless so far avoided the wrath of the mathematicians). First, the obvious one: if access to an article is going to be free, who’s going to cover the costs of publishing it? While it’s true that publishing costs in an increasingly digital world are lower than they used to be, they are still substantial. The major academic players, such as Reed Elsevier and Wiley, rely on sophisticated – and very expensive – IT platforms to deliver their content, and while the papers themselves are written for free, administering the editorial and peer review process for 240,000 articles a year is not cheap. Elsevier employs nearly 7000 people and spends over £1bn per year ensuring that peer reviewed publications reach their intended audience. (Interestingly, whilst Elsevier earns a healthy but not astronomical profit margin of 37%, their competitor, Wiley, earns an even healthier 42% but has nevertheless so far avoided the wrath of the mathematicians). Open access then, certainly doesn’t come cheap. But who exactly are we paying to grant access to? Are we really expecting Josephine Bloggs (or Dr J. Bloggs, for that matter) to browse the International Journal of Quantum Chemistry in their lunch break? Who are these ubiquitous ‘people’ that Mr Willetts keeps referring to? And can they really be expected to ‘usher in a new era of academic discovery and collaboration’? It seems to me that two separate issues are being confused here. Certainly, the scientific community would benefit from more open streams of communication – who wouldn’t? But, quite frankly, the idea of spoon-feeding science to the public went out with the Spice Girls. Today it’s all about public dialogue... According to The Royal Society, ‘the approach of some organisations to the ‘open access debate’ is threatening to hinder rather than promote the exchange of knowledge between researchers’. I can’t help but agree. Large publishers, with all their resources, could and should play an important role in developing the technologies and platforms that will enable greater collaboration across the academic community. As for communicating the results of tax-funded research to the public, let’s get creative. Making sure that journalists have access to what should be their primary source material would pay far greater dividends than some of the other options being bandied about, potentially leaving some spare cash for other engagement activities...”

Beyond Open Access: Open Source Scientific Software | Techdirt

Posted: 06 Jun 2012 09:07 AM PDT

 
Beyond Open Access: Open Source Scientific Software | Techdirt
www.techdirt.com
“... Computers need software, and some of that software will be specially written or adapted from existing code to meet the particular needs of the scientists' work. This makes computer software a vital component of the scientific process. It also means that being able to check that code for errors is as important as being able to check the rest of the experiment's methodology. And yet very rarely can other scientists do that, because the code employed is not made available. A new paper in Science points out that this needs to change: ‘The publication and open exchange of knowledge and material form the backbone of scientific progress and reproducibility and are obligatory for publicly funded research. Despite increasing reliance on computing in every domain of scientific endeavor, the computer source code critical to understanding and evaluating computer programs is commonly withheld, effectively rendering these programs ‘black boxes’ in the research work flow. Exempting from basic publication and disclosure standards such a ubiquitous category of research tool carries substantial negative consequences. Eliminating this disparity will require concerted policy action by funding agencies and journal publishers, as well as changes in the way research institutions receiving public funds manage their intellectual property (IP).’ As that notes, the open exchange of knowledge and materials are obligatory for publicly-funded research, and there's no reason why it should be any different for software that is written in order to conduct the experiment. After all, this, too, has been funded by the tax-payers, who therefore have a right to enjoy the results. There may not be much they can do with it directly, but they can still benefit when other scientists are able to build on the code of others, instead of needing to re-invent the digital wheel for their own experiments. The paper makes an important point that deserves a wide audience, because it's about a public policy issue. So it's a huge pity that, ironically, it is not published under an open access licence, and can only be read by Science's subscribers.”

Open Access and Data Management - A winning combination

Posted: 06 Jun 2012 09:05 AM PDT

 
Open Access and Data Management - A winning combination
dev.figshare.com
“Today is officially Open Access Monday or #OAMonday. This is because a petition set up by John Wilbanks, Heather Joseph, Mike Carroll, and Mike Rossner requiring 'free access over the internet to scientific journal articles arising from taxpayer-funded research' has passed the required number of signatures in order to ascertain a US governmental response. Peter Murray-Rust has been quoting Churchill in reference to the fact that we appear to be at a tipping point with open access: ‘this is not the end. This is not even the beginning of the end. But it is perhaps, the end of the beginning.’ At the RSP event last week 'New Developments in Open Access', Alma Swan quoted Ghandi in a similar manner, ‘First they ignore you, then they laugh at you, then they fight you, then you win.’ It appears the inevitability of open access is unstoppable now, a point emphasised further by David Willetts recent announcement that the UK would be making all publicly funded research openly available to all. Also out today is a video from the Digital Curation Centre focussing on the problems with providing access to research data, which features figshare. This is key ... DCC describe the video as offering: ‘a unique insight into the importance of providing access to research data and the risks of not managing data effectively. It also explains how the DCC could help researchers, research support staff and HE institutions by offering guidance, training and tools.’ These two ideas of data management and open access overlap to form a need for open access to all research data. This is something that figshare is looking to help researchers achieve with a free open access service for research data. If you have research data, share it on figshareand help the academic community move towards this goal, which is not only good for society but also good for your career prospects.”

Open-source human genomes - Boing Boing

Posted: 06 Jun 2012 09:04 AM PDT

 
Open-source human genomes - Boing Boing
boingboing.net
“Yesterday, during a World Science Festival panel on human origins and why our species outlasted other species of Homo, geneticist Ed Green mentioned that there were thousands of sequenced human genomes, from all over the world, that had been made publicly available. Our code is open source. But where do you go to find it? Several folks on Twitter had great suggestions and I wanted to share them here. [1] The 1000 Genomes Project—organized by researchers at the Wellcome Trust, the National Institutes of Health, and Harvard—is working on sequencing the genomes of 2500 individuals. The data they've already collected is available online. Read a Nature article about The 1000 Genomes Project: Data management and community access. [2] The Personal Genome Project is interactive. Created by a researcher at Harvard Medical School, the program is aimed at enrolling 100,000 well-informed volunteers who will have their genomes sequenced and linked to anonymized medical data. Everything that's collected will be Creative Commons licensed for public use. [3] The University of California Santa Cruz Genome Browser is a great place to find publicly available genomes and sequences.”

#FreeTHOMAS - Sunlight Foundation

Posted: 06 Jun 2012 09:03 AM PDT

 
#FreeTHOMAS - Sunlight Foundation
sunlightfoundation.com
“Does information about legislation belong to Congress or to the American people? This basic question is at the heart of a fight over how Congress releases data about what it does. Americans increasingly use the Internet to make sense of the world around them, and open data opens up Congress in a way that's never been possible before. In the pre-YouTube pre-iPhone pre-Amazon days, Congress built a website -- THOMAS -- to let citizens follow legislation from home. THOMAS was revolutionary ... in 1995... Congress doesn't share the data behind THOMAS with anyone. Instead, web developers must reverse-engineer the website to transmute its pages into usable data, like assembling a puzzle from thousands of ragged pieces without a picture on the box as a guide. This slow, difficult, and time-consuming process isn't perfect, but it's responsible for how most Americans follow what's happening in Congress. The better approach is for Congress to publish the data behind THOMAS. Government regularly does this elsewhere, and ‘bulk data’ is responsible for clever new uses of information developed by citizens, journalists, and even the government itself. In upcoming days, the House is likely to pass legislative language that pays lip service to releasing THOMAS data while putting the idea in a deep freeze. This would be a disaster. But it's not too late. Tell your representative that you want Congress to publish legislative data now. PS. For more information and the latest developments, go here.”

Open Tree of Life Project Draws In Every Twig and Leaf - NYTimes.com

Posted: 06 Jun 2012 09:00 AM PDT

 
Open Tree of Life Project Draws In Every Twig and Leaf - NYTimes.com
www.nytimes.com
“... Darwin presented a detailed account of the tree of life in “On the Origin of Species.” And much of evolutionary biology since then has been dedicated to illuminating parts of the tree. Using DNA, fossils and other clues, scientists have been able to work out the relationships of many groups of organisms, making rough sketches of the entire tree of life. ‘Animals and fungi are in one part of the tree, and plants are far away in another part,’ said Laura A. Katz, an evolutionary biologist at Smith College. Now Dr. Katz and a number of other colleagues are doing something new. They are drawing a tree of life that includes every known species. A tree, in other words, with about two million branches. Until recently, a complete tree of life would have been inconceivable. To figure out how species are related to one another, scientists inspect each possible way they could be related. With each additional species, the total number of possible trees explodes. There are more possible trees for just 25 species than there are stars in the universe. Scientists have overcome this problem by developing computer programs that find the most likely relationship among species without having to consider every possible arrangement. With enough processing power, those computers can now analyze tens of thousands of species at a time. Yet these studies have thrown spotlights on only small portions of the tree of life. ‘Nobody has tried to put all these results together,’ said the leader of the new effort, Karen Cranston, a biologist at the National Evolutionary Synthesis Center in Durham, N.C. Last year, Dr. Cranston and other experts gathered at a meeting called by the National Science Foundation, where they came up with a plan for a single tree of life. On May 17, the National Science Foundation announced that it was awarding the team a three-year grant of $5.7 million. The first goal of the project, known as the Open Tree of Life, is to publish a draft by August 2013. For their raw material, the scientists will grab tens of thousands of evolutionary trees that are archived online. They will then graft the smaller trees into a single big one. These trees represent just a tiny fraction of all the known species on earth. She and her colleagues will then enlist the entire community of evolutionary biologists to make the tree more accurate. They will set up an Internet portal where scientists can upload new studies, which can then automatically be used to revise the entire tree. Even as the tree of life comes into sharper focus, it will continue to grow. Each year scientists publish descriptions of 17,000 new species. How many species there are left to discover is an open question; last year a team of scientists estimated the total to be 8.7 million, although others think it could easily be 10 times that many. When scientists publish the details of a new species, they typically compare it with known species to determine its closest relatives. They will be able to upload this new information into the Open Tree of Life. Scientists who extract DNA from the environment from previously unknown species will be able to add their information as well... Some scientists are reserving judgment on the project until they can actually see the tree on a computer screen. Roderic D. M. Page, a professor of taxonomy at the University of Glasgow, called the Open Tree of Life team ‘first class,’ but added: ‘Displaying large trees is a hard problem that has so far resisted solution. We are still waiting for the equivalent of a Google Maps...’” Luke Harmon, an evolutionary biologist at the University of Idaho who is not involved in the project, looks forward to using the tree to explore the history of life. One major question evolutionary biologists have long explored is why evolution runs at different speeds in different lineages. ‘We can use the tree to identify evolutionary ‘bangs’ and ‘whimpers’ through the history of life,’ he said.  It may also be possible to see how climate change has driven extinctions in the past, and to make predictions for the future...”

Developing a Taiwan library history digital library with reader knowledge archiving and sharing mechanisms based on the DSpace platform

Posted: 06 Jun 2012 08:59 AM PDT

 
Developing a Taiwan library history digital library with reader knowledge archiving and sharing mechanisms based on the DSpace platform
Chih-Ming Chen et al.
Electronic Library, The 30 (3), (06 Jan 2012)
Use the link above to access pay per view options for the article available in the current issue of “The Electronic Library” published by Emerald. The abstract for the article reads as follows: “Purpose – This work seeks to present a reading annotation and knowledge sharing tool, which can annotate a web page with HTML format archived by the Taiwan libraries' history digital library based on Web 2.0 technologies Design/methodology/approach – This work adopted DSpace, an open-source institutional repository system, to implement a Taiwan Digital Library History Library with the reading annotation tool for knowledge archiving and sharing services. A quasi-experimental design method was employed to randomly assign participants to an experimental group and control group to evaluate differences in the reading performance of learners who used the proposed annotation system. A statistical analysis scheme was employed to evaluate differences in learning performance of learners while reading and learning with the proposed annotation tool. Findings – The paper finds that annotated digital material provides useful knowledge to readers. The values to those annotating and subsequent readers are the acquisition of in-depth knowledge and efficient reading. Additionally, the effect on digital libraries is that digital library content grows dynamically as readers contribute knowledge. More importantly, annotated information from different readers has very high potential for the discovery of value-added knowledge utilizing data mining techniques. Originality/value – Collecting user-generated content is a novel research issue in the library sciences field, and few studies have developed useful tools that allow readers to actively contribute their knowledge to digital libraries. This work shows how to implement such digital library systems and how the annotation tool benefits the growth of digital archives and promotes learning performance.”

Interview to Alma Swan

Posted: 06 Jun 2012 08:57 AM PDT

 
Interview to Alma Swan
Isabel Bernal, Sonia Hidalgo, and Oficina de Digital
Use the link to access the interview of Alma Swan,Director,SPARC European Advocacy Programmes. The interview is availalable in both English and Spanish. The Interview has been archived in Digital.CSIC, “the Institutional Repository of the Spanish National Research Council.”

The Risks of Launching a New Services Business — Branding, Cash Flow, and the Fraught Start of PeerJ « The Scholarly Kitchen

Posted: 06 Jun 2012 08:56 AM PDT

 
The Risks of Launching a New Services Business — Branding, Cash Flow, and the Fraught Start of PeerJ « The Scholarly Kitchen
scholarlykitchen.sspnet.org
“Two weeks ago, we learned of the Fall 2012 launch of PeerJ, a new proposition in the open access (OA) publishing space. Instead of per-article charges as part of an OA business model, PeerJ is proposing to allow researchers to publish as much as they want, all for the low, low price of one $99 lifetime membership. After scooping my brain back into my skull once I’d absorbed this apparently foolhardy approach to cash flow and sustainability (a topic I’ll return to momentarily), it began to dawn on me that perhaps what PeerJ is headed toward is more akin to a freemium model, like WordPress ... PeerJ is hoping to collect cash up-front with its $99 lifetime membership fee. On the face of it, scalability would seem to be their main challenge... There is already some skepticism that PeerJ, taken at face value, can work. And there are two interesting hedges in PeerJ’s admittedly scant description of its business model: ‘Researchers will be able to purchase Lifetime Memberships, starting at just $99, giving them the rights to publish their articles in our peer reviewed journal.’ The first hedge is ‘starting at,’ a classic indication that up-charges are headed your way. The second hedge is that the Lifetime Membership grants researchers ‘rights to publish,’ but nothing more. So what could come after the ‘starting at just $99’ price? I could imagine peer-review fees, formatting fees, search engine optimization fees, press release fees, data storage and hosting charges, syndication fees, and so forth. The list could be impressive, and the ala carte final charge could be significant and recurring. As Chris Anderson outlined in 2009, freemium is usually enabled by restrictions... The freemium model only works as well as the balance of free and premium services works. That is, you have to offer enough initially to hook the customer, then hold back enough things — service omission or the prospect of service provision — so that the customer is compelled to pay for more. If I had to bet, I’d wager on PeerJ adopting feature, capacity, or customer class limits in the freemium model I’m imagining... PeerJ has stated that it will not stoop to subscriptions: ‘Subscription fees made sense in a pre-Internet world, but now they just slow the progress of science.’ Cash flow drives a lot of behaviors in businesses. For example, when PLoS first emerged, it focused on two high-quality journals possessing familiar peer-review and publication practices — an editor at the helm of each, small issues, selective editorial control, and monthly publication. However, this model didn’t provide sufficient cash flows using OA publication fees, so PLoS created PLoS ONE, a high-volume mega-journal that allowed the organization to publish more papers without singular editorial oversight or a filter that added novelty or interest criteria, allowing the publication to settle for methodological soundness. Around the same time, BioMed Central fully embraced article-processing charges and began launching dozens of OA journals, creating a high-throughput article publication environment but parceling it out through multiple titles rather than one main mega-journal. Since then, ‘predatory’ OA publishers have emerged time and again, with a common thread linking them — namely, every one seems to launch dozens or hundreds of journals simultaneously. There is a reason for OA publishing taking on this high-throughput aspect — namely, cash flow. For the OA publisher, each article is sold once and only once. This places significant cash flow restrictions on the OA publisher. For the (dare I say it) traditional OA publisher, the obvious answer to the question of how to increase cash flow and revenues has been and will continue to be, publish more articles more frequently. There is no clear alternative, even with supplemental revenues from institutional memberships and other secondary revenue streams, like ads. The main thrust of the OA model dictates this financial reality. The side-effects of this simple financial model are legion — the lowering of standards to accommodate bulk publishing practices; an emphasis that publishing is just a technology business in order to strip away the costs of legal, editorial, and custodial work; and advocacy to make OA publishing as prevalent as possible to further increase throughput... PeerJ has an intriguing proposition to steal customers from PLoS and BMC using a potentially novel business model — a low initial “lifetime membership” PeerJ can then upsell. By getting initial commitment from researchers, PeerJ creates a small but real switching cost if member researchers decide to try publishing at PLoS or BMC. PeerJ also gets some fast cash in the door. And if PeerJ adopts a freemium model — which I believe they will scrupulously avoid calling a subscription model — cash flow will be their main motivator, and many of their services will likely have renewal elements. Should PeerJ adopt a freemium model, it will be a new variant of OA publishing, and possibly one that could be more selective than traditional OA — that is, if the services and inherent subscriptions around ongoing publication services take hold, PeerJ may not have to publish more papers to thrive, just provide better service over time. There is a major risk to starting a services company in the freemium mode. Service companies — and I’m putting most OA publishers in this camp — run the risk of having a fairly transparent and reproducible set of value propositions on the market, with little to no protection. Hence, the publisher of PLoS can leave PLoS and take the same service to market at a much lower price point. There is nothing PLoS can do about it. (Because PLoS doesn’t protect its content, I’m wondering why PeerJ doesn’t also take all the PLoS content, but that’s a more nefarious plot by a long shot — and the fact that it would create no real value for PeerJ underscores that OA publishers are services businesses, not product businesses.) In any event, PeerJ is essentially “service replication at a lower price,” which reveals the fatal flaw of any service business — if someone can do essentially the same thing more cheaply or scalably, you’re dead. Of course, one service a publisher arguably provides is branding — by building, sustaining, and extending a brand wisely over time, a company can lend brand equity to affiliated parties, be they authors or readers in the case of publishers... The PeerJ branding start isn’t promising... Branding, cash flow, service distinctiveness, and competition — with all these elements in play, it seems like PeerJ has a hill to climb. It will be interesting to see how their initial business model, offerings, and market stance contend with vital elements that will determine their fate.”

Report: Six Month OA Mandate Would Cut Journal Subscriptions By Almost Half

Posted: 06 Jun 2012 08:55 AM PDT

 
Report: Six Month OA Mandate Would Cut Journal Subscriptions By Almost Half
lj.libraryjournal.com
“The U.K.’s Publishers Association released a report suggesting that libraries would cancel 65 percent of arts, humanities and social sciences journal subscriptions, and 44 percent of scientific, technical and medical ones, if the United Kingdom adopted an across-the-board open access mandate. ‘The potential effect of making journals free after a six month embargo’ was commissioned by The Publishers Association and the Association of Learned, Professional and Society Publishers [ALPSP]. Some 950 libraries around the world were sent surveys; 210 responded, of which 159 were from higher education libraries. Graham Taylor, Director of Educational, Academic and Professional Publishing at The Publishers Association, said: ‘The findings of the report are testament to the fact that a six month embargo period is too short for the ‘green’ model of open access. The Publishers Association is in full support of a funded version of open access as we hope will be recommended by the report of the Finch Committee.’ The Finch Committee, officially known as the Working Group on Expanding Access to Published Research Findings,  is made up of representatives from universities, research funders, ‘the research community,’ scholarly publishers, and libraries, and is scheduled to produce a final report this month. Audrey McCulloch, Chief Executive of The Association of Learned, Professional and Society Publishers, said: ‘ALPSP is very concerned about the effect this may have on non-profit publishers, many of whom may not survive...’ The [current] report’s findings make an interesting counterpoint to an analysis by American’s National Institute of Health, which concluded that the NIH’s 12 month mandate entailed no evident harm to publishers.”

Research methodology workshop may 2012

Posted: 06 Jun 2012 08:54 AM PDT

 
Research methodology workshop may 2012
www.slideshare.net
Use the link above to access the presentation posted to slideshare by Sarika Sawant, Ph.D. on May 25, 2012. The presentation was made at the Research Methodology Workshop, organized by the SHPT School of Library Science, SNDT Women’s University.

Report from the IFLA/EIFL/COAR/SPARC World Summit on the Information Society (WSIS) Forum 2012 Workshop, Geneva, Switzerland

Posted: 06 Jun 2012 08:52 AM PDT

 
Report from the IFLA/EIFL/COAR/SPARC World Summit on the Information Society (WSIS) Forum 2012 Workshop, Geneva, Switzerland
“On Thursday 17th May 2012 IFLA, along with its organising partners Electronic Information for Libraries (EIFL), the Confederation of Open Access Repositories (COAR) and Scholarly Publishing and Academic Resources Coalition (SPARC) held a workshop on open access titled 'Rethinking the Agenda for Development: Open Access Policies and Practice'. This was the second year in a row that IFLA had organised a workshop on open access at WSIS, and the high-level discussion that took place showed that there is a knowledge of and an appetite for the issue amongst WSIS attendees. IFLA Director of Policy and Advocacy, Stuart Hamilton, moderated the session which featured the Chair of IFLA's Open Access Taskforce Lars Bjornshauge, Silvia Nakano, Director of the Science & Technology National Directorate of Physical Resources at the Ministry of Science Technology and Productive Innovation in Argentina, and Eve Gray, an honorary research associate at the Centre for Educational Technology and the IP Law and Policy Research Unit at University of Cape Town... The main objective of the workshop was to discuss how open policies are being used in support of development in developing countries, with a particular emphasis on Africa and Latin America. To do this, case studies and best practice were discussed, along with the challenges and barriers that need to be overcome to place the item on the agenda of policymakers. Presenters gave brief presentations on this from the perspective of governments, research and academia, and libraries, before an interactive discussion took place for the remainder of the workshop. The workshop's outcomes included a clear identification of the 'buy-in' needed to place open access policies on the agenda of governments and research institutions... All stakeholders - government departments and policymakers; academics and researchers; librarians and publishers - have a role to play in promoting and implementing open access policies that can support development. The value of providing free access to taxpayer funded research was clearly identified as a motivating factor for the uptake of open access, along with the importance of encouraging more output from researchers in the developing world to counter a northern bias in scholarly output and publishing. An important outcome of the workshop was information exchange between the panelists and audience participants from developing countries, particularly Africa and the Middle-east who wished to develop programmes and policies in their countries. The presentations and discussed showed that open access is in a very strong phase at the moment, with many governments developing policies to open up taxpayer-funded research, research trusts promoting open access journals and universities mandating academics and researchers to publish their results in an open access format. The issue of access to taxpayer funded research, without having to pay twice (i.e. having to pay to buy back the results of the research from third-party publishers) has also been increasingly mentioned in mainstream publications and press, especially in Europe and North America, and this will have implications for the profile of the issue in the global south. As we move towards 2015 open access will continue to gain momentum, and IFLA believes it is essential that the issue, and the benefits of adopting policies that use open access to research information to support development, stays on the agenda of WSIS.”

25,000 Advocates Urge White House to Open Taxpayer-Funded Research to Everyone

Posted: 06 Jun 2012 08:51 AM PDT

 
25,000 Advocates Urge White House to Open Taxpayer-Funded Research to Everyone
SPARC - Full Feed, (04 Jun 2012)
“The movement to make taxpayer-funded research freely available online hit a new milestone on Sunday when advocates hit their goal of 25,000 signatures to a “We the People” petition to the Obama administration. The petition, created by Access2Research (a group of Open Access advocates, including SPARC’s Executive Director, Heather Joseph), requests that President Obama make taxpayer-funded research freely available.    According to the petition site’s rules, any petition securing 25,000 signatures within 30 days will be sent to the White House Chief of Staff, and will receive an official response. The Open Access petition hit the 25,000 mark in half the allotted time.   ‘The community is fully engaged in sending a clear message to the Administration – access to taxpayer-funded information is in the public’s interest, and they want it now,’ said Heather Joseph, SPARC’s Executive Director. The Open Access mandate builds on the National Institutes of Health’s policy, noting that that agency’s experience ‘proves that this can be done without disrupting the research process,’ urging the president ‘to act now to implement open access policies for all federal agencies that fund scientific research.’ John Wilbanks, Senior Fellow in entrepreneurship for the Ewing Marion Kauffman Foundation and one of the creators of the petition believes that the fast uptake by the public signals a new pace in the Open Access debate. ‘Opening access to taxpayer-funded research is no longer a policy discussion happening away from researchers, scientists and taxpayers. People are now fully part of the conversation, and that changes everything.’ ‘The next step is for the White House to issue an official response,’ said Mike Rossner, Executive Director at The Rockefeller University Press and an original sponsor of the petition. ‘Our hope is that they will act quickly and will require expansion of the successful NIH policy to all other major U.S. federal funding agencies.’ A number of key organizations outside the academic community endorsed the petition. The Wikimedia Foundation endorsed the petition and included a feature article on its Wikipedia’s English Homepage. Patients advocacy groups from Patients Like Me to the Avon Foundation promoted the petition to their members, as did a variety of publishers, university libraries, commercial companies and advocacy organizations.  For further information on the petition, its sponsors and supporting organizations see the SPARC website and the Access2Research website.”

Major Milestone for SCOAP3: Invitation to Tender sent to Publishing Partners

Posted: 06 Jun 2012 08:50 AM PDT

 
Major Milestone for SCOAP3: Invitation to Tender sent to Publishing Partners
SCOAP3 official website
“SCOAP3 is pleased to announce that it has issued its Invitation to Tender to the leading publishers of high-quality peer-reviewed journals carrying content in the field of High-Energy Physics, who had previously agreed to the key SCOAP3 principles in a qualifying market survey This process is the culmination of one year of work, during which an international team of experts, with considered input from the publishing industry and the support of CERN, has transformed the SCOAP3 idea into a set of concrete specifications and a clear agreement framework. A detailed description of the SCOAP3 open access model is now publicly available. Publishers, who convened at CERN today to further discuss the tendering process, are now expected to submit their bids by mid June. SCOAP3 will then review the bids and identify the publishing partners and selected journals to which a contract for Peer Review, Open Access and other publishing services could be awarded. This competitive process will take into account the price per article and a set of defined quality criteria. SCOAP3 aims to award contracts by end of the third quarter of 2012, putting into place a detailed operational and organizational infrastructure throughout 2013 for services expected to commence in January 2014. To meet this schedule, SCOAP3 invites libraries, library consortia and funding agencies that have already pledged support, as well as those with whom partnerships are currently being forged, to engage with the process that is scheduled to unfold over the next several months as SCOAP3 builds its governance structure and concludes the agreements among all participants that are essential to begin this groundbreaking Open Access initiative.”

Sarah Stewart: A broken promise

Posted: 06 Jun 2012 08:49 AM PDT

 
Sarah Stewart: A broken promise
sarah-stewart.blogspot.com
“I am afraid I have to admit I have broken a promise that I made a couple of years ago. Exactly two years ago I promised that I would only submit articles for publication in open access journals. The reason for this was that I am very committed to making research openly and freely available to everyone, especially colleagues who live in resource-poor countries. But the reality has been more difficult that I thought it would be... There are several reasons why it has been difficult to publish in open access journals. The main reason is because there are so few suitable, quality open access journals for midwives and nurses to publish research. This is concerning for me as a midwifery academic, especially in view of the pressure put on me to publish in ‘top rated’ journals. I want to support open access journals, but at the same time, I have to think about my academic career progression, which highly driven by publications, as far as universities are concerned. I have also turned down requests to write opinion pieces in magazine-type journals, which  are not open access. But I do wonder if I have shot myself in the foot by taking this decision. If the greater midwifery audience reads these types of journals, how can I get any message across if I do not engage with them? How do you balance deeply held beliefs with every day pragmatics?  For all my angst, I think the future is looking very good for open access journal publication. Only the other day,Harvard announced it is encouraging its staff to publish in open access journals because it cannot afford the incredibly expensive journal subscriptions it pays. Another glimmer of hope is the Australian Health Research Council has mandated that any research it funds must be made freely available within a year of publication, as from July 2012. As for nursing and midwifery, the time is ripe to start exploring how to support open access research publication. Any ideas?

RCUK embargo rule ‘would cause publisher collapse’

Posted: 04 Jun 2012 08:31 AM PDT

 
RCUK embargo rule ‘would cause publisher collapse’
www.thebookseller.com
“Making journals free after a six-month embargo period would lead to libraries cancelling subscriptions, a major fall in publisher revenues, the axeing of large numbers of journals and the collapse of some small publishers, according to a new report prepared for the Association of Learned, Professional and Society Publishers (ASPSP) and the Publishers Association (PA). Research Councils UK plans to adopt a policy shortly that would stipulate that all papers produced with funding from any of the science research councils must be freely available online within six months of publication. The measure has already drawn criticism from the PA, which said in March that it would have to oppose the policy because it ‘takes no account of the role of publishers in scholarly communication, makes no reference to sustainability or the management of peer review, offers no practical policy for funding open access while dictating firm and onerous requirements for mandatory deposit on short embargoes’. Now a paper prepared for the two publisher bodies by research company Gold Leaf has ‘strongly recommended’ that no such mandate should be imposed ‘until both libraries and publishers have had time to understand the issues better and have together taken steps to explore alternatives to a fully open access publishing model’. The report claimed that science, technical and medical (STM) publishers could expect to retain full subscriptions from 56% of libraries under the new measure, with reduced or no revenues from the remaining 34%, while arts, humanities and social science (AHSS) publishers would only retain full subscriptions from 35% of libraries with reduced or no funding from the remaining 42%.  If an across-the-board six-month embargo was imposed, ‘most publishers would be obliged to review their portfolios and a substantial body of journals, especially in AHSS subjects, would cease or be financially imperiled,’ the report stated. The report also said: ‘almost certainly small publishers of all kinds (commercial, learned society and those with single or only a few journals), especially those who do not engage in other types of publishing (books, online e-book collections, databases, educational software etc) would find it most difficult to accommodate the sudden withdrawal of revenues, and some would undoubtedly cease to exist.’”

A Study of Open Access Resources and Their Evaluation

Posted: 04 Jun 2012 08:29 AM PDT

GrandIR Blog/ PEER End of Project Conference: a few reflections

Posted: 04 Jun 2012 08:26 AM PDT

 
GrandIR Blog/ PEER End of Project Conference: a few reflections
grandirblog.blogspot.com
“ The fact that the PEER European Project (Publishing and the Ecology of European Research) has managed to establish a fruitful communication channel between publishers and repositories was repeatedly highlighted [at] the PEER End of Project Conference held last Tue May 29th in Brussels. This ability for fostering a successful collaboration between stakeholders initially at conflicting positions is undoubtedly one of the main PEER outcomes and it would be good news for the Open Access movement as a whole if these communication channels could remain open in the future. As Norbert Lossau put it, favouring pragmatism over ideology could be very useful for jointly outlining evolving business models. The second most important achievement of the PEER project was being able to establish a tested publisher-repository transfer infrastructure which can be deployed beyond the project... The figures associated to the PEER project are certainly impressive: 53,000 stage-two manuscripts (aka post-prints in SHERPA RoMEO terminology) from 241 journals published by 12 mainstream publishers were processed by the PEER Depot resulting in 22,500 EU manuscript deposits (including embargoed papers) released into six different IRs plus into a long-term preservation archive at the KB in The Hague. Two submission routes were designed: automatic publisher-driven deposit and 11,800 invitations to authors for self-archiving their papers, the latter one resulting in just 170 author deposits (or 0.2% of total PEER deposits). The large difference between deposit figures associated to the two deposit routes led PEER researchers to conclude that authors sympathise with OA but don't see self-archiving as their task, therefore ‘Green OA not being the key road to optimal scholar information systems’. The PEER Usage research -one of the three research team projects within the PEER Research strand along with Behavioural and Economics research- proved also that altough current findings reflect the position of a relatively early stage in PEER development, Open Access repositories are not really a threat to publishers... While testing Green Open Access and its economic consequences for the publishing ecosystem in Europe was the main PEER goal and Green OA was the preferred workline when the Project started back in Sep 2008, the Gold Open Access route seems nowadays to be winning hearts and minds of those trying to promote access to research output on a wide basis. PEER has produced quite a number of evidences on the fact that Green OA does not harm journals nor publishers, but in the meantime attention has shifted to Open Access and hybrid journals as a way to ensure that final publisher/PDF versions of the papers are made available. This is probably the strongest argument in favour of Gold OA, but there are also very good ones that support Green OA. As a result, a lively debate is taking place these days inside the Open Access community on which OA model should receive main support from the government bodies... And there is finally an important fact to be accounted for after watching PEER result of 99.8 vs 0.2% automatic vs author-driven deposit: author self-archiving rates should not be systematically used as reliable indicators of the strength of Green OA, since there is nowadays a wealth of alternative ways to populate repositories that do not imply self-archiving obligations for authors. In fact CRIS systems, their integration with IRs and the resulting alternative workflows for content ingest into repositories were not mentioned at all last Tuesday despite having already been proved effective by a recently released UKOLN report...”

Policy provides open access to research papers

Posted: 04 Jun 2012 08:25 AM PDT

 
Policy provides open access to research papers
UC Newsroom RSS Feed, (23 May 2012)
“The UCSF Academic Senate has voted to make electronic versions of their current and future scientific articles freely available to the public, helping to reverse decades of practice on the part of medical and scientific journal publishers to restrict access to research results. The unanimous vote of the faculty senate makes UCSF the largest scientific institution in the nation to adopt an open-access policy and among the first public universities to do so. ‘Our primary motivation is to make our research available to anyone who is interested in it, whether they are members of the general public or scientists without costly subscriptions to journals,’ said Richard A. Schneider, chair of the UCSF Academic Senate Committee on Library and Scholarly Communication, who spearheaded the initiative at UCSF... UCSF is the nation’s largest public recipient of funding from the National Institutes of Health (NIH), receiving 1,056 grants last year, valued at $532.8 million. Research from those and other grants leads to more than 4,500 scientific papers each year... but the majority of those papers are only available to subscribers who pay ever-increasing fees to the journals. The 10-campus University of California system spends close to $40 million each year to buy access to journals. Such restrictions and costs have been cited among the obstacles in translating scientific advances from laboratory research into improved clinical care. The new policy requires UCSF faculty to make each of their articles freely available immediately through an open access repository, and thus accessible to the public through search engines such as Google Scholar. Articles will be deposited in a UC repository, other national open-access repositories such as the NIH-sponsored PubMed Central, or published as open-access publications. They then will be available to be read, downloaded, mined or distributed without barriers. Hurdles do remain, Schneider noted. One will be convincing commercial publishers to modify their exclusive publication contracts to accommodate such a policy... Under terms negotiated with the NIH, a major proponent of open access, some of the premier journals only allow open access in PubMed Central one year after publication; prior to that only the titles and summaries of articles are freely available. How such journals will handle the UCSF policy remains to be seen, said Schneider. The UCSF policy gives the university a nonexclusive license to distribute any peer-reviewed articles that will also be published in scientific or medical journals. Researchers are able to ‘opt out’ ... ‘The hope,’ said Schneider, ‘is that faculty will think twice about where they publish, and choose to publish in journals that support the goals of the policy.’ UC was at the forefront of the movement to open scientific papers to the public through its libraries, and generated the first major effort to create a policy of this kind in 2006. It was a complex policy, though, requiring faculty to ‘opt in’ and for a variety of reasons failed to garner enough faculty votes across the UC system, said Schneider... In the past few years, 141 universities worldwide, including Harvard University and Massachusetts Institute of Technology, have learned from UC’s initial missteps and have created very effective blanket policies similar to the one just passed at UCSF, Schneider said. The UCSF vote was the result of a faculty-led initiative, and makes UCSF the first campus in the UC system to implement such a policy. It has been developed in collaboration with other UC campuses and systemwide committees, especially the systemwide Committee on Library and Scholarly Communication, with the ultimate goal of implementing the policy across all ten UC campuses. ‘This vote is very, very good news,’ said Karen Butter, UCSF librarian and assistant vice chancellor. ‘I am delighted that UCSF will join leading institutions in changing the model of scientific communications, and that UCSF authors have chosen to take control of their scholarship, providing new audiences with incredible opportunities to translate UCSF’s remarkable research into improving health care.’”

Gray Is The New Black in Scholarly Literature « American Anthropological Association

Posted: 04 Jun 2012 08:24 AM PDT

 
Gray Is The New Black in Scholarly Literature « American Anthropological Association
blog.aaanet.org
“The Resource Development Committee raised funds to support AAA members and anthropologists in sharing their research faster and more efficiently. With donations for the Gray Literature Portal, AAA has partnered with the Social Science Resource Network (SSRN) to create a new tool – the Anthropology and Archaeology Resource Network (AARN). What will the Anthropology and Archaeology Resource Network do? The AARN will give anthropology scholars access to distribute their technical reports, gray literature, preprints, and other scholarly contributions that might not have other outlets to become widely accessible and distributed across disciplines. The goals of the network are to help anthropological ideas and data be widely distributed. With AARN, you can create your own account and author page to publish your work – making it free and accessible to everybody. While the research is not peer-reviewed in the traditional sense, at SSRN each work goes through three layers of review to ensure the quality of scholarly discourse, the appropriate classification and the objectives of the specific network are met. Once your work is uploaded and reviewed it receives a digital object identifier (DOI) and a permanent URL for easy reference. You can also utilize AARN to conduct tailored searches... AAA partnered with SSRN in part because it is the leading digital repository of scholarly work and ranked in the top 10 publications by Google Scholar. As scholars evolve in the digital era, the Resource Development Committee is ensuring that AAA members have the needed tools to successfully share their grey literature in a reputable, open and freely accessible network.”

Institutional Repositories and measuring research impact

Posted: 04 Jun 2012 08:23 AM PDT

 
Institutional Repositories and measuring research impact
Manchester eScholar
Manchester eScholar Blog, (01 Jun 2012)
“For better or worse universities and researchers are coming under increasing pressure to demonstrate the wider impact that their funded research has beyond the end of the research project. But some are questioning whether traditional methods of measuring impact (peer-review, citation counts, Journal Impact Factor) are still fit for purpose in the modern research environment. The open nature of the social web is continuing to have a disrupting effect on many aspects of scholarly communication. Today's research article is increasingly likely to be disseminated via social media/bookmarking websites making it instantly available to huge numbers of potential readers due to the innately viral nature of these services. The exciting part is that modern web technologies make it possible to track each time an article is accessed from a tweet, blog or social bookmarking site. Now researchers can see in real-time how their research is being received around the globe. Transparent conversations evolve around papers as every (re)tweet, comment and annotation is available to be recorded and aggregated. This opens up the possibility to measure and define impact in new ways. A number of emerging services are taking the first steps to build impact metrics based on these new usage data - collectively they are referred to as altmetrics. The people behind these services believe altmetrics may in future be used to supplement (or replace!) traditional methods of measuring impact - they even have a manifesto... how can institutional repositories (IRs) capitalise on this broadening definition of research impact in order to benefit researchers? The continuing failure to convince many researchers of the positive benefits of depositing to their IR is a common criticism of the repository community. The availability of this trove of rich usage data represents an opportunity for IRs to demonstrate the frequency with which the research they contain is used on the web. Here at Manchester we've begun to make usage data captured using Google Analytics available through a new section of the service called 'View metrics'. Hopefully, by allowing researchers to see when, where and how often their eScholar records have been viewed and downloaded we can demonstrate how depositing research to eScholar can contribute to overall impact. In addition to usage data the 'View metrics' section also displays citation metrics under license from Thomson Reuters' InCites - Research Analytics as well as detailed metrics describing deposit activity. Researchers can view their own metrics as well as the metrics of their colleagues.”

No comments:

Post a Comment