Monday, 14 May 2012

Connotea: Bookmarks matching tag oa.new (50 items)

Connotea: Bookmarks matching tag oa.new (50 items)


Elsevier requires institutions to seek Elsevier's agreement to require their authors to exercise their rights?

Posted: 14 May 2012 05:32 AM PDT

 
Elsevier requires institutions to seek Elsevier's agreement to require their authors to exercise their rights?
nospam@example.com (Stevan Harnad)
Open Access Archivangelism, (14 May 2012)
1. Does Elsevier formally recognize that "all [Elsevier] authors can post [their accepted author manuscripts] voluntarily to their websites and institutional repositories" (quoting from Alicia Wise here)? According to Elsevier formal policy since 2004, the answer is yes. 2. What about the "not if it is mandatory" clause? That clause seems to be pure FUD and I strongly urge Elsevier -- for the sake of its public image, which is right now at an all-time low -- to drop that clause rather than digging itself deeper by trying to justify it. The goal of the strategy is transparent: "We wish to appear to be supportive of open access, formally encoding in our author agreements our authors' right to post their accepted author manuscripts to their institution's open access repository -- but [to ensure that publication remains sustainable,' we wish to prevent institutions from requiring their authors to exercise that right unless they make a side-deal with us." Not a commendable publisher strategy, at a time when the worldwide pressure for open access is mounting ever higher, and subscriptions are still paying the cost of publication, in full, and handsomely. If there is eventually to be a transition to hybrid or Gold OA publishing, let that transition occur without trying to hold hostage the authors' right to provide Green OA to their author accepted manuscripts by posting them free for all in their institutional repositories, exercising the right that Elsevier has formally agreed rests with the author.
Posted by stevanharnad (who is an author) to oa.mandates elsevier oa.new on Mon May 14 2012 at 12:32 UTC | info | related

How Elsevier Can Improve Its Public Image

Posted: 14 May 2012 05:29 AM PDT

 
How Elsevier Can Improve Its Public Image
nospam@example.com (Stevan Harnad)
Open Access Archivangelism, (13 May 2012)
Elsevier's public image is so bad today that rescinding its Green light to self-archive after almost a decade of mounting demand for OA is hardly a very attractive or viable option. And double-talk, smoke-screens and FUD are even less attractive or viable. It will hence very helpful in helping researchers to provide -- and their institutions and funders to mandate -- Open Access if Elsevier drops its "you may if you wish but not if you must" clause. It will also help to improve Elsevier's public image.
Posted by stevanharnad (who is an author) to oa.mandates oa.new on Mon May 14 2012 at 12:29 UTC | info | related

“Would [open access] be prestigious just because we say it is? I say, why not?”

Posted: 13 May 2012 03:47 PM PDT

 
“Would [open access] be prestigious just because we say it is? I say, why not?”
Gary F. Daught
Omega Alpha | Open Access, (13 May 2012)
Posted by OAopenaccess to oa.humanities oa.new on Sun May 13 2012 at 22:47 UTC | info | related

Whats’ the difference between Elsevier and British Gas?

Posted: 13 May 2012 11:42 AM PDT

 
Whats’ the difference between Elsevier and British Gas?
petermr's blog, (12 May 2012)
“This is a serious question and I have a serious answer. See if you can guess it. If so add a comment. You can substitute ‘FooPub’ for ‘Elsevier’ where FooPub is any #scholpub such as ACS, PLoS, Wiley, BMC, etc. You can substitute Eastern Water, Scottish Power, First Capital Connect (a train operator) and many others for ‘British Gas’. I shall continue to turn my attention to content-mining in the next few posts.”

Paywalls are backward-looking

Posted: 13 May 2012 11:41 AM PDT

 
Paywalls are backward-looking
Dave Winer's "Scripting News" weblog, (12 May 2012)
“Mathew Ingram, writing in GigaOm, offers three reasons he doesn't like paywalls. His second reason is ‘Paywalls are backward-looking, not forward-looking,’ which is the one that resonates with me. Before the Internet, news orgs had a natural paywall, the distribution system. If you wanted to read the paper you had to buy the paper. And the ink, and the gasoline it took to get it to where you are. In fact, everything that determined the structure of the news activity, that made it a business, was organized around the distribution system.   But that's been over now for quite some time. And paywalls express a desperate wish to go back to a time when there was a reason to pay. Now news, if it wants to continue, must find a new reason. I think there are plenty of ideas here. But linear problem-solution thinking won't get you there. This is the box we have to get out of. Because change comes whether or not our minds can conceive of it. That's the magic of new generations. Their minds are not limited the way ours are The model of news we used to practice was started to solve a business problem. A guy named Reuters who lived in London, wanted to know what was going on in the markets elsewhere in Europe. He found that the faster he got this information, the more money he could make. Then he learned that he could sell access to the flow of information for even more money. Then he wanted information from America. So he invested some of his profits from Europe in better ways of doing that... That's a simplification of course, but it's the basic idea... And eventually a new model emerged, of news as entertainment that could cause people to watch advertisements... But the Internet in NYC is a lot like news flow in Europe before Reuters revolutionized it. It really sucks for most of us. Why couldn't someone make it their business to solve this problem? If they do, I believe they will become the news organization of the future. Assuming people still want to live in NYC... I think you have to look at things this way. Where are the inefficiencies, and can you do something to erase them? If so, that's probably a good business... Most big news thinkers are not business people, so they don't seem to understand this. But the tech guys do think this way. And that's why that's where the new forward-looking movement is coming from.  Paywalls go the other way, they remove efficiencies. It's hard to see how, long-term, that can be profitable.  What paywalls are really asking is how are the news people of the past going to hold their lock on the flow of information in the future. And that's not a great question, because the answer is they aren't. Let's hope no one does. But of course that's a lot to hope for.”

Elsevier query on: "positive things from publishers..."

Posted: 11 May 2012 09:16 PM PDT

 
Elsevier query on: "positive things from publishers..."
openaccess.eprints.org
Alice Wise (Elsevier, Director of Universal Access, Elsevier) asked, on the Global Open Access List (GOAL): "[W]hat positive things are established scholarly publishers doing to facilitate the various visions for open access and future scholarly communications that should be encouraged, celebrated, recognized?" Elsevier's shameful, cynical, self-serving and incoherent clause about "mandates for systematic postings" ("you may post if you wish but not if you must"), which attempts to take it all back, should be dropped. That clause -- added when Elsevier realized that Green Gratis OA mandates were catching on -- is a paradigmatic example of publisher FUD and double-talk.
Posted by stevanharnad (who is an author) to FRPAA oa.mandates elsevier oa.new on Sat May 12 2012 at 04:16 UTC | info | related

A few Religious Studies articles showing up in SAGE Open open access “mega journal”; reviewers being solicited

Posted: 11 May 2012 08:15 AM PDT

Scientific Utopia: I. Opening scientific communication

Posted: 09 May 2012 01:52 PM PDT

 
Scientific Utopia: I. Opening scientific communication
Brian Nosek and Yoav Bar-Anan
Use the link above to access the paper published in arXiv. The abstract reads as follows: “Existing norms for scientific communication are rooted in anachronistic practices of bygone eras, making them needlessly inefficient. We outline a path that moves away from the existing model of scientific communication to improve the efficiency in meeting the purpose of public science - knowledge accumulation. We call for six changes: (1) full embrace of digital communication, (2) open access to all published research, (3) disentangling publication from evaluation, (4) breaking the ‘one article, one journal’ model with a grading system for evaluation and diversified dissemination outlets, (5) publishing peer review, and, (6) allowing open, continuous peer review. We address conceptual and practical barriers to change, and provide examples showing how the suggested practices are being used already. The critical barriers to change are not technical or financial; they are social. While scientists guard the status quo, they also have the power to change it.”

Open access

Posted: 09 May 2012 11:31 AM PDT

 
Open access
Nat Mater 11 (5), 353 (May 2012)
“Open-access (OA) publishing is in vogue, and for very good reasons. Readers — especially those in developing nations — gain from the removal of subscription barriers, authors benefit from the additional exposure their work receives, public funders put tax-payers' money to better use, and publishers benefit from the economies of scale of an author-pays publishing model. However, such a model, in which costs are spread across authors instead of subscribers, is not well suited for highly selective journals with high rejection rates, for which readers vastly outnumber authors. Indeed, whereas high-volume journals such as PLoS ONE and Scientific Reports require a modest article processing charge of $1,350, selective journals with an OA option (hybrid journals) charge higher fees ($2,700 for Physical Review Letters and $5,000 for Nature Communications), and for highly selective journals such as Nature or the Nature research journals the amount would be much higher. Some publishing companies are therefore exploring additional revenue models based on providing added-value services for authors and readers. An alternative viable option is the publishing model pursued by the Public Library of Science (PLoS), in which the cost of running selective journals is subsidized by profit-generating, high-volume journals. Of course, the content of research articles is provided to publishers for free, peer-review work is voluntary and unpaid, and printed copies have long ceased to be an absolute need. It therefore makes increasingly more sense for journals (especially those belonging to high-volume publishers) to make access to published papers free, and charge authors for manuscript services (manuscript selection, peer-review management, editing, copyediting, typesetting, web hosting) and readers for premium services. Open-access publishing is thus transforming the scholarly publishing industry from being content sellers to becoming service providers. This transformation to OA publishing of scholarly content is happening quickly, driven by decreasing costs of digital versus print, by OA self-archiving mandates adopted by funders and institutions, and by public pressure into embracing the principles of open science. A study showed that in a random sample of 1,837 articles published in 2008 across multiple disciplines, 20.4% of the papers were freely accessible, 8.5% from the publishers' websites (so-called gold OA) and 11.9% elsewhere (green OA). But how quickly is the OA literature growing? According to recent data2, the projected yearly growth in the number of gold OA papers indexed by Thomson Reuters is 20%, whereas the total annual growth of published articles is 3.5%. One would thus expect that the output of OA journals is growing at a much faster pace than that of journals offering subscription-based access. However, as shown in Fig. 1, although early annual growth rates can be very high (approaching 100% for PLoS ONE and Nature Communications), the difference in sustained average growth rates between PLoS ONE and ACS Nano or Soft Matter — subscription-based journals with limited rejection rates — is only moderate. Of course, higher manuscript rejection rates can lead to slow output growth (Small) or inappreciable growth (this journal). This suggests that the expansion rate of OA literature will largely depend on the willingness of publishers to embrace economically viable OA publishing models3, and on their capability to innovate products and services that provide value to researchers. Also, at a projected 20% annual growth, by 2020 only about 27% of the papers published in that year will be gold OA. In the meantime, green OA should be adopted as widely as possible, for the benefit of researchers and the public. Many publishers, among them Nature Publishing Group (NPG), encourage authors to post their own version of the accepted paper (incorporating a reference and URL to the published version on the journal's website) in institutional repositories for public release six months after publication. Moreover, NPG and other publishers allow authors to post the originally submitted versions of manuscripts online (but not subsequent versions that evolve as a result of the editorial process), for instance on the ArXiv preprint archive. You are certainly welcome to do so."

Journal of Public Interest IP

Posted: 09 May 2012 08:33 AM PDT

 
Journal of Public Interest IP
www.piipajournal.org
Journal of Public Interest IP is a new peer-reviewed OA journal published by Public Interest Intellectual Property Advisors (PIIPA).

Open Access Understanding - signed by ANBPR and ABR

Posted: 09 May 2012 06:38 AM PDT

 
Open Access Understanding - signed by ANBPR and ABR
www.kosson.ro
"Kosson community has reached a new milestone in advocating Open Access in Romania. The Open Access Understanding - the main instrument for kick starting concrete actions received the institutional signature of the two important Library and Information Science Associations from Romania: National Association of Librarians and Public Libraries in Romania and The National Librarians Association...."

Cancel the subscription - Indian Express

Posted: 08 May 2012 04:26 PM PDT

Peer-Reviewed Open Access Publishing.

Posted: 08 May 2012 03:04 PM PDT

 
Peer-Reviewed Open Access Publishing.
peerj.com
Mission: Provide a platform to openly publish peer reviewed scholarly research at a price that reflects the dramatic decrease in costs brought about from today's technology. We believe an open model such as this will accelerate the outcome of research for both the individual researcher and the community.
Posted by whires to oa.new on Tue May 08 2012 at 22:04 UTC | info | related

Harvard open memo says major journal publishers’ prices are “untenable”

Posted: 08 May 2012 06:24 AM PDT

 
Harvard open memo says major journal publishers’ prices are “untenable”
www.lib.neu.edu
On April 17, 2012, Harvard University’s Faculty Advisory Council on the Library issued an open memo to the Harvard community stating that “major periodical subscriptions cannot be sustained” due to high prices and unreasonable publisher practices. If this topic sounds familiar, it’s because it’s already been in the news recently – in January, mathematician Timothy Gowers-Lee blogged about these issues specifically as they relate to publishing giant Elsevier. In February, a website was created where scholars could sign on to a boycott of Elsevier; as of today over 10,000 signatures have been gathered...
Posted by hcorbett (who is an author) to oa.new on Tue May 08 2012 at 13:24 UTC | info | related

Austria innovativ: Freier Literaturzugang für alle

Posted: 08 May 2012 03:37 AM PDT

 
Austria innovativ: Freier Literaturzugang für alle
VÖBBLOG, (07 May 2012)
Posted by Klausgraf to oa.new on Tue May 08 2012 at 10:37 UTC | info | related

Open Access Policies on Scholarly Publishing in the University Context

Posted: 07 May 2012 03:23 PM PDT

OpenData BC

Posted: 07 May 2012 02:37 PM PDT

 
OpenData BC
www.opendatabc.ca
New Citizen's Group for open data in BC.
Posted by heathermorrison to oa.data oa.new on Mon May 07 2012 at 21:37 UTC | info | related

Time Travel for the Scholarly Web - YouTube

Posted: 07 May 2012 02:24 PM PDT

 
Time Travel for the Scholarly Web - YouTube
www.youtube.com
Use the link above to access the video of a presentation uploaded to youtube by rivervalleytv. The presentation Time Travel for the Scholarly Web was made by Herbert Van de Sompel, staff scientist, Research Library of the Los Alamos National Library, at the STM Innovations Seminar 2011.

Journal Article Mining and Scholarly Publishers - YouTube

Posted: 07 May 2012 02:22 PM PDT

 
Journal Article Mining and Scholarly Publishers - YouTube
www.youtube.com
Use the link above to access the presentation by Maurits van der Graaf, Pleiade Management Consultancy and Management, at the STM Innovations Seminar 2011. The video was uploaded to youtube by rivervalleytv on December 28, 2011.

DuraSpace Launches 2012 Community Sponsorship Program | DuraSpace

Posted: 07 May 2012 02:12 PM PDT

 
DuraSpace Launches 2012 Community Sponsorship Program | DuraSpace
duraspace.org
“Today the DuraSpace organization announced the launch of the 2012 Sponsorship Program encouraging users of DSpace, Fedora and DuraCloud open source technologies for digital preservation and access to invest in radical collaboration by becoming DuraSpace sponsors. This software is available free of charge and DuraSpace does not receive significant funding from government agencies or private foundations. Instead DuraSpace relies on financial support from those organizations who directly benefit by using its open source software. ‘The challenge of preserving academic content is too great for any one institution to tackle alone’, says Karin Wittenborg, University Librarian, University of Virginia. ‘DuraSpace fosters collaborative activities and open-source solutions to ensure that knowledge will be accessible to future generations.’ DuraSpace Community Sponsorship information: http://duraspace.org/sponsors ... Research data and the scholarly record continue to grow while libraries remain uniquely positioned to take advantage of open access, open source and open data systems that address burgeoning data and access for future generations. DuraSpace, an independent 501(c)(3) not-for-profit organization, is a catalyst in this effort by supporting the development of DSpace (http://DSpace.org/) and Fedora (http://Fedora-commons.org/) open source software for digital repositories, and DuraCloud (http://DuraCloud.org/), a hosted service and open technology for managing content in the cloud. The combined benefits of strategic leadership, innovative solutions, community outreach and advocacy on behalf of DuraSpace communities have resulted in ongoing development and deployment of open technologies for durable, persistent access to digital data. Community collaborators in these efforts include highly respected academic institutions, government agencies, and scientific and cultural organizations. DuraSpace advocates for open access to scholarly publications and for interoperability of the supporting technologies and has helped to establish open standards and protocols by working with other open source software projects and commercial partners on integration strategies. DuraSpace is an active participant in the Digital Preservation Network (DPN), EDUCAUSE, International Open Repositories Conference, Internet2, and the National Digital Stewardship Alliance (NDSA)...”

Exercises in democracy: building a digital public library

Posted: 07 May 2012 02:10 PM PDT

 
Exercises in democracy: building a digital public library
arstechnica.com
“Most neighborhoods in America have a public library. Now the biggest neighborhood in America, the Internet, wants a library of its own. Last week, Ars attended a conference held by the Digital Public Library of America, a nascent group of intellectuals hoping to put all of America's library holdings online. The DPLA is still in its infancy—there's no official staff, nor is there a finished website where you can access all the books they imagine will be accessible. But if the small handful of volunteers and directors have their way, you'll see all that by April 2013 at the latest. Last week's conference set out to answer a lot of questions. How much content should be centralized, and how much should come from local libraries? How will the Digital Public Library be run? Can an endowment-funded public institution succeed where Google Books has largely failed (a 4,000-word meditation on this topic is offered by Nicholas Carr in MIT's April Technology Review)? As it stands, the DPLA has a couple million dollars in funding from charitable trusts like the Alfred P. Sloan Foundation and the Arcadia Fund. The organization is applying for 501(c)3 status this year, and its not hard to imagine it running as an NPR-like entity, with some government funding, some private giving, and a lot of fundraisers. But outside of those details, very little about the Digital Public Library has been decided. ‘We’re still grappling with the fundamental question of what exactly is the DPLA,’ John Palfrey, chair of the organization’s steering committee, admitted. The organization must be a bank of documents, and a vast sea of metadata; an advocate for the people, and a partner with publishing houses; a way to make location irrelevant to library access without giving neighborhoods a reason to cut local library funding. And that will be hard to do... When people hear ‘Digital Public Library,’ many assume a setup like Google Books: a single, searchable hub of books that you can read online, for free. But the DPLA will have to manage expectations on that front. Not only are in-copyright works a huge barrier to entry, but a Digital Public Library will be inextricably tied to local libraries, many of which have their own online collections, often overlapping with other collections. An online library of America will have to strike a balance between giving centralized marching orders, and acting as an organizer of decentralized cooperation. ‘On the one hand would [the DPLA only offer] metadata? No, that’s not going to be satisfying. Or are we trying to build a colossal database? No that’d be too hard,’ Palfrey noted to the audience last Friday. ‘Access to content is crucial to what the DPLA is, and much of the usage will be people coming through local libraries that are using its API. We need something that does change things but doesn’t ignore what the Internet is and how it works.’ Wikimedia was referenced again and again throughout the conference as a potential model for the library. Could the Digital Public Library act as a decentralized national bookshelf, letting institutions and individuals alike contribute to the database? With the right kind of legal checks, it would certainly make amassing a library easier, and an anything-goes model for the library would bypass arguments over the value of any particular work. Palfrey even suggested to the audience that the DPLA fund ‘Scan-ebagoes’—Winnebagoes equipped with scanning devices that tour the country and put local area content online. But the Wikimedia model, where anyone can write or edit entries in the online encyclopedia, could present problems for an organization looking to retain the same credibility as a local library. Several local librarians attended the conference, and voiced concerns over how to incorporate works of local significance and texts published straight to an e-book format, into the national library. The easy answer is that all information should be accessible to anyone who wants it, but some curating might be necessary to make sure every library in America gets on board. Although he stipulated that his answer was speculative, Palfrey told Ars that individuals would not be contributing to the Digital Public Library, at least at the beginning. ‘Libraries have done this for a long time, [appraisal] is not a new problem,’ he said. Similarly, the Scan-ebago idea is brimming with populist appeal, but Google Books is proof that it’s not always as easy as scanning and uploading documents that people want to see online. Publisher Tim O’Reilly of O’Reilly Media played the print industry’s white knight at the DPLA’s conference, explaining to the audience how his company adapted to the prevalence of on-demand information. ‘We’ve insisted from the beginning that our books be DRM free,’ He insisted to applause. Brewster Kahle, another champion of digital (and physical) libraries and the founder of the hosting Internet Archive, suggested that the DPLA buy, say, five electronic copies of an e-book, and digitally lend them out, just like one rents a movie off Amazon or iTunes, which expires in 24 hours or a few days. When an audience member questioned Kahle on what it would take for publishers to nix DRM (or Digital Rights Management restrictions, which confine certain formats to specific e-book readers) for that rent-a-book idea to be more widely viable, Kahle replied facetiously, ‘Wanting to have a business at the end of the day?’ Kahle and O’Reilly are members of a growing number of publishing industry-types that believe that fixing books to a single e-reader platform is an unsustainable business practice that will naturally become extinct... Wishing DRM away, or convincing charitable investors that’s it’s not going to be a problem, could be an Achilles heel for the Digital Public Library... While content is a thorny issue, what the DPLA can leverage to establish itself as a force that won’t be ignored by content providers, is the massive amount of metadata it’s collected about books, including data for over 12 million books from Harvard’s libraries. These aren’t actual books, but details about books you can find in libraries across the country. Sure, it’s not exactly a romantic liberation of information, but this data is a roadmap to everything that’s available out there, and where users can find it. Building an API with all of this metadata is also the first step to the ideal because a digital library is useless if search doesn’t work. ‘It’s critical to think through search: how to leverage the distributed nature of the internet, and keep [content] in open formats that are linkable,’ O’Reilly said. With an open API, the organization’s extensive database could be distributed to all libraries to build their own digital public library on top of it. There are other benefits to organizing all the metadata too. Involvement has long been an issue for local libraries, and members of the Digital Public Library’s volunteer development team suggested that the API could be used to build social applications on top of the DPLA platform, or map the database and include links to other relevant online databases of culture, like Europeana. ‘The DPLA could sponsor some research in managing all the metadata,’ David Weinberger, a member of the DPLA’s dev team, suggested. But in the meantime, the group is relying on volunteer time from developers at occasional DPLA-sponsored hackathons. By April 2013, Weinberger said, the DPLA aims to have a working API with a custom ingestion engine to put metadata from library holdings online, a substantial aggregation of cultural collection metadata and DPLA digitizations, and community developed apps and integration. All mostly from the help of volunteers and open source enthusiasts. The problem the DPLA has now, explained Weinberger, is figuring out how to build an API that makes use of all the metatdata without giving weight to information that will incorrectly classify a lot of the books. Similarly, he described the DPLA’s ‘deep, deep problem’ of ‘duping’ which happens when two caches of data describe the same book differently, leading to duplicates. Despite the challenges facing the Digital Public Library of America, it’s a concept that needs to come to fruition sooner than later... One of the earliest speakers at the conference, Dwight McInvaill, a local librarian for North Carolina’s Georgetown County Library, spoke of how important it is to digitize works for the good of the public. His own library’s digital collection gets over 2 million hits a month. ‘Small libraries serve 64.7 million people," he said, many of those in poverty. We must engage forcefully in the bright American Digital Renaissance,’ McInvaill proclaimed. Either that, or be left in the physical book dark ages.”

Open access is no more than academic consumerism. It neither democratizes knowledge production nor communication

Posted: 07 May 2012 02:09 PM PDT

 
Open access is no more than academic consumerism. It neither democratizes knowledge production nor communication
sociologicalimagination.org
“The Open Access movement should be seen for what it is – nothing more but nothing less than a consumerist revolt, academic style. No one in this revolt is calling for what is sometimes called ‘extended peer review’ (whereby relevant non-academic stakeholders operate as knowledge gatekeepers), let alone the abandonment of science’s normal technicality. In fact, the moral suasiveness of a journalist like the Guardian’s George Monbiot rests on his support of BOTH science’s normal authorising procedures AND the free distribution of their fruits. In short, it’s all about making research cheaper to access by those who already possess the skills to do so but are held back by such ‘artificial’ barriers as publishers’ paywalls. Nothing in this dispute bears on questions concerning how one might democratise knowledge production itself (e.g. how research credit might be distributed across students, informants, etc.; how one might select research topics that people find worthwhile; how impact across many audiences might be made a desideratum for securing a research grant). Certainly there is no reason to believe that science communication/engagement is served by an open access policy to commercial scientific publications, if the target body of knowledge remains encoded as it has for the last 100 years. I take it that this is the message that Alice Bell is trying to send, perhaps too politely.”

What the UC “open access” policy should say

Posted: 07 May 2012 02:08 PM PDT

 
What the UC “open access” policy should say
Michael Eisen
it is NOT junk, (04 May 2012)
“The joint faculty senate of the ten campuses of the University of California has floated a trial balloon ‘open access’ policy. I, of course, laud the effort to move the ball forward on open access, but the proposed policy falls short in two key ways. 1) The rights reserved by the University are too limited. Rather than granting UC the right to redistribute the article, the policy should place all scholarly works produced by UC faculty under a Creative Commons Attribution License. 2) There should be no ‘opt-out’ provision. Here is my edited version of the proposal (my additions are in green)..." [Use the link above to see the edits referred to by the blogger.]

Rule Britannia! On David Willetts and open access to research.

Posted: 07 May 2012 02:07 PM PDT

 
Rule Britannia! On David Willetts and open access to research.
figshare.com
“It is a proud day to be British, for good intentions at least! UK minister of state for universities and science David Willetts announced on Tuesday in a piece in the Guardian that the UK would be making all publicly funded research openly available to all: ‘Giving people the right to roam freely over publicly funded research will usher in a new era of academic discovery and collaboration, and will put the U.K. at the forefront of open research’. Eric Merkel-Sobotta, executive vice president for corporate communications at Springer, offers a more pessimistic view with his comments in The Chronicle. Given the lack of details so far, ‘it's too early to say whether this will be a success,’ he said of the plan. ‘It looks like setting off fireworks, but nobody's really sure what holiday we're celebrating.‘ He has a point. But let us not digress from the significance of this... for a long time researchers have been dismissive about the benefits of open access, when closed access publishing can help their career. It isn't the fault of researchers. I'm talking from experience here. I know the pressure of wanting to advance my career through publishing in the journals with the highest Impact Factor. It is only relatively recently that I learnt just how messed up the scientific publication process has become. Take the Impact factor for instance... At figshare, we work on the principles of carrots and sticks. We are here for the researcher. We allow users to make all of their research publicly available, visualisable in the browser at no cost, whether this is a pdf or a video. We give you metrics on your research so you can track the true impact you are having. We try to make the barrier to this technology so low that anyone who can operate a computer can share their research outputs with the world. These are the carrots. Researchers also need sticks. The NSF in the USA have mandated that all researchers have data management plans. The UK government needs to make sure that they see this through, that UK academic institutions ensure that their researchers use the repository or whatever they are planning. The Royal Society current study on Science as a public enterpriseis already addressing the ever changing face of scientific research and the way it should be disseminated. The government is committing £2 million to this, if we end up with a UK wide version of an institutional repository the enthusiasm for this forward thinking mentality would be slightly lost... Martin Hall, a member of the Finch working group who is the vice-chancellor of Salford University speaking on nature blogs, reckons that ultimately we will see a transition to gold - so the real question is how long this will take.For me this raises a bigger question. The British government is pioneering in attempting to do the right thing through open access, but there is a danger that a lot of the value from this open access research will still be trapped. A good example of how to do this right is PLoS. All PLoS journals are licensed under CC-BY. Michael Eisen lists some scientific publisher that have chosen to use creative commons licenses with extended clauses on his blog. Mike Taylor discusses the implications of these clauses very clearly on his blog: ‘Although these additional clauses are intuitively appealing, they typically have unintended consequences that hamper the reusability of information published in this way...’ These clauses will not allow the researchers to ‘roam freely over publicly funded research’. So my message to David Willetts and the UK government is this. Well done on such a positive move, please don't mess this up. You don't need to reinvent the wheel and you do need to mandate licensing at least as un-restrictive as CC-BY."

Open access as a matter of academic ethics: The right thing to do

Posted: 07 May 2012 02:05 PM PDT

 
Open access as a matter of academic ethics: The right thing to do
Gary F. Daught
Omega Alpha | Open Access, (03 May 2012)
“In order to succeed, open access needs to demonstrate real and practical benefits to the scholarly community and beyond. In order to instill confidence as a publishing model, open access needs to be both economically accessible and economically sustainable. However, it’s not just what open access needs to be or do to justify its serious consideration. As has been discussed on this blog before (especially here), open access brings with it a capacity to raise awareness of a values dynamic in the creation, dissemination, and use of the products of scholarly research. If for no other reason than the fact that open access presents scholars a choice (as an alternative to traditional publishing avenues), it ought to at least provoke thoughtfulness about what we need to be or do when it comes to publishing research. The decision is no longer just where will we publish (to best enhance our academic careers and reputations) but how will we publish (to best contribute to the growth of knowledge to the widest possible audience with the fewest possible barriers). In other words, open access introduces an added dynamic to academic ethics. John Willinsky and Juan Pablo Alperin, in a recent article entitled ‘The academic ethics of open access to research and scholarship’ (in Ethics and Education 6(3), October 2011, pp. 217-223), note that academic ethics in the institutional context usually focuses on issues of academic integrity and honesty... Ethics committees in the institutional context are commonly directed to investigate and recommend disciplinary action against faculty or students for infringements of academic integrity and honesty, whether through fraud (manufactured data), or plagiarism (intentionally claiming the work of another as one’s own, or failing to give proper attribution). The application of academic ethics from this negative point of view is, regrettably, too well known. The new dynamic Willinsky and Alperin propose is that we also view academic ethics as positive action. Ethics committees in the institutional context are commonly directed to investigate and recommend disciplinary action against faculty or students for infringements of academic integrity and honesty, whether through fraud (manufactured data), or plagiarism (intentionally claiming the work of another as one’s own, or failing to give proper attribution). The application of academic ethics from this negative point of view is, regrettably, too well known. The new dynamic Willinsky and Alperin propose is that we also view academic ethics as positive action... Willinsky and Alperin believe that whereas the negative aspects of academic ethics have and will be with us for a long time, ‘there is something of a time-limited opportunity for ethical action when publishing models are changing and in this unsettled period are radically split between tendencies toward increasingly restrictive (for reasons of profit) and open (for wider sharing) practices’ (p. 218)... But not all open access approaches are ethically equivalent. Publishers may sense a change in the air. Willinsky and Alperin raise concerns about ‘the commercialization of OA,’ and merely shifting the economic model from “high-priced subscription journals to high-priced article-processing fees” (p. 219). Yes, this approach brings down the user-side paywall, but it raises new barriers for researchers and scholars. Some disciplines enjoy generous grant funding that can fairly easily cover these new producer-side open access publishing costs. But what about disciplines—like most of the humanities, including religious studies—where grant funding is scarce and department budgets are perennially tight? If scholars cannot afford to publish their research how can even open access help to enhance the production of and access to knowledge? Willinsky and Alperin do not say this explicitly, but the implication is that institutions, as part of their larger concern that faculty and students act ethically in their academic context, need to take a role in authorizing scholars to choose less costly approaches to open access, and also recognize the legitimacy of these approaches. Willinsky and Alperin specifically mention author self-archiving of research papers and articles in open access institutional repositories or websites, and the creation of new ‘scholar-publisher’ open access journals that typically do not charge article-processing fees as two low-cost alternatives to either the traditional subscription-based journal, or the open access journal that imposes article-processing fees. We are, of course, familiar with John Willinsky’s work with the Public Knowledge Project at Simon Fraser University in Canada, which developed the open source Open Journal Systems journal platform. A study referenced by Willinsky and Alperin, which surveyed journals using Open Journal Systems, found that ‘a good number of journals are making a go of it with OA. [The study notes that approximately 5,000 journals are using the OJS platform!] [T]he scholar-publisher [i.e., a scholar or group of scholars in an academic department who start a journal without seeking/requiring the support (or the costs) of a professional publisher] dominates these titles, with an average per article cost under US$200’ (p. 219).”

Hot Type: Elsevier Experiments With Allowing 'Text Mining' of Its Journals - Technology - The Chronicle of Higher Education

Posted: 07 May 2012 02:04 PM PDT

 
Hot Type: Elsevier Experiments With Allowing 'Text Mining' of Its Journals - Technology - The Chronicle of Higher Education
chronicle.com
“High-profile scholarly boycotts aren't the only way to get a big publisher's attention. Sometimes all it takes is a tweet. Not long ago, Heather A. Piwowar, a postdoctoral researcher at the University of British Columbia, found herself on the phone with six high-level employees of the science-publishing giant Elsevier. Ms. Piwowar studies patterns in the sharing and reuse of research data. (Her Twitter handle is @researchremix.) Her work depends on text mining, using computers to automatically pull certain kinds of information from large amounts of text, including databases of journal articles. Many of those are subscription-based, and can be hard to get access to. The chat with Elsevier came about because Ms. Piwowar had complained on Twitter about how little Elsevier content was openly available to text mine. Alicia Wise (@wisealic), the publisher's director of universal access, responded, saying that Elsevier content could be text-mined, which led to the phone talk and negotiations by e-mail, and eventually to an agreement between Ms. Piwowar's university and the publisher that will allow UBC researchers to dig into Elsevier content for research purposes ... Elsevier could use some good PR right now. In some quarters, the publisher has become the Great Satan of scholarly communication because of what critics see as price gouging and attempts to quash ‘the free exchange of information.’ More than 11,000 researchers, including Ms. Piwowar, have pledged that they will not review for or contribute to its journals. Ms. Piwowar is cheerfully frank about what she thinks Elsevier stands to gain from working with her. She's plugged into an active network of researchers who push for open access to data, and experiment with new ways to track and measure research activity... ‘I think I have the ear of some people in social media, and by making me happy and by clearly trying hard, they're sending a message that's different than the message that's otherwise going around about Elsevier these days,’ she says. ‘Clearly they're using me, but that's OK. I still think it can be a win-win-win-win.’ Librarians often can't talk about their dealings with publishers because of nondisclosure agreements. Unconstrained, Ms. Piwowar has made it a point to be as open as possible about the Elsevier negotiations, chronicling the process step by step on her blog. At least one prominent advocate of text-mining rights, though, doesn't love the UBC-Elsevier arrangement: Peter Murray-Rust, a professor of chemistry at the University of Cambridge. ‘The rigmarole that Elsevier put Heather Piwowar through with UBC librarians is out of order, and in any case doesn't scale across publishers, libraries, or researchers,’ he wrote in a May 1 blog post titled ‘Towards a Manifesto on Open Mining of Scholarship.’ Mr. Murray-Rust convened a Skype meeting of researchers, including Ms. Piwowar, last week to discuss what a manifesto should say. In his May 1 post, he argued that restrictions on text mining stifle opportunity for new kinds of research, waste time by forcing researchers to chase after permissions or to enter into complicated negotiations with publishers, and lead to bad science and bad policy decisions. ‘His opinion is that negotiating with Elsevier and other publishers is the wrong approach, and that we have these rights to use the material in a responsible way,’ Ms. Piwowar says. She's pragmatic about the UBC deal. ‘I don't think this is the best way or the long-term way or the most scalable way, but if Elsevier unexpectedly wants to offer me access, let's see how that works, and let's use that to get access for UBC from other publishers, and for other universities.’ Elsevier's representatives say the boycott didn't motivate their invitation to Ms. Piwowar. "This is just good practice in interacting with the academic community," says Ms. Wise. ‘We're constantly evolving our products and services, and of course are wanting to keep pace with changes in scholarly behavior.’ The deal seems to be playing well with librarians at the University of British Columbia, who have been closely involved in the conversations. ‘I'm thrilled to see Elsevier doing this.’ says Lea Starr, associate university librarian for research services. ‘It shows that they are listening.’ For the library, negotiating a text-mining agreement was unexplored territory. Teresa Lee, the university's e-resource and access librarian, reviews contracts for UBC's digital subscriptions. It's unusual, and exciting, she says, to have a researcher so directly involved in library negotiations. She also sees this as a chance for the library to set a good precedent as it works out the details with Elsevier. ‘I think what we should look toward is crafting a model agreement that we could then turn around and use with other publishers,’ Ms. Lee says. The agreement with UBC could be a useful experiment for Elsevier as well. Ms. Wise says it's eager to learn more about what researchers want out of text mining, and how that varies from discipline to discipline...”

Data Mining for STM Content Offers Opportunities, Obstacles

Posted: 07 May 2012 02:03 PM PDT

 
Data Mining for STM Content Offers Opportunities, Obstacles
Guest Contributor
Publishing Perspectives, (04 May 2012)
“While demand for data mining of scholarly content is mounting, lack of standardization of search technologies, interfaces and licensing terms hinders its use. When researchers make these mining requests, publishers of Scientific, Technical, and Medical content (STM) generally handle mining requests from third parties liberally. However, they have concerns if the mining results can replace, or compete with, the original content or if the mining is burdening their systems. Many publishers have publicly available mining policies, and most handle mining requests case-by-case. According to one study for open-access journals, mining is generally allowed as part of standard terms and conditions.  For content published in a more restrictive way, however, nearly all publishers require information about the intent and purpose of the mining request [Smit and van der Graaf, 2011]. In addition, for many publishing organizations as well as the associations on whose behalf journals and books are published, the administration of complicated mining rights is labor intensive and expensive. As a consequence, the supply side of STM data mining currently lacks broad-based product offerings. On the demand side, the audience is broad, and the needs vary significantly. Content mining holds tremendous potential for unlocking scientific discoveries and a great deal more. Scientists who conduct research within pharmaceutical companies might be interested in studying all reported side effects of a particular substance. Others might be doing research on the possible correlation between certain genes and a particular disease. But the potential for data mining goes much further. In the financial world, data mining techniques are pervasive and are already deployed in numerous applications, ranging from black box trading to macro-economic analyses. Enriching these applications with additional information from STM data mining could be of great value to hedge fund managers and other finance professionals. In legal affairs, the mining of content can help in addressing challenges in IP litigation such as e-discovery. Engineers can enhance risk factor analysis and quality assurance by mining information associated with a particular substance or process. Publishers may benefit from a way to mine their own content and that of other rightsholders to open up new markets for this information. The list of STM data mining applications is endless. n discussions of the mining of scientific articles and books, there is a clear intersection between data and text mining and access. Content mining benefits from the availability of a body of content, which is sufficiently large and is relevant to a particular topic. In many cases a significant volume of that content is available only by subscription. To be as effective as possible, semantic data and text searches require access to all content, whether available through open access, through repositories, or through subscriptions. When looking at the mining issue, publishers ask some very valid questions. How can rightsholders allowing mining protect their most valued asset — their content — from theft or manipulation? Can rightsholders allow searches of their databases without allowing copies of their entire database of works being made? How does a rightsholder determine when to charge for queries and how much? Can rightsholders allow searches of their databases without creating problems for other users of the database? Clearly text and data mining is one of the next areas in which both publishers and content users are eager to find a solution, enabling scientific discovery while realizing and protecting the value of rightsholder content. Such a voluntary solution may require the participation of an intermediary, an experienced collective management organization that can design policies and processes that effectively serve the needs of pharmaceutical companies, STM publishers, researchers and others."

Blinding us with science journals

Posted: 07 May 2012 02:01 PM PDT

 
Blinding us with science journals
www.vancouversun.com
“On Feb. 6, 2010, the prestigious medical journal the Lancet published one of the most anticipated papers in its 187-year history ... the paper retracted a previously published paper - specifically, the now infamous 1998 study in which former British surgeon Andrew Wakefield proposed, using falsified data, the existence of a link between the measles, mumps and rubella (MMR) vaccine and autism. Amazingly, the Lancet took 12 years to publish the retraction, even though it became known, within the first few years after publication, that other researchers could not reproduce Wakefield's results. Indeed, the retraction was only published after the British General Medical Council found Wakefield guilty of three dozen charges, including dishonesty and abusing developmentally disabled children for research purposes, and revoked his licence to practice medicine. And the GMC's hearing only occurred after Sunday Times journalist Brian Deer completed an investigation into Wakefield's fraudulent activities. Hence, it took an investigative reporter to bring to light one of the biggest scientific scandals in recent memory, a scandal that placed disabled children in jeopardy and that fuelled - and continues to fuel - the anti-science, anti-vaccination movement. This suggests that something is seriously wrong, not just with those who oppose science, but with science itself. Glenn Begley appears to agree with this sentiment ... Begley, the former vice-president of biopharmaceutical company Amgen, outlined, in a recent Nature article, serious problems with the scientific enterprise. Begley noted that over the last decade, scientists at Amgen identified 53 ‘landmark’ cancer studies and, before engaging in their own research, tried to reproduce the landmark studies' results. Yet in only six cases (11 per cent) were Amgen scientists able to confirm the studies' findings. And Amgen's results are not unique... the New York Times recently reported that the number of retracted studies has increased tenfold in the past decade, while the number of papers published has increased only 44 per cent. And one study in the Journal of Medical Ethics found about three quarters of retractions were due to error and one quarter due to fraud. The question, then, is why this is happening and, even more importantly, what can be done to stop it? One easy answer is that it's not the number of mistaken or fraudulent studies that has increased, but rather the number of such studies that have been discovered... Begley charges that the ‘academic sys-tem and peer-review process tolerates and perhaps inadvertently encourages ... the publication of erroneous, selective or irreproducible data.’ That's a serious charge, but there is substantial evidence to support it. Begley notes, for example, that for a young researcher to obtain tenure in a university, or for a more seasoned researcher to obtain a promotion or grant, a strong publication record, typically with publications in the most prestigious journals, is required. In a separate paper, Ferric Fang, editor in chief of the journal Infection and Immunity, and Arturo Casadevall, editor in chief of the journal mBio, echo these concerns, noting that intense pressure for positions and grants has produced a hyper-competitive environment where fraud and error, while never justifiable, become more likely... For example, while virtually all scientific discoveries are the result of the work of many people over a long period of time, a culture that values high-profile publications rewards only those who announce a discovery first. This not only presents a distorted portrait of the nature of science, but serves to further distort that picture as it discourages communication among scientists for fear that their results may aid the work of others. And this desire of researchers to keep their work close to their chests is harmful or fatal to science, since science works, and works best, through collaboration. Given these problems, Begley, Casa-devall and Fang all agree that the scientific enterprise needs to engage in a fundamental reordering of its values. Specifically, while competition itself cannot and should not be eliminated, collaboration and teamwork, and teaching and mentoring, should play a much greater role in the awarding of tenure, promotion, grants and scientific prizes. Furthermore, Begley argues that journals must similarly reconsider what they value when making publication decisions. Specifically, he notes that journals prefer to publish papers in which the hypothesis being tested is confirmed. Yet hypotheses are often not confirmed, and the failure to confirm tells us that the hypothesis might well be false. Hence, failure to publish such information presents a distorted picture, not of science, but of reality. Indeed, as epidemiologist John Ioannides has been arguing for the better of the last decade, the practice of ignoring ‘negative’ data - data which fails to confirm a hypothesis - has resulted in the publication of many false findings. In addition, Begley argues that it virtually guarantees that publication-hungry researchers will ‘submit selected data sets for publication, or even massage ... data to fit the under-lying hypothesis.’ As Begley, Fang and Casadevall all admit, it won't be easy to make such fundamental changes to the scientific enterprise and to scientific culture. But then science has never been about what's easy. Rather, science has always been about what's true, and about what works. And as long as things remain the way they are, it isn't true, and it's not working.”

Access to the Finch Committee on Open Access

Posted: 07 May 2012 01:59 PM PDT

 
Access to the Finch Committee on Open Access
Reciprocal Space, (05 May 2012)
“The Finch Committee, set up last year by David Willetts to examine how UK-funded research findings can be made more accessible — and mentioned by the minister in his speech on the subject earlier this week — has been meeting regularly and is due to report within weeks. If you would like to find out more about the committee’s deliberations, you can. The notes of their meetings are published on the website of the Research information Network (RIN). Chaired by Dame Janet Finch DBE (Professor of Sociology at Manchester University) the Working Group on Expanding Access to Published Research Findings (WG), to give the committee its long and proper name, is made up of representatives of researchers, universities, librarians, publishers, funders (Wellcome Trust, RCUK, HEFCE), learned societies and RIN. It has met three times to date and has a fourth and final meeting scheduled for later this month. The decision to make the meeting notes available was taken — at their 2nd meeting in December 2011 — to foster ‘two-way communication’ between members of the working group and their constituencies. I’m not sure how much of that has gone on. Oddly for a process that is consideringopen access, the consultation process seems to have been rather muted; the same was true of the consultation on the RCUK’s draft policy on open access which didn’t appear to have been officially announced. Nevertheless it is evident from the notes that the working group has been consulting widely. Anyone who would like to communicate with the committee, even at this late stage, can do so: the membership and email details of its members is available via RIN (Excel file). I’ve had a look at the meeting notes and have uploaded PDF versions to a public dropbox folder in which I have highlighted what appeared to me to be the key points. Anyone wanting a fuller picture should read the full documents (links below) — they’re not very long...” [Use the link above to access the “highlights” of each meeting posted by the blogger.]

Data Diving: What lies untapped beneath the surface of published clinical trial analyses could rock the world of independent review

Posted: 07 May 2012 01:58 PM PDT

 
Data Diving: What lies untapped beneath the surface of published clinical trial analyses could rock the world of independent review
the-scientist.com
“A few weeks before Christmas 2009, the world was in the grip of a flu pandemic. More than 10,000 people had died, and roughly half a million people had been hospitalized worldwide; tens of millions had been infected. In the United States, millions of doses of Tamiflu, an antiviral medication, had been released from national stockpiles. ‘December 2009 was a point in the H1N1 outbreak where there was a lot of talk about a second or third wave of this virus coming back and being more deadly,” says Peter Doshi, now a postdoctoral researcher at Johns Hopkins University and a member of an independent team of researchers tasked with analyzing Tamiflu clinical trials. “Anxiety and concern were really peaking.’ So it was no small blow when, that same month, Doshi and his colleagues released their assessment of Tamiflu showing that there was not enough evidence to merit a claim that the drug reduced the complications of influenza.1Their report had been commissioned by the Cochrane Collaboration, which publishes independent reviews on health-care issues to aid providers, patients, and policy makers. The findings, published in the British Medical Journal, made headlines around the world. Doshi’s group arrived at this conclusion because they’d run into a lack of available data. Some of the widespread belief that Tamiflu could blunt pneumonia and other dangerous health consequences of flu was based on a meta-analysis of several clinical trials whose results had never been published. Because the data could not stand up to independent scrutiny by the researchers, these trials were tossed out of the Cochrane review; other published trials were disqualified because of possible bias or lack of information. Just as the 2009 BMJ paper was to be published, Roche, the maker of Tamiflu, opted to do something unorthodox—the company agreed to hand over full clinical study reports of 10 trials, eight of which had not been published, so that independent researchers could do a proper analysis. Within a few weeks after the publication of its review, the Cochrane team was downloading thousands of pages of study files. Clinical study reports are massive compilations of trial documents used by regulators to make approval decisions. Doshi says he had never heard of, let alone worked with, a clinical study report. “This is how in the dark most researchers are on the forms of data there are. Most people think if you want to know what happened in a trial, you look in the New England Journal of Medicine orJAMA.” And in fact, that is how many meta-analyses or systematic reviews of drugs are done. As publications amass, independent analysts gather up the results and publish their own findings. At times they might include unpublished results offered by the trial investigators, from the US Food and Drug Administration’s website, or from conference abstracts or other “grey literature,” but for the most part, they rely simply on publications in peer-reviewed journals. Such reviews are valuable to clinicians and health agencies for recommending treatment. But as several recent studies illustrate, they can be grossly limited and misleading. Doshi and his colleagues began poring over the reams of information from Roche, and realized that not only had their own previous reviews of Tamiflu relied on an extremely condensed fraction of the information, but that what was missing was actually important... In January of this year, the group published its latest review of Tamiflu, which included the unpublished evidence obtained from Roche in 2009.2 The authors concluded that Tamiflu falls short of claims—not just that it ameliorates flu complications, but also that the drug reduces the transmission of influenza... Jefferson is not convinced, and the experience has made him rethink his approach to systematic review, the Cochrane method of evaluating drugs. For 20 years, he has relied on medical journals for evidence, but now he’s aware of an entire world of data that never sees the light of publication. “I have an evidence crisis,” he says. “I’m not sure what to make of what I see in journals.” He offers an example: one publication of a Tamiflu trial was seven pages long. The corresponding clinical study report was 8,545 pages... The big question is: What does that mean for the validity of independent reviews? ... Although summaries of clinical trials are available from the FDA, unabridged clinical study reports or the raw data are hard to come by. Keri McGrath Happe, the communications manager at Lilly Bio-Medicines, wrote in an e-mail to The Scientist that the company has a committee that reviews requests to obtain unpublished clinical trial results. ‘I can tell you that it is not common’ to have a request filled for raw data, she says. ‘Granting access to raw data isn’t as easy as opening file cabinets and handing over documents. A team has to go through each piece of data to find what specific data [are] needed to fulfill the request.‘ [In addition to] being an administrative burden, handing over clinical reports or raw data is considered hazardous to the integrity of a drug’s worth. ‘The simple truth is that drug discovery is enormously expensive,’ says Jeff Francer, the assistant general counsel of the Pharmaceutical Research and Manufacturers of America (PhRMA). “In order for companies to engage in the immensely capital-intensive work to develop a medicine, there has to be some protection of the intellectual property. And the intellectual property is the trial data.’ The FDA tends to concur. The agency receives much more information about a drug than it ever releases. According to Patricia El-Hinnawy, an FDA public affairs officer, ‘as a matter of law and regulation, patient-level clinical trial data has been historically regarded as confidential commercial and/or trade secret information.’ The other route to obtaining unpublished results is through a Freedom of Information Act (FOIA) request, but just as with putting in a request to a company, there is no guarantee that the information will be released. Plus, ‘FOIA requests take a long time,’ says Michelle Mello, a professor of law and public health at the Harvard School of Public Health. ‘In a world where we’re concerned about being able to rapidly assess certain safety signals, this is not a route to producing timely information...‘ The other argument, says Sidney Wolfe, director of the health research group at the advocacy organization Public Citizen, is that ‘it’s a moral and ethical thing too. People who are participating in clinical trials, aside from whatever possible benefit will happen to them . . . are doing it for the benefit of humanity. And if there is some lid put on some aspects of those trials, that is frustrating one important goal of research, which is sharing information.’ The question of whether results from human experiments are private information or a public good has been debated for some time. In 2010, the European Medicines Agency (EMA), the European Union’s equivalent of the FDA, finally made a decision. ‘We had resolved that clinical data is not commercial confidential,’ says Hans-Georg Eichler, the EMA’s senior medical officer. ‘It doesn’t belong to a company, it belongs to the patient who was in the trial.’ The EMA’s new policy is that if someone requests data from clinical trials of an approved medication, the agency will provide it. Doshi’s group took advantage of this to obtain about 25,000 pages of information on Tamiflu, which they used for their 2012 Cochrane update. Eichler says there have only been a handful of requests to date, too few to know how the policy is working out. Fulfilling such requests can be cumbersome, he says. It takes time to carefully review the data and make sure patients cannot be identified. Eichler adds that in the future he’d like to see a system where all clinical trial results are entered into a system accessible by other researchers. Under the FDA Amendments Act of 2007, the agency requires trial sponsors to post the summary results of registered trials on clinicaltrials.gov within one year of completing the trial. But few comply. A recent survey of the website found that of 738 trials that should have fallen within the mandate, just 163 had reported their results.’ While companies are certainly part of the problem in this case, they were actually more likely to report results than were researchers whose clinical trials had no industry backing, but were funded by foundation or government money. ‘I think it’s so important to acknowledge that is a huge problem throughout’ the clinical research enterprise, says Kenneth Getz, a professor at Tufts Center for the Study of Drug Development. And industry has made some moves to be more proactive about sharing data. Last year, the medical device company Medtronic agreed to share all of its original data regarding Infuse, a bone growth product that had been facing considerable skepticism about its efficacy. Yale professor Harlan Krumholz approached the company with a challenge: if Medtronic thinks the Infuse data can stand up to external scrutiny, then let an external group have a look. The company agreed, and a Yale University group serves as the middleman between the company and the independent reviewers. Joseph Ross, a Yale Medical School professor who’s involved in the project, says two review teams have been selected, and they should have results by the summer. Medtronic is paying $2.5 million for the external reviews, a price Ross says is small compared to what gets invested in—and ultimately earned from—a successful drug. He says it’s the first experiment of its kind... Journals are also lighting a fire under trial sponsors to provide their results to independent reviewers more quickly and completely. In 2005, the International Committee of Medical Journal Editors initiated a requirement that trials had to be registered, say on clinicaltrials.gov, in order to be published. ‘That sent shock waves,’ says Elizabeth Loder, an editor at the British Medical Journal... While Getz agrees that more data could improve meta-analyses, he cautions against ‘data dumping’—completely opening the floodgates to unpublished results. ‘I think just the idea of making more information available misses the point. You reach a level of data overload that makes it very hard to draw meaningful and reasonable conclusions, unless you’re an expert with a lot of time.’ But Cochrane Collaboration’s Jefferson says bring it on. While the clinical study reports he received numbered in the thousands of pages, they were still incomplete. Roche says it provided as much as the researchers needed to answer their study questions. But accepting that response would require a trust that is clearly eroded. ‘We hold in the region of 30,000 pages. That’s not a lot,’ Jefferson says. ‘We don’t know what the total is. We’re looking at the tip of the iceberg and we don’t know what’s below the waterline.’”

They said it would never happen — but it just has

Posted: 07 May 2012 01:56 PM PDT

 
They said it would never happen — but it just has
Mike Taylor
Sauropod Vertebra Picture of the Week #AcademicSpring, (04 May 2012)
“The speed that things are happening at the moment is astonishing. Whenever we talk about the economics of open access — when I argue that it costs the community eight times as much to publish a paywalled article with Elsevier as it does to publish it as open access with PLoS ONE — I always hear the same argument in response.  And it’s a good argument.  It goes like this: ‘Yes, the total cost to libraries around the world of an Elsevier article may be eight times the cost to the author of publishing an open-access article that is free to read.  But you can only expect to save that money if libraries cancel their Elsevier subscriptions and plough that money into funding open-access publications instead.  And no library will ever do that, because the researchers that they serve need the subscriptions.’ Well, it turns out — somewhat to my own surprise, I’ll admit — that libraries will cancel their Elsevier subscriptions.  The Department of Mathematics at the Technical University of Munich has just voted to do exactly that: ‘Because of unsustainable subscription prices and conditions, the board of directors of the mathematics department has voted to cancel all of its subscriptions to Elsevier journals by 2013.’ So what does this mean?  A lot of things. 1. This is no idle far-in-the-future threat: 2013 is only one year away!  So this is an actual policy.  Something that they’re going to do. 2. Universities are not messing about.  When Harvard say they can’t afford subscriptions, they probably mean it — it’s not just a negotiating tactic. 3. Where one university department leads, others will probably follow.  Maybe initially it will be mostly maths departments in other universities; maybe it will be other departments of the Technical University of Munich; maybe it will be all of Harvard. 4. So far, this announcement is only about cancelling subscriptions and says nothing about open access.  If that’s all they do, it will be a mere cost-saving exercise and a missed opportunity.  To be truly transformational, the department needs to channel a significant chunk of its subscription savings into funding Gold OA publications. 5. Publishers who are paying attention will surely start to realise that they have pushedtheir exploitative prices too far, and that they don’t hold libraries in a steely grip any more.  I wonder how this will play into investment advice regarding Elsevier? This isn’t the kind of problem that can be fixed by hiring a PR person.  I’ve argued this before, but if Elsevier are going to survive, they’ll need to be much clearer in the their communications, eliminate practices that alienate authors, and ultimately change their business model entirely.

Why Networked Knowledge Makes Us Smarter than Before

Posted: 07 May 2012 01:54 PM PDT

 
Why Networked Knowledge Makes Us Smarter than Before
projectinfolit.org
Use the link above to access the transcription of the interview introduces as follows: “When it comes to wrapping your brain around how the Internet has impacted how we think, interact, and learn, David Weinberger is always worth listening to. David is a thought-leader, writer, technologist, and a senior researcher at Harvard University’s Berkman Center for Internet and Society and Co-Director of Harvard Law School’s Library Innovation Lab. We interviewed David in April 2012, shortly after the release of his new book, Too Big to Know: Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room. We discussed how the promise of knowledge has been completely transformed in a comparatively short time, why, and what it means to libraries, to higher educational institutions, and to learning.”

World of academia bites back

Posted: 07 May 2012 01:53 PM PDT

 
World of academia bites back
“MY AFFECTION for DC Thomson deepened when I discovered that to subscribe to The Beano cost more than to simply purchase it weekly at the newsagents. The majority of publishers operated on the notion that a loyal subscriber, someone who wished to read every issue of their publication, not just whenever they could be bothered to pick one up, deserved a little reward, so the cost of an annual subscription was always less than the cover price. Unless, of course, you happened to be DC Thomson who I learned, while in discussion with the editor of The Beano many years ago, decided that the proper model for subscription should be the cover price plus the cost of an envelope and second class stamp. I thought of The Beano while pondering Biochimica et Biophysica Acta, which, as I’m sure you know, is the publication of choice for organic chemists and is a must read for every aspiring lab rat who has a tattoo of Boron on their butt. It is to chemists and biologists what Variety is to the Hollywood agent, essential reading. So guess how much an annual subscription costs? Go on. Well, it is currently £15,210. Now, before you get all upset and start sputtering into your coffee or, perhaps begin to ponder just what a good deal a subscription to The Scotsman is, it should be said that for your £15,210 you do get a total of 100 issues of a quality, peer-reviewed publication that will keep you up-to-date with the latest reviews of cancers and molecular cell research but also biomembranes, too, so this is surely a snip at roughly £152.10 per issue. So why does it cost so much? Well, there are the overheads to think of. You don’t think that academics from around the world write long, detailed articles on the fruit of sometimes years of hard work for free do you? Well, actually, they do write them for free. OK, how about the scientists and academics who peer review the articles prior to publication – surely they are on a fat wad of cash? Sadly, it would appear not. The editors of the publications? As far as I can gather, some get paid and some don’t. The reason that Biochimica et Biophysica Acta cost £15,210 for a single annual subscription is that university libraries will pay £15,210 for a single annual subscription. The reason they pay £15,210 for a single subscription is that they don’t really have any choice as their chemists and biologists have to be kept informed of the latest development in their fields. And what is more is that, unlike other publishers, who will allow you to subscribe only to the publications you actually wish to read, Elsevier, the Dutch publishing house behind Biochimica et Biophysica Acta, has a much better idea which they call ‘Bundling’. A library cannot call up Elsevier and say simply: ‘Phew, we’ve had a vast ‘bring and buy’ sale and so we can now afford to take up a subscription to Biochimica Acta.’ For Elsevier, while no doubt lighting a cigar with a €100 note, will say no, you can’t have only the magazines you wish, what you have to accept is a bundle with lots of other expensive journals that you don’t really want. Elsevier are like the Soup Nazi in Seinfeld. In one episode of the American sitcom, a new soup shop opened selling the most wonderful tasting soups, but service was strictly on the proprietor’s terms. If you asked for something else, or commented on the high price of his Mulligatawny, he would snatch it back and say: ‘No soup for you.’ Apparently, again according to a senior professor, university libraries who try to negotiate with the company find that they are ‘ruthless about cutting off access to all their journals’. The way the system appears to operate is that we, the British public, fund universities through our taxes as well as the Research Councils UK, who provide grants for academics and scientists to conduct research into new fields. After we, the taxpayer, fund the work the scientists are required to publish their work in the appropriate academic journal who, of course, do not pay them for the articles, but claim copyright of them – 70 years will do – and then charge us, the taxpayer, to read the results of what we have funded. C’mon you have to doff your cap to the likes of Elsevier, and the other leading academic publishers such as Springer and Wiley who, in an age when publishing has taken a pounding from the rise of the internet, are apparently impregnable behind their pay walls and exorbitant subscription fees. If you wish to read a single article in one of Elsevier’s journal the cost is £19.47, Springer charges £21.60, while Wiley-Blackwell charges £25.96. When George Monbiot, the Guardian columnist, peered into the accounts of the three leading academic publishers he discovered that their returns were staggering. In 1998, before the explosion of the internet, Elsevier reported a profit margin of 36 per cent; 12 years later it remains 36 per cent with the company last year earning a profit of £724 million on a revenue of £2 billion. If I was a shareholder in Elsevier then I would be delighted and urging them to charge not just £15,000 for a subscription to Biochimica et Biophysica Acta, but £30,000. I mean, why not? Who is to stop them? Well, if a university library doesn’t like it they can go off and subscribe to another journal which carries the same content? Oh that’s right. They can’t. There isn’t one. But the fact remains that, sadly, I’m not a shareholder in Elsevier. Instead I’m a taxpayer in Scotland where, last year, Edinburgh University had to pay out £1.7m to the big three academic publishers, while Glasgow University had to pay £1.4m in subscription charges for academic journals, and St Andrew University spent £624,000 on academic subscriptions. So, as a taxpayer rather than a shareholder, I’m rather delighted that a spotlight has recently been trained on the, well, we’ll not say ‘scam’ or ‘cartel’ or ‘monopoly’; let’s call it the rather exceptionally generous trading circumstances in which the academic publishers have found themselves. A rebellion against the costs and restrictive practices of Elsevier has been stirred up with Professor Tim Gowers from Cambridge University casting himself in the role of Spartacus. The academic, who won the Fields Medal, which, as I’m sure you know, is the equivalent of the Nobel Prize for mathematics, has announced a boycott of Elsevier. He will no longer submit articles to any of the company’s journals or peer review for them and has even gone as far as to set up a website, The Cost of Knowledge, on which over 9,000 people have logged their objections to Elsevier’s practices. Then there is The Wellcome Trust who are insisting that any research paid for them must now be published in a manner accessible by the public. Finally, earlier this week, David Willetts, the minister for universities and science, announced plans for the development of a website on which all government-funded research – which totals £5bn a year – could be accessible to the public. He is now in talks with Jimmy Wales, the founder of Wikipedia, to assist in the new ‘government-funded portal’. The problem will then extend to gaining access to the remaining 94 per cent of research papers still tucked away within the paid subscription journals and produced by non-British academics. While researching this column, I discovered that Glasgow University has decided to throw open its vast library of research files and experiments and puffing cauldrons to the Scottish public. If an individual or company is embarking into a specific field and thinks the university may have research and intellectual property that would benefit them, say, for example, in the development of a new type of widget, they can have the IP for free, and if they make millions as a result then it is theirs to spend as they wish. This is on the grounds that as the work was funded by the public purse it should benefit those entrepreneurial members of the pubic keen to put it into practice. The university would hope manners might prompt a donation or two, but there is certainly no requirement, although the donation of a subscription bundle to Elsevier would certainly lighten their fiscal load.”

Promoting Open Access to Research in Academic Libraries, Dr Priti Jain

Posted: 07 May 2012 01:49 PM PDT

 
Promoting Open Access to Research in Academic Libraries, Dr Priti Jain
www.webpages.uidaho.edu
Use the link above to access the full text article published in the current issue of the open access journal “Library Philosophy and Practice.” The introduction reads as follows: “‘A commitment to scholarly work carries with it a responsibility to circulate that work as widely as possible: this is the access principle. In the digital age, that responsibility includes exploring new publishing technologies and economic models to improve access to scholarly work. Wide circulation adds value to published work; it is a significant aspect of its claim to be knowledge. The right to know and the right to be known are inextricably mixed. Open Access can benefit both’ (Willinsky, 2010). Increasingly, this capacity to close the gap between developed and less developed countries through access to information becomes more important for educational, cultural, and scientific development. OA can foster information and knowledge sharing within research, educational, and scientific communities in traditionally economically disadvantaged regions (Canada, 2009). Based on the latest literature, this paper examines academic libraries' initiatives in promoting open access. It will also look at the obstacles and challenges faced in open access with specific reference to developing countries. First of all it would be suitable to appreciate the concept of open access.”

Interviews « Figure/Ground Communication

Posted: 07 May 2012 10:01 AM PDT

 
Interviews « Figure/Ground Communication
figureground.ca
Blog by SFU MA grad Laureano Ralon featuring freely accessible interviews with academics, also designed to be accessible from a content perspective. Laureano's blog is up for two awards - please consider voting! http://westcoastsocialmediaawards.com/voting/best-personal-blog/ http://westcoastsocialmediaawards.com/voting/community-builder-award/

Controversial Research Regarding the Bird Flu has Been Published in Nature

Posted: 05 May 2012 07:01 AM PDT

 
Controversial Research Regarding the Bird Flu has Been Published in Nature
Lifestyle articles at Technorati, (04 May 2012)
“One of the controversial bird flu researches, done at the University of Wisconsin-Madison, has been published online in the journal Nature on Wednesday. Interesting part of the news is that the research paper is on open access, so that everyone can read the paper without purchase. This research has been published with the approval of the U.S. Government after a months-long international debate about its value versus its use in bioterrorism as this paper has been referred by some people as the ‘recipe’ for a bioterrorism. In this research, headed by University of Wisconsin (UW)-Madison’s flu virologist Yoshihiro Kawaoka, researchers have reported that four mutations in a bird flu virus cause the virus to spread among the ferrets in lab. So the research showed that how the H5N1 hybrid could be made and transmitted efficiently among mammals. The other paper headed by Ron Fouchier at the Erasmus Medical Centre in Rotterdam, the Netherlands, is expected to be published in the journal Science in the very near future. This paper has many of the positive aspects as I have already commented on one of the Editorials of Nature, ‘To know’ is better in many cases than ‘Not to know’ and I think it is better to tell the world about the preparedness, so that the scientists around the world could work in defense.’ ‘There are people who say that bird flu has been around for 16, 17 years and never attained human transmissibility and never will,’ said Malik Peiris, virology professor at the University of Hong Kong. ‘What this paper shows is that it certainly can. That is an important public health message, we have to take H5N1 seriously. It doesn't mean it will become a pandemic, but it can,’ said Peiris, who wrote a commentary accompanying Kawaoka's paper in Nature. ‘By identifying mutations that facilitate transmission among mammals, those whose job it is to monitor viruses circulating in nature can look for these mutations so measures can be taken to effectively protect human health,’ Kawaoka said in a statement released Wednesday through the UW.”

BC ELN Connect 10.1 May 2012

Posted: 05 May 2012 06:57 AM PDT

 
BC ELN Connect 10.1 May 2012
eln.bc.ca
“Strategy 3.2 of the BC ELN 2011-2016 Strategic Plan calls for BC ELN to ‘articulate and implement an appropriate role for BC ELN in managing open access archives (e.g. Institutional Repositories, Learning Object Repositories) and supporting adoption by BC ELN partners’. Actions identified for 2012/13 at the fall 2011 BC ELN All Partners Meeting include: [1] Investigate needs and opportunities for a collaborative institutional repository service [2] Conduct needs analysis survey [3] Investigate opportunities - technology and hosting options The BC ELN 2011-2016 Strategic Plan can be found here: http://www.eln.bc.ca/view.php?id=1947 This article highlights some existing institutional repositories (IR) in British Columbia, including the DSpace installations at University of British Columbia (UBC) and University of Victoria (U Vic), and Simon Fraser University (SFU)’s new Drupal-based IR. University of Northern British Columbia (UNBC) is planning to have an IR up and running this fall, using Islandora. British Columbia Institute of Technology (BCIT) is gearing up for their 50th anniversary with an Innovative Interfaces Content Pro IR. The article concludes with a few tips on staffing and workflow for an institutional repository service by local experts...” [Use the link above to access the full text blog post providing more details about the institutional repositories mentioned in the preceding introduction.]

A revolutionary new approach to making humanities and social sciences books free

Posted: 05 May 2012 06:56 AM PDT

 
A revolutionary new approach to making humanities and social sciences books free
Blog Admin
Impact of Social Sciences, (04 May 2012)
“Earlier this week, David Willetts announced the government’s intention to make publically funded research available for free to readers. This announcement comes in the wake of a tumultuous few months for academic publishers. The boycott of journals published by Elsevier, the Wellcome Trust’s decision to adopt more robust Open Access policies in relation to the research that it funds and, internationally, Harvard University’s proclamation that the cost of journal subscriptions has become ‘untenable’ have added to a growing sense of crisis in the publishing community. Many academics are eager for their work to be shared more widely. They have been frustrated by the failure of publishing industry business models to reflect the open, collaborative potential of new technologies. And they are increasingly dubious about the value that publishers are providing to scholarly communities in return for distribution monopolies many of them command. As Professor Tom Cochrane puts it in an article: ‘The fact is that the overwhelming majority of articles published in the traditional journal literature are given away by their authors, are refereed gratis by colleagues in the peer review process and are then published.’ David Willetts is, understandably, interested in securing the best possible return on public investments in research for UK tax payers. Nonetheless, researchers operate in highly connected global communities of scholarship. The questions facing funders of research, the challenges facing publishers and the implications of the seismic shifts now taking place in landscapes of scholarly communication are truly global. The disruptive power of the internet is not just changing business models in the archetypal ‘copyright industry’ – publishing. The Internet is changing the ways in which knowledge is made and communicated... Remember Books? ... Although a great deal of attention is being paid to the ways in which academic publishing business models are failing scientific communities, very little has been paid to the crisis facing the humanities. Whereas it was once common for specialist book length publications in the humanities and social sciences to sell up to three thousand copies, publishers of this kind of book are now pleased if they are able to sell just a few hundred, mostly to university libraries... The deep connections between books and scholarship are reflected in the ways in which scholars are trained to carry out and present their work (writing a PhD dissertation, for example), as well as in systems of academic promotion and the funding and ranking of humanities based research and the institutions that produce it. The crisis in book publishing, then, represents a crisis of the gravest proportions for scholarship in the humanities and social sciences. It raises fundamental questions about the nature of scholarly enquiry and communication in the twenty-first century and highlights a worrying lack of connection between one of the most highly prized forms of scholarship and contemporary readerships. Dwindling sales of scholarly books clearly have consequences for academic publishers and university presses. But they also signal the profound failure of humanities and social sciences publishing to engage effectively with digital technology or to ensure that hard won advancements in knowledge made outside the sciences are made publically available. t is tempting to simply condemn academic publishers for their failure to engage more effectively with the potential of new technology to facilitate open access to knowledge and new approaches to research. However, doing so ignores the fact that academic publishing is a diverse industry which has evolved alongside universities’ systems for funding research and technologies for copying and distribution – perhaps most notably the printing press. The term ‘academic publisher’ covers giants such as Elsevier, not-for-profit university publishers, such as Oxford University Press, and smaller operations like Edward Elgar. It would be foolhardy to ignore the role that publishers have to play in a vibrant innovation ecosystem, or the very real expertise and value that the best aspects of academic publishing have to offer scholarly communities in a digital age. The commitment to opening up access to academic research expressed by David Willetts in The Guardian earlier this week is to be commended. Nonetheless, careful thought will be required to ensure that new approaches to sharing publically funded research are able to retain the best aspects of a scholarly communication system that has evolved over several hundred years. This is why the Big Innovation Centre is choosing to partner with academic publisher and serial entrepreneur, Dr Frances Pinter to pilot a revolutionary new approach to making scholarly books available for free. Knowledge Unlatched offers a commercially sustainable approach to making books available on open access licenses, while reducing costs to libraries and helping to stimulate markets for new kinds of value added content. Knowledge Unlatched aims is to work with publishers and university libraries, while engaging with scholarly communities through a critical research agenda.”

Something for Nothing: The Non-Existent Benefit of Linking in the Access Copyright Deal

Posted: 05 May 2012 06:55 AM PDT

 
Something for Nothing: The Non-Existent Benefit of Linking in the Access Copyright Deal
Michael Geist
Michael Geist Blog, (04 May 2012)
“As debate over the AUCC - Access Copyright settlement spreads to campuses across the country, one of the talking points that has emerged is that the coverage of linking to content in the settlement provides some value to the education community. The model licence defines copy as: ‘any reproduction, in any material form whatever, including a Digital Copy, that is made by or as a consequence of any of the following activities ... (k) posting a link or hyperlink to a Digital Copy.’ Critics argue that this provision gives the AUCC no value as there is simply no need to license such activities. The inclusion of the provision means students will be paying something - there must some notional part of the $26 annual fee that covers this section - for nothing. Supporters of the deal, including AUCC, claim otherwise. Indeed, the AUCC FAQ has two questions and answers on point: 5Q. Does the definition of ‘Copy’ in the AUCC model licence mean that AUCC accepts that posting a hyperlink to a digital copy is the same as authorizing the making of a copy and requires a licence? A. Despite the ruling of the Supreme Court of Canada in a recent defamation case, Crookes v. Newton, it is still an open issue in Canadian law whether posting a hyperlink could make a person liable for authorizing the copying of the digital work. The definition of ‘Copy’ in the model licence makes the licence and the indemnity very broad in scope. Another provision in the model licence clarifies that AUCC has accepted this definition on a ‘without prejudice’ basis and reserves the right to take a different position on the meaning of the term in any other proceeding... 7.Q Would it have been better to wait until after Bill C-11, the Copyright Modernization Act, becomes law and the Supreme Court of Canada rules on fair dealing in K-12 schools before AUCC settled with Access Copyright? A. Bill C-11 and the upcoming Supreme Court decision on fair dealing are unlikely to affect the need to secure a licence for copying required readings for students for inclusion either in course packs or on course websites. Required readings is the principal category of copying covered by the model blanket licence agreement.’ The AUCC position raises two issues: first, that the issue of linking still poses a risk under Canadian law, and second, that Bill C-11 will not alter the legal implications. The AUCC is wrong on both counts. On the Supreme Court of Canada's approach to liability for linking, the court has unquestionably provided a strong foundation for arguing that there is no liability for linking to content. In Crookes v. Newton, a case focused on defamation and linking, Justice Abella states:’Hyperlinks thus share the same relationship with the content to which they refer as do references.  Both communicate that something exists, but do not, by themselves, communicate its content.  And they both require some act on the part of a third party before he or she gains access to the content.  The fact that access to that content is far easier with hyperlinks than with footnotes does not change the reality that a hyperlink, by itself, is content-neutral - it expresses no opinion, nor does it have any control over, the content to which it refers.’ Control over the content rests with the site that has made the content available online. Merely linking to such content does not implicate copying that would or should require permission or a licence. The even bigger error comes from its analysis of Bill C-11, which it examines solely from the perspective of the expanded fair dealing provision. The bill includes at least two other provisions that are directly relevant to parts of the model licence, including the issue of linking to online materials. The most obvious provision is one that AUCC has spent years lobbying for (thus making its omission from the FAQ shocking) - section 30.04 on publicly available materials on the Internet. The provision states: ‘30.04 (1) Subject to subsections (2) to (5), it is not an infringement of copyright for an educational institution, or a person acting under the authority of one, to do any of the following acts for educational or training purposes in respect of a work or other subject-matter that is available through the Internet: (a) reproduce it; (b) communicate it to the public by telecommunication, if that public primarily consists of students of the educational institution or other persons acting under its authority; (c) perform it in public, if that public primarily consists of students of the educational institution or other persons acting under its authority; or (d) do any other act that is necessary for the purpose of the acts referred to in paragraphs (a) to (c). The subsections that follow create several conditions, including attribution, the absence of a digital lock, and no clear opt-out notification (that is more than just a copyright notice). The whole point of the provision is to provide education with legal certainty in the use of online materials without the need for further permissions or payment. If the provision permits reproduction, communication, or performance of a work that available online, it surely means there is no legal risk in merely linking to such a work that can be freely reproduced. In addition to the publicly available materials exception, the non-commercial user generated content provision may also prove relevant for some electronic casebooks that incorporate some materials to create a new work for non-commercial purposes. The UGC provision (Section 29.21) includes four conditions including an analysis of "substantial adverse effect", but the provision may allow for the development of new materials where the evidence suggests that the new works actually increase (or at least do not substantially adversely affect) the original works. In sum, Bill C-11 offers education far more than just an expanded fair dealing provision. While there are still problems with the bill - particularly with respect to digital locks and restrictions on lessons - there are provisions that have implications for issues such as linking to content that confirm the Access Copyright deal provides no real benefits in this regard. For AUCC to ignore these provisions in its FAQ is highly misleading and presents an inaccurate picture of one of the touchstone issues in the model licence. It also raises questions about whether AUCC has fully analyzed the impact of the forthcoming changes to Canadian copyright law, which places it in a far stronger position that it seems to realize.”

The Problem with “Open” Science | SiliconANGLE

Posted: 05 May 2012 06:53 AM PDT

 
The Problem with “Open” Science | SiliconANGLE
siliconangle.com
“One of the tragedies of the modern age is that we have to prepend the word ‘open’ onto science in order to differentiate Open Science from the regular kind — you know, the kind that’s overrun with IP restrictions and practiced by way too many academic and research organizations. It should never have come to this considering science is about reproducible results under controlled conditions, a process that intrinsically demands transparency and openness. Fortunately, the alarm bells have gone off and many in the research community are taking matters into their own hands. In my experience IP barriers in science come about in three major ways. First, there is the decreasing ability of the traditional publishing process to support reproducibility. A ‘publication’ without data, source code (the methods), and good documentation is nearly impossible to reproduce and causes many to question what is actually being shared. Some of the computational papers I have seen are so complex that reproducing the science behind it may take years (assuming you can acquire the data), something that few do due to the pressure to create novel ideas (career recognition tends to fall to those that create something new versus those that confirm a result). Second, there is a decided resistance to the open research process. Part of the research community is uncomfortable with the inherent messiness of innovation and do not want others to see the process unfold. Others want to protect their ‘ideas’ due to the competitive environment that is science today, created largely by the way funding science and tenure works. Lastly, formal IP organizations have been established at universities and other research organizations to control and license IP. I have seen so much innovation killed by these offices that it turns my stomach. While there is the occasional Gatorade success, statistics indicate that there are significantly more failures than successes and that these come at the cost of squelching innovation, the scientific process, and even business. For more information on University IP systems, see Melba Kurman’s blog post ‘Fear, Uncertaintly and Doubt and University IP Strategy,’ which claims that 99% of university patents never earn money, or Kimberly Moore’s article ‘Worthless Patents,’ which shows a 53.71% abandonment rate of patents that takes place after spending tens of thousands of dollars to acquire the patent in the first place. With the emergence of networked science and the increasing reliance on computing, novel methods are emerging to address these IP barriers. These methods typically boil down to providing Open Access (publication), Open Source (software), and Open Data to support the Open Science (community). Many people are aware of the fantastic Open Access journals such as PLoS and BioMed Central; even more innovative (at least from a software point of view) are journals that accept data and software submissions and then automatically test and score the software before handing the submission off to the human reviewers (see the Insight Journal here). Progressive publishers are also experimenting with novel processes; for example, the Optical Society of America provides their Interactive Science Publishing system by enabling linking of active documents to data that can be downloaded and interactively viewed. Indeed, one possible business model for publishers is to get back to their roots and provide services to the community in the spirit of networked science, which could include hosting open communities, data, and software (can you imagine publishers becoming software developers?). There are serious needs for such services; just consider the need to host and archive (large) computational data, a huge and increasingly important problem as data rains down on us. highways that will enable us to drive happily into the future. Open Science is central to this; yet despite the fears of many, practicing it does not preclude commercialization. Gated communities representing well-crafted solutions and user experiences will always sell, and technology integration services that move ideas into practice offer significant business opportunities. Without Open Science, we run the risk of creating feudal societies that lock innovation behind intellectual walls, with closed data repositories that we don’t share and software that we have to needlessly reinvent. In the future, success should be measured not only by impact factors, but by how well research output is shared and reused – to see further, we must produce results that enable others to stand on our shoulders.”

UK to science publishers: don't follow recording industry down the tubes

Posted: 05 May 2012 06:50 AM PDT

 
UK to science publishers: don't follow recording industry down the tubes
arstechnica.com
“There's been a growing push to get more scientific research out from behind paywalls. The federal government, private funding bodies, and a number of research institutions have all adopted policies that either mandate or encourage placing papers where the public can view them. Now, it appears that the UK is considering following suit. In addition to planning to make its researchers' publications available, the country's science minister has asked Wikipedia's Jimmy Wales to advise it on how to make the underlying data accessible. The announcements came in a speech by David Willetts, the UK's Minister of State for Universities and Science. Willets was pretty blunt about access to government-funded research, saying, ‘As taxpayers put their money towards intellectual inquiry, they cannot be barred from then accessing it.’ Like the other bodies that have formulated open access policies, however, he recognizes the value that publishers add to research publications. Publishers both exercise editorial control—deciding which research is significant enough to highlight through publication—and arrange for peer review to integrate the advice of multiple reviewers. And the UK also has a vested interest in their continued viability. London is a major hub for the publishing industry, and Willett’s isn't looking to simply disrupt the existing system: "Provided we all recognize that open access is on its way, we can then work together to ensure that the valuable functions you carry out continue to be properly funded—and that the publishing industry remains a significant contributor to the UK economy." To that end, he's interested in expanding the use of two methods that are already in use, which he terms "green" and "gold." Green means that the publishers will get a period of time where they offer access exclusively to their subscribers, after which the paper is made open access (that's how the NIH's plan works). The alternative, gold approach, would be to publish in journals that provide immediate open access for a fee, with the fee being built into research budgets. That approach is already used by the UK's Wellcome Trust, a major funder of biomedical research... The Wellcome Trust isn't waiting for publishers to get with the program, though. It recently partnered with the Howard Hughes Medical Institute and the Max Planck Society to found a fully open access journal called eLife... Willetts recognizes that there are some potential hang-ups with this approach. Research doesn't go out of date as quickly in all fields, so the length of the publishers' exclusive period may need to be adjusted. In some areas (Willetts cites local history), amateurs and independent researchers make major contributions, but won't have a budget for paying to make their work open access. Finally, the UK is leery of forcing its researchers into an open access plan when other countries haven't adopted a similar one. To that end, it's going to be discussing matters with US agencies and other nations in the EU. But publications are only part of the story, and Willetts intends to focus on the rest of the research—the underlying data, things that don't get published, etc. He intends to set up a portal that will list every government funded research, and provide access to their papers, any databases they've created, etc. To make sure the information is easy to maintain, modify, and share Lest the publishers think they can wait this one out, Willetts had a rather stark warning for them: adapt, or bad things will happen. ‘To try to preserve the old model is the wrong battle to fight,’ Willetts said. ‘Look at how the music industry lost out by trying to criminalize a generation of young people for file sharing. It was companies outside the music business such as Spotify and Apple, with iTunes, that worked out a viable business model for access to music over the web. None of us want to see that fate overtake the publishing industry.’”

Australian scientists call for research data to be open source

Posted: 05 May 2012 06:49 AM PDT

 
Australian scientists call for research data to be open source
www.ip-192.com
“Hackers should be role models for freeing up access to the "source code" of clinical trials - patient-level data – scientists from the University of New South Wales (UNSW) in Australia argue. They are calling for the open sharing of clinical trial data in the medical research community, saying it would be instrumental in eliminating bottlenecks and duplication, and lead to faster and more trustworthy evidence for many of the most pressing health problems. Hackers revolutionized the software industry by countering the economic and cultural motivations that drove closed source software and disengagement from user needs, the researchers say. ‘Similar roadblocks plague the clinical evidence domain where, despite a rapid increase in the volume of published research, physicians still make decisions without access to the synthesized evidence they need,’ said Dr. Adam Dunn, a UNSW Australian Institute of Health Innovation Research Fellow and co-author of the study that was first published in the journal Science Translational Medicine. The call follows a wider push for free, open access to academic publications and intellectual property rights designed to turn more university research into real-world applications. Open source communities often out-perform their closed source counterparts, most notably in the software community where millions of programmers contribute code that can be used for free, by anyone. ‘If the same principles were applied to medical research, bottlenecks, biases and self-interest would be largely removed,’ said Professor Enrico Coiera, a co-author on the paper. ‘Clinical trial data is a potential goldmine. If researchers, doctors and patients were able to re-analyze and pool this data, there would be a host of questions that could start to be answered. But these meta-analyses are very uncommon because researchers and companies don’t like to share data. One solution, which has no support, is for data to be pirated. No one would win in that scenario. But everyone could be a winner if clinical research data went open source.’ While there are technical challenges around building an open source community for clinical trials, including important considerations around privacy and data quality, ‘these could be easily overcome,’ Dunn said. Less easy to overcome are the social and financial barriers. ‘Most researchers want to hold their data as long as they can as the basis for publications,’ Dunn said. ‘And unfortunately, pharmaceutical companies want to control the messages that are delivered to doctors and maximize profits rather than facilitate the cost-effective delivery of care.’”

Science research may be freed from journals’ ‘unhealthy’ paywalls — paidContent

Posted: 05 May 2012 06:48 AM PDT

 
Science research may be freed from journals’ ‘unhealthy’ paywalls — paidContent
paidcontent.org
“The UK government has told academic journal publishers it will make freely available online the publicly-funded research they currently charge for, labelling ‘paywalls’ ‘deeply unhealthy’. The news will prove unpopular with academic publishers, which license and peer-review researchers’ work and charge libraries to make it available. ‘As taxpayers put their money towards intellectual enquiry, they cannot be barred from then accessing it,” science minister David Willetts said in a speech to the Publishers Association on Wednesday (transcript). They should not be kept outside with their noses pressed to the window – whilst, inside, the academic community produces research in an exclusive space.’ A group led by Dame Janet Finch will shortly advise the government on how to accomplish Willett’s aims online. But Willetts revealed it is likely to moot a ‘green’ option, which would see journal publishers granted a short exclusive window on publishing publicly-funded research, and a ‘gold’ option, under which the research would be openly available from the start... ‘I realise this move to open access presents a challenge and opportunity for your industry, as you have historically received funding by charging for access to a publication,” Willetts told publishers. Nevertheless, that funding model is surely going to have to change … To try to preserve the old model is the wrong battle to fight. Look at how the music industry lost out by trying to criminalise a generation of young people for file sharing.’ Many researchers were already revolting against health and science journal publisher Reed Elsevier for selling bundles of journals containing their work, rather than individual journals, to libraries. Over 11,000 people have signed a petition... The UK is currently creating a portal, Gateway To Research, to provide links to published publicly-funded research and some of the data sets which underpin them. Jimmy Wales is advising on format standards... The UK government is aligned with the European Commission, which has previously said it wants to see more free access to publicly-funded research and more open data, and claims the US Committee on Economic Development is moving in the same direction. International consensus on the moves would be important else UK researchers could find themselves giving away their research to the world online whilst having to pay to access research from other countries, Willetts said.”

Researchers call for clinical trial data to be open source

Posted: 04 May 2012 08:35 PM PDT

 
Researchers call for clinical trial data to be open source
www.theaustralian.com.au
MEDICAL researchers are calling for all clinical trial information to be openly shared with the entire research community to eliminate bottlenecks and duplication. ‘Despite a rapid increase in the volume of published research, physicians still make decisions without access to the synthesised evidence they need,’ said Adam Dunn, a research fellow with the Australian Institute of Health Innovation at UNSW. In a paper published in the journal Science Translational Medicine, Dunn and his co-authors say open access to clinical data could revolutionise the medical research model of silos and protected IP. ‘Clinicial data is a potential goldmine,’ said Enrico Coiera, a co-author on the paper along with Ric Day and Harvard’s Kenneth Mandl. ‘If researchers, doctors and patients were able to reanalyse and pool this data, there would be a hot of questions that could start to be answered. But these meta-analyses are very uncommon because researchers and companies don’t like to share data.’ The paper comes as traditional publishing houses are coming from increasing pressure from governments’ and universities’ increasing endorsements for publicly funded research to be made available on open source platforms. This week alone, the UK government said it had enlisted the services of Jimmy Wales, the founder of Wikipedia, to help design a platform to ensure all taxpayer funded research is freely available. Harvard and MIT also announced this week they would jointly invest $US60 million ($58m) in a non-profit open source platform called edX. At a press conference, Harvard president Drew Faust and MIT president Susan Hockfield emphasised that online courses wouldn’t take away from traditional on-campus education. They said one of the goals of edX was to further research on education, such as developing online tools that could be used by on-campus students.”

Crossing the Rubicon — Is the UK Going to Enable Open Access for All Taxpayer-Funded Research by 2014?

Posted: 04 May 2012 08:21 PM PDT

 
Crossing the Rubicon — Is the UK Going to Enable Open Access for All Taxpayer-Funded Research by 2014?
David Smith
The Scholarly Kitchen, (03 May 2012)
“Yesterday, the UK Government Science Minister, David Willets, delivered a keynote speech to the Publishers Association Annual General Meeting. What he outlined is nothing less than the desire to profoundly restructure the way UK taxpayer-funded research is disseminated. You can read his thinking for yourself as he has rather helpfully provided a companion opinion piece for the Guardian. He refers to the proposals as a seismic shift for academic publishing. Well, at least he’s not understating the government’s proposals. So what is being proposed? Quite a lot, as it turns out. ‘We [the UK government] will make publicly funded academic research free of charge to readers. . . . [This will] usher in a new era of academic discovery and collaboration and will put the UK at the forefront of open research.’ There’s no ambiguity there. The coalition government has been making noises for some time, but this is a clear statement of intent. ‘The challenge is how we will get there without ruining the value added by academic publishers.’ Quite. I’m not an anti-OA ideologue. My concerns with OA revolve around the long-term stability of the business model and issues to do with how quality filtering can best work... ‘We still need to pay for . . . functions [such as peer review], which is why one attractive model [Gold OA] has the funders of research covering the costs.’ Well, I think publishers offer far more value than just peer review. Mind you, if we haven’t done a good enough job of articulating the value-add, then it’s easy to see why it tends to boil down to this one issue... ‘Another approach, known as green, includes a closed period before wider release during which journals can earn revenues.’ Um, no. But let’s move on. This is the NIH model, of course, and I suspect a sentence or two has been removed, so we’ll just assume that he’s talking about a repository here... ‘Moving from an era in which taxpayer-funded academic articles are stuck behind paywalls for much of their life to one in which they are available free of charge will not be easy.’ Indeed! And let us study recent history, where OA has in fact carved out a meaningful niche in the overall ecosystem of scholarly publishing. Like all niches, it’s notable for its complete absence in some areas. ‘If those funding research pay open-access journals in advance, where will this leave individual researchers who can’t cover the cost?’ If anybody has any thoughts on what this means, put them in the comments below — I’m wondering who this applies to, citizen scientists? ‘If we improve the world’s access to British research, what might we get in response?’ I do wonder how much of an improvement he’s expecting to get. If this is an oblique reference to the general public’s access to research, I’d predict pressure to not fund some areas of basic research that are hard for the lay public to understand — things like the laser, for example, or Maxwell Clarke’s work on electromagnetic theory. I mention those only because they are the basis for all of our modern information transmission systems and they had absolutely zero industrial/economic applications when they were first made public (see Dame Janet Finch, below). Oh yes, and they are insanely difficult for mere mortals to understand. ‘Does a preference for open access mean different incentives for different disciplines?’ ... Willets has asked Dame Janet Finch to produce a report setting out the steps needed to fulfill ‘our radical ambition.’ Apparently, she is working with ‘all interested parties,’ and her report will appear before the summer. I take this to mean that the report will be published before the House of Commons goes to recess on July 27, 2012... Dame Janet Finch is one of the four panel chairs for the Research Excellence Framework (REF), which replaces the Research Assessment Exercise. If you are not familiar with it, the REF assesses the research output of UK higher education institutions, and then money is doled out on the basis of how they rank. Now, one of the controversial aspects of the REF is the focus on the measurement of the “economic impact” of research. The REF has been delayed until 2014 in order to asses the efficacy of the impact measure. The timelines for these two things would seem to overlap considerably, and it’s not too difficult to see why Dame Janet has been picked to report back. For reference, the next UK General Election is scheduled to be held on May 7, 2015. Any legislation, therefore, has to be completed by early April 2015. I think this is why the Guardian (who have clearly been further briefed) are saying that whatever it is will be up and running by 2014. Willets has also stated that Jimmy Wales will be advising on the common standards that will have to be agreed for open access to be a success. Frankly, I think this is a poor decision. Wales does have some considerable expertise, but then there’s a pretty long list of other people and organisations who have been doing some rather important work on standards and processes and best practises. And not for nothing, there is an industry out there that did, y’know, put much of the current content up online and built various discovery and access services on top of it. But hey, what do we know? ... To its credit, the current coalition government has ring-fenced science spending (though not the other areas of academic research, apparently covered by this sweeping announcement) through the lifetime of this parliament. But past governments of whatever ideology have not exactly had a good track record when it comes to the stability of scholarly funding. Science in particular has been a dirty word in Westminster at many times. Previous governments have sacked scientific advisors when their evidence-based advice did not match what the government wished to do. I’m distinctly uncomfortable  with the idea that research dissemination should be under the direct control of the government of the day; there’s too much opportunity for conflict of interest. Whether you are an OA advocate or not, I hope that you will be paying very close attention to this issue. Governments don’t always like the cold hard light of evidence. Reading between the lines, it would appear that a UK government-funded research repository consisting of articles plus all the associated data and some sort of communication layer is what is being proposed. I don’t see how this squares with the aims of preserving the services that need to be paid for, such as peer review. Logically one would surely seek to bolt peer review onto the repository which would rule out Gold OA as a model. By the same thinking, Gold OA would surely make the repository idea a waste of time. The data publication angle is a very interesting one as well. There’s very little information to go on here, but putting Jimmy Wales together with the word data, and I’d venture to suggest that Willets is arguing for all research data associated with a publication to be made open access as well. That is truly revolutionary. Now, as Erasmus once wrote, one UK ministerial speech does not UK government policy make. However, the intention is clearly there. Whether the coalition has the political will to move on this is open to question, as are any legislative steps that need to be taken (and the time available for that to happen). The Rubicon has not been crossed yet, but scouts have most definitely been sent up and down the river to asses the best place to start the process. Watch this space. Carefully.”

Publishers Support Sustainable Open Access

Posted: 04 May 2012 08:21 PM PDT

 
Publishers Support Sustainable Open Access
www.stm-assoc.org
[From the International Association of Scientific, Technical, nad Medical Publishers] “Publishers are committed to the widest possible dissemination of and access to the content they publish. We support any and all sustainable models of access that ensure the integrity and permanence of the scholarly record. Such options include 'gold' open access, whereby publication is funded by an article publishing charge paid by the author or another sponsor, a subscription-based journal, or any one of a number of hybrid publishing options. Most publishers now offer open access options and publish open access journals, and work closely with funders, institutions and governments to facilitate these developments. Gold open access provides one approach toward our shared goal of expanding access to peer-reviewed scientific works and maximizing the value and reuse of the results of scientific research. We believe that authors should be able to publish in the journal of their choice, where publication will have the greatest potential to advance their field.   Institutions and funders have a key role to play in ensuring that public access policies allow for funding of peer reviewed publication and publishing services in whatever journal that an author chooses. Publishers look forward to working with all stakeholders to achieve this goal and to advance scholarly communication. Signatories as of 8 March 2012...”

Harvard joins MIT in platform to offer massive online courses | Inside Higher Ed

Posted: 04 May 2012 08:19 PM PDT

 
Harvard joins MIT in platform to offer massive online courses | Inside Higher Ed
www.insidehighered.com
“After a whirlwind nine months that has witnessed a rapid rebirth of online education at elite U.S. universities in the form of massively open online courses, or MOOCs, Harvard University threw its hat into the ring Wednesday -- along with the largest investment yet in technology aimed at bringing interactive online education to hundreds of thousands of students at a time, free. Harvard will be piggybacking on MITx, the platform the Massachusetts Institute of Technology has developed for its own MOOCs, the universities jointly announced. The combined venture will be a nonprofit called edX. Harvard and MIT together have committed $60 million to the project, which is likely more than the combined venture funds raised by Coursera, Udacity and Khan Academy. Like the open courses being developed by MIT, Harvard’s open, online courses will be taught by the same professors who preside over the classroom versions. The courses will have to go through an approval process at each campus to make sure they measure up to standards of rigor and usability. ‘Certificates of mastery will be available for those motivated and able to demonstrate their knowledge of the course material,’ the universities said in a release... There is no word on what Harvard and MIT courses are next in line for MOOC adaptation, but university officials indicated that the edX offerings will include courses in humanities, social science and natural science. The EdX platform will be open source, ‘so it can be used by other universities and organizations who wish to host the platform themselves,’ according to the release. While EdX will initially play host to adapted versions of courses from MIT and Harvard, the institutions expect it to become a clearinghouse for open courses offered by various institutions. “MIT and Harvard expect that over time other universities will join them in offering courses on the edX platform,” the universities said. ‘The gathering of many universities’ educational content together on one site will enable learners worldwide to access the course content of any participating university from a single website, and to use a set of online educational tools shared by all participating universities.’ Harvard is hardly the first top university of late to announce a foray into large-scale open teaching. Stanford and MIT made their online moves last year; and Princeton University, the University of Pennsylvania, the University of Michigan at Ann Arbor, and the University of California at Berkeley are preparing to open their first MOOCs with help from Coursera, a venture-backed company started by two Stanford engineering professors. The string of announcements represents an online education renaissance among top-tier U.S. universities... Harvard and MIT say one of their main goals with edX is to generate learning data that the universities  can share freely with education researchers. The MITx platform, which will serve as the technology platform for edX, ‘already has a lot of mechanisms for understanding how students are learning,’ said Anant Agarwal, a computer science and engineering professor at MIT and the first president of edX. ‘These data will be available to researchers at MIT and Harvard and other universities around the world,’ he said. The combination of the data-rich online medium and the scale edX hopes to achieve will "enable [education researchers] to ask very different questions than we’ve been able to ask before," said Alan Garber, the provost at Harvard. By crunching granular data on the activity of students in the edX environment, educators will be able to get a sense not only of how well they perform on high-stakes tests but also ‘how well they acquire and apply the information months after a class has ended,’ Garber said. In a subtle swipe at the proprietary companies (like Coursera and Udacity) that have also built platforms through which top-tier universities can run MOOCs, L. Rafael Reif, the MIT provost, suggested that the ethic of transparency and public-mindedness Harvard and MIT bring to the table will make edX a more generous and responsible curator of the learning data that MOOC platforms will accumulate. Harvard and MIT say one of their main goals with edX is to generate learning data that the universities  can share freely with education researchers. The MITx platform, which will serve as the technology platform for edX, ‘already has a lot of mechanisms for understanding how students are learning,’ said Anant Agarwal, a computer science and engineering professor at MIT and the first president of edX. ‘These data will be available to researchers at MIT and Harvard and other universities around the world,’ he said. The combination of the data-rich online medium and the scale edX hopes to achieve will ‘enable [education researchers] to ask very different questions than we’ve been able to ask before,’ said Alan Garber, the provost at Harvard. By crunching granular data on the activity of students in the edX environment, educators will be able to get a sense not only of how well they perform on high-stakes tests but also ‘how well they acquire and apply the information months after a class has ended,’ Garber said. In a subtle swipe at the proprietary companies (like Coursera and Udacity) that have also built platforms through which top-tier universities can run MOOCs, L. Rafael Reif, the MIT provost, suggested that the ethic of transparency and public-mindedness Harvard and MIT bring to the table will make edX a more generous and responsible curator of the learning data that MOOC platforms will accumulate. In an e-mail to Inside Higher Ed on Wednesday, Daphne Koller and Andrew Ng, the co-founders of Coursera, called the nonprofit vs. for-profit argument ‘a red herring and a non-issue.’ Coursera, which has raised $16 million from outside investors, recently announced that it will be staging MOOCs on behalf of several of Harvard and MIT's peers, including Princeton, Stanford and Penn. Of all the recent entrants to the business of MOOC hosting. Coursera seems most likely to compete directly with edX. (Udacity and Udemy work with individual instructors, not institutions; and Khan Academy does not work directly with either.) Koller and Ng emphasized that despite its status as a for-profit company, Coursera was founded on educational principles that its two professorial founders carried over from their academic posts at Stanford. ‘All ... courses on the Coursera platform are free to students worldwide," Ng wrote. "Our partnerships with universities are non-exclusive and have no time commitment. Universities and their instructors retain the IP on the content they produce. Our partners have all chosen Coursera because they believe it provides students with the best learning experience.’ In a phone call with reporters following Wednesday's news conference, Agarwal, the edX president, said there is plenty of demand for high-quality, free, interactive online courses. ‘The more of these there are, the better,’ he said.   However, officials at MIT and Harvard said that eventually edX will have to figure out a way to be self-sustaining, and it does not have a business plan -- another aspect it shares with Coursera... ‘MIT and Harvard will use the jointly operated edX platform to research how students learn and how technologies can facilitate effective teaching both on-campus and online,’ the universities said in a release. ‘The edX platform will enable the study of which teaching methods and tools are most successful.’ In what has quickly become one of the strongest undercurrents of the messaging from these and other top institutions, the presidents of both colleges took care to underscore their commitment to the on-campus student experience during a news conference...”

No comments:

Post a Comment