Connotea: Bookmarks matching tag oa.new (50 items) |
- Persistent myths about open access publishing
- Natural England makes GI datasets available under Open Government Licence - Directions Magazine
- Don't Pay Twice
- EC-Survey on Scientific Information in the Digital Age: Open Science matters!
- Wellcome joins chorus calling for free online access to medical research - FierceBiotechIT
- Professor starts e-text company to electrify textbook field
- Around the Web: Some resources on the Panton Principles & open data
- Changes Coming For Open Access To Research In Europe | Intellectual Property Watch
- Clinical Trial Data Should Be Open for Review - in Infectious Disease, Flu & URI from MedPage Today
- Research Data Curation Bibliography
- BBC News - Data wars: Unlocking the information goldmine
- Some Observations on the Practice of Open Data as Opposed to Its Promise
- How open is corporate data in Open Government Partnership countries?
- Trust Acts to Open Research Findings to the Public - NYTimes.com
- How to shed the predatory label? Open peer review!
- Radio National PM April 11 2012 Open Access by mattoddchem on SoundCloud - Create, record and share your sounds for free
- Carl Malamud's Box of Goodies - On The Media
- OSDD Malaria Meeting, University of Sydney February 2012 - YouTube
- Elsevier Boycott Nears 10,000 Signatures
- The Digital Public Domain: Foundations for an Open Culture | International Communia Association
- Study: Sharing Patents, Rather Than Blocking Others, Encourages Innovation And Market Success | Techdirt
- The “Academic Spring” — Shallow Rhetoric Aimed at the Wrong Target
- Gekkan DRF: Digital Repository Federation Monthly
- Toward an Index of All Stories: Previewing Small Demons - ProfHacker - The Chronicle of Higher Education
- 21st century smarter government is 'data-centric' and 'digital first,' says US CIO - O'Reilly Radar
- Norton Scientific Journal: Researchers Call for Open-access Journals
- World Bank decides to make its research open access
- One Man's Quest to Make Information Free - Businessweek
- Elsevier responds to my text mining request
- Memorandum of Understanding between DataCite and re3data.org
- Academic publishing: Open sesame | The Economist
- PhD students: Don’t ‘occupy publishing’, just do your bit
- Scientists use public-database search to identify novel receptor with key role in type-2 diabetes - Office of Communications & Public Affairs - Stanford University School of Medicine
- Point of No-Return for Open Access
- Why PMC and UKPMC Should Harvest From Institutional Repositories
- FRPAA v. America Competes Act
- EU and World Bank step up pressure to make research available for free
- Scam Publisher Fools Swedish Cranks
- A (free) roundup of content on the Academic Spring
Persistent myths about open access publishing Posted: 17 Apr 2012 02:04 PM PDT Science news, comment and analysis | guardian.co.uk, (17 Apr 2012) “A spate of recent articles in the Guardian have drawn attention to lots of reasons why open access to research publications is reasonable,beneficial and even inevitable. But two recent letters columns in the Guardian, headlined "Information that we want to be free" and "Better models for open access", have perpetuated some long-running misconceptions about open access that need to be addressed. It's not surprising that for-profit, barrier-based publishers are fighting to stem the tide, by misinformation if necessary, but researchers and the general public need not be taken in. Richard Mollet, chief executive of the Publishers Association, claims that ‘publishers shoulder the administrative burden of filtering three million submissions to 20,000 journals.’ They do not: researchers, donating their time, do this. Publishers' role in the peer-review process is two steps removed from the coalface: they do not pay peer reviewers, nor in most cases do they pay the editors who handle the reviews, but only the administrative layer above the editors. Publishers and their representatives consistently perpetuate the idea that they provide peer review. We must recognise this claim for the landgrab that it is. Dr Robert Parker, chief executive of the Royal Society of Chemistry (which is itself a barrier-based publisher, though a not-for-profit one) points out that, ‘Open access does not mean free, as many readers may have assumed, with many costs involved including managing systems and content.’ Of course management and infrastructure can't be provided at zero cost – no one has claimed it can – but the important point is that open access is much more cost-efficient. For Elsevier, the biggest of the barrier-based publishers, we can calculate the total cost per article as £1,605m subscription revenue divided by 240,000 articles per year = £6,689 per article. By contrast, the cost of publishing an article with a flagship open access journal such as PLoS ONE is $1,350 (£850), about one eighth as much. No one expects open access to eliminate costs. But we can expect it to dramatically reduce them, as well as making research universally and freely available. In a recent article in the New Statesman, Dave Carr and Robert Kiley of the Wellcome Trust explain how the Human Genome Project placed its results in the public domain, and that the result has been that a $3.8bn project has achieved $796bn of economic impact with enormous implications for health care. These are the kinds of economic and societal opportunities that become possible when barriers to exploiting the results of research are taken down. We need to look beyond open access's economic effect on existing barrier-based publishers and see the effect it will have on society as a whole. Rick Bradford complains that under an author-pays open access regime ‘only people affiliated to an organisation which will pay on their behalf can publish.’ A. G. Gordon of London amplifies: ‘Had open-access journals been present in the past ... Einstein, as a patents clerk in 1905, would never have been able to afford to publish his four ground-breaking papers.’ Both Bradford and Gordon are evidently unaware that many open-access journals are free to authors as well as readers, being funded by government bodies, professional associations or similar ... Further, many fee-paying open access publishers offer waivers to authors without institutional funding... Finally, in his letter to the Guardian Prof Gordon McVie suggested that: ‘The Wellcome Trust can take a lead by removing impact factor as one of the major determinants of screening grant applications.’ Happily, it has already done so. Its open access policy states that the trust ‘affirms the principle that it is the intrinsic merit of the work, and not the title of the journal in which an author's work is published, that should be considered in making funding decisions’. This is a very important development. Probably the greatest impediment to more universal open access at the moment is researchers' fear that unless they place their work in established, prestigious, barrier-based journals, they will be at a disadvantage when competing for grants. This places the researcher's interest directly in opposition to the community's, which needs the work to be as widely available as possible. By promising to evaluate grant applicants on the quality of their work rather than the brand name with which it's associated, funding bodies can work to remove this fear.” |
Natural England makes GI datasets available under Open Government Licence - Directions Magazine Posted: 17 Apr 2012 02:03 PM PDT www.directionsmag.com “From the 1st April 2012 Natural England has made its publically available Geographic Information datasets available under the Open Government Licence. This means that datasets about areas of significance for the natural environment such as our protected site boundaries, habitat inventories, open access land and scheme agreements are now made available under a perpetual licence for commercial and non-commercial reuse. The Open Government Licence is designed to allow anyone - businesses, individuals, charities and community groups - to re-use public sector information without having to pay or get specific permission. It is designed to ensure everyone benefits from government information published under the UK Government’s transparency agenda. We are now able to use the licence, as we’ve secured copyright exemptions from Ordnance Survey under the Public Sector Mapping Agreement. Details of our datasets and the licence are available here. This change means that our datasets can be used in services such as Google Earth and in your own websites, services and applications without restrictions so long as data isn’t used in a misleading way and that you make a link to the licence on our website. We would welcome feedback and thoughts on how this will change the way our data can be used. Natural England is the government’s independent advisor on the natural environment. Established in 2006 our work is focused on enhancing England’s wildlife and landscapes and maximising the benefits they bring to the public. [1] We establish and care for England’s main wildlife and geological sites, ensuring that over 4,000 National Nature Reserves and Sites of Special Scientific Interest are looked after and improved. [2] We work to ensure that England’s landscapes are effectively protected, designating England’s National Parks, Areas of Outstanding Natural Beauty and Marine Conservation Zones, and advising widely on their conservation. [3] We run England’s Environmental Stewardship green farming schemes that deliver over £400 million a year to farmers and landowners, enabling them to enhance the natural environment across two thirds of England’s farmland. [4] We fund, manage, and provide scientific expertise for hundreds of conservation projects each year, improving the prospects for thousands of England’s species and habitats. [5] We promote access to the wider countryside, helping establish National Trails and coastal trails and ensuring that the public can enjoy and benefit from them.” |
Posted: 17 Apr 2012 02:02 PM PDT www.dontpaytwice.org [Use the link above to access the “Citizen Campaign supported by the American Medical Student Association and Universities Allied for Essential Medicines.” The Campaign is supporting the Tax Day Call for Action, April 17, 2012 and the Right to Research Coalition’s FRPAA Day of Action, April 25, 2012. The post provides links, making it easy for participants to email Congress, tell friends, and learn more.] A brief overview of the campaign reads as follows: “[1] Earlier this year, the publishing industry attempted to take away the public’s access to taxpayer-funded research through the Research Works Act (H.R. 3699). [2] The bill threatened to charge the American public twice for taxpayer-funded research: once to fund the research and again to see the results. It would have deprived patients and physicians of the latest advances and slowed the pace of innovation so central to American leadership in science and technology. [3] Halted by the efforts of patients, students, and taxpayers, the Research Works Act was pulled by its legislative sponsors just hours after Reed Elsevier withdrew its officials support of the bill. In their strategic withdrawal, Reed Elsevier made clear it was not motivated by a change of heart, but by the overwhelming opposition and outcry, to the bill. [4] But the job to bring public access to taxpayer-funded research is not done. Support the Federal Research Public Access Act (H.R. 4004/S. 2096): shorten the embargo period to no more than six months, not a year, to public access and extend this practice from NIH to other key federal research agencies. [5] It's the right thing for students, patients, and the economy.” |
EC-Survey on Scientific Information in the Digital Age: Open Science matters! Posted: 17 Apr 2012 02:01 PM PDT EDaWaX, (17 Apr 2012) “...the European Commission has just published the results of a consultation regarding accessibility and preservation of digital publications and research data in the European Union. Commissioner Neelie Kroes, responsible for the digital agenda for Europe, has launched this consultation in July 2011 for seeking views on access to and preservation of digital scientific information – to be more precisely, the survey broached the issues of Open Access for scientific publications, accessibility of research data and digital long term preservation. The purpose of the consultation was to gather information from as many sources as possible and receive important input for the future development of policy options in the area of scientific information in the digital age. 1.140 persons from 42 countries responded the online-survey. The biggest parts of the respondents were from Germany (422 answers), followed by people from France (129) and the UK (127). The majority of respondents were individual researchers (37.6 %). The remainder were citizens (27.5 %), university/ research institutes (8.4 %), libraries (7.3 %), publishers (6.4 %), international organisations (4.3 %), research funding organisations (1 %) and national, regional or local governments (0.8 %). Regarding the most relevant field for us – access to research data – the report stated that ‘…the vast majority of respondents (87 %) disagreed or disagreed strongly with the statement that there is no access problem for research data in Europe.’ The major barriers to access research data are: [1] lack of funding to develop and maintain the necessary infrastructures (80 %), [2] the insufficient credit given to researchers for making research data available (80 %), [3] and insufficient national/regional strategies/policies (79 %). Almost 90 % of responses had the opinion that research data that is publicly available and results from public funding have to be, as a matter of principle, available for reuse and free of charge on the Internet. Already when starting the survey, Mrs. Kroes had a clear vision regarding accessibility and availability of publicly funded research: ‘The results of publicly funded research should be circulated as widely as possible as a matter of principle. The broad dissemination of knowledge, within the European Research Area and beyond, is a key driver of progress in research and innovation, and thus for jobs and growth in Europe. Our vision is Open Access to scientific information so that all of us benefit as much as possible from investments in science. To accelerate scientific progress, but also for education, for innovation and for other creative re-use. For the same reason we must preserve scientific records for future generations’”... |
Wellcome joins chorus calling for free online access to medical research - FierceBiotechIT Posted: 17 Apr 2012 01:57 PM PDT www.fiercebiotechit.com “Wellcome Trust no longer wants to pay for medical research that ends up guarded behind a pay wall, and the U.K.'s largest private funder of medical research is considering several ways to bring a proverbial wrecking ball to such pay walls and make research papers available for free online under an open-access framework. The charity's efforts are a direct assault on the model that the publishers of go-to journals such as Nature and Science have used to charge universities millions of dollars for access to academic research. Wellcome, which pumps more than £600 million ($952 million) per year into medical research, aims to attach some strings to its grants that would, for instance, require that research funded by the charity become available free to the public within 6 months of publication, The Guardian reports. Evidently, more than 9,000 researchers have pledged their support to a grassroots effort dubbed the ‘academic spring’ that calls for a boycott of pay-only academic journals and support of open access publishing, according to The Guardian. Wellcome isn't endorsing a boycott of the established journals that charge subscription fees, but the charity has backed a plan to create an open-access journal called eLife that would compete with Nature and Science. And that move alone could propel the ‘academic spring’ movement." |
Professor starts e-text company to electrify textbook field Posted: 17 Apr 2012 01:57 PM PDT JSOnline.com Business, (16 Apr 2012) “M. Ryan Haley, a University of Wisconsin-Oshkosh economics professor, has started a company he hopes will undercut the academic textbook publishing industry and help college students save a lot of money. CoreTxt Plus Inc. is distributing a free digital statistics textbook to UW-Oshkosh students. ‘We bypassed the middleman, which is the people making all the money off our students,’ Haley said. ‘They're putting new editions out every few years now, and it's absurd. Statistics hasn't changed in 150 years.’ Haley estimates the e-text has saved UW-Oshkosh students taking the economics and business statistics class $100,000 to $150,000 during the four semesters it has been used. The textbook was written by Haley and three other professors under a grant from the U.S. Department of Education. It includes study questions written by business professors across the campus who will be teaching the students in future semesters. The book was reviewed by academics at other schools, the same way publishing houses review texts, Haley said. Professors who use the book can customize about 15% of it to their teaching style and needs, Haley said. Students who are more comfortable with a paper textbook can go to the university's copy center and have one made for about $15, he said. CoreTxt has copyrighted the content of the book with help from WiSys Technology Foundation Inc., the tech transfer agent for the UW System, said Maliyakal E. John, WiSys managing director. CoreTxt has copyrighted the content of the book with help from WiSys Technology Foundation Inc., the tech transfer agent for the UW System, said Maliyakal E. John, WiSys managing director. ‘The opportunity we see is that we can turn around these types of expert books in a shorter time frame, and the’students are tremendously served,’ John said. CoreTxt is looking to sell the book at other schools at a very low cost and to create digital textbooks for other large-enrollment, introductory-level classes, Haley said...” |
Around the Web: Some resources on the Panton Principles & open data Posted: 17 Apr 2012 01:53 PM PDT Confessions of a Science Librarian, (16 Apr 2012) [Use the link above to access the complete list of resources described in the blog post.] “As part of a workshop on Creative Commons, I'm doing a short presentation on Open Data and The Panton Principles this week to various members of our staff. I thought I'd share some of the resources I've consulted during my preparations. I'm using textmining of journal articles as a example so I'm including a few resources along those lines as well.” |
Changes Coming For Open Access To Research In Europe | Intellectual Property Watch Posted: 16 Apr 2012 01:51 PM PDT www.ip-watch.org “Pressure is growing in Europe for open, free access to research results, particularly if they are publicly funded. The European Commission (EC) said this week it will propose a plan for open access soon, while the Wellcome Trust and Research Councils UK are cracking down on researchers who don’t comply with their policies. Changes in the value chain enabled by the internet make sharing of scientific knowledge economically possible, European Digital Agenda Commissioner Neelie Kroes said at an 11 April meeting of the European Federation of Academies of Sciences and Humanities in Rome. Open access (OA) should apply to all research at least partly funded by taxpayers, but that holds true for all scientific and scholarly research as well, she said. The EC is readying a communication and recommendation on the way forward on OA to research results, Kroes said. It will reflect the EC decision to make all outputs funded under the EU Horizon 2020 Research and Innovation program openly accessible, she said. The proposal will also examine the role of electronic infrastructures in supporting OA, and the how to motivate researchers to share, she said. There are limits to openness and costs associated with it, such as personal data protection, Kroes said. There may sometimes be security reasons for not disseminating research widely, and private investments to defend. ‘But for me, these are exceptions, not the rule,’ she said. The only clear foes of open access are some major scientific publishers who believe their “raw material” may dry up and their subscription revenues from non-open access journals fall once libraries need fewer subscriptions, said Kroes spokesman Ryan Heath. The European Publishers Council doesn’t have a formal policy on OA, which is a business model issue for individual publishers, said Executive Director Angela Mills Wade. The main challenge the EC faces is to ensure that publicly funded research results ‘actually get deposited in OA repositories by the authors,’ Heath told Intellectual Property Watch. Best practices include funders such as the Wellcome Trust, and institution mandates such as the University of Liege, which require deposit and will only take into account deposited research for performance reviews, he said. But there is also the problem of getting publishers to offer ‘transition paths,’ such as the opportunity to make individual articles in a journal open access via a one-time, up-front payment, Heath said. This is called a hybrid model because other articles in the same journal may stay non-OA, he said. If double-dipping – charging full subscription prices and up-front charges for the OA articles – is avoided, then the hybrid model can ‘offer a glide path’ from a subscription-only to an OA-only scenario without necessarily affecting a publisher’s viability, he said. In the area of scientific data, as opposed to articles, there are no entrenched business models or positions yet, Heath said. The public sector is taking the lead in making the information available and it “can be a win-win situation with publishers,” he said. They have an interest in seeing that the data underlying the articles they publish is available, curated and preserved, he said. The EC will pilot a project that asks funded researchers to make their data open access in fields where that is appropriate, he said. The UK-based Wellcome Trust is preparing to launch a new digital journal, eLife, which it says will serve as a platform for speeding scientific advancement by making results available as quickly as possible, openly and in a way that helps other build upon them. Last year, the trust published an open access policy under which authors of research papers it funds must maximise the opportunities to make their results available for free. The policy requires electronic copies of any research papers that have been accepted in a peer-reviewed journal, and are supported in whole or part by the trust, to be made available as soon as possible and, in any case, within six months of the journal publishers’ official date of final publication. The trust said it will give grant-holders additional money, through their institutions, to cover OA charges in order to meet its requirements. The policy encourages, and, where it pays an open access fee, requires, authors and publishers to licence research papers so they can be freely copied and re-used, provided such uses are fully attributed. The trust is also bearing down on scientists it funds to make sure they comply with the requirement to make their results publicly available for free within six months, Director Mark Walport said in a 9 April interview with The Guardian. Currently only about 55 percent of research papers to which the trust’s funding contributes are compliant, Head of Digital Services Robert Kiley and Policy Advisor Dave Carr told Intellectual Property Watch in written comments. ‘It is simply not acceptable to us that nearly half of the publications we fund are blocked behind subscription walls,’ they said. Proportionate sanctions are needed to ensure that trust-funded researchers and their institutions take responsibility for making their published research findings freely available in a way that guarantees the greatest possible benefit, they said. The trust recognises that some researchers face legitimate difficulties in complying, and will continue to help them address the issues, they said. Possible new approaches might include a requirement that institutions officially confirm that all publications associated with a grant had been made OA before the final instalment is paid, Kiley and Carr said. Another might be to make future funding conditional on compliance, they said... Meanwhile, Research Councils UK (RCUK) continues to seek feedback on proposed revisions to its policy on access to research outputs, spokeswoman Jane Wakefield said. The councils are part of a national working group on expanding access to published research findings, which is expected to propose a plan of action and make recommendations to the governments and others, according to the draft. In the proposed policy, open access means unrestricted, online access to peer-reviewed and published scholarly research papers. Under the changes, open access will include unrestricted use of manual and automated text and data mining tools, and unlimited reuse of content with proper attribution, the RCUK said. The policy will require that research papers paid for by the councils be published in journals that meet their open access standards. In addition, it said, it won’t support researchers who use publishers who embargo papers for more than six months from date of publication (12 months for research paid for by the Arts and Humanities Research Council and the Economic and Social Research Council). The revised policy is expected to be approved this summer, according to the draft.” |
Clinical Trial Data Should Be Open for Review - in Infectious Disease, Flu & URI from MedPage Today Posted: 16 Apr 2012 01:49 PM PDT www.medpagetoday.com “Original clinical study reports, which contain far more detail than published randomized trials, should be made available to independent researchers seeking to verify efficacy and safety claims, a group of Cochrane reviewers argued. In support of this argument, the history of the influenza antiviral oseltamivir (Tamiflu), which was approved by the FDA in 1999, was cited by Peter Doshi, PhD, from Johns Hopkins University in Baltimore, and colleagues in an article published online in PLoS Medicine. In the run-up to the expected H1N1 influenza pandemic in 2009, the U.S. stockpiled a $1.5 billion supply of the drug to be used in case an outbreak occurred before a vaccine could be developed. This action was done in light of claims by the Department of Health and Human Services that the antiviral would reduce hospitalizations, and by the Advisory Committee on Immunization Practices that its use could prevent disease complications. The widespread belief in oseltamivir's efficacy, according to Doshi's group, was based on a meta-analysis of 10 trials conducted by the manufacturer prior to licensure. The authors of that meta-analysis, which was published in 2003, suggested that the drug could reduce the likelihood of secondary complications. But the authors pointed out that the FDA, which was aware of these clinical trials, concluded that oseltamivir had not been shown to reduce complications and required a statement on the drug's label to that effect. The current authors requested the full clinical study reports from Roche, the manufacturer of oseltamivir, with the goal of independently scrutinizing all the available data. They reported receiving 3,200 pages in response. They subsequently were able to obtain more data through a Freedom of Information request to the European Medicines Agency, but consider this just a fraction of the manufacturer's data... ‘This information has turned our understanding of the drug's effects on its head,’ the researchers stated. ‘We challenge industry to either provide open access to clinical study reports or publicly defend their current position of [randomized controlled data]...‘ However, a group of authors from several European regulatory agencies responded in a PLoS Medicine perspective article that there are valid reasons for not opening trial data up to the public. First is the importance of patient confidentiality, which could be compromised, according to lead author Hans-Georg Eichler, MD, of the European Medicines Agency in London.Secondly, they challenged the notion that so-called ‘independent’ research groups are necessarily free of conflict of interest. ‘Personal advancement in academia, confirmation of previously defended positions, or simply raising one's own visibility within the scientific community may be powerful motivators,’ wrote Eichler and colleagues, also in PLoS Medicine.Furthermore, unrestricted availability of data could lead to promulgation of misleading information and possible health scares among the public. The regulators suggested several steps that could help resolve these conflicts, such the establishment of adequate, but not excessive, standards of protection for patients and quality requirements for meta-analyses. They also called for openness not only on the part of industry, investigators, and regulators, but also on the part of those who conduct independent data re-analyses.” |
Research Data Curation Bibliography Posted: 16 Apr 2012 01:47 PM PDT [Use the link above to access the bibliography on data curation published by Digital Scholarship. “Established in 2005 by Charles W. Bailey, Jr., Digital Scholarship provides information and commentary about digital copyright, digital curation, digital repository, open access, scholarly communication, and other digital information issues. Digital Scholarship's digital publications are open access. Both print and digital publications are under versions of the Creative Commons Attribution-Noncommercial License. All works include sources that are in English.” More information about Digital Scholarship can be found at <http://digital-scholarship.org/about/overview.htm> “Charles W. Bailey, Jr. is the publisher of Digital Scholarship. He has over 30 years of information and instructional technology experience, including 24 years of managerial experience in academic libraries. From 2004 to 2007, he was the Assistant Dean for Digital Library Planning and Development at the University of Houston Libraries. From 1987 to 2003, he served as Assistant Dean/Director for Systems at the University of Houston Libraries. From 1976 to 1986, he served as the head of the systems department at an academic medical library, a systems librarian at a research library, a technical writer at a bibliographic utility, and a media librarian at an academic media center” A biographical sketch for Charles W. Bailey, Jr., author of the following bibliography on digital curation, can be found at <http://digital-scholarship.org/cwb/cwbaileyprofile.htm> ] An introduction to the bibliography reads as follows: “The Research Data Curation Bibliography includes selected English-language articles and technical reports that are useful in understanding the curation of digital research data in academic and other research institutions. For broader coverage of the digital curation literature, see the author's Digital Curation and Preservation Bibliography 2010. The ‘digital curation’ concept is still evolving. In ‘Digital Curation and Trusted Repositories: Steps toward Success,’ Christopher A. Lee and Helen R. Tibbo define digital curation as follows: Digital curation involves selection and appraisal by creators and archivists; evolving provision of intellectual access; redundant storage; data transformations; and, for some materials, a commitment to long-term preservation. Digital curation is stewardship that provides for the reproducibility and re-use of authentic digital data and other digital assets. Development of trustworthy and durable digital repositories; principles of sound metadata creation and capture; use of open standards for file formats and data encoding; and the promotion of information management literacy are all essential to the longevity of digital resources and the success of curation efforts. This bibliography does not cover books, digital media works (such as MP3 files), editorials, e-mail messages, interviews, letters to the editor, presentation slides or transcripts, unpublished e-prints, or weblog postings. Coverage of technical reports is very selective. Most sources have been published from 2000 through 2011; however, a limited number of earlier key sources are also included. The bibliography includes links to freely available versions of included works. If such versions are unavailable, italicized links to the publishers' descriptions are provided. Such links, even to publisher versions and versions in disciplinary archives and institutional repositories, are subject to change. URLs may alter without warning (or automatic forwarding) or they may disappear altogether. Inclusion of links to works on authors' personal websites is highly selective. Note that e prints and published articles may not be identical." |
BBC News - Data wars: Unlocking the information goldmine Posted: 16 Apr 2012 01:41 PM PDT www.bbc.co.uk “In his 1950 paper entitled Computing Machinery and Intelligence, computer scientist Alan Turing opens with the words: "I propose to consider the question, 'Can machines think?' Sixty years on, the idea of intelligent computers seems a little less ridiculous. Technology leaps forward year by year with ever greater processing powers, bigger clouds, faster internet connections, sleeker interfaces and cleverer self-learning algorithms. It poses the question: are computers getting closer to human intelligence by being able to make value judgements, understand concepts and process a world that is not just black and white in real time? The answer is yes. Not only is this a significant step forward, but it also has the potential to fundamentally change the world we know... If technology can give data meaning, whether it is voice, video, text or images, for instance, think of the prospect of using this insight to create a more predictable, and consequently less volatile, world... The reason why understanding meaning in information has generated a lot of excitement is because of its enormous potential. Today, thanks to technology's mass appeal and accessibility, on a daily basis we collectively produce 2.5 quintillion bytes of data, and the growth rate is so high that 90% of all information ever created was produced in the last two years alone. The value of this information to organisations who want to keep ahead of the curve is huge. In the business world, information has traditionally been housed in databases. This has been an effective way to keep information safe and stored in a uniform fashion. However, the database was created as a way to work around the limitations computers had half a century ago. As the majority of information flowing through organisations today is unstructured or 'human information' (90%), such as text, email, video, and audio, and therefore impossible to box into traditional databases, being able to manage that information so you know where it is and process it so you know what it means is essential. For instance, you could store customer-service call centre recordings into a database if you really had to, but you would have no idea whether the customer was happy or not. This is exactly the type of insight businesses want to know after all. To understand them, organisations have to take a new approach to managing and uncovering the meaning of their data. The next-age information platforms make sense of, and understand the meaning of, information and use it to solve problems, in order to give businesses a better understanding of themselves, their customers and the competition. This is what has the analyst community so excited, due to the sheer size of the market opportunity. Customer browsing and purchasing habits create sizeable data trails and, coupled with social-media integration, can allow content to be effectively targeted. However, it has to be noted that businesses can only be effective if decisions can be made in real time. Business data is of little use if companies are not quick enough to act on it, which is often the case. Organisations that truly want to understand their customers and keep up with changing demands need to change their organisation culturally in order to do this successfully. The challenge is that having tens or hundreds of thousands of customers, each with potentially thousands of data points connected with them, results in a very large amount of data being collected. Part of this might be structured data that can be easily categorised, for example, gender, age, and geography, but the majority will be unstructured, human information. The content and sentiment of a product review, for instance, is not something that can be slotted neatly into a standard database, but the ability to tap into information like this has huge implications for driving sales or safeguarding a reputation. The key is to understand that data - leaving it exactly where it is - in order to create an infinitely scalable platform, and a powerful basis for analysis and action throughout the entire enterprise. By applying meaning, we can cut through the gordian knot of trying to find that ‘perfect’ database, and get to the heart of the issue - being able to process 100% of the information, structured and unstructured, to unlock real business value. By applying the ability to understand meaning, businesses can get a view of all of their data, not only the 10% of ‘neat’, structured data, but the whole 100%. What we can do now has never been possible before: the next IT revolution is happening in the ‘I’ - the information - not the ‘T’.” |
Some Observations on the Practice of Open Data as Opposed to Its Promise Posted: 16 Apr 2012 01:40 PM PDT Some Observations on the Practice of Open Data As Opposed to Its Promise The Journal of Community Informatics 8 (2), (04 Feb 2012) [Use the link above to access the full text of the article published in the Journal of Community Informatics: a global e-journal. Community Informatics is an open access journal published using Open Journal Systems] The abstract reads as follows: “This note is a contribution to the continuing debates and analyses about what can and should be done to make public data open. In this note, I share some observations about current practices surrounding public data. In general, these observations lead to the insight that absolutely open public data is and will continue to be rare. Instead, various types of data are apt to be more or less open, and the reasons for the degree of openness may vary from one situation to another, that is by type of data, by country, by type of institution, etc.” |
How open is corporate data in Open Government Partnership countries? Posted: 16 Apr 2012 01:36 PM PDT Open Knowledge Foundation Blog, (16 Apr 2012) “Today, the day before the Open Government Partnership meeting starts in Brasilia, OpenCorporates is publishing a major new report into access to company data in OGP countries, and the picture is not good. Out of a total of a possible 100 points, the average score was just 21, with several major countries (including Spain, Greece and Brazil) scoring zero. A score of 100 means that the company register is an open data register, making detailed information free for reuse under an open licence, and also makes the information available as open data. A score of zero means the central register can not even be search without payment or registration. Highest score is the Czech Republic, with a score of 50, though the UK will achieve a score of 70 when it starts publishing a limited set of data under an open licence in July. Virtually all OGP countries score very badly for openness of company data, with several – including countries such as Spain, Greece and Brazil – effectively closed for the public, civil society and the wider world, undermining corporate governance, and providing a fertile ground for corruption, money laundering, organised crime, and tax evasion... a summary of the data should be available below...” [Use the link above to access the charts discussed in the above bookmark.] |
Trust Acts to Open Research Findings to the Public - NYTimes.com Posted: 16 Apr 2012 01:35 PM PDT www.nytimes.com “The trend toward open access scientific publishing gained strength last week when the Wellcome Trust, the second-largest nongovernmental funder of scientific research in the world, said it was considering sanctions against scientists who do not make the results of their research freely available to the public. The London-based trust, which funds £650 million, or about $1 billion, of medical and scientific research every year, already has a policy requiring researchers to “maximize the opportunities to make their results available for free” and a presumption in favor of publication in open access journals available on the Internet. But Sir Mark Walport, the trust’s director, told the Guardian newspaper that only 55 percent of Wellcome-funded researchers comply. Scientists often prefer to publish in journals that refuse to make the work available without paying a fee. One option reportedly under consideration is to withhold the last installment of a grant until the research is publicly available; another option would be to make grant renewal contingent on open access publication. The open access movement arose in response to the high subscription fees for scientific journals, which in some cases can amount to thousands of dollars a year. Initiated by scientists, the movement has grown rapidly in recent years, partly because of support from university librarians who saw their acquisitions budget swallowed up by rising subscription costs. The success of journals such as PLoS One, an on-line journal that began by publishing 138 articles in 2006 and is now the largest scientific publication in the world, has also encouraged imitation. Last year the Wellcome Trust announced that it would begin a new open-access journal, eLife, aimed at competing with prestigious subscription-based publications such as Science and Nature. — D. D. GUTTENPLAN” |
How to shed the predatory label? Open peer review! Posted: 15 Apr 2012 06:08 PM PDT www.openbiomed.info, (13 Apr 2012) “There are more than 50 questionable open access publishers on Jeffrey Beall’s List of Predatory Open-Access Publishers. Some questionable journals publish independently of any publisher. How has this disease spread? Here are my thoughts and evidence: [1] Plug-and-play content management such as Open Journal Systems (OJS) provides a no-cost easy way to set up a professional journal publishing platform. For better or worse, this respected attempt by the Open Knowledge Project aimed at ‘improving the scholarly and public quality of research’ has simultaneously offered a fast-track opportunity for publisher copycats to launch an ‘open access’ scholarly publishing operation. To be fair to OJS, there are more than a dozen publishing software packages listed in the Open Access Directory launch by Simmons College. [2] Nearly every journal concerned with their reputation in fields such as medicine makes considerable effort to adopt a credible peer review mechanism, but the traditional method of author-blinded reviewing does not provide a way for the submitting author to see the identity of the reviewer who detected deficiencies or judged their research deficient. A journal admitting the lack of sufficiently knowledgeable reviewers may ask an author to provide names and contact information of potential reviewers in her/his specialty, creating a different kind of competing interest. [3] When one of the questionable publishers recently sent me an email solicitation to publish in one of their journals, I also noticed that they charged an author-processing charge (APC) of $400-$600. That amount is very low, considering that open access journals with strong peer review reputations have a business model that requires 200%-500% of that amount for an APC... [4] A code of conduct published by the Open Access Scholarly Publishing Association (OASPA) is voluntary, and the OASPA itself has not taken fellow publishers to task about the predatory evidence that seems to exist for more than 50 questionable open access publishers. [5] The stagnant global economy has created the largest unemployment rates in modern history, and would-be entrepreneurs with computer skills and time on their hands could find opportunity by simulating a closed system of peer review and rapid publishing turn-around, satisfying the need of researchers to find a publisher when competition for a place in important journals is very competitive... I think there is something to do to create trust that even goes beyond aligning with the OASPA code of conduct: adopt careful and scientifically-based peer review that is open and available for authors, readers, and the institutions that are providing APC subsidies as part of the Compact for Open-Access Publishing Equity (COPE). I encountered open peer review more than 9 years ago, as the editor of a new open access journal on the Biomed Central (BMC) platform. BMC announced that journals could consider publishing the peer reviews as part of an article’s history, signed by the review authors. The editorial staff I initially led declined to make open peer review mandatory, but at least one BMC journal has embraced it. Here is their own description: ‘Biology Direct offers a novel system of peer review, allowing authors to select suitable reviewers from the journal’s Editorial Board; making the peer-review process open rather than anonymous; and making the reviewers’ reports public, thus increasing the responsibility of the referees and eliminating sources of abuse in the refereeing process...’ It may occur to you that a well-written review becomes a scholarly publication, much in the way a book review does in other disciplines. One potential downside is the same concern raised by my own editorial staff. In a relatively small community of like-minded professionals, would an honest review that appeared harsh not be appreciated in the spirit of improvement and/or considered a threat to friendship? I guess we have to ask ourselves why more open access journals did not adopt open peer review, once someone as respected as NCBI Director David Lipman became the editor of Biology Direct...” [Use the link above to access the full text, providing examples of reviews written during the open peer review process.] |
Posted: 15 Apr 2012 05:36 PM PDT soundcloud.com Use the link above to access the brief news report from the Australian Broadcasting Company Radio National. The report focuses on the Cost of Knowledge boycott and the recent announcement from the Wellcome Trust regarding their open access journal eLife. Additionally, a brief description of OA and its benefits is also offered. The report was uploaded to SoundCloud, a social media site for audio sharing. |
Carl Malamud's Box of Goodies - On The Media Posted: 15 Apr 2012 05:25 PM PDT www.onthemedia.org Use the link above to access Alex Goldman’s interview of Carl Malamud on the program “On the Media.” An overview of the program reads as follows: “This week on the show, we're talking with Carl Malamud, open government advocate and director of public.resource.org. Malamud's interested in what are called ‘incorporation by reference.’ The phrase describes laws that have references to safety standards embedded in them. For instance, a law that says your car has to have turning signals will also include standards about how bright or how large the turn signals have to be. Malamud's gripe is that frequently, those standards are written by private companies, and those companies hold the copyright to the standards. That means that even though the standards are part of a law everyone has to follow, citizens have to pay -- sometimes several hundreds of dollars -- for the privilege of looking at them. So Malamud's come up with a simple, albeit expensive solution. He spent over $7,000 to purchase 73 public safety standards that have been incorporated by reference into law. Then he printed them out, and recently, he's been shipping the documents (which weigh thirty pounds, all-in) to various governmental agencies and standards development organizations. He plans to publish the documents online for the public soon, but he's giving the various agencies and organizations until May 1st to comment on his plan. If it doesn't become a copyright lawsuit, Malamud plans to start publishing these standards to the web regularly. You can listen to the segment from the show on the embedded player below...” |
OSDD Malaria Meeting, University of Sydney February 2012 - YouTube Posted: 15 Apr 2012 05:24 PM PDT www.youtube.com Use the link above to access a series of YouTube videos covering the Open Source Drug Discovery (OSDD) Malaria Meeting in Sydney, Australia from February 2012 “OSDD is a CSIR [Council of Scientific and Industrial Research] led team India Consortium with global partnership with a vision to provide affordable healthcare to the developing world by providing a global platform where the best minds can collaborate & collectively endeavor to solve the complex problems associated with discovering novel therapies for neglected tropical diseases like Malaria, Tuberculosis, Leshmaniasis, etc.” “OSDD is a community of students, scientists, researchers, academicians, institutions, corporations and anyone who is committed to discovery of drugs in an open source mode. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multifaceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Numerous academic and research intuitions along with industries are partnering with CSIR in Open Source Drug Discovery Project. The OSDD approach is to conduct early stage research in an open environment in a highly collaborative fashion involving best minds from across the world. In the development stage of the drug it collaborates with partners like contract research organizations in the pharmaceutical sector or public sector institutions with development capability. The conduct of research and development in the countries having disease burden, yet having competencies, will bring in skills at highly affordable scale. To deliver drugs to the market OSDD relies on the generic industry business model. The generic drug industry has played a key role in making drugs affordable and accessible in the developing world. The drugs that come out of OSDD will be made available like a generic drug without any IP encumbrances so that the generic drug industry can manufacture and sell it through their channels anywhere in the world. The generic drug industry based market model creates the environment of affordability. OSDD brings in the concept of open source, crowd source, open science, open innovation and product development partnership concepts on the same platform and leaves delivery of drugs to market forces.” |
Elsevier Boycott Nears 10,000 Signatures Posted: 15 Apr 2012 05:18 PM PDT Singularity Hub, (14 Apr 2012) “The call to action by Cambridge professor Timothy Gowers to boycott Elsevier last January has resounded with academics, as the number of signatures recently crossed the 9,600 threshold. Researchers across the world have come out in droves at times to sign the boycott at The Cost Of Knowledge website. The boycott targets Elsevier specifically for charging “exorbitantly high prices for subscriptions,” which forces libraries to buy expensive journal bundles rather than selecting individual titles... and for supporting measures such as SOPA, PIPA, and the Research Works Acts, all of which attempt to control the free exchange of information for corporate profit (Elsevier publicly rescinded its support for the Research Works Act). The boycott has been covered by Nature, the Chronicle of Higher Education, and most recently, The Guardian. But, as Harvard professor Stuart Shieber pointed out on his blog about the daily signups asking “Have scientists lost interest again?”, the signatures started to taper off in early March, according to his numbers which are shown in the following chart... While the momentum appeared to be flatlining, a resurgence in the number of signatures occurred in mid-March around the time that Singularity Hub brought the Elsevier boycott to the attention of its readers. After a few weeks, however, the daily signups withered to less than 10 per day. But the recent media coverage about the Wellcome Trust, the second largest nongovernmental funding agency of medical research in the world, reaffirming its support of the Open Access movement seemed to spur on the boycott as 273 researchers signed up in a single day. Though the 10,000 signature milestone is on the horizon, the risk of losing momentum is real, just by looking at the chart. Another flatlining could make hitting 10,000 signatures weeks away. Yet, the milestone is vital to the Open Access movement as it is likely to trigger another firestorm of media coverage that will help keep the public informed about the importance of open access to scientific research... Open access for academic research is reminiscent of the famous Greek myth about Sisyphus, a wise and prudent ruler, who was condemned by the gods to repeatedly roll a rock up a mountain where at the top, it would roll back down again. Albert Camus in his essay Myth Of Sisyphus, within which he explores the absurdity of human existence, said, ‘[The gods] had thought with some reason that there is no more dreadful punishment than futile and hopeless labor.’ The Sisyphean damnation is a fitting metaphor for the Open Access movement faced with the absurdity of publicly-funded scientific research locked behind paywalls as it struggles to free academic research from an antiquated journal system. With the 10,000 signature milestone in sight, it will take only a few hundred more signatures to help researchers take a big step toward breaking free from its own Sisyphean tragedy.” |
The Digital Public Domain: Foundations for an Open Culture | International Communia Association Posted: 15 Apr 2012 05:17 PM PDT www.communia-association.org [Use the link above to access the announcement from the Communia Association, regrading the publication of a new book. “The mission of the Communia Association is to foster, strengthen, and enrich the Public Domain. The full text pdf is available] To fulfill its mission, the Communia Association and its Members will raise awareness in, educate about, advocate for, offer expertise on and research about the Public Domain. The Association aims at maintaining and reinforcing a network of European and international organisations that constitute reference for policy discussion and strategic action on all issues related to the public domain in the digital environment and related topics.”] “The book ‘The Digital Public Domain: Foundations for an Open Culture’, edited by Melanie Dulong de Rosnay and Juan Carlos De Martin as an output of the Communia Thematic Network which took place between 2007 and 2011 and is at the origin of Communia Association, is out in all formats (hardback, paperback, and digital editions) and can be purchased on the website of OpenBookPublishers. The book is under a CC Attribution license and the PDF can be downloaded here:The Digital Public Domain: Foundations for an Open Culture Citation reference: Melanie Dulong de Rosnay, Juan Carlos De Martin, (eds.), The Digital Public Domain: Foundations for an Open Culture, Open Book Publishers, Cambridge, UK, 2012, 220 p. This book brings together essays by academics, librarians, entrepreneurs, activists and policy makers, who were all part of the EU-funded Communia project. Together the authors argue that the Public Domain — that is, the informational works owned by all of us, be that literature, music, the output of scientific research, educational material or public sector information — is fundamental to a healthy society. The essays range from more theoretical papers on the history of copyright and the Public Domain, to practical examples and case studies of recent projects that have engaged with the principles of Open Access and Creative Commons licensing. The book is essential reading for anyone interested in the current debate about copyright and the Internet. It opens up discussion and offers practical solutions to the difficult question of the regulation of culture at the digital age.” |
Posted: 15 Apr 2012 05:16 PM PDT www.techdirt.com ‘There's been plenty of research over the years (much of which we've pointed to here) showing that the sharing of information and knowledge -- including information and knowledge that leads to innovation breakthroughs -- can actually help companies thrive. Studies on the early success of Silicon Valley by Annalee Saxenian focus heavily on how information sharing among companies -- even those in competition with each other -- helped make Silicon Valley so successful. That's because the breakthroughs opened up new markets and expanded them in ways that allowed multiple players to thrive. To put it another way: if, by sharing information, companies were able to reach major market-changing breakthroughs faster, there would be more than enough benefit to go around as the new markets expanded. Thus, the ‘cost’ of having competitors with the same knowledge was dwarfed by the ‘benefit’ of having the innovation and the resulting market expansion. Gene Cavanaugh points us to a new study that appears to reiterate this basic point, but focusing directly on situations with patents. The research, by economist Gilad Sorek, found that the free-licensing of patents to competitors actually increases the likelihood that a company's profits will grow as the result of a particular innovation. In other words, contrary to what many believe (that the best thing to do with a patent is to restrict others from using it), this research suggests that openly sharing that information for free actually tends to help the patent holder in the long run by opening up new opportunities that increase their profit. ‘The study, to be published in a forthcoming issue of Economics Letters, shows that the benefits of giving up patent protection outweigh the risks of surrendering a share of the market. By inviting further research, Sorek says, the original innovator is able to stimulate demand for its product. The company may lose a share of the market, but its product ultimately becomes more valuable as a result of the extended innovation effort.’ The research points out that such open and free licensing acts as a way to get free research and development from other companies that help expand the original innovator's market. This paper certainly seems to match what we've seen in other research in the past and, yet again, raises significant questions about the way many companies today manage their patent portfolios, as well as how they view the process of innovation itself.” |
The “Academic Spring” — Shallow Rhetoric Aimed at the Wrong Target Posted: 15 Apr 2012 05:14 PM PDT The Scholarly Kitchen, (12 Apr 2012) “Recently, the term ‘academic spring’ has appeared in relation to statements from the Wellcome Trustand the Elsevier boycott started by mathematician Timothy Gowers, mainly in coverage in theGuardian, which seems to have become a cheerleader in the open access (OA) movement. The term ‘academic spring’ was apparently introduced by a librarian named Barbara Fister. The term borrows from the Arab Spring, a set of uprisings that started in 2010 during which thousands if not millions rose up against tyrannical regimes, risking and sometimes losing their lives in an attempt to create more representative governments... The ‘academic spring’ as envisioned by coverage in the Guardian and in Fister’s blog post is altogether different from the Arab Spring and, I would argue, is a term inappropriately and cynically borrowed. How can you cheapen the striving of an entire culture for representative government by equating that with complaints about paywalls? But there is a deeper problem with equating a boycott of publishers or a call for open access with a revolution against great and domineering powers — that is, by targeting publishers, academics are targeting the wrong power-players in the academic marketplace. It’s becoming increasingly clear that there is a more plausible candidate for the role of villain in this overcharged environment, a powerful group exploiting academics, taking taxpayer funded research for itself, and shifting risk from itself to those it ostensibly values. But it’s not publishers. It’s the universities themselves. Yesterday, I published a review of ‘The Economics Shaping Science.’It becomes clear from this carefully written book that one of the main economic transformations in science careers is publication — academics publish in order to turn information that has no inherent economic value for them (it is non-rival and non-excludable) into things that are rival and excludable, and therefore have economic value (namely, priority and prestige). Academics accomplish this transformation through the act of citable publication. Journal publishers create and maintain the venues that establish and record priority and lend prestige, including the third-party validation, the underlying coordination of peer-review, the maintenance and integration of platforms, the dissemination and longevity of materials, and so forth. This is the case no matter the underlying business model of the publisher. The ‘academic spring’ is meant to force publishers to stop erecting paywalls. As an alternative to readers paying, some funders and activists want journals to be funded largely by research dollars derived from the researchers’ own grants (and by extension, granting entities). These pre-paid articles are then freely available for anyone to read them. This makes little rational sense given the economics of science, which may explain why the Gowers petition has only received 9,000 signatures (as compared with the 30,000+ the initial PLoS petition garnered years ago), and why a recent similar petition in the US failed to even gain 1,000 signatures (but don’t tell the Guardian that — it doesn’t fit with their narrative): [1] Scientists sense there is little to be gained. Given the transformation accomplished by publishing (from economically valueless for the authors to economically valuable for the authors), the only economic value an author derives is from prestige and priority. (There is no economic value to scientists retaining copyright, either, kids.) Therefore, the best venue for publication (the best brand) and the first recorded finding are what the rational player in this market would seek. There is no clear reason to have unqualified readers who add nothing to either prestige or priority value. The citation advantage myth demonstrates this — OA publication adds to readers but not qualified readers, those who care about prestige and priority. Therefore, the value of OA is not increasing significantly, with stagnant prices demonstrating this. [2] Funding overlaps are obvious and problematic. Gold OA article publication fees are typically paid out of research funds. When OA publication was less common, this aggregate amount may have been trivial. Now, it is deep into millions of dollars, if not into the hundreds of millions, money flowing directly from research grants and awards to OA publishers. Subscription dollars, on the other hand, flowed from other budgets — department overhead, library acquisitions, and personal. By insisting on a system of research-dollar funded OA, academics are depleting their available grant funding to pay publishers. This is not completely rational. Instead, it is a looming Tragedy of the Commons around research funding. [3] The Pogo problem. Pogo’s most famous quote is, ‘We have met the enemy and he is us.’ Scholarly publishers are often academic centers or societies themselves. Depleting funds from these publishers in order to accomplish a dubious outcome (more readers who matter at best peripherally to prestige and priority) turns out to create self-inflicted wounds. Publishing enterprises that subsidize societies have a harder time as competitive initiatives like PubMed Central and OA repositories take traffic away, driving down ad impressions and usage stats librarians use to value subscriptions, leading to decreased revenues from publications. As these subsidies diminish, membership fees increase, grants shrink, or other fees are created for each community. Or, the academic or society publisher embraces OA to compensate for lower subscription revenue prospects, and thereby moves us closer to the research funding Tragedy of the Commons. Meanwhile, without any publisher’s hand in this, library budgets have been shrinking as a percentage of university revenues for decades, despite the fact that universities have been actively recruiting more and more scientists and researchers, benefiting mightily from their work, and over-producing PhDs. More and more academics are being forced to accept soft-money jobs and adjunct positions as universities pad their endowments and shield themselves from the risks of full-time faculty and dedicated research staffs. And the intellectual property of federally funded research has become the legal property of these same universities in the United States at least, many of which make millions of dollars exploiting it while some scientists hit it big. And academics are protesting publishers? Publishers — OA or subscription — are essentially partners with scientists and researchers, creating a clear, well-attenuated, and highly valuable route to transforming ideas into careers. Of course, there are more journals, and many more important journals, than there were 30 years ago. But ask yourself these questions: [1] Who shrank the library budgets while simultaneously courting researchers and research dollars? [2] Who’s training too many PhDs for the economy to absorb, while generally (through commission or omission) misleading PhD candidates about how viable a PhD will be? [3] Who has moved away from tenure-track positions and intramural funding? [4] Who has become too reliant on blunt and unreliable metrics like impact factor and h-index to rank faculty? [5] Who holds the IP for federally funded research in the United States, and exploits that without returning it to the taxpayer? Many of these university-driven practices make publication in journals all the more valuable — getting funding has become more tooth-and-nail at the researcher level, and a brutal h-index year can spell the end for some. This is why pay-for-publication has flourished recently. Competition for grants, status, priority, and prestige is all the fiercer because there’s such a reliance on soft money now. Instead of ‘publish or perish,’ it’s ‘fund or fail.’ As it has done with the Monbiot rant and other open access retreads, the Guardian is giving full throat to these rhetorical bombshells. For some reason, the editors of this particular paper — which I otherwise admire — have lost their objectivity when it comes to topics like open access and the Gowers petition. That’s too bad, but perhaps they’ll come to their senses before too long. For there to be a true ‘academic spring,’ academics would need to demand that administrators return to spending the same percentage on library acquisitions they spent, well, let’s say 10 years ago; require they be paid to do research and not be forced to constantly scrounge for grants; hold universities accountable for telling students what their chances of good careers in the sciences actually are, rather than just padding their enrollment numbers; and so forth. Confronting administrators with ideas like these would take real courage, not the empty courage of hitting the kid closest to your out of frustration while the big kid walks away with your lunch money. Complaining about a partner that helps academics accurately and reliably transform economically non-viable information into highly valuable academic credit? That seems like academic silly season, not academic spring.” |
Gekkan DRF: Digital Repository Federation Monthly Posted: 15 Apr 2012 05:12 PM PDT drf.lib.hokudai.ac.jp Use the link above to access Special Issue, No. 26 from March 2012, of the monthly newsletter from the Digital Repository Federation of Japan. Topics covered in the newsletter include: “What is an Open Access Journal,” a report on “The 5th SPARC Japan Seminar,” and “Burgeoning of OA Megajournals.” |
Posted: 15 Apr 2012 05:09 PM PDT chronicle.com “How useful might a proper-name index to all novels and movies be? If, for example, you wanted to discover all instances of Peeps in fiction, or every time someone drove a Mustang, or took a picture of Big Ben? That is the humble goal of Small Demons, a new service (still in beta) that aims to allow readers to discover connections between works, as well as to more fully understand the way novels and movies they love represent the world. Valla Vakili, the co-founder and CEO of Small Demons, has a great post describing the stichomancy of fictional detail–of the ring of truth that emerges as one pursues details both within and across universes. Small Demons calls those universes the Storyverse, and will make it possible for readers to search for people, places, and things across an impressive range of works. You can also browse works, and see what connections it contains. (Note that the connections have to be direct, and by name–veiled allusions, or quotations/paraphrases that don’t directly invoke the source, won’t be mentioned.) What’s nice is that Small Demons will show you a snippet of text containing an instance of the reference. Naturally, Small Demons is also prepared to sell you a copy of books that you’ve discovered using their interface. Small Demons will depend on the contributions of readers, and is scaling up to a system of editing and curating, complete with badges for demonstrated expertise. You can tell Small Demons will be good, because it has a trailer by Adam (Lonely Sandwich) Lisagor... You can also see a presentation by Vakili to the O’Reilly Media Tools of Change in Publishing Conference from February. Registration (free) is required–you can either register with an e-mail address or with Facebook or Twitter–until it is out of beta. Right now, Small Demons is strongly weighted toward contemporary works, which I guess means there will be plenty of opportunity for us Victorianists to earn badges! You might start with William Gibson’s Pattern Recognition or Nick Bilton’s I Live in the Future & Here’s How It Works.” |
21st century smarter government is 'data-centric' and 'digital first,' says US CIO - O'Reilly Radar Posted: 13 Apr 2012 11:28 AM PDT radar.oreilly.com [Use the link above to access the transcription of the interview described below.] “Any nation's top government IT executive has a tough gig in the 21st century. The United States chief information officer, for instance, has an immense budget to manage — an estimated $80 billion dollars in annual federal IT spending. US CIO Steven VanRoekel (@StevenVDC), who started work in the White House just over eight months ago, must address regulatory compliance on privacy and security, and find wasteful spending. As the nation's federal CIO, he has inherited a staggering challenge: evolve the nation's aging IT systems toward a 21st century model of operations. In the age of big data, he and everyone who works with him must manage a lot of petabytes, and do much more with less. He must find ways to innovate to meet the needs of the federal government and the increased expectations of citizens who transact with cutting-edge IT systems in their personal and professional lives. When he was named to the position, he told the New York Times: ‘We're trying to make sure that the pace of innovation in the private sector can be applied to the model that is government.’ From adjusting to the needs of an increasingly mobile federal workforce to moving to the cloud to developing a strategy for big data, it's safe to say that VanRoekel has a lot on his plate. When he was named to the post, it was also safe to say that there were reasons to be hopeful about his prospects. Under VanRoekel, FCC.gov got a long overdue overhaul to reboot as an open government platform. In the process, he and his team tapped into open source, the cloud, and collective intelligence. He brought a dot-com mentality to the FCC, including a perspective that ‘everything should be an API’ that catches some tech observer's eye. He worked with an innovative new media team that established a voice for @FCC on social media, where there had been none before, and an FCC.gov/live livestream that automatically detected the device a viewer used to access it. VanRoekel is the man who told me in April that "the experiences that live outside of FCC.gov should interact back into it. In a perfect world, no one should have to visit the FCC website." Instead, he said, you'd go to your favorite search engine or favorite app, and open data from the FCC's platform would be baked into it. ‘If we think of citizens as shareholders, we can do a lot better,’ he said. ‘Under the Administrative Procedure Act, agencies will get public comments that enlighten decisions. When citizens care, they should be able to give government feedback, and government should be able to take action. We want to enable better feedback loops to enable that to happen.’ After VanRoekel spoke at the FOSE conference this month, I walked with him to the Old Executive Officer Building, next to the White House, to dig a bit deeper into some of the broad strokes he outlined in his speech that morning. (The images below come from pictures taken during his presentation.) The Office of Management and Budget is widely expected to release a strategy on mobile and data-centric government in the near future. Our interview, which follows, touches upon many of the issues outlined above and provides some insight into the strategic thinking of one of the key players in the Washington tech policy world...” |
Norton Scientific Journal: Researchers Call for Open-access Journals Posted: 13 Apr 2012 11:27 AM PDT openPR.com - Newsfeed, (13 Apr 2012) [Use the link above to access the press release posted by Norton Scientific Journal on OpenPR, a web service enabling self-posted press releases. Norton Scientific Journal is a blog with the mission “to effectively advance and diffuse knowledge to every part of the world.” ] “(openPR) - Britain’s Wellcome Trust, one of the largest research charities worldwide expressed their support to scientists who wants to make their work accessible to all. Officials at the organization gave hints of their plan to introduce a free online journal that can rival established academic publications. Researchers are now demanding that their work be opened to the public, believing that in this way, progress in scientific research will speed up. Besides, researches that are publicly funded should not be exclusive for private publishing houses as the research findings must be available to all. Trust seems to be advocating that charity- and public-funded scientific research must be accessible for anyone who wants to read it. It is evident that Wellcome Trust does not want to pay for medical studies that only end up in private parties so it is now considering ways to bring the research papers under an open-access framework. Most of the world’s scientific research which is estimated to be around 1.5 million new articles every year is only released through journals owned by several big publishing companies like Wiley, Springer and Elsevier. Influential journals such as New England Journal and Nature and Science are only accessible via paid subscription. And because of the frustrations with the expenses of academic journals, researchers staged a boycott of the biggest publisher worldwide, Elsevier. Over 9,200 said they will not submit manuscripts anymore, nor act as peer reviewer for Norton Scientific Journal. With this intervention from the second largest non-government funder of medical research, the movement gained a considerably strong ally in their demand to open online journals. ‘The broad principle is obviously correct, publicly-funded research should be in the public domain as soon as possible,’ said the Labour chair of the House of Commons science and technology committee. And in fact, if you look at what really makes information dissemination effective, you will find that open content obviously spreads faster, has more influence and reach a wider audience — it could even be used in ways that the authors do not expect. Wellcome Trust provides financial assistance in form of grants so that they can pay publishers to make their work available for free. Those who do not open their work for public access in accordance with Trust’s terms can be sanctioned in future grant applications. The government also appeared to be giving their assent for calls on open access journals. During its launch of innovation strategy in December, the minister for universities and science said that he would like to see all state-funded researched released in the public domain. The director of Wellcome Trust Sir Mark Walport, announced that they are in the final stages of introducing a high-caliber scientific journal called eLife...” |
World Bank decides to make its research open access Posted: 13 Apr 2012 11:25 AM PDT arstechnica.com “The World Bank is the frequent target of criticism, protests, and even riots. Critics feel the organization exerts a disproportionate influence on the economies and policies of less powerful nations. As a large, transnational bureaucracy it can be impenetrable at the best of times, moreso if those manning the fiduciary barricades are resistant to change. Perhaps it is the persistence and extravagance of its detractors, but for whatever reason, the World Bank is taking steps toward greater transparency. It announced yesterday that it would be instituting a new "Open Access policy for its research outputs and knowledge products" beginning July 1. The implications of this policy ‘for authors, enables the widest possible dissemination of their findings and, for readers, increases their ability to discover pertinent information.’ The policy's full title is ‘World Bank Open Access Policy for Formal Publications,’ and the Bank says it will apply to ‘manuscripts and all accompanying data sets... that result from research, analysis, economic and sector work, or development practice... that have undergone peer review or have been otherwise vetted and approved for release to the public; and... for which internal approval for release is given on or after July 1, 2012’, as well as the final reports prepared by outside parties for the Bank. As this policy's first phase, the World Bank has launched the Open Knowledge Repository, where all the Bank's public products will reside for public reading and download. The World Bank products covered by the new policy are by default to be published under a liberal Creative Commons license (CC BY). Third-person material will be licensed under a somewhat more restrictive policy (CC BY-NC-ND 3.0). According to Creative Commons' Diane Peters, the Repository ‘reinforces scholarship norms’ and ‘has been built with an eye toward maximizing interoperability, discoverability, and reusability by complying with Dublin Core metadata standards and the Open Archives Initiatives protocol for metadata harvesting.’ Over 2,100 books and papers from 2009-2012 are already available in the Repository... The new policy and the Repository are the latest stages in the World Bank's efforts and openness. They were preceded by April 2010's Open Data Initiative, which allowed access to "more than 7,000 development indicators, as well as a wealth of information on World Bank projects and finances," and the Access to Information Policy (July 2010)... Only two months ago, the Federal Research Public Access Act was introduced in Congress. That act would ‘significantly shorten the waiting period between publication in a subscription journal and the point where a paper is made open access, dropping it from a year to six months. It would also expand the scope of the policy, applying it to any federal agency with a budget of $100 million or more.’ Quite aside from the benefits it may bring to the institution or organization willing to chance it, access is one of those habits-of-thought that seem difficult to reverse once they gain any kind of purchase on the public imagination. (Not that there isn't always someone who will try.) An exciting question which follows from this announcement is what kind of tools enterprising geeks might build on what could be an unprecedented amount of data on international economic development. |
One Man's Quest to Make Information Free - Businessweek Posted: 13 Apr 2012 11:24 AM PDT www.businessweek.com “Carl Malamud once told a senior official at the Securities and Exchange Commission that he wanted to put the agency’s filings online. “I just don’t think people who use the Internet are going to be interested in this stuff,” Malamud remembers the official saying in 1993. Malamud bought all the filings and put them online anyway, using a computer borrowed from his friend Eric Schmidt. (Yes, that Eric Schmidt.) About a year later, he took them down. That prompted more than 17,000 day traders, investment clubs, and business school professors to beg the agency for free Web access to its records. You might know it today as the SEC’s ‘Edgar’ database. More than 15 years after that stunt, Malamud is still making the same argument: If you make government information free and easily accessible, there’s no telling who’ll start using it or what good ideas will spring up. ‘Every time I put something online there’s a huge audience,’ says Malamud, founder of Public.Resource.Org, a nonprofit that advocates for government transparency... Now Malamud’s taking on the best practices for construction, industry, and manufacturing. They’re written by hundreds of nonprofits known as ‘standards development organizations.’ About 3,000 of these standards are referenced but not fully spelled out in federal law. Let’s say you’re building a hospital for the Department of Veterans Affairs. Federal law says you need to follow the ‘Life Safety Code,’ a set of rules on how to prevent fire, smoke, and toxic fumes from injuring tenants. To get a print copy of the code you’d have to send a payment of $82.95 to the National Fire Protection Association, which wrote it. Malamud argues that the code is a law—and therefore part of the public domain—and should be distributed freely. NFPA maintains that it holds a copyright on its standard. Malamud is forcing the issue. Last month he bought and copied 73 different standards and boxed them with American flag packing tape. He sent the packages to the 10 organizations that wrote the standards to ‘put them on notice’ that he plans to publish the specs online beginning in May. He’s been doing something similar over the last five years with state building codes, gradually uploading them to the Web after a federal appeals court ruled they’re part of the public domain. Not one standards organization has sued him to uphold its copyright on those codes. Malamud is hoping for the same result with the federal standards. ‘We are very serious about doing this and intend to see it through to completion,’ he says. Sound arcane? Malamud’s mission will spur investment, says Stephen Schultze, the associate director of Princeton University’s Center for Information Technology Policy, who’s worked on similar projects with him in the past. ‘You’ll have a round of innovation,’ says Schultze. ‘Other people will find ways to make [the information] easily searchable and create guides for businesses—plumbers and electricians.’ Or how-to apps on building to code for homeowners. There’s an economic case for copyrights. It’s expensive to develop good standards—you need panels of engineers, industry representatives, and consumer advocates weighing in. Some organizations recoup their costs by charging to test products for manufacturers and granting their seal of approval; others rely on selling standards to make ends meet. ‘This source of revenue gives us great independence,’ says Jim Shannon, president of the NFPA. ‘We don’t want the industries to pay for standards. Then there will be industry-dominated standards.’ Malamud doesn’t argue with that. But charging for public information ‘comes up against this brick wall,’ he says, ‘which is the U.S. Constitution.’” |
Elsevier responds to my text mining request Posted: 13 Apr 2012 11:24 AM PDT Research Remix, (13 Apr 2012) “Yesterday Elsevier responded to my text mining request. David Tempest, Universal Access Team Leader, emailed a letter with proposed addendum licensing terms to me and my university librarian. It has been clear to everyone that I am blogging these interactions — there was no request to keep the letter confidential, so I include it in full below. Briefly: [1] the agreement would permit some types of text mining of subscribed Elsevier content for authorized users in my university — a win, given that standard publisher contracts explicitly forbid all text mining. [2] the agreement places full responsibility on my university itself to install and support ‘the text mining system’ [3] the agreement forbids releasing ‘all or any portion of the Subscribed Products… to anyone other than an Authorized User and other than as publishing the text mining results via scholarly communication...’ What does this mean? 1. After negotiation, Elsevier permits the results of text mining to be included in scholarly communication, but does not permit text-mining over its literature for citizen science or research tools. I explicitly asked Elsevier about these use cases (twice) and they have excluded them from their proposed agreement. I did not develop these use cases as gotcha questions — they were existing plans for my real research, and they need text mining access. 2. This took a really long time. Not long compared to what some researcher have gone through for text mining access, but long! And this is only for one publisher. I guess I’d have to go through this again and again with Wiley and Springer and Nature and AAAS and all subscription publishers if I want text mining access to all the literature already covered by subscription agreements? 3. It isn’t clear that my university can or will agree to these terms. My university probably doesn’t have the resources to install and maintain a text mining system. I’m just a short-term postdoc with no grant funding for this: I can’t help support this infrastructure. A researcher-driven solution I could handle myself. The problem is that lightweight solutions aren’t allowedwhen content must be treated as a protected resource. So. That’s where it stands. My university is reading the letter and deciding what to do. I thank Elsevier for engaging with me on this. I believe we both approached this in good faith. Although these new contract terms will hopefully be useful to me and other researchers at UBC, I’m disappointed: I was hoping Elsevier would take this opportunity to work with me to experiment with new ways to support researchers building on top of the scholarly literature. In contrast: You want to text mine Open Access content? No problem. It just works. Ross Mounce carries PLoS full text around on a USB stick. Let’s move to that kind of a publishing model now, please? The kind of publishing model where the interests of publishers and researchers and research progress are all aligned.” |
Memorandum of Understanding between DataCite and re3data.org Posted: 13 Apr 2012 11:22 AM PDT re3data.org, (13 Apr 2012) “We are pleased to announce a Memorandum of Understanding between DataCite and re3data.org. With this Memorandum of Understanding DataCite and re3data.org define their efforts to enhance accessibility and visibility of data sets. DataCite is an international consortium to facilitate easier access to scientific research data on the internet, increase acceptance of research data as legitimate, citable contributions to the scientific record, and to support data archiving that will allow results to be verified and re-purposed for future study. Both partners aim to promote sharing, increased access, and better visibility of research data. Core of this cooperation is the support of this common aim. DataCite and re3data.org will intensify their dialogue on the following topics: [1] Communication and information exchange on general developments at DataCite and re3data.org [2] Dialogue on the development of metadata schemas, data models and web service interfaces [3] Development of complementary user services..” |
Academic publishing: Open sesame | The Economist Posted: 13 Apr 2012 11:21 AM PDT www.economist.com “PUBLISHING obscure academic journals is that rare thing in the media industry: a licence to print money. An annual subscription to Tetrahedron, a chemistry journal, will cost your university library $20,269; a year of the Journal of Mathematical Sciences will set you back $20,100. In 2011 Elsevier, the biggest academic-journal publisher, made a profit of £768m ($1.2 billion) on revenues of £2.1 billion. Such margins (37%, up from 36% in 2010) are possible because the journals’ content is largely provided free by researchers, and the academics who peer-review their papers are usually unpaid volunteers. The journals are then sold to the very universities that provide the free content and labour. For publicly funded research, the result is that the academics and taxpayers who were responsible for its creation have to pay to read it. This is not merely absurd and unjust; it also hampers education and research. Publishers insist that high prices are necessary to ensure quality and cover the costs of managing the peer-review process, editing and distribution. High margins, they say, are evidence of their efficiency. Clearly the cost of producing a journal is not zero. But the internet means it should be going down, not up. Over the past decade many online journals and article repositories have emerged that are run on a shoestring. Some have been set up by academics who are unhappy with the way academic publishing works. (Since January some 9,500 researchers have joined a boycott of Elsevier.) In several cases the entire editorial boards of existing journals have resigned to start new ones with lower prices and less restricted access. But the incumbent journals are hard to dislodge. Researchers want their work to appear in the most renowned journals to advance their careers. Those journals therefore have the pick of the best papers, remain required reading in their fields and have strong pricing power as a result. What is to be done? There is a simple way both to increase access to publicly funded research and to level the playing field for new journals. Government bodies that fund academic research should require that the results be made available free to the public. So should charities that fund research... There are some hopeful signs. The British government plans to mandate open access to state-funded research. The Wellcome Trust, a medical charity that pumps more than £600m ($950m) a year into research, already requires open access within six months of publication, but the compliance rate is only 55%. The charity says it will ‘get tough’ on scientists who publish in journals that restrict access, for example by withholding future grants, and is also launching its own open-access journal. In America, a recent attempt (backed by journal publishers) to strike down the existing requirement that research funded by the National Institutes of Health should be made available to all online has failed. That is good news, but the same requirement should now be extended to all federally funded research. Open access to research funded by taxpayers or charities need not mean Armageddon for journal publishers. Some have started to embrace open access in limited ways, such as letting academics post their papers on their own websites or putting time limits on their pay barriers. But a strongly enforced open-access mandate for state- and charity-funded research would spur them to do more. The aim of academic journals is to make the best research widely available. Many have ended up doing the opposite. It is time that changed.” |
PhD students: Don’t ‘occupy publishing’, just do your bit Posted: 13 Apr 2012 11:19 AM PDT bench twentyone, (12 Apr 2012) “Academics are getting more and more unhappy about the state of scientific publishing in the UK in what is becoming known as the ‘academic spring’. Timothy Gowers wrote in the THE recently about why he – like many academics – is boycotting the publisher everyone seems to resent most: Reed Elsevier. There’s also a campaign going on, to try to force journals to widen their access. Researchers already post links to their research papers on their academic home pages – perhaps they could just post the whole paper there without ever bothering to give it to Elsevier? Well, of course, the problem is that this process wouldn’t automatically be subject to peer review, there would be no collection of work in a single location and so on. It would also take a paradigm shift in attitudes of truly heroic proportions for anything like this to be adopted... As a PhD student hoping to eventually get a job, it would not be appropriate for me to start hurling insults at people like Elsevier. I actually have a better opinion of them than many scientists. I happen to think that academic publishers do care about the problems with the current system, and are trying to make it better... So what can a PhD student do to help? I say, to start with, we should begin to make our research findings as freely available as possible. In an effort to promote this sort of thinking, I have made a decision. As I near the end of my PhD I will submit as many as possible of the compounds I made during my research to ChemSpider SyntheticPages, a free repository of synthetic chemistry procedures. If databases like this were used consistently and widely, we might eventually move organically towards a situation where information is much more freely available. In fact, there are such initiatives springing up across most of the sciences. Physical scientists already have arXiv (although that’s old), for example. And in an unexpected (to me) move, the Wellcome Trust have just announced they will be putting in place sanctions for researchers they fund So I challenge you – if you are a PhD student – to start making your research data available for free, as far as you reasonably can. My first chemspider synthetic page is here: DOI 10.1039/SP540...” |
Posted: 13 Apr 2012 11:18 AM PDT med.stanford.edu [Use the link above to access the bookmarked article from the news webpage of Stanford University School of Medicine found below. The full text article, published as open access in Proceedings of the National Academy of Sciences (PNAS), is also available. <http://www.pnas.org/search?fulltext=Atul+Butte&submit=yes> doi:10.1073/pnas.1114513109] “Using computational methods, Stanford University School of Medicine investigators have strongly implicated a novel gene in the triggering of type-2 diabetes. Their experiments in lab mice and in human blood and tissue samples further showed that this gene not only is associated with the disease, as predicted computationally, but is also likely to play a major causal role. In a study published online April 9 inProceedings of the National Academy of Sciences, the researchers combed through freely accessible public databases storing huge troves of results from thousands of earlier experiments. They identified a gene never before linked to type-2 diabetes, a life-shortening disease that affects 4 percent of the world’s population. These findings have both diagnostic and therapeutic implications. The study’s senior author is Atul Butte, MD, PhD, associate professor and chief of systems medicine in pediatrics; its first author is Keiichi Kodama, MD, PhD, a staff research scientist in Butte’s group. Ordinarily, cells throughout the body, alerted to the presence of sugar in the blood by insulin, hungrily slurp it up for use as an energy source. But excessive blood-sugar levels — diabetes’ defining feature — eventually damage blood vessels, nerves and other tissues... Type-2 diabetes ... results from a phenomenon called insulin resistance: the tendency of cells in tissues throughout the body — but especially in fat, liver and muscle — to lose sensitivity and ignore the insulin’s ‘gravy train’ signal. Drugs now used to treat insulin resistance can’t reverse the progression to full-blown type-2 diabetes. ‘We don’t really have a good grasp of the molecular pathology that makes people get it in the first place,’ said Butte, who is also director of the Center for Pediatric Bioinformatics at Lucile Packard Children’s Hospital. In searching for risk-increasing genes over the past 10 years, scientists have used two approaches to hunt them down. One way is to look for variations in genes’ composition — deviations in their chemical sequences that correlate with a higher likelihood of contracting a particular condition... And so a second approach to understanding our genes has been devised. This latter method flags differences in genes’ activity levels, for example in diseased vs. normal tissues, for each of the 20,000 genes in the entire genome. Both types of approaches have generated staggering amounts of data — far more than can fit onto the pages of standard, peer-reviewed journals, whose editors routinely demand (as do federal-government funding agencies) that researchers park their experiments’ results in online, public repositories accessible to others. Now, investigators such as Butte are starting to reach in, drill down and pull out a treasure-trove of potentially valuable information. In this study, the Stanford scientists wanted to know which genes showed especially marked changes in activity, as indicated in earlier comparisons of diabetic vs. healthy tissue samples (notably fat, muscle, liver and beta cells, the only cells in the body that release insulin). Mining public databases, they located 130 independent gene-activity-level experiments — in rats, mice and humans — comprising 1,175 separate individual samples in all. Then, integrating that data, they searched for those genes that showed activity-level differences in the most experiments. They zeroed in on a single gene, called CD44, whose activity changed substantially in diabetic tissues compared with healthy tissues in 78 of the 130 experiments. The chance of this occurring “just due to dumb luck,” Butte said, was vanishingly small: less than one in 10 million-trillion. The uptick in CD44’s activity was especially pronounced in the fat tissue of people with diabetes, he said — intriguing, because obesity is known to be a strong risk factor for type-2 diabetes. The gene was interesting in itself. CD44 codes for a cell-surface receptor not found on fat cells, although those cells do have surface molecules that bind to it. Rather, this receptor sits on the surface of scavenger cells called macrophages (from the Greek words for “big eater”) that can cause inflammation. In obese individuals, macrophages migrate to and take up positions in fat tissue. (Indeed, as many as half the cells in a big potbelly can be macrophages.) Recent medical research has strongly implicated inflammation in initiating type-2 diabetes...” |
Point of No-Return for Open Access Posted: 13 Apr 2012 11:17 AM PDT @ccess, (11 Apr 2012) “The Open Access movement is gaining momentum by the day... The call for action by Tim Gowers may have marked a point of no return for the open access movement. It almost seemed as if scientists suddenly and collectively came to realize the absurdness of a situation that they had taken for granted for all too long... There has been a lot of media attention recently for open access. The Guardian has put open access on its front page in an article on the Wellcome Trust’s move in favor of open access. Sir Mark Walport, director of the Wellcome Trust said that his organisation ‘would soon adopt a more robust approach with the scientists it funds, to ensure that results are freely available to the public within six months of first publication’. Another major event has been the announcement by the World Bank to ‘become the first major international organization to require open access under copyright licensing from Creative Commons—a non-profit organization whose copyright licenses are designed to accommodate the expanded access to information afforded by the Internet’. Starting april 10, 2012 the World Bank has launched a repository as a ‘one-stop-shop for most of the Bank’s research outputs and knowledge products, providing free and unrestricted access to students, libraries, government officials and anyone interested in the Bank’s knowledge. Additional material, including foreign language editions and links to datasets, will be added in the coming year. This move is especially significant since the bank is not just making their work free of charge, but also free for use and reuse.’ But with the increased media attention comes the danger that we may loose sight of what is meant by the term ‘open access’. With everyone starting talking about ‘open access’ as if this were one clearly defined term, it has become more urgent than ever to have clarity on this issue. It was one of the reasons for the start of the @ccess Initiative where this blog is posted. Because open access can range from somewhat restricted (only free reading) to completely unrestricted (completely free for use and reuse) we have proposed to coin the term @ccess for free and unrestricted access to information in accordance with the BBB definition. Another reason for the @ccess Initiative, and a matter of increasing importance, is the EASE of access to information. When more and more information will become available through open access, the difficulty of finding the right information will also increase. The use of a great number of institutional repositories can only work when all these repositories are adequately cross-linked and working together, a sheer impossible task to accomplish. A better option would be to reduce the number of repositories by limiting these to big (inter)national organisations like WHO, World Bank, FAO and others. Another option still, and one I personally favor, can run in parallel with the last option above. This option is the storage and management of information with specialized scientific communities as I have described in another blog and on the @ccess communities page of this website. To give an example, and the one that we are actually working on: together with MalariaWorld we are developing a comprehensive database of malaria related publications. At the same time we will ask researchers to deposit their manuscripts and data in an open access repository that is linked to the database. This database will also link to open access articles. For restricted access publications we will seek to get as many manuscripts as possible deposited in the database as well. The community will eventually provide open access to all information, provide a platform for collaboration and information exchange and serve as a communication platform for everyone seeking information on, or working on malaria. Other communities can be formed using this model. In this way we would move towards a system of interlinked scientific communities and easy access to pertinent information through these communities. This model would also maximize the chances for scientific collaboration and innovation. The combination of open access and the participation of scientists and citizens in the scientific enterprise will change the way that science is done. Networked scientific communities will have far better chances to tackle the world’s toughest problems, not in the least because open access would give equal opportunities to people in developing countries to profit from,and contribute to science. To quote Peter Suber: ‘What you want is for everybody to have access to everything, and not just the north making its research available to the south or vice versa. Open access should be a two-way street.’ The proposed structure for scientific @ccess communities would be perfectly suited for this task.” |
Why PMC and UKPMC Should Harvest From Institutional Repositories Posted: 13 Apr 2012 11:16 AM PDT openaccess.eprints.org “PubMed & PubMed Central are wonderful resources, but not nearly as resourceful or wonderful as they easily could be. (1) PMC & UKPMC should of course be harvesting or linking institutional repository (IR) versions of papers, not just PMC/UKPMC-deposited and publisher-hosted papers. (2) Funders should be mandating IR deposit and PMC harvesting rather than direct PMC/UKPMC deposit. By thus making funder mandates and institutional mandates convergent and collaborative instead of divergent and competitive, this will motivate and facilitate adoption and compliance with institutional mandates: institutions are the universal providers of all research output, funded and unfunded. (3) IRs should mandate immediate deposit irrespective of publisher OA policy: If authors wish to honor publisher OA embargoes, they can set access to the deposit as Closed Access during the embargo and rely on providing almost-OA via the IR's email eprint request button (4) Funder mandates should require deposit by the fundee -- the one bound by the mandate -- rather than by the publisher, who is not bound by the mandate, and indeed in conflict of interest with it. (5) Publishers (partly to protect from rival publisher free-loading, partly to discourage funder mandates, and partly out of simple misunderstanding of network capability) are much more likely to endorse immediate institutional self-archiving than institution-external deposit. This is yet another reason funders should mandate institutional deposit and metadata harvesting instead of direct institution-external deposit.” |
Posted: 12 Apr 2012 03:03 PM PDT FRPAA v America COMPETES Act Scholarly Communication News@BC, (12 Apr 2012) “Recent (March 29) testimony to the Subcommittee on Investigations and Oversight of the House Committee on Science, Space, and Technology seems to have re-drawn the battle lines over open access mandates, with one side touting the reintroduction of the Federal Research Public Access Act (FRPAA), and the other claiming that the existing provision for an executive branch working group in the America COMPETES Act of 2010 works, and should be left alone. The hearing was called, ‘Federally Funded Research: Examining Public Access and Scholarly Publication Interests.’ Though FRPAA was not yet being considered in Committee--it has to attract more cosponsors before committees will plan hearings on it... Stuart Shieber, Director of the Office for Scholarly Communication at Harvard University and Elliot Maxwell, Fellow of the Communications Program at Johns Hopkins University testified on its behalf, saying it would provide better access to research results and economic benefits. Scott Plutchak of the Health Sciences Library at U. Alabama, Birmingham, H. Dylla of the American Institute of Physics, and Dr. Crispin Taylor Executive Director of the American Society of Plant Biologists argued that FRPAA was a ‘blanket approach’ that would restrict innovation, publisher profits and flexibility, and ultimately access. Plutchak, Dylla, and Taylor all spoke of the America COMPETES Act as more likely than FRPAA to foster a ‘robust’ and ‘nuanced’ system that encourages flexibility and federal agency/publisher partnerships. Taylor's testimony repeatedly urged a ‘sensible, flexible, and cautious’ approach. Maxwell's testimony referred to his own research on the economics of journal publishing in the NIH-mandate environment, in which he found not only that there was no evidence that the NIH policy harmed journal subscription rates, profits, or ability to carry out peer-review, but that in the environment of the NIH mandate, new journals continued to proliferate and existing journals increased profit margins. Shieber's testimony touched on other economic potentials of Open Access, such as text mining, which would of course depend on comprehensive availability of open-source articles. He also pointed out ‘systemic problems’ in the existing journal publishing structure that prevent the ‘widest possible dissemination’ of scholarly materials. He had two recommendations for solving this distribution problem: [1] More institutions should emulate open access policies like Harvard's, which mandates that faculty grant a license to the university to distribute their scholarly articles through article repositories; [2] Scholars should, when possible, publish research in open-access journals, which repositions the cost of journal publishing from libraries on behalf of readers to funding agencies and/or employing institutions on behalf of authors. The anti-FRPAA position was represented neatly by Taylor, who said, ‘A centralized approach discourages innovation by driving traffic away from innovators, including publishers, thus minimizing scientific and commercial opportunities.’ Taylor's position, though, rests on three assumptions: One, that publishers have a greater claim on innovation in this area than scholars, universities, university libraries, and federal agencies; two, that publishers' current contributions to academic publishing (coordinating peer review, page composition, copyediting, and listing and linking of bibliographic and reference data) would be unsustainable under the mandated deposit models championed by Shieber and Maxwell; and three, that publishers would operate in good faith and continue to work toward an open access model of publication without mandates to do so. The first two are arguable positions, though Maxwell has demonstrated that the second position seems to fall apart under close scrutiny. The third position, given the meteoric rise in subscription prices and the growth of academic publisher profits during a recession, seems hard to support." |
EU and World Bank step up pressure to make research available for free Posted: 12 Apr 2012 03:02 PM PDT www.universityworldnews.com “Three significant blows were struck this week for the international cause of achieving open access to scientific research. Neelie Kroes (pictured), vice-president of the European Commission, who is responsible for the Digital Agenda for Europe, has confirmed that the commission is drawing up a proposal to open up access to the results of research funded under its proposed €85 billion (US$111 billion) Horizon 2020 research programme. The World Bank announced that it is to make findings of research that it funds freely available under Creative Commons licensing. And the Wellcome Trust, one of the world’s largest biomedical charities, announced that it will launch its own free online publication to compete with subscription-based journals and enable scientists to make their research findings freely available. Kroes said... ‘We already have the infrastructure supporting open access. Researchers, funding bodies and the public are already using and re-using thousands of publications hosted around the world in e-Infrastructures like OpenAIRE. ‘This is important. Not just because it helps scientists and science to progress. But because we should never forget that the number one research funder in Europe is the taxpayer. And they deserve to get the largest possible reward from that investment.’ Kroes said there was no reason why subscription access only models should remain dominant for access to research publications in an era when distribution costs approach zero. Her speech came a day after the World Bank announced details of its open access policy, which will take effect from 1 July. Two years after opening its vast storehouse of data to the public, the bank is consolidating more than 2,000 books, articles, reports and research papers in a search-engine friendly Open Knowledge Repository, and allowing the public to distribute, reuse and build upon much of its work – including commercially. The bank says the repository is a one-stop-shop for most of its research outputs and knowledge products, providing free and unrestricted access to students, libraries, government officials and anyone interested in its knowledge. Other material, including foreign language editions and links to datasets, will be added in the coming year. Further, the World Bank will become the first major international organisation to make much of its research output available under Creative Commons licensing. This will mean that any user located anywhere in the world will be able to read, download, save, copy, print, reuse and link to the full text of the bank’s research work, free of charge... The Wellcome Trust, which provides £400 million (US$636 million) a year in funds for research on human and animal health, announced on 10 April that it too would throw its weight behind efforts by scientists to make their work freely available to all. It said it would launch its own free online publication to compete with existing academic journals in an effort to force publishers to increase free access. Currently most scientific journals are only available by subscription. Sir Mark Walport, head of the Wellcome Trust, said: ‘One of the important things is that up until now if I submit a paper to a journal I've been signing away the copyright, and that's actually ridiculous...’ Speaking to BBC Radio’s Today programme, he said the paradox was that peer review was one of the biggest costs of publishing papers: scientists do it for free and then the fruits of their review work are ‘locked behind a paywall’. This week’s moves will be welcomed by nearly 9,000 researchers who signed up to a boycott of journals that restrict free sharing, initiated by Tim Gowers, the British mathematician. It is part of a campaign that supporters call the 'academic spring', due to its aim to revolutionise the spread of knowledge. The European Commission's Kroes stressed that its digital openness proposal would be about sharing both findings and data. On data, she said the world was just beginning to realise how significant a transformation of science the openness enabled by ICT infrastructures can mean. ‘We [are entering] the era of open science,’ she said. ‘Take ‘Big Data’ analysis. Every year, the scientific community produces data 20 times as large as that held in the US Library of Congress.’ Big data needs big collaboration. Without that, it is not possible to collect, combine and conclude results from different experiments, in different countries and disciplines, she added, citing the example of genome sequencing... ‘That is why we've invested in high-speed research networks like GÉANT. Today, GÉANT is connecting millions of researchers, scholars, educators and students. That is why we want to promote ever better and open infrastructures for research collaboration...’ The UK government has already signalled its intention to press for increased access to public knowledge or data created by publicly funded research and universities... In the long run there is a huge potential cost saving to make, since British universities spend £200 million a year on subscriptions to electronic databases and journals and many of Britain’s big universities spend around £1 million a year with publishers, according to a report in the Guardian.” |
Scam Publisher Fools Swedish Cranks Posted: 12 Apr 2012 03:00 PM PDT Aardvarchaeology, (12 Apr 2012) "Perennial Aard favourites N-A. Mörner and B.G. Lind have published another note in a thematically unrelated journal. It's much like the one they snuck past peer review intoGeografiska Annaler in 2009 and which Alun Salt and I challenged in 2011. The new paper is as usual completely out of touch with real archaeology, misdating Ales stenar by over 1000 years and comparing it to Stonehenge using the megalithic yard. No mention is made of the fact that this unit of measurement was dreamed up by professor of engineering cum crank archaeoastronomer Alexander Thom and has never had any standing in academic archaeology. The megalithic yard does not exist. At first I thought, damn, they've managed to game the system again. But then I looked into the thing some more and came to the conclusion that this time, Mörner & Lind have been scammed, poor bastards. The journal they've published in is named the International Journal of Astronomy and Astrophysics. It's an on-line Open Access quarterly, and though it has an ISSN number for a paper version as well, this is not held by any Swedish library. This may not be cause for suspicion, because the journal is new: its first four issues appeared last year. The Head Editor is professor of astronomy at a young English university that is quite highly ranked within the UK. So far, it may look like Mörner & Lind have simply published in a low-impact but legit academic venue. But let's have a look at the publishers of IJAA, Scientific Research Publishing (SCIRP). This outfit publishes from Irvine, CA, but its web site is registered in Wuhan, China, where its president Huaibei "Barry" Zhou is based. He is apparently a physicist. According to a 2010 statement by Zhou to Nature News, he co-founded SCIRP in 2006 or 2007. In the five or six years since, the firm has launched over 150 on-line Open Access journals. Uh-oh. Suspicions about SCIRP began to gather in December 2009, when Improbable Research, the body behind the IgNobel Prize, said the publisher might offer "the world's strangest collection of academic journals". Improbable Research pointed out that at the time, SCIRP's journals were repurposing and republishing decade-old papers from bona-fide journals, sometimes repeating the same old paper in several of its journals, and offering scholars in unrelated fields places on editorial boards. This was taken up by Nature News in January 2010, when they contacted Zhou and received the explanation that the old papers had appeared on the web site by mistake after having been used to mock up journals for design purposes. "They just set up the website to make it look nice", said Zhou. While he had otherwise represented himself as president of SCIRP, Zhou now told Nature News that he helped to run the journals in a volunteer capacity. The piece reports that SCIRP had listed several scholars on editorial boards without asking them first, in some cases recruiting the names of people in completely irrelevant fields. In other cases, scholars had agreed to join because a SCIRP journal's name was similar to that of a respected publication in their field. Recruitment efforts by e-mail had apparently been intensive and scattershot. Now, what is this really about? Why is SCIRP cranking out all of these fly-by-night fringe journals that anybody can read for free? The feeling across the web is that it's most likely a scam utilising a new source of income: the "author pays" model built into bona fide Open Access publishing. A kinder way to put it would be that SCIRP is a pseudo-academic vanity press... But Mörner & Lind's new paper has clearly not been vetted by any competent scholar. This suggests that anybody can publish anything in SCIRP's International Journal of Astronomy and Astrophysics as long as they pay the fee. Its Head Editor tells me by e-mail that he is ‘concerned about the refereeing process and should investigate’. And as for the other 150 SCIRP journals? Well, what can you tell me, Dear Reader?” |
A (free) roundup of content on the Academic Spring Posted: 12 Apr 2012 02:57 PM PDT Higher Education Network | guardian.co.uk, (12 Apr 2012) [Use the link above to access the full text which provides a list of articles covering the “Academic Spring” recently published by “the guardian” and suggestions for additional reading from around the web.] “As far back as 2004 the seeming contradiction of publicly-funded research made only available at prohibitive cost through journals, has attracted the attention of HE leaders and policy makers. In July of that year, journalist Donald MacLeod reported on what MPs were calling ‘a revolution in academic publishing, which would make scientific research freely available on the internet’. At the time, Sir Keith O'Nions, director general of the research councils, said: ‘I think it would be a pretty brave decision of the government at the present time to say it has sufficient confidence in the open access business model ... to shift rapidly from something it knows and trusts to an open access model.’ And there the case rested. Fast forward to present day, the near ubiquitous use of social media, the growth of the 'copyleft' movement which seeks to allow work to be shared more freely and a blog by a Cambridge mathematician announcing that he would no longer be submitting papers to Elsevier, the largest publisher of scientific journals, and the Academic Spring was born...” |
You are subscribed to email updates from Connotea: Bookmarks matching tag oa.new To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
No comments:
Post a Comment