April 25, 2012

Journal Publishers in China Vow to Clamp Down on Academic Fraud - SCIENTIFIC AMERICAN

By David Cyranoski of Nature magazine
China's roughly 5,300 home-grown journals have been a receptacle for much of the nation's research that has resulted from misconduct
The China Association for Science and Technology (CAST) in Beijing has taken the lead among the country's publishers in trying to clamp down on academic misconduct. This month, it issued a declaration from the 1,050 journals it oversees -- part of increasingly aggressive nationwide efforts to purge China's corpulent scientific publishing industry and bring its home-grown journals, in both English and Chinese, up to international standards.
In the declaration, journal editors in chief and affiliated society presidents commit to following CAST guidelines issued in 2009. The document defines many types of fraud and lists possible penalties for miscreant authors -- from written warnings to blacklisting or informing home institutions and funding agencies about the misconduct. Reviewers who abuse their privilege by, for example, plagiarizing an article, can also face blacklisting and public disclosure.
That is a step in the right direction, says Chun-Hua Yan, associate editor-in-chief of the CAST-administered Journal of Rare Earths, based in Beijing. Yan says that many editors had not been aware that some subtle forms of wrongdoing -- such as favoring papers on the basis of personal relations or offering honorary authorship -- were types of misconduct. "There are some soft or grey areas. These are now more clear to all the editors," he says. Suning You, president of the Chinese Medical Association Publishing House in Beijing, which has 126 journals administered by CAST, agrees. "The declaration will purify the academic environment to create first-class medical journals, thus achieving social and economic benefits," he says.
Clampdown on misconduct
China's academia and government alike have taken measures to curb misconduct in recent years, with institutions such as Zhejiang University in Hangzhou taking the lead (see Nature 481, 134-136; 2012). The CAST declaration itself follows the announcement of rules from China's education ministry that require universities to monitor misconduct closely (see Nature 483, 378-379; 2012).
The country's roughly 5,300 home-grown journals have been a receptacle for much of the research that has resulted from misconduct. Two years ago, the government vowed to get rid of the most problematic publications (see Nature 467, 261; 2010), but that weeding process hasn't happened yet.
Yan says that the latest declaration will put pressure on journals to fall in line. "Many are just commercial journals, just there to make money," he says. "We cannot make an announcement that `these are bad journals' but we can show the right way to publish."
A stronger incentive -- money -- might force the issue. According to Yan, China's finance ministry is starting a program that will spend 100 million renminbi (US$16 million) per year to improve journals. By the end of 2012, a committee will rank the country's publications into three tiers on the basis of their international and Chinese impact factors and other measures of international influence, such as the number of overseas subscriptions and the number of foreign editorial-board members. Journals ranked in the first tier will get a bonus of 100,000 renminbi per year, and those in the second, 50,000 renminbi. Third-tier publications will get nothing.
Yan says that the money could as much as double his journal's current budget, and allow the publication to waive publishing fees for top papers, train young researchers in how to write scientific papers, invite international advisory-board members to China to discuss possible improvements and enhance software for electronic submission and review systems. He hopes that some Chinese-language journals will become internationally relevant, "followed by scientists around the world".
But there are skeptics. Cong Cao, a science-policy researcher specializing in China at the University of Nottingham, UK, says that neither the extra funding nor the editors' declaration will have much of an impact. China's 5,300 journals account for roughly one-third of the world's science and technology journals and, by Cao's estimate, publish around 600,000 papers per year. That, he says, "represents a huge business". The journals attract "those who have to fill institutionally set publication requirements", adds Cao. "The real question that China's scientific leadership as well as scientific publishers have to consider is: does China really need that many journals in the first place?"
This article is reproduced with permission from the magazine Nature. The article was first published on April 25, 2012.
Nature

April 16, 2012

A Sharp Rise in Retractions Prompts Calls for Reform - The New York Times

By CARL ZIMMER

In the fall of 2010, Dr. Ferric C. Fang made an unsettling discovery. Dr. Fang, who is editor in chief of the journal Infection and Immunity, found that one of his authors had doctored several papers.        
It was a new experience for him. “Prior to that time,” he said in an interview, “Infection and Immunity had only retracted nine articles over a 40-year period.”
The journal wound up retracting six of the papers from the author, Naoki Mori of the University of the Ryukyus in Japan. And it soon became clear that Infection and Immunity was hardly the only victim of Dr. Mori’s misconduct. Since then, other scientific journals have retracted two dozen of his papers, according to the watchdog blog Retraction Watch.
“Nobody had noticed the whole thing was rotten,” said Dr. Fang, who is a professor at the University of Washington School of Medicine.
Dr. Fang became curious how far the rot extended. To find out, he teamed up with a fellow editor at the journal, Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York. And before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.
Dr. Casadevall, now editor in chief of the journal mBio, said he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct.
“This is a tremendous threat,” he said.
Last month, in a pair of editorials in Infection and Immunity, the two editors issued a plea for fundamental reforms. They also presented their concerns at the March 27 meeting of the National Academies of Sciences committee on science, technology and the law.
Members of the committee agreed with their assessment. “I think this is really coming to a head,” said Dr. Roberta B. Ness, dean of the University of Texas School of Public Health. And Dr. David Korn of Harvard Medical School agreed that “there are problems all through the system.”
No one claims that science was ever free of misconduct or bad research. Indeed, the scientific method itself is intended to overcome mistakes and misdeeds. When scientists make a new discovery, others review the research skeptically before it is published. And once it is, the scientific community can try to replicate the results to see if they hold up.
But critics like Dr. Fang and Dr. Casadevall argue that science has changed in some worrying ways in recent decades — especially biomedical research, which consumes a larger and larger share of government science spending.
In October 2011, for example, the journal Nature reported that published retractions had increased tenfold over the past decade, while the number of published papers had increased by just 44 percent. In 2010 The Journal of Medical Ethics published a study finding the new raft of recent retractions was a mix of misconduct and honest scientific mistakes.
Several factors are at play here, scientists say. One may be that because journals are now online, bad papers are simply reaching a wider audience, making it more likely that errors will be spotted. “You can sit at your laptop and pull a lot of different papers together,” Dr. Fang said.
But other forces are more pernicious. To survive professionally, scientists feel the need to publish as many papers as possible, and to get them into high-profile journals. And sometimes they cut corners or even commit misconduct to get there.
To measure this claim, Dr. Fang and Dr. Casadevall looked at the rate of retractions in 17 journals from 2001 to 2010 and compared it with the journals’ “impact factor,” a score based on how often their papers are cited by scientists. The higher a journal’s impact factor, the two editors found, the higher its retraction rate.
The highest “retraction index” in the study went to one of the world’s leading medical journals, The New England Journal of Medicine. In a statement for this article, it questioned the study’s methodology, noting that it considered only papers with abstracts, which are included in a small fraction of studies published in each issue. “Because our denominator was low, the index was high,” the statement said.
Monica M. Bradford, executive editor of the journal Science, suggested that the extra attention high-impact journals get might be part of the reason for their higher rate of retraction. “Papers making the most dramatic advances will be subject to the most scrutiny,” she said.
Dr. Fang says that may well be true, but adds that it cuts both ways — that the scramble to publish in high-impact journals may be leading to more and more errors. Each year, every laboratory produces a new crop of Ph.D.’s, who must compete for a small number of jobs, and the competition is getting fiercer. In 1973, more than half of biologists had a tenure-track job within six years of getting a Ph.D. By 2006 the figure was down to 15 percent.
Yet labs continue to have an incentive to take on lots of graduate students to produce more research. “I refer to it as a pyramid scheme,” said Paula Stephan, a Georgia State University economist and author of “How Economics Shapes Science,” published in January by Harvard University Press.       
In such an environment, a high-profile paper can mean the difference between a career in science or leaving the field. “It’s becoming the price of admission,” Dr. Fang said.
The scramble isn’t over once young scientists get a job. “Everyone feels nervous even when they’re successful,” he continued. “They ask, ‘Will this be the beginning of the decline?’ ”
University laboratories count on a steady stream of grants from the government and other sources. The National Institutes of Health accepts a much lower percentage of grant applications today than in earlier decades. At the same time, many universities expect scientists to draw an increasing part of their salaries from grants, and these pressures have influenced how scientists are promoted.
“What people do is they count papers, and they look at the prestige of the journal in which the research is published, and they see how many grant dollars scientists have, and if they don’t have funding, they don’t get promoted,” Dr. Fang said. “It’s not about the quality of the research.”
Dr. Ness likens scientists today to small-business owners, rather than people trying to satisfy their curiosity about how the world works. “You’re marketing and selling to other scientists,” she said. “To the degree you can market and sell your products better, you’re creating the revenue stream to fund your enterprise.”
Universities want to attract successful scientists, and so they have erected a glut of science buildings, Dr. Stephan said. Some universities have gone into debt, betting that the flow of grant money will eventually pay off the loans. “It’s really going to bite them,” she said.
With all this pressure on scientists, they may lack the extra time to check their own research — to figure out why some of their data doesn’t fit their hypothesis, for example. Instead, they have to be concerned about publishing papers before someone else publishes the same results.
“You can’t afford to fail, to have your hypothesis disproven,” Dr. Fang said. “It’s a small minority of scientists who engage in frank misconduct. It’s a much more insidious thing that you feel compelled to put the best face on everything.”
Adding to the pressure, thousands of new Ph.D. scientists are coming out of countries like China and India. Writing in the April 5 issue of Nature, Dr. Stephan points out that a number of countries — including China, South Korea and Turkey — now offer cash rewards to scientists who get papers into high-profile journals. She has found these incentives set off a flood of extra papers submitted to those journals, with few actually being published in them. “It clearly burdens the system,” she said.
To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”
They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.
Such a shift would require scientists to surrender some of their most cherished practices — the priority rule, for example, which gives all the credit for a scientific discovery to whoever publishes results first. (Three centuries ago, Isaac Newton and Gottfried Leibniz were bickering about who invented calculus.) Dr. Casadevall thinks it leads to rival research teams’ obsessing over secrecy, and rushing out their papers to beat their competitors. “And that can’t be good,” he said.
To ease such cutthroat competition, the two editors would also change the rules for scientific prizes and would have universities take collaboration into account when they decide on promotions.
Ms. Bradford, of Science magazine, agreed. “I would agree that a scientist’s career advancement should not depend solely on the publications listed on his or her C.V.,” she said, “and that there is much room for improvement in how scientific talent in all its diversity can be nurtured.”
Even scientists who are sympathetic to the idea of fundamental change are skeptical that it will happen any time soon. “I don’t think they have much chance of changing what they’re talking about,” said Dr. Korn, of Harvard.
But Dr. Fang worries that the situation could be become much more dire if nothing happens soon. “When our generation goes away, where is the new generation going to be?” he asked. “All the scientists I know are so anxious about their funding that they don’t make inspiring role models. I heard it from my own kids, who went into art and music respectively. They said, ‘You know, we see you, and you don’t look very happy.’ ”

April 12, 2012

Honest work - NATURE

Nature 484, 141 (12 April 2012),  doi:10.1038/484141b
The plagiarism police deserve thanks for defending the honour of the PhD.
Last week, Hungary's President Pál Schmitt was forced to resign because of plagiarism detected in his 1992 PhD thesis on physical education. Tivadar Tulassay, rector of Budapest's prestigious Semmelweis University, showed admirable courage by standing up to the Hungarian establishment to revoke the thesis a few days earlier, after experts appointed by the university declared that Schmitt's thesis “failed to meet scientific and ethical standards”. Tulassay, a cardiovascular researcher, has since assumed personal responsibility for his university's decision to revoke Schmitt's title.
The affair has remarkable parallels with that of Germany's former defence minister, Karl-Theodor zu Guttenberg, who resigned in March last year after his own PhD thesis, in law, had been revoked by the University of Bayreuth.
Like Schmitt, zu Guttenberg tried at first to deny plagiarism charges, then to underplay them, and he enjoyed powerful political support — until protests by a movement of honest PhD holders made his situation untenable. Plagiarism hunters have other prominent personalities in their sights, and are not necessarily going to be stopped just because a thesis is not in electronic form — if suspicion is high, they will digitize it themselves.
In many central European countries, an academic title is a decided advantage for a political career; clearly, some ambitious politicians think nothing of obtaining such a title by cheating. We can thank the plagiarism hunters — whatever their individual motives — for exposing dishonesty among those who govern us, and for defending the honour of a PhD. The only safe doctorate these days is an honestly acquired one.

April 4, 2012

A lot of science is just plain wrong

Suddenly, everybody’s saying it: the scientific and medical literature is riddled with poor studies, irreproducible results, concealed data and sloppy mistakes.
Since these studies underpin a huge number of government policies, from health to the environment, that’s a serious charge.
Let’s start with Stan Young, Assistant Director of Bioinformatics at the US National Institute of Statistical Sciences. He recently gave evidence to the US Congress Committee on Science, Space and Technology about the quality of science used by the US Environmental Protection Agency.
Some might think, he said, that peer review is enough to assure the quality of the work, but it isn’t. “Peer review only says that the work meets the common standards of the discipline and, on the face of it, the claims are plausible. Scientists doing peer review essentially never ask for data sets and subject the paper to the level of examination that is possible by making data electronically available.”
He called for the EPA to make the data underlying key regulations, such as those on air pollution and mortality, available. Without it, he said, those papers are “trust me” science. Authors of research reports funded by the EPA should provide, at the time of publication, three things: the study protocol, the statistical analysis code, and an electronic copy of the data used in the publication.
Further, he calls for data collection and analysis to be funded separately, since they call for different skills and if data building and analysis are together, there is a natural tendency for authors not to share the data until the last ounce of information is extracted. “It would be better to open up the analysis to multiple teams of scientists.”
The problem of data access is not unique to the EPA, or the US. Despite the open data claims made by the UK Government, many sets of data in the social sciences gathered at government expense are not routinely available to scholars, a point made at a conference last month at the British Academy under the auspices of its Languages and Quantitative Skills programme.
Often this is data that is too detailed, sensitive and confidential for general release but that can be made available to researchers through organisations such as the Secure Data Service, which is funded by the Economic and Social Science Research Council. But complaints were made at the conference that SDS data is three years late in being released.
Accessibility of data was also among the points made in a damning survey of cancer research published last week in Nature (1). Glenn Begley spent ten years as head of global cancer research at the biotech firm Amgen, and paints a dismal picture of the quality of much academic cancer research. He set a team of 100 scientists to follow up papers that appeared to suggest new targets for cancer drugs, and found that the vast majority – all but six out of 53 “landmark” publications – could not be reproduced.
That meant that money spent trying to develop drugs on the basis of these papers would have been wasted, and patients might have been put at risk in trials that were never going to result in useful medicines. “It was shocking” Dr Begley told Reuters. “These are the studies that the pharmaceutical industry relies on to identify new targets for drug development. But if you’re going to place a $1 million or $2 million or $5 million bet on an observation, you need to be sure it’s true. As we tried to reproduce these papers we became convinced that you can’t take anything at face value.”
He suggests that researchers should, as in clinical research, be blinded to the control and treatment arms, and that they should be obliged to report all data, negative as well as positive. He recounted to Reuters a shocking story of a meeting with the lead author of one of these irreproducible studies at a conference. He took him through the paper line by line, explaining that his team had repeated the experiment 50 times without getting the result reported. “He said they’d done it six times and got this result once, but put it in the paper because it made the best story. It’s very disillusioning.”
Intense academic pressure to publish, ideally in prestige journals, and the failure of those journals to make proper checks, has both contributed to the problem. Journal editors – even those at Nature, where Begley’s study was published – seem reluctant to acknowledge the problem.  Nature published an editorial that seemed to place the blame on sloppy mistakes and carelessness, but I read Begley’s warning as much more fundamental than that, as did many of those who commented on the editorial.
This website has identified a few examples of implausible results published in distinguished journals, but the editors of those journals don’t seem very bothered. In an era where online publishing with instant feedback and an essentially limitless ability to publish data is available, the journals are too eager to sustain their mystique, and too reluctant to admit to error. That said, retractions have gone up by ten-fold over the past decade, while the literature itself has grown by only 44 per cent, according to evidence given to a US National Academy of Sciences committee last month.
Stan Young, however, does not blame the editors. In an article in last September’s issue of Significance (2), he and colleague Alan Carr argue that quality control cannot be exercised solely at the end of the process, by throwing out defective studies, let alone at the replicative stage. It must be exercised at every stage, by scientists, funders, and academic institutions.
“At present researchers – and, just as important, the public at large – are being deceived, and are being deceived in the name of science. This should not be allowed to continue”, Young and Carr conclude.

References
1. Raise standards for preclinical cancer research, by C. Glenn Begley and Lee M Ellis, Nature 483, pp 531-33, 29 March 2012
2. Deming, data and observational studies, by S. Stanley Young and Alan Karr, Significance, September 2011, pp 116-120

April 2, 2012

Hungarian President Resigns in Plagiarism Scandal

BUDAPEST, Hungary (AP) — Hungarian President Pal Schmitt resigned Monday because of a plagiarism scandal regarding a doctoral dissertation he had written 20 years ago on the Olympics.>>>

Rector quits after plagiarism scandal

The rector of Budapest's Semmelweis University has announced his resignation, the latest twist in a plagiarism scandal surrounding President Pal Schmitt.
Tivadar Tulassay's announcement followed last week's decision to strip Schmitt of his 1992 doctorate, after an investigative committee set up by the university found he had copied 'word-for-word' large passages of other academics' work in his thesis.
According to the university website Tulassay resigned because he felt he had lost the trust of his political superiors and had been left alone to handle Schmitt's case.
Earlier on Sunday Schmitt told national radio that only the law courts, rather than the university, could revoke his doctorate.
Prime Minister Viktor Orban, a close ally of Schmitt, told public radio in a recent interview that the president alone must decide whether he should resign from the largely ceremonial post.
Opposition parties, meanwhile, called on him to resign and on Saturday hundreds of people marched from the city center to the presidential palace in the castle district.
The demonstrators joined a handful of people who have set up a small protest camp outside the palace.
Chanting 'Resign' and holding placards that read 'Bullschmitt' and 'Pal the lying doctor' they expressed their frustration at Schmitt's public interview on Friday night, in which he denied any wrongdoing and said he would stay in office.
'I am here because I have had enough of our politicians, who have been lying to our faces on a daily basis. I demand Pal Schmitt's resignation because he did not get his education through honest means and cannot represent Hungary,' said Balint, a 37-year-old freelance designer. skynews.com.au

Random Posts



.
.

Popular Posts