Tags: atlantic, medical research, misinformation, research
1 comment so far
Today I experienced one of those small miracles where it seems like the entire universe has converged to say “Yes, I agree with you!” when I was e-mailed an article that expresses everything I have been examining and thinking for the last 8 months–“can any medical-research studies be trusted?” In recent months I have become increasingly involved in researching various medical discussions. While initially disgusted by the number of the people quoting statistics and “they say” aphorisms on the internet without citing any kind of research, my turn to peer-reviewed medical journals, government agencies, and well-established professional societies seemed promising. Boy was I wrong.
The first problem I encountered was a marked dearth of research on certain topics even when preliminary research and letters to the editor stressed the need for follow-up studies. Why had no one taken on the topics so easily presented to them?
My second problem was faulty or insufficient research. How were “peer-reviewed” journals approving studies that used narrow demographics or extremely limited participants as their population of study? And what about the literature reviews and topic analyses that incorporated data over ten years old (or older)? How about the number of studies that measure long-term effects of a drug/procedure when the “long-term” lasts six months?
The final straw was the directly contradictory data between comparable research studies. What could account for one study claiming that vaccinating pregnant women in the third trimester prevents influenza in newborns 63% of the time, while another study claims that the protection is negligible (both supplying method and hard numbers)? While the bias of certain professional organizations (often funded by pharmaceutical companies) was obvious in some, even bias cannot sway hard numbers, or so I believed.
So when “Lies, Damned Lies, and Medical Science” by David H. Freeman in the November issue of the Atlantic showed up in my inbox, I was more than thrilled to know I was not alone. The article follows self-proclaimed “meta-researcher” John Ioannidis who, along with his team, has proven, “that much of what biomedical researchers conclude in published studies—conclusions that doctors keep in mind when they prescribe antibiotics or blood-pressure medication, or when they advise us to consume more fiber or less meat, or when they recommend surgery for heart disease or back pain—is misleading, exaggerated, and often flat-out wrong.”
We have become so accustomed to the way doctors, nutritionists, and scientists later retract studies, or the refutation of older studies by newer ones, that we rarely question why this has happened. Most people, I would surmise, make the assumption that science and research have improved over time and thus provided us with new evidence and information. But, according to Ioannidis, this is not the case. Faulty research, again and again, is. The common errors he lists range from “what questions researchers posed, to how they set up the studies, to which patients they recruited for the studies, to which measurements they took, to how they analyzed the data, to how they presented their results, to how particular studies came to be published in medical journals.”
Does this mean scientists and researchers are lazy? Ignorant? Inherently evil? Why would such important and literally life-altering work be composed so shoddily? Bias. (Hmm…sound familiar?) Turns out that even unintentionally, bias has a way of making itself into every step of the research process, influencing outcomes greatly and it doesn’t matter if this bias is self-inflicted or the product of an outside pressure such as those funding the research. (I cannot help but to point out the irony here, in that we must question whether bias played a role in Ioannidis’ research on research.) And while bias is the source of the faulty research, a number of factors perpetuate the misinformation, including sensationalism, lack of thorough research (i.e. ignoring/missing later refuting studies), and lack of duplication of the experiment.
So what is the point of research refuting research? I’ll stop summarizing the article and give you a chance to decide for yourself. But let me leave you with this last thought: Ioannidis’ research, like medical research, provokes us to examine things we held to be true. And just like medical research, it seems to come up short, leaving us with the question, “well what can we do about it?” Perhaps further research is needed. ; )
Is Google Making Us Dumb? 21% (and me) said Yes March 5, 2010Posted by pupfiction in Uncategorized.
Tags: future, google, intelligence, internet, research
A portion of the Pew Research Center’s project, “The Future of the Internet IV”, examines an article written by Nicholas Carr in the summer of 2008 entitled “Is Google Making Us Stupid?” (published in the Atlantic Monthly). The study asked respondents if they agreed with Carr, that human intelligence (measured by IQ) would not have increased and may have even decreased by 2020. Results were as follows (for totals): 76% did not agree with Carr, 21% did, and 2% did not respond. What is most interesting about this study is the lengthy comments that follow explaining their stance on human intelligence and how it will be affected by Internet information searching. Most discussion revolves around the issues we are well aware of–engaged reading has turned into skimming and jumping, what we used to have to remember we can always access, and (on the positive side) the number of resources available has exploded (although one could also start the debate on quality versus quantity).
Carr argues that our thinking is changing from a more strenuous method to a less vigorous one. He states that,”the ease of online searching and distractions of browsing through the web were possibly limiting his capacity to concentrate”. I see changes in my own thinking daily and have always believed this to be a product of information overload. Never one to have an attention problem, I too find myself unable to read an entire article, feeling the need to skim and move on. Many of Carr’s adversaries don’t see this as a negative thing. They believe that we are required to process information this way now
that we have so many sources available to us. While these objectors have a point, one has to ask where this will end? When will we max out? Information resources continue to proliferate and there has to come a point when we say ‘enough is enough’. If we spend all day skimming and comparing, when does the actual thinking and decision-making take place? Clearly, I agree with Carr and his small set of followers that Google (symbolically representing the Internet as a whole) will make us dumb.
Another argument that the commenters seem to miss is that knowledge and intelligence are not the same thing. While someone may have the most knowledge from browsing the Internet, this does not mean they can harness or process it in any useful way. However, you may take someone with limited knowledge and a lot of intelligence and teach them to do amazing things. True, an intelligent person with no knowledge cannot do much, but a person with a plethora of knowledge and no intelligence is just as useless. In conclusion, I would have to say that though “Google” will inevitably make us more knowledgeable, it cannot make us think more clearly.
Visual Impact: Worldmapper.org February 24, 2010Posted by pupfiction in Uncategorized.
Tags: information, maps, reference, research, visualization
1 comment so far
Not to bombard you with reference resources today, but I stumbled across another great site that will easily keep you occupied for a while. Worldmapper.org provides more than 700 world maps (over half of which are available in PDF form) that showcase various statistics by resizing countries to visually show the impact of such statistics. There are even a few that are animated, and thus display the way the world has changed over a number of years. All of the maps link to excel spreadsheets with detailed statistics as well as sources. The organization is run by a group of college professors. I am going to include some of the most astounding maps below so you can see for yourself what an impact these can have, but make sure to check out the whole list of maps here.
Access to Research: Overcoming Barriers Report December 17, 2009Posted by dataduchess in InformationIssues.
Tags: frustration, information, openaccess, research
add a comment
A recent report from the Resource Information Network, Overcoming Barriers: Access to Research Information, reaches the same conclusions as those of us who do research (in any field) have already experienced:
The report’s key finding is that access is still a major concern for researchers. Although researchers report having no problems finding content in this age of electronic information, gaining access is another matter due to the complexity of licensing arrangements, restrictions placed on researchers accessing content outside of their own institution and the laws protecting public and private sector information. This means that research into important information resources can be missing. Researchers report that they are frustrated by this lack of immediate access and that this slows their progress, hinders collaborative work and may well affect the quality and integrity of work produced.
No Library Here – Please See Google December 8, 2009Posted by dataduchess in Uncategorized.
Tags: cutbacks, frustration, legal, libraries, research
I came across this picture last week, and although I chuckled a little, my reaction was to roll my eyes and groan at the state of libraries and the lack of respect for their resources or the skills of the librarians in this country, as more and more companies and agencies are cutting staff and resources to save money. This picture came from a blog post highlighting that BusinessWeek Magazine has recently closed its library, on the heels of the Wall Street Journal doing the same thing earlier this year.
Even with my awareness of the usefulness of a skilled librarian and a well-inventoried library, it doesn’t hit home until you need access to a resource that is no longer provided. This morning, for my work, I needed to search for a specific set of cases to support my legal argument, and although I knew they were out there, and I knew exactly how I could find them – I could not find any library within a reasonable distance that still had the resources I sought. It took me 3 times as long to figure out a different approach to the problem, and that is only to find the resources – not to use them. Either my clients are going to get an ugly bill, or I have to discount my time – either way, a good library would have made everyone happier.
Librarian Fail; Again December 2, 2009Posted by pupfiction in Uncategorized.
Tags: databases, libraries, literacy, research, study
Project Information Literacy (PIL) has just released a report entitled, “Lessons Learned: How Students Find Information in the Digital Age” and the Free Range Librarian blog has some interesting things to say on it, namely, what many of us are painfully aware of, “that students rarely if ever consult librarians.” The study compiled the responses of 2,318 students from six different campuses (comprised of major universities as well as smaller community college). The major findings include:
1. Many students in the sample reported being curious, engaged, and motivated at the beginning of the course-related and everyday life research process. Respondents’ need for big-picture context, or background about a topic, was the trigger for beginning course-related (65%) or everyday life research (63%).
2. Almost every student in the sample turned to course readings—not Google—first for course-related research assignments. Likewise, Google and Wikipedia were the go-to sites for everyday life research for nearly every respondent.
3. Librarians were tremendously underutilized by students. Eight
out of 10 of the respondents reported rarely, if ever, turning to
librarians for help with course-related research assignments.
4. Nine out of 10 students in the sample turned to libraries for
certain online scholarly research databases (such as those
provided by EBSCO, JSTOR, or ProQuest) for conducting
course-related research, valuing the resources for credible
content, in-depth information, and the ability to meet instructors’
5. Even though it was librarians who initially informed students
about using online scholarly research databases during freshmen training
sessions, students in follow-up interviews reported turning to instructors as valued research coaches, as they advanced through the higher levels of their education.
6. The reasons why students procrastinate are no longer driven by the same pre-Internet fears of failure and a lack of confidence that once were part of the college scene in the 1980s. Instead, we found that most of the digital natives in the sample (40%) tended to delay work on assignments as they juggled their needs to meet competing course demands from other classes.
These findings, as always, drive home my perennial argument that outreach is largely overlooked or ineffectual. This can be the librarians’ fault, the professors’ fault, or a combination of both. While students feel understandably more comfortable speaking to their professors who they see on a daily basis, these professors should feel comfortable with directing students to the librarians and librarians should encourage this. While this is the ideal situation, how often is it actually practiced?
Furthermore, the students’ use of certain databases time and time again leads me to believe that library web sites need to become more transparent and navigable. In fact, almost every day I help a student who says they just use EBSCO, ProQuest, or Gale without understanding the differences between the actual subscriptions (i.e. whether they are using EBSCO’s ERIC or EBSCO’s MLA Bibliography). While I don’t suggest an aggregated search that utilizes all of the libraries’ subscriptions, I do suggest an aggregated search that searches all the databases for a subject. While many libraries group databases under subject specific research guides, none that I know of search all the databases for a particular subject. I think this would greatly help students. Does anyone use such a search feature or know of a library that does? What have the results been like?
See the full report here: PIL_Fall2009_Year1Report_12_2009
Mobile Tech in the Classroom November 9, 2009Posted by pupfiction in Uncategorized.
Tags: academia, education, mobile, research, smartphone
add a comment
Anyone who has worked in education or an academic setting can vouch that technology for technology’s sake never ends well. The result? Pointless and poorly made powerpoint presentations, Smartboard snafus, and futile attempts to seamlessly integrate multimedia with more traditional lecturing. And that’s why when I stumbled across Mobile Behavior’s “Five Approaches to Mobile Technology in the Classroom”, I was understandably skeptical. Mobile Behavior, a commercial project, but one with research at its core, points to some interesting studies on the effect of integrating mobile technologies with the classroom.
Most promising perhaps is the study done by Digital Millennial which proved that such integration raises the test scores of those in low-income areas. Of course, the economic specter always looms, and one wonders who will fund such devices, especially in these days of reduced funding, where school programs are constantly being cut.
Celebrate Open Access Week! October 22, 2009Posted by pupfiction in Uncategorized.
Tags: information, legal, openaccess, pubmed, research
1 comment so far
The first annual Open Access Week (which began this Monday, the 19th and ends on the 23rd), is a growing movement that started with a “day of action” in 2007. Openaccessweek.org is a site dedicated to the week and explains the exponentially growing movement in greater detail.
In short, the week and movement in general is a collaboration between colleges, universities, professional and academic organizations to make access to research freely searchable and accessible to all. The movement was started, in part, by the National Institutes of Health with it’s unprecedented completely free access to health publications in its database PubMed, many of which are full text.
Though most Open Access blogs, movements, sites, etc. tend to over look an important piece of legislation, the “Consolidated Appropriations Act of 2007“, by President Bush, this was a landmark piece of legislation which required the National Institute of Health to include complete electronic versions of research findings in PubMed Central.
For those of you who are still unsure what exactly constitutes an Open Access Publication, Earlham College explains:
“An Open Access Publication is one that meets the following two conditions:
- The author(s) and copyright holder(s) grant(s) to all users a free, irrevocable, worldwide, perpetual right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship, as well as the right to make small numbers of printed copies for their personal use.
- A complete version of the work and all supplemental materials, including a copy of the permission as stated above, in a suitable standard electronic format is deposited immediately upon initial publication in at least one online repository that is supported by an academic institution, scholarly society, government agency, or other well-established organization that seeks to enable open access, unrestricted distribution, interoperability, and long-term archiving (for the biomedical sciences, PubMed Central is such a repository).
Want to know how you can join the fight? Add you signature to the Budapest Open Access Initiative here.