Impact
During a HOA there was a discussion about Open Access journals. Brent Neal wrote a follow up post
Open Access
where I mentioned that journal impact factors should be discussed as they play a role in accessing the quality of a journal. Some Open Access journals are good and some are not so much. How can you tell? There’s some nuance and disagreement about impact factors but I’ll get to that later.
First, I want to give a little background and continue the conversation about Open Access journals. In Brent’s post, he mentioned predatory publishers and that we have all gotten spam from them, i.e., requests to consider Open Access journal X when we publish our next manuscript. One of the negative sides of predatory Open Access that I’ve experienced is related to peer review and the role of the editor. After you have done your job reviewing a manuscript and recommended whether or not the manuscript should be accepted for publication, sent back for major revision or rejected outright, the editor takes into consideration the recommendations from all referees and informs the author(s) of his/her decision. The problem is that some predatory Open Access journals charge a significant amount to the authors, sometimes more than $1,500. In the case that I am thinking of, the manuscript was poorly written and was essentially what is known as a quick communication that was being submitted as a full research article. The manuscript was very verbose to try justify full article vs. quick communication. The editor kept pushing to accept the article and to accept it as a full research article. I can only guess the motivation for that was the fee that is charged to the author(s).
Removing the issue of predatory journals, how does one assess the quality of a journal and more importantly a specific journal article. I’ll discuss an example. You probably hear scientists on G+ request peer reviewed citation when “debating” with people. I put debating in quotes because people often don’t know what it really means, it does not mean arguing but I’ll save that for another post. In a debate with a commenter on one of my posts (sorry I couldn’t find the comment to link), he finally gave a link to a peer reviewed article in Bulletins in Insectology. I’m not an entomologist so I have no idea of the accuracy or impact of that particular article. So what do you do?
Phone a friend
Like the Who Wants to be a Millionaire show, one option is to ask an expert. Maybe you know an entomologist. One of the great things about G+ is that you might actually have one in your circles. Alas, I don’t know any or at least couldn’t think of one. The next option is to assess the quality of the journal using impact factors.
Impact Factor (IF)
Journal Citation Reports are made by Institute for Scientific Information (ISI) and can be found on the ISI Web of Knowledge site, owned by Thomson Reuters. You have to have a subscription to the site so this ties in with the Open Access discussion from an access point of view as well. Impact factor is a calculation of the average number of citations per paper for a 2 year period divided by the total number of citable items. The idea of IF is that it gives you an idea of the average importance or impact of articles for a journal. You can imagine where there can be problems with this, e.g., what if a journal publishes a low number of articles per year? The Wiki below goes through more explanations and some alternatives like Page Ranking. A good example in the Wiki is an article that was cited over 6,000 times, yet the other citations for that journal are much lower.
Getting back to the Bulletins in Insectology journal, I looked it up in the Citations Reports. It’s impact factor is 0.44. In the post with the “debate” I had referenced an article in the Proceedings of the National Academy of Science (PNAS). It’s impact factor is 9.737. Just for reference, Science has an IF of 31.027 and Nature has an IF of 38.597. You’ll see them along with other details in the citation report, in the attached figure. Here’s the problem, most people agree that you can’t really compare IF from different disciplines. One reason is that some research might take longer to complete and publish. So if one discipline churns out more publications, that will affect the IF. The number of articles from Bulletins in Insectology is only 42. Remember, IF divides by the number of citable items and a low number should help.
Entomology is a reasonable category to compare Bulletins in Insectology with other journals, in the same discipline. In the next figure below, you’ll see a screenshot of the first page of results (sorted by IF) for journals in entomology. The range of IF is from 13.589 to 1.926. Without being able to “phone a friend”, one would conclude that Bulletins in Insectology is either an obscure journal, new journal, or one that is not ranked high in entomology.
So the Bulletins in Insectology article that was linked is peer reviewed, which is good, but it likely has some issues preventing it from being published in a better journal in the field of entomology.
In our own fields of research, we often don’t pay too much attention to IF because we know which journals our peers/colleagues are publishing in. Unfortunately some academic administrations will use impact factors to judge the quality of someone’s track record for promotion purposes. Again, if they are not in your field, it is one way to assess quality, albeit with the caveats mentioned.
http://en.wikipedia.org/wiki/Impact_factor
Homer GIF via Reddit
#ScienceSunday


