It is not a trivial task. Beall’s list was imperfect and it does not actively exist anymore. No universally accepted rating of scholarly journals exists. However, scholars need to know which is a reputable journal, and which is not. A huge industry of predatory journals sprung up across the world. It does an amazing job camouflaging its crap depositories as legitimate publications. In the past, pay-to-publish journals were overwhelmingly predatory, but now the rise of open-access movement has made the criterion obsolete. In addition, there are too many weak journals, put together by well-meaning scholars but never achieving the level of acceptable quality. Because of the low number of submissions, they have to water down their acceptance standards.
Some universities and even single departments engage in sophisticated procedures for whitelisting and blacklisting journals. Those do work to some extent, but they cost much and may ignite internal conflicts. I want to offer some advice to junior scholars and their T&P committees for a quick and dirty check with resources readily available.
Here is my simple method, in two steps:
1. Go to www.scimagojr.com and look for the journal there. If it is in the first two quartiles, it is likely to be a good journal. If it is in quartiles 3 and 4, it is still likely to be a legit journal, although not that great. If the journal is not found there, go to step 2. Scimagojr feeds from the Scopus database (where you could also go directly, but it is not as user-friendly). It captures a great number of good journals, but not all of them. A number of good journals, especially in North America, just never bothered to be indexed there. It is especially true for those published directly by universities and scholarly societies. This is why you should go to step 2 if the journal is not found.
2. Find the name of the editor and the editorial board. If you do not know who that person is, go to scholar.google.com and see how many people cite the person and what is his or her h-index. In social sciences, it the number is lower than about 8-10, we are dealing with a junior scholar, who has not yet built a solid reputation in the field. The standards are different in different fields. By the way, here is the Sac State’s Hall of fame according to Google. There are some exceptions, like Harvard Education Review. It is edited by graduate students and is one of the top journals in the world. However, it is indexed by Scopus, so you would see it in step 1 above. There may be some other exceptions. As a general rule, no respectable scholar will lend her or his name to a crappy journal. After all the technology, all the indices, all tricks, the bottom line is still someone’s personal reputation that matters. And it works.
Neither of these two tests is perfect. If a journal fails both tests above, do not dismiss it, but you will need to do some more digging. The easiest thing to do is ask someone who you believe is an expert in the specific field. The problem with education and some other cross-disciplinary fields is that those are federations of various disciplines applied to a common phenomenon. Each of its sub-disciplines have different hierarchy of journals, and some of them are split into several traditions or approaches. This is why we should rely on some external evidence. While it is not perfect, it is better than nothing.
Some universities and even single departments engage in sophisticated procedures for whitelisting and blacklisting journals. Those do work to some extent, but they cost much and may ignite internal conflicts. I want to offer some advice to junior scholars and their T&P committees for a quick and dirty check with resources readily available.
Here is my simple method, in two steps:
1. Go to www.scimagojr.com and look for the journal there. If it is in the first two quartiles, it is likely to be a good journal. If it is in quartiles 3 and 4, it is still likely to be a legit journal, although not that great. If the journal is not found there, go to step 2. Scimagojr feeds from the Scopus database (where you could also go directly, but it is not as user-friendly). It captures a great number of good journals, but not all of them. A number of good journals, especially in North America, just never bothered to be indexed there. It is especially true for those published directly by universities and scholarly societies. This is why you should go to step 2 if the journal is not found.
2. Find the name of the editor and the editorial board. If you do not know who that person is, go to scholar.google.com and see how many people cite the person and what is his or her h-index. In social sciences, it the number is lower than about 8-10, we are dealing with a junior scholar, who has not yet built a solid reputation in the field. The standards are different in different fields. By the way, here is the Sac State’s Hall of fame according to Google. There are some exceptions, like Harvard Education Review. It is edited by graduate students and is one of the top journals in the world. However, it is indexed by Scopus, so you would see it in step 1 above. There may be some other exceptions. As a general rule, no respectable scholar will lend her or his name to a crappy journal. After all the technology, all the indices, all tricks, the bottom line is still someone’s personal reputation that matters. And it works.
Neither of these two tests is perfect. If a journal fails both tests above, do not dismiss it, but you will need to do some more digging. The easiest thing to do is ask someone who you believe is an expert in the specific field. The problem with education and some other cross-disciplinary fields is that those are federations of various disciplines applied to a common phenomenon. Each of its sub-disciplines have different hierarchy of journals, and some of them are split into several traditions or approaches. This is why we should rely on some external evidence. While it is not perfect, it is better than nothing.
No comments:
Post a Comment