Skip to Main Content
Turku University Library

Evaluation based on scientific publications

This guide is based on the LibGuide created by Tritonia Academic Library.

Remember when analyzing journals

  • Most of the tools analyze journals as publication channels. These tools are not intended for evaluating the quality of a single article.
  • Most of the tools are based on citations so you can't compare results between disciplines
  • More information on indicators for evaluating journals in Finnish national guide to publication metrics


Analyzing journals

Impact Factor (IF) is the oldest and most used indicator for measuring the impact of journals. Clarivate Analytics has the exclusive right to the impact factor and the IF numbers are only available in Web of Science and  Journal Citation Reports (JCR) database and are updated annually.

The Journal Impact Factor for 2017 is calculated as follows:

First, citations in 2017 to items published in the journal in 2015 and 2016 are added up. The sum is divided by the total number of citable items published in the journal in 2015 and 2016. The result is the Journal Impact Factor.

Restrictions of the impact factor

  • The impact factors of journals of different disciplines are not comparable!
  • The impact factor favours disciplines, where the publication rate is fast and where new articles are reacted upon quickly. IF is best suited for science and medicine.
  • Journals that publish many articles (especially review articles) usually have higher IF values than those that publish only a few articles.
  • Journals in specialised fields tend to have low IF numbers.
  • When calculating the IF values, only the journal citations in Clarivate Analytics Journal Citation Reports (JCR) database are taken into account. This means that the citations of other journals and monographic publications are not included in the IF value calculations. However, the self-citations of journals are taken into account.

Do it yourself: finding IF numbers of a specific journal

Screen capture of the JCR search box. Visualises written instructions.

1. Access Journal Citation Reports (JCR) -database.

2. Write the name of the journal into the search box.

3. Now you can see the information of the journal of your interest and the IF numbers from the last couple of years.

4. If you want to know the journal's position in its subject category, browse the page until you see the link "Click here to view Rank, Cited Journal Data, Citing Journal Data, Box Plot, and Journal Relationships". Now you can select "Rank" on the left.



Do it yourself: comparing journals in a certain category

1. Access Journal Citation Reports (JCR) -database.

2. Select "Browse by Category".

3. On the left side menu you will find "Select Category". From there select the journal category you want to explore. Remember that it is advisable to compare only journals from the same journal category. 

4. Scroll down and select "Submit".

5. If you see only the name of the journal rank, go to the top of the page and select "Journals by RANK".



Publication Forum is a rating and classification system to support the quality assessment of research output. The evaluation is performed by discipline-specific Expert Panels. Publication Forum is part of the university funding model established by the Ministry of Education and Culture.

Publication Forum levels:
1 = basic level
2 = leading level
3 = highest level

Restrictions of Publication Forum

  • The purpose of the tool is not to evaluate single articles.
  • Publication Forum levels are suitable to use only when analyzing a large amount of publications.
  • Publication Forum levels 2 and 3 are awarded to a limited number of publications per discipline panel.
  • User guide for the Publication Forum classification

Do it yourself: search Publication Forum level of a certain journal

1. Go to Publication Forum (link above).

2. Write the name of the journal and select Search.

3. Search results appear.


Do it yourself: search Publication Forum levels of all journals of a certain topic

1. Go to Publication forum (link above).

2. Select whether you want search by panels or by categorys (for example MinEdu, Web of Science or Scopus fields).

3. Click Search.

4. You can also download results as an Excel file.

Scopus has several tools to evaluate journals:

  • CiteScore is based on citations. For example, CiteScore for the year 2019 is calculated in the following way:
    First, citations  to items published in the journal in the years 2016, 2017, 2018  and 2019 are added up. The sum is divided by the total number of articles published in the journal in 2016, 2017, 2018 and 2019. The result is the CiteScore of the journal.
  • SCImago Journal Rank (SJR) measures journal’s prestige by taking into account the subject field, quality and reputation of the journal. It accounts for both the number of citations received by a journal and the prestige of the journals where the citations come from (based on the SJR score). For example, if both journals A and B receive the same number of citations, the SJR indicator of Journal A is higher, if its citations come from more prestigious journals than journal B’s.  

    SJR is calculated based on the Elsevier Scopus citations data over a three year period. The journal self-citations discount the indicator value. SJR is available in Scopus or free of charge through the SCImago Journal & Country Rank service.

  • Source-Normalized Impact per Paper (SNIP) measures a source’s contextual citation impact. It takes into account the characteristics of the source’s subject field. SNIP is calculated based on the citation data of the Elsevier’s Scopus database over a period of 3 years.  SNIP is available in Scopus or free of charge through the CWTS Journal Indicators website

SJR and SNIP in comparison
  • SJR is best for subject fields, where authors rapidly cite other works and which have a limited number of nuclear journals. SNIP suits better for evaluating journal impact in more heterogeneous fields, where journals are not the main publication channel.
  • Life and Health Sciences usually have the highest SJR values (SJR tends to make the difference between journals larger and enhances the position of the most prestigious journals). Furthermore, Technology and Social Sciences tend to have the highest SNIP scores. 
  • The SNIP values of small and multi-disciplinary journals are best viewed with caution, as the scores may greatly vary from year to year.


Do it yourself: journal metrics in Scopus

1. Go to Scopus

2. Select Sources from top of the page.

3.  If you want to search a specific journal, select Title from the drop down menu and write the name of the journal to the search box.

Screen capture of the journal title search. Visualises written instructions.

4. On the result page you can see  CiteScore, SJR and SNIP values for the journal.


Do it yourself: comparing journal categories in Scopus

1. Access the Scopus database.

2. Select Sources from top of the page.

3. If you want to compare journals in a certain journal category click the search box and select from the menu which category you are interested in.

Screen capture of selecting a category. Visualises written instructions.

4. Scroll down the page to select Apply.

5. From result page you can select, how to sort the results, for example by CiteScore or SJR.

As open access publishing has increased, so have attempts at fraud. The so-called predatory open-access journals collect overtly large author fees or their peer-review does not fulfill the set requirements. Notification about the publication fee can also arrive only after the article is ready for publishing.

To analyze the reliability of the publisher, you can use the following check-list:

• does the publisher disclose full contact information, including an address?
• is information available on who owns the publisher and in which country the publisher operates?
• who are the members of the journal's editorial board and do they provide full information on their institutional affiliation?
• is there information on the author fees?
• are the previous articles written by well-known scholars and institutions and are they of a high-quality in your opinion?
• what does the peer-review process entail and is there information on it?
• is the publication listed in the DOAJ directory ?
• is the publisher a member of OASPA?
• are the terms of the publishing agreement and copyright reasonable?

The increase of open access publishing has attracted suspicious publishers to join. These fraudulent players are called predatory journals, predatory publications and predatory publishers. They exploit the open access publishing business model that requires authors to pay a fee to make their article freely available.

Predatory journals do collect author fees but they do not fulfil the requirements for proper scientific publishing, for example the peer review process. In practice, predatory journals publish without any quality control everything they are paid. The only purpose of these journals is to make money. Please note though that collecting author fees alone does not make a journal predatory – also legitimate journals collect author fees.

Some characteristics of predatory journals

  • they steal or imitate legitimate journals’ identities & logos
  • the scope of the journal is unreasonable (”International Journal of Science”)
  • there give incomplete, fake or no information of editorial board
  • the publisher has large amount of journals, but only few of them are publishing
  • they make false claims that the journal is indexed in well-known scholarly databases (for example Scopus, Web of Science)
  • they imply value by using misleading or fake metrics (fake impact factors)
  • they try to lure researchers to publish by contacting them with flattering e-mails
  • they claim that they do have peer review process, but they do not / it is suspiciously fast
  • they provide incomplete, incorrect, or no information about author fees

More information:

Think. Check. Submit. -checklist

Simon Linacre (2022): The Predator Effect: Understanding the Past, Present and Future of Deceptive Academic Journals