Skip to main content

Evaluation based on scientific publications

This guide is based on the LibGuide created by Tritonia Academic Library.

Remember when analyzing journals

  • Most of the tools analyze journals as publication channels. These tools are not intended for evaluating the quality of a single article.
     
  • Most of the tools are based on citations so you can't compare results between disciplines.

Analyzing journals

Impact Factor (IF) is the oldest and most used indicator for measuring the impact of journals. Clarivate Analytics has the exclusive right to the impact factor and the IF numbers are only available in Web of Science and  Journal Citation Reports (JCR) database and are updated annually.

The Journal Impact Factor for 2017 is calculated as follows:

First, citations in 2017 to items published in the journal in 2015 and 2016 are added up. The sum is divided by the total number of citable items published in the journal in 2015 and 2016. The result is the Journal Impact Factor.



Restrictions of the impact factor
 

  • The impact factors of journals of different disciplines are not comparable!
  • The impact factor favours disciplines, where the publication rate is fast and where new articles are reacted upon quickly. IF is best suited for science and medicine.
  • Journals that publish many articles (especially review articles) usually have higher IF values than those that publish only a few articles.
  • Journals in specialised fields tend to have low IF numbers.
  • When calculating the IF values, only the journal citations in Clarivate Analytics Journal Citation Reports (JCR) database are taken into account. This means that the citations of other journals and monographic publications are not included in the IF value calculations. However, the self-citations of journals are taken into account.

Do it yourself: finding IF numbers of a specific journal

1. Access Journal Citation Reports (JCR) -database.

2. Write the name of the journal into the search box.

3. Now you can see the information of the journal of your interest and the IF numbers from the last couple of years.

4. If you want to know the journal's position in its subject category, browse the page until you see the link "Click here to view Rank, Cited Journal Data, Citing Journal Data, Box Plot, and Journal Relationships". Now you can select "Rank" on the left.

 

 


Do it yourself: comparing journals in a certain category

1. Access Journal Citation Reports (JCR) -database.

2. Select "Browse by Category".

3. On the left side menu you will find "Select Category". From there select the journal category you want to explore. Remember that it is advisable to compare only journals from the same journal category. 

4. Scroll down and select "Submit".

5. If you see only the name of the journal rank, go to the top of the page and select "Journals by RANK".

 

 

Publication Forum is a rating and classification system to support the quality assessment of research output. The evaluation is performed by discipline-specific Expert Panels. Publication Forum is part of the university funding model established by the Ministry of Education and Culture.

Publication Forum levels:
1 = basic level
2 = leading level
3 = highest level


Restrictions of Publication Forum

  • The purpose of the tool is not to evaluate single articles.
  • Publication Forum levels are suitable to use only when analyzing a large amount of publications.
  • Publication Forum levels 2 and 3 are awarded to a limited number of publications per discipline panel.

Do it yourself: search Publication Forum level of a certain journal

1. Go to Publication Forum (link above).

2. Write the name of the journal and select Search.

3. Scroll the page down to see the result.

 


Do it yourself: search Publication Forum levels of all journals of a certain topic

1. Go to Publication forum (link above).

2. Select whether you want to use MinEdu, Web of Science or Scopus fields and then choose the category of your interest.

3. Click Search and scroll the page down to see the results.

4. You can also download results as an Excel file.

Scopus has several tools to evaluate journals:

  • CiteScore is based on citations. For example, CiteScore for the year 2017 is calculated in the following way:
    First, citations in 2017 to items published in the journal in the years 2014, 2015 and 2016 are added up. The sum is divided by the total number of articles published in the journal in 2014, 2015 and 2016. The result is the CiteScore of the journal.
     
  • SCImago Journal Rank (SJR) measures journal’s prestige by taking into account the subject field, quality and reputation of the journal. It accounts for both the number of citations received by a journal and the prestige of the journals where the citations come from (based on the SJR score). For example, if both journals A and B receive the same number of citations, the SJR indicator of Journal A is higher, if its citations come from more prestigious journals than journal B’s.  

    SJR is calculated based on the Elsevier Scopus citations data over a three year period. The journal self-citations discount the indicator value. SJR is available in Scopus or free of charge through the SCImago Journal & Country Rank service.

  • Source-Normalized Impact per Paper (SNIP) measures a source’s contextual citation impact. It takes into account the characteristics of the source’s subject field. SNIP is calculated based on the citation data of the Elsevier’s Scopus database over a period of 3 years.  SNIP is available in Scopus or free of charge through the CWTS Journal Indicators website

 
SJR and SNIP in comparison
  • SJR is best for subject fields, where authors rapidly cite other works and which have a limited number of nuclear journals. SNIP suits better for evaluating journal impact in more heterogeneous fields, where journals are not the main publication channel.
  • Life and Health Sciences usually have the highest SJR values (SJR tends to make the difference between journals larger and enhances the position of the most prestigious journals). Furthermore, Technology and Social Sciences tend to have the highest SNIP scores. 
  • The SNIP values of small and multi-disciplinary journals are best viewed with caution, as the scores may greatly vary from year to year.

 

Do it yourself: journal metrics in Scopus

1. Go to Scopus

2. Select Sources from top of the page.

3.  If you want to search a specific journal, select Title from the drop down menu and write the name of the journal to the search box.

4. On the result page you can see  CiteScore, SJR and SNIP values for the journal.

 


Do it yourself: comparing journal categories in Scopus

1. Access the Scopus database.

2. Select Sources from top of the page.

3. If you want to compare journals in a certain journal category click the search box and select from the menu which category you are interested in.

4. Scroll down the page to select Apply.

5. From result page you can select, how to sort the results, for example by CiteScore or SJR.

As open access publishing has increased, so have attempts at fraud. The so-called predatory open-access journals collect overtly large author fees or their peer-review does not fulfill the set requirements. Notification about the publication fee can also arrive only after the article is ready for publishing.

To analyze the reliability of the publisher, you can use the following check-list:

• does the publisher disclose full contact information, including an address?
• is information available on who owns the publisher and in which country the publisher operates?
• who are the members of the journal's editorial board and do they provide full information on their institutional affiliation?
• is there information on the author fees?
• are the previous articles written by well-known scholars and institutions and are they of a high-quality in your opinion?
• what does the peer-review process entail and is there information on it?
• is the publication listed in the DOAJ directory?
• is the publisher a member of OASPA?
• are the terms of the publishing agreement and copyright reasonable?

Loading ...