Skip to Main Content

Scholarly Communications & Publishing

What is Research Impact?

Research Impact can be defined differently depending on the field, institution, or organization.But research impact definitions at their core boil down to the influence and benefits that is derived from the work. 

The definition can expand to include who is benefiting from or being influenced by - academia, society, the economy, healthcare, governmental policy, etc. 

Research Impact is not always as straightforward as a citation count, the scope of the impact is often iterative and nonlinear. 

Impact can be measured for:

  • Authors
  • Articles
  • Journals 

What is a good impact factor? This is highly dependent on the specific field of the journal. It is important to note that you should not compare impact factors across disciplines. A "good" impact factor in one field might be higher on average than that of another. For example: An impact factor in Engineering of 5 might be considered excellent but in the Medical field might be considered poor. 

Why does research impact matter?

Research Impact can effect:

  • Funding - The higher the impact of the research, the more likely you are to receive. 
  • Value Creation - Shows how the research has created value outside of academia and how it has contributed to improving society. 
  • Advancement  - the research can show how it advances understanding, academic theory, or research methods across disciplines. 
  • Innovation - Going from research to real world applications, creating new processes, products, or services.

Measuring Research Impact

There are several different methods of measuring your research impact. They include:

  • Bibliometrics - Traditional measurements like citation counts
  • Altmetrics - Tracks online engagement in more nontraditional formats
  • Stakeholder Feedback - Feedback or analysis from groups or organizations that are directly impacted by the research
  • Case studies - Other research done looking at the impact of the original work. This is done by examining how the research has influenced changes in society, helped develop new technology, and/or improved services.

Author Impact

Author impact is usually calculated on citation metrics. Citation metrics look at how many times an article has been cited by others in their work. A citation metric can be used to determine the productivity and influence of an author. The most common author metric is called an h-index. 

Think of the h-index as the yardstick we can use to compare individual authors. 

An H-index is calculated by counting the number of publications for which an author has been cited by other authors at least that same number of times. For example: if an author has an h-index of 15 that means that 15 of the authors articles have been cited at least 15 times each. If the author writes their 16th article but it is only cited twice - the H-index remains at 15, only increasing if their citations increase above the number of total articles.

Sites such as Google Scholar, Web of Science, and Scopus can help provide this information. Please note that MSOE does not have institutional access to Web of Science or Scopus.

There are several ways you can increase your impact:

  • Consider creating an ORCiD - ORCiD is an open, non-profit, international registry of unique and persistent identifiers for individual researchers. 
  • Share your research more widely via nontraditional means
  • Build your reputation, brand, and network
  • Increase accessibility to your work 

Article Impact 

Article impact can help determine if an article is important to a particular field of study. It can also give you an indication of the popularity or contested nature of the topic itself. While citation counts can be used in evaluating an article - it should not be solely relied upon as an indicator of actual quality. It only tells us one piece of the puzzle. It is important to note that some disciplines have a lower numbers of field specific journals and/or usage, these should not be compared to other disciplines. 

Article metrics should be derived from traditional sources as well as from altmetrics (sources like social media). 

Journal Impact

Journal impact measures how often a journal is cited, and while it cannot tell you if a journal is good - it can be a marker of visibility or importance within a field. While there is no guarantee that your article will be read, shared, or cited publishing in a highly cited journal can help grant you a level of prestige and give your resume/ CV that extra edge. However, please note that Journal Impact is not the only factor you should be considering when you publish. Metrics like Journal rankings, reputation, best fit, etc. should also be considered when choosing where to publish. 

Most commonly known journal metrics comes from the Journal Impact Factor (JIF) which is sourced from the Journal Citation Report (JCR) from Web of Science (a database). 

Finding the Right Journal for You

Looking for journals to publish in can be quite the daunting task. There are thousands of journals out there - including Predatory journals looking to scam researchers and the public. How does one choose? The following resources can help assist you in selecting a journal:

  • Scirev - Provides information on journal response times, review durations, review rounds, revision allowances, and more based off of researcher's experiences. 
  • Think. Check. Submit. - provides a range of tools and resources to help you identify trusted publishers. 
  • Ulrichsweb - Ulrichsweb is an easy to search source of detailed information on more than 300,000 periodicals of all types: academic and scholarly journals, e-journals, peer-reviewed titles, popular magazines, newspapers, newsletters, and more.
  • Journal/ Author Name Estimator (JANE) - an open access web-based tool that searches via keywords to provide you a list of relevant journals. 

Predatory Journals 

What is a predatory Journal?

A predatory journal is the term for a journal that takes advantage of researchers by employing deceptive or unscrupulous practices. In 2019, a group of stakeholders from 10 different countries came together to create an official definition of predatory publishing:

"Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices."

Grudniewicz, A., Moher, D., Cobey, K. D., Bryson, G. L., Cukier, S., Allen, K., ... & Ciro, J. B. (2019). Predatory journals: no definition, no defence. Science (576)7786. 210-212.

Common tactics to look for:

  • A webpage filled with partially or fully fabricated journals, articles, editors, or impact factors.
  • Advertising an expedited peer review process.
  • Exorbitant fees - these may or may not be disclosed. 

How is predatory publishing harmful?

  • Bad reputation - Publishing in a predatory journal can leave a stain on your scholarly reputation, the reputation of your institution, and even effect your career advancement. 
  • No Peer-Review Process - While a predatory journal may boast of having a peer-review process that is suspiciously fast, when in truth they have little to no peer-review process. Good peer-review takes time and predatory journals don't care to put in the effort. 
  • Lost or Hard to Find Work - True publishers are committed to preserving the work that they've published, predatory journals do not. Any work you publish with a predatory journal could vanish at any point. This makes it difficult to verify publication on your end. Predatory journals will also claim that they're indexed/ accessible in major databases when this is not at all true.

How to Spot a Predatory Journal 

Key things to look for:

  • On the Journal's website
    • Broad or vague journal scope
    • No "About us" page
    • Spelling and grammatical errors
    • No contact information/ weird address for a business location
    • Incorrect metrics/ indexing claims
  • Look at the articles they've published
    • Numerous articles published by the same author or group of authors
    • Bad article titles and abstracts
    • Poorly researched articles 
    • Few/ No published articles available for you to view 
    • Bad reputation among other authors or journal review sites. 
  • Look at their editorial board
    • Hard to find or nonexistent editorial board on the website 
    • If editors are listed - are they recognized as experts in their field?
    • Look up the editors' qualifications or credentials - are they legitimate?
  • Unsolicited Emails
    • You may get solicited out of the blue to submit for publication or review
    • The email may be poorly written, awkward or unprofessional, & contain spelling/ grammatical errors. 
    • The email may be filled with flattering pushy language. 
  • Unclear Processes & Fees
    • They may boast of a faster-than-average publication process
    • They may obscure or have a vague peer-review process
    • Fee structure is unclear & hard to find. 
    • Their "Instructions for Authors" page is unclear, vague, or poorly written
    • Their fees are different from other reputable journals 

If you have suggestions for how to make this page better, please contact Elizabeth Jerow, Library Director (jerow@msoe.edu).