Skip to Main Content

Scholarly Communication

OU Libraries guides scholars in matters relating to scholarly communication, which we define as the formal and informal ways research and scholarly works are created, evaluated, disseminated, preserved, used, and transformed.

Funding for publications

The Senate University Research Committee (URC) has a fund to support publication costs related to publishing books, reprints, and page charges.

The URC Books, Reprints, and Page Charge Reimbursements - Award Amount: $1,000 max
Information can be found on the campus internet, and requests should be sent to the current committee chairs. 

Lakshmi Raman (CAS) CO-CHAIR 2025-2026  raman@oakland.edu
Olga Ehrlich (SON) CO-CHAIR 2025-2026 oehrlich@oakland.edu

*For publication costs related to open-access publishing, see the Open Access Fund tab

Selecting Where to Publish

Selecting Journals

You can use tools to find journals in your discipline or related to your manuscript.

  • DOAJ - Directory of Open Access Journals - largest repository of OA journal. Includes a filter to find journals without publishing fees. 
  • Eigenfactor.org - uses the whole citation network to factor in discipline differences
  • Journal Citation Report database - list journals by category and you can examine their impact factor within that field.
  • SCImago Journal and Country Rank - lists journals by category and you can examine their rank within that field. Also allows you to view all Open Access journals by subject area.
  • Ulrich's Periodicals Directory- provides information on a journal, publisher information, open access status, and what databases index the journal.
  • Scopus – compares journals using SCImago
  • Google Scholar Journal Metrics - uses it own h-index to compare

Manuscript matching tools:

  • Jane - is a journal/author name estimator tool that compares your document to millions of documents in Medline to find the best matching journals, authors or articles. Still need to evaluate suggested journals.
  • SPI-Hub journal finder tool - Scholarly Publishing Information Hub from  Center for Knowledge Management at Vanderbilt University Medical Center.
  • Web of Science - Match Manuscript tool (need to create account or log-in to WOS)
  • Jot: a free, open-source web application that matches manuscripts in the fields of biomedicine and life sciences with suitable  (Yale.edu)

Publisher tools:

Book publishers

Scams and Fake Journals
Unfortunately, academics who are required to publish to advance in their careers are increasingly targets for internet scams with fake journals duping researchers into publishing for a fee, only to not actually publish the work, or publish but lacking academic standards for quality. Publishing in bogus journals (sometimes referred to as “predatory journals”) can negatively impact your career. 

Journals that appear to publish on a variety of topics are generally a red flag. Scammers have become very adept and polished in their methods. So, it's best to spend some time up front to review the publications you are considering before sending off your manuscript.  

Comparing Journals

When considering publishing in a journal or reviewing a colleague's scholarship record, you should conduct a comprehensive review of each journal. Many factors are influencing the current scholarly publishing system, and you can not simply rely on the "reputation" of a known publisher or journal.

Step 1: Always read other research published in the journal. Does it match the Aim & Scope? Is it sound research?

Step 2: Consider the following indicators for each journal:

  • URL for journal homepage
  • How long has the journal been published?
  • Who is the publisher?
  • What is the publisher's business model? For-profit, non-profit?
  • Is the publisher a member of COPE (Committee on Publication Ethics) or publicly committed to ethics in publishing as outlined by COPE
  • Where is the journal indexed?  If fully OA, is the journal indexed in DOAJ?
  • Is the scope of the journal clearly defined?
  • How is peer review done?
  • Who's on the editorial board?
  • What are the citation-related metrics within discipline/rank within discipline?
  • How frequently are articles published in the journal subject to retraction?
  • What is the acceptance rate of the journal?
  • What is the required copyright/licensing agreement? 

*For more on journal publishing ethics, view the Code of Conduct and Best Practice for Journal Editors.
*For more on evaluating journals, view the Principles of Transparency guide from the OASPA. 

What about the impact factor of a journal?

There are many different methods for comparing journals based on various metrics. Informed and careful use of these data is essential. Journals from different disciplines cannot be easily compared.

  • Journal impact factor attempts to measure a journal's "importance" by calculating the number of times its articles are cited.
  • Eigenfactor is a rating of the total "importance" of a scientific journal based on the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution than those from poorly ranked journals.
  • SCImago is similar to eigenfactor but based on Scopus data.

*For more in-depth information, see  ASSESSING JOURNAL QUALITY: IMPACT FACTORS (Boston College Libraries)

*For more on research impact and metrics, see the Research Impact tab.

 

The Problem with Journal Impact Factors

The journal impact factor was originally created as a tool to help librarians identify journals to purchase, not as a measure of the scientific quality of a journal or a particular article. With that in mind, it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment. 

Drawbacks of traditional metrics

  • Using journal impact factors (JIF)  to determine journal quality is a flawed metric. JIF calculates the average citation count for a journal. One highly cited article can skew a citation count for the entire journal.
  • JIFs don’t weigh whether the citations are positive or negative. Large citation counts don’t represent an individual article or a journal’s overall quality.
  • JIFs are susceptible to manipulation by journal editors, data used to calculate JIFs are neither transparent nor openly available to the public. 
  • JIFs have large variations between disciplines and can’t accurately represent the long-term impact.

 

CALL FOR CHANGE

Declaration on Research Assessment (DORA)

In 2012, the San Francisco Declaration on Research Assessment (DORA), initiated by the American Society for Cell Biology (ASCB) together with a group of editors and publishers of scholarly journals, recognized the need to improve the ways in which the outputs of scientific research are evaluated. They released an international declaration in 2013 calling on the world scientific community to eliminate the role of the journal impact factor in evaluating research for funding, hiring, promotion, or institutional effectiveness. To date, 26,573 individuals and organizations in 168 countries have signed DORA. 

(From San Francisco Declaration on Research Assessment Putting Science Into The Assessment of Research)

Read more about the ongoing work of DORA.