• English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • English 
    • English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • Login
View Item 
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

All of the LibraryCommunitiesPublication DateTitlesSubjectsAuthorsThis CollectionPublication DateTitlesSubjectsAuthorsProfilesView

My Account

Login

The Library

AboutNew SubmissionSubmission GuideSearch GuideRepository PolicyContact

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors

Faculty Decisions on Serials Subscriptions Differ Significantly from Decisions Predicted by a Bibliometric Tool.

  • CSV
  • RefMan
  • EndNote
  • BibTex
  • RefWorks
Author(s)
Sue F. Phelps
Keywords
evidence summary
serials
bibliometry
Bibliography. Library science. Information resources
Z

Full record
Show full item record
URI
http://hdl.handle.net/20.500.12424/3144511
Online Access
https://doaj.org/article/832f8c4d06954f8fb99592f2b5043709
Abstract

 Objective – To compare faculty choices of serials subscription cancellations to the scores of a bibliometric tool.
 
 Design – Natural experiment. Data was collected about faculty valuations of serials. The California Digital Library Weighted Value Algorithm (CDL-WVA) was used to measure the value of journals to a particular library. These two sets of scores were then compared. 
 
 Setting – A public research university in the United States of America.
 
 Subjects – Teaching and research faculty, as well as serials data.
 
 Methods – Experimental methodology was used to compare faculty valuations of serials (based on their journal cancellation choices) to bibliometric valuations of the same journal titles (determined by CDL-WVA scores) to identify the match rate between the faculty choices and the bibliographic data. Faculty were asked to select titles to cancel that totaled approximately 30% of the budget for their disciplinary fund code. This “keep” or “cancel” choice was the binary variable for the study. Usage data was gathered for articles downloaded through the link resolver for titles in each disciplinary dataset, and the CDL-WVA scores were determined for each journal title based on utility, quality, and cost effectiveness. 
 
 Titles within each dataset were ranked highest to lowest using the CDL-WVA scores within each fund code, and then by subscription cost for titles with the same CDL-WVA score. The journal titles selected for comparison were those that ranked above the approximate 30% of titles chosen for cancellation by faculty and CDL-WVA scores.
 
 Researchers estimated an odds ratio of faculty choosing to keep a title and a CDL-WVA score that indicated the title should be kept. The p-value for that result was less than 0.0001, indicating that there was a negligible probability that the results were by chance. They also applied logistic regression to quantify the association between the numeric score of CDL-WVA and the binary variable of the faculty choices. The p-value for this relationship was less than 0.0001, also indicating that the result was not by chance. A quadratic model plotted alongside the previous linear model follows a similar pattern. The p-value of the comparison is 0.0002, which indicates the quadratic model’s fit cannot be explained by random chance. 
 
 Main Results – The authors point out three outstanding findings. First, the match rate between faculty valuations and bibliometric scores for serials is 65%. This exceeds the 50% rate that would indicate random association, but also indicates a statistically significant difference between faculty and bibliometric valuations. Secondly, the match rate with the bibliometric scores for titles that faculty chose to keep (73%) was higher than those they chose to cancel (54%). Thirdly, the match rate increased with higher bibliometric scores.
 
 Conclusions – Though the authors identify only a modest degree of similarity between faculty and bibliometric valuations of serials, it is noted that there is more agreement in the higher valued serials than the lower valued serials. With that in mind, librarians might focus faculty review on the lower scoring titles in the future, taking into consideration that unique faculty interests may drive selection at that level and would need to be balanced with the mission of the library.
Date
2016-03-01
Type
Article
Identifier
oai:doaj.org/article:832f8c4d06954f8fb99592f2b5043709
1715-720X
https://doaj.org/article/832f8c4d06954f8fb99592f2b5043709
Collections
OAI Harvested Content

entitlement

 
DSpace software (copyright © 2002 - 2022)  DuraSpace
Quick Guide | Contact Us
Open Repository is a service operated by 
Atmire NV
 

Export search results

The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.