• English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • English 
    • English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • Login
View Item 
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

All of the LibraryCommunitiesPublication DateTitlesSubjectsAuthorsThis CollectionPublication DateTitlesSubjectsAuthorsProfilesView

My Account

LoginRegister

The Library

AboutNew SubmissionSubmission GuideSearch GuideRepository PolicyContact

Learning From Others About Research Evidence (editorial)

  • CSV
  • RefMan
  • EndNote
  • BibTex
  • RefWorks
Author(s)
Alison Brettle
Keywords
research evidence
LIS research
evidence
research designs
research methodologies
Bibliography. Library science. Information resources
Z
DOAJ:Library and Information Science
DOAJ:Social Sciences

Full record
Show full item record
URI
http://hdl.handle.net/20.500.12424/1530789
Online Access
https://doaj.org/article/4327cc03750548ac925d2f17bfe2da80
Abstract
Welcome to the June issue of EBLIP, our firstto be published with an HTML version as wellas PDFs for each article. I hope you enjoy andfind the alternative formats useful. As usualthe issue comprises an interesting range ofevidence summaries and articles that I hopeyou will find useful in applying evidence toyour practice.When considering evidence, two recent trips toEdinburgh got me thinking about the widerange of study designs or methods that areuseful for generating evidence, and also howwe can learn about their use from otherprofessions.The first trip was as part of the cadre of the LISDREaM project (http://lisresearch.org/dreamproject/).DREaM has been set up by the LISResearch Coalition to develop a sustainableLIS research network in the UK. As part ofthis, a series of workshops aims to introduceLIS practitioners to a wider range of researchmethods, thus expanding the methods used inLIS research. Indeed, a quick scan of thecontents of this issue show a preponderance ofsurveys, interviews, and citation analysis,suggesting that broadening our knowledge ofmethods may well be a useful idea. Theworkshops are highly interactive and, at eachsession experts from outside the LIS disciplineintroduce particular research methods andoutline how they could be used in LISapplications. As a result, I can see the valueand understand when to use research methodssuch as social network analysis, horizonscanning, ethnography, discourse analysis, andrepertory grids – as well as knowing that datamining is something I’m likely to avoid! So farI’ve shared my new knowledge with a PhDstudent who was considering her methodologyand incorporated my new knowledge ofhorizon scanning into a bid for researchfunding. The next (and more exciting) step isto think of a situation where I can apply one ofthese methods to examining an aspect of LIS practice.The second trip was the British Association ofCounselling and Psychotherapy ResearchConference, an event which I've attended forthe last few years (don’t ask!). Each time, I've been struck by both the similarities and differences between counselling and LIS research in the UK. Counselling research is conducted by a relatively small number of individuals and, as in LIS, the vast majority of practitioners don’t engage in writing any research up for publication (Clapton, 2010). Particular types of research dominate in counselling, but most are highly qualitative in manner, e.g. using biographical approaches. I can’t immediately see how these could become widely used in LIS, but I do find it fascinating to hear about different approaches and the evidence this provides. Like many of the things that LIS professionals do, counselling and psychotherapy is a complex intervention and it is not always immediately apparent what has caused an effect. It may well be that the counsellor is only one of a number of elements that has led to a positive outcome or a change in effect. This makes it difficult to generate evidence about the effectiveness of counselling, similar, for example, to trying to generate evidence regarding the effectiveness of information literacy.Due to political drivers there is an increasing interest (and resistance) to a more evidence based approach in counselling and psychotherapy. One of the main areas of resistance towards evidence based practice (EBP) in counselling is that the medical model or paradigm of EBP and the view that the randomized controlled trial (RCT) is the method of choice for providing high quality evidence on the effectiveness of services doesn't fit with the way counsellors provide services to their clients. Each client is seen as an individual and therapy is provided according to a client’s particular needs at that time rather than following a set manual or course. This makes it impossible to assess in a "randomized controlled" manner, before even beginning to worry about the ethical and practical implications of conducting an experimental study.The unsuitability of the RCT has also been raised regarding generating evidence for EBLIP (e.g. Banks, 2008); however, “best evidence” doesn’t need to be an RCT. The definition of EBLIP provided by Booth (2006) mentions best quality evidence (generated from research, among other elements, but makes no mention of particular research designs). In addition, both Eldredge (2004) and Crumley and Koufogiannakis (2002) have argued for the consideration of a wide type of study designs as evidence within EBLIP, a viewpoint with which I have long agreed. After all, it is much more important to choose a design that is suitable to answer the question at hand and provide good quality evidence, rather than trying to use a "good quality" design at the expense of finding relevant evidence. Bearing that in mind, I'm racking my brains to think of how I can use webometrics and techniques from history to investigate my practice. At the same time, I urge you to think widely about research evidence and try exploring some different methodologies and see what evidence they can reveal.
Date
2012-06-01
Type
Article
Identifier
oai:doaj.org/article:4327cc03750548ac925d2f17bfe2da80
1715-720X
https://doaj.org/article/4327cc03750548ac925d2f17bfe2da80
Collections
OAI Harvested Content

entitlement

 

Related items

Showing items related by title, author, creator and subject.

  • Thumbnail

    Multi-sectoral data linkage for intervention and policy evaluation

    Lyons, Ronan (2016-07-05)
  • Thumbnail

    DATA QUALITY IN CROSS-NATIONAL SURVEY. The Quality Indicators Response Rate, Nonresponse Bias and Fieldwork Efforts

    Halbherr, Verena (2016-07-05)
  • Thumbnail

    Trialling a new Survey Project Management Portal on the European Values Study 2017

    Brislinger, Evelyn; Kurti, Dafina; Davari, Masoud; Klas, Claus-Peter (2018-07-03)
DSpace software (copyright © 2002 - 2021)  DuraSpace
Quick Guide | Contact Us
Open Repository is a service operated by 
Atmire NV
 

Export search results

The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.