Bibliography. Library science. Information resources
DOAJ:Library and Information Science
Full recordShow full item record
AbstractWelcome to the June issue of EBLIP, our firstto be published with an HTML version as wellas PDFs for each article. I hope you enjoy andfind the alternative formats useful. As usualthe issue comprises an interesting range ofevidence summaries and articles that I hopeyou will find useful in applying evidence toyour practice.When considering evidence, two recent trips toEdinburgh got me thinking about the widerange of study designs or methods that areuseful for generating evidence, and also howwe can learn about their use from otherprofessions.The first trip was as part of the cadre of the LISDREaM project (http://lisresearch.org/dreamproject/).DREaM has been set up by the LISResearch Coalition to develop a sustainableLIS research network in the UK. As part ofthis, a series of workshops aims to introduceLIS practitioners to a wider range of researchmethods, thus expanding the methods used inLIS research. Indeed, a quick scan of thecontents of this issue show a preponderance ofsurveys, interviews, and citation analysis,suggesting that broadening our knowledge ofmethods may well be a useful idea. Theworkshops are highly interactive and, at eachsession experts from outside the LIS disciplineintroduce particular research methods andoutline how they could be used in LISapplications. As a result, I can see the valueand understand when to use research methodssuch as social network analysis, horizonscanning, ethnography, discourse analysis, andrepertory grids – as well as knowing that datamining is something I’m likely to avoid! So farI’ve shared my new knowledge with a PhDstudent who was considering her methodologyand incorporated my new knowledge ofhorizon scanning into a bid for researchfunding. The next (and more exciting) step isto think of a situation where I can apply one ofthese methods to examining an aspect of LIS practice.The second trip was the British Association ofCounselling and Psychotherapy ResearchConference, an event which I've attended forthe last few years (don’t ask!). Each time, I've been struck by both the similarities and differences between counselling and LIS research in the UK. Counselling research is conducted by a relatively small number of individuals and, as in LIS, the vast majority of practitioners don’t engage in writing any research up for publication (Clapton, 2010). Particular types of research dominate in counselling, but most are highly qualitative in manner, e.g. using biographical approaches. I can’t immediately see how these could become widely used in LIS, but I do find it fascinating to hear about different approaches and the evidence this provides. Like many of the things that LIS professionals do, counselling and psychotherapy is a complex intervention and it is not always immediately apparent what has caused an effect. It may well be that the counsellor is only one of a number of elements that has led to a positive outcome or a change in effect. This makes it difficult to generate evidence about the effectiveness of counselling, similar, for example, to trying to generate evidence regarding the effectiveness of information literacy.Due to political drivers there is an increasing interest (and resistance) to a more evidence based approach in counselling and psychotherapy. One of the main areas of resistance towards evidence based practice (EBP) in counselling is that the medical model or paradigm of EBP and the view that the randomized controlled trial (RCT) is the method of choice for providing high quality evidence on the effectiveness of services doesn't fit with the way counsellors provide services to their clients. Each client is seen as an individual and therapy is provided according to a client’s particular needs at that time rather than following a set manual or course. This makes it impossible to assess in a "randomized controlled" manner, before even beginning to worry about the ethical and practical implications of conducting an experimental study.The unsuitability of the RCT has also been raised regarding generating evidence for EBLIP (e.g. Banks, 2008); however, “best evidence” doesn’t need to be an RCT. The definition of EBLIP provided by Booth (2006) mentions best quality evidence (generated from research, among other elements, but makes no mention of particular research designs). In addition, both Eldredge (2004) and Crumley and Koufogiannakis (2002) have argued for the consideration of a wide type of study designs as evidence within EBLIP, a viewpoint with which I have long agreed. After all, it is much more important to choose a design that is suitable to answer the question at hand and provide good quality evidence, rather than trying to use a "good quality" design at the expense of finding relevant evidence. Bearing that in mind, I'm racking my brains to think of how I can use webometrics and techniques from history to investigate my practice. At the same time, I urge you to think widely about research evidence and try exploring some different methodologies and see what evidence they can reveal.
Showing items related by title, author, creator and subject.