Point-of-Care Healthcare Databases Are an Overall Asset to Clinicians, but Different Databases May Vary in Usefulness Based on Personal Preferences

Carol D. Howe

Abstract


Objective – To evaluate the usefulness of three point-of-care healthcare databases (BMJ Point-of-Care, Clin-eguide, and Nursing Reference Centre) in clinical practice.

Design – A descriptive study analyzing questionnaire results.

Setting – Hospitals within Alberta, Canada’s two largest health regions (at the time of this study), with a third health region submitting a small number of responses.

Subjects – A total of 46 Alberta hospital personnel answered the questionnaire, including 19 clinicians, 7 administrators, 6 nurses, 1 librarian, 1 preceptor, and “some” project coordinators. Subjects were chosen using a non-probability sampling method.

Methods – The researchers developed an online questionnaire consisting of 17 questions and posted it on the University of Calgary’s Health Sciences Library and the Health Knowledge Network websites. The questions, in general, asked respondents how easy the databases were to search and use, whether the database content answered their clinical questions, and whether they would recommend the databases for future purchase. Most questions required a response for each of the three databases. The researchers collected quantitative data by using a Likert scale from 1 to 5, with 5 being the most positive answer and 1 being the most negative. They collected qualitative data by asking open-ended questions.

Main Results – With regard to ease of searching, BMJ Point-of-Care (BMJ) received the greatest number of responses (71%) at level 5. A smaller number of respondents (56%) rated Nursing Reference Centre (NRC) at level 5. Clin-eguide received 59% of the responses at level 5, but it also received the greatest number of responses at the next highest level (level 4). Respondents rated all three databases similarly with regard to levels 1 and 2.

Regarding how easy the resources were to learn, most respondents rated all three databases as easy to learn (BMJ, 77%; Clin-eguide, 72%; and NRC, 68%). Very few respondents thought any of the databases were difficult to learn.

The researchers gleaned from open-ended questions that the respondents generally thought all three databases were faster and easier to use than the conventional databases they had used. Respondents did not always agree with one another, however, about which features they liked or why.

With regard to content, most respondents agreed that the information in all three databases was relevant to their needs (94.6% for Clin-eguide and 87.9% for BMJ and NRC). Respondents also generally agreed that all three databases answered their questions to a high degree. Clin-eguide had the highest percentage of answers at levels 4 and 5 and the lowest percentage of answers at level 2. NRC was the reverse, with the lowest percentage of answers at levels 4 and 5 and the highest percentage of answers at level 2. Still, the researchers felt that all three databases answered respondents’ questions to a similar degree. In the open-ended questions, respondents voiced additional likes and dislikes about content, but again, answers among respondents were not consistent with one another.

Respondents were asked how often they would use the resource if it were available though their library. The majority of BMJ users reported that they would use it extensively or moderately. About 36% and 39% of NRC users reported they would use it extensively or moderately, respectively; while 43.5% and 34.8% of Clin-eguide users reported they would use it extensively or moderately, respectively. When asked if they would recommend the resource for the library, 84.8% would recommend Clin-eguide, 75% would recommend BMJ, and 67.6% would recommend NRC. The open-ended questions generally indicated that respondents would recommend all three databases.

Regarding how respondents preferred training on these resources, users preferred online tutorials to learn Clin-eguide and NRC. Users preferred website tips and instruction to learn BMJ. The least preferred methods of training for all three databases were live demonstration and classroom training.

Conclusion – None of the databases particularly stood out with regard to usability and content. The respondents generally liked all three databases.

It is important to note, however, that detailed comparisons among the databases were difficult to make. First, respondents did not always give an answer for all three databases for a given question. Because of this, and to present a more meaningful analysis, the researchers often reported the number of respondents who answered a certain way as a percentage rather than a number. Second, although the respondents generally liked all three databases, opinions about likes and dislikes were not consistent among respondents. For example, one respondent thought the NRC and Clin-eguide interfaces were more difficult to navigate than BMJ, while another respondent thought BMJ had the harder-to-navigate interface. The researchers felt that respondents’ prior experience with the databases may have influenced their preferences. They were unable to determine if the respondents’ professional interests had any influence on their preferences. Inconsistent responses made it difficult for researchers to assign an overall value to a given database. Therefore, this survey did not help to make definitive purchasing decisions. The researchers felt they would have to look at each resource much more carefully to make such a decision.

The researchers noted several ideas for future research of this sort. They acknowledged that the sample size was not big enough to determine statistical significance and thought that better marketing of the questionnaire may have increased the numbers. They also thought that it would be interesting to observe the respondents using the databases in real-time to find out such things as: what information they require in their daily work, how long it takes them to find it, and what they do with it once they find it.

Full Text: PDF



Evidence Based Library and Information Practice (EBLIP) | EBLIP on Twitter