David Guest wrote: > One for Geoff. If you mean Geoff Sayer, he works for Tom Bowden at Healthlink now, doesn't he, not for HCN?
> The current blurb from HCN quotes a BMJ article > http://www.bmj.com/cgi/rapidpdf/bmj.39003.640567.AEv1, "Googling for a > diagnosis—use of Google as a diagnostic aid: internet based study". > > The study looked at 26 cases from the NEJM Case Records articles in > 2005. The researchers identified 3 to 5 key terms for each article and > ran them through Google, which found the correct diagnosis in 15 of the > 26 selected cases or about 60%. HCN concludes that using the internet to > diagnose rare conditions or presentations is wrong 40% of the time. That is an incorrect conclusion, based on the evidence presented by the paper. A correct assertion based on that particular study's results would be that Google fails to suggest the correct diagnosis in 42% (95% confidence interval 23% to 62%) of cases of a small subset of rare diseases. The researchers explicitly noted that Google does not suggest diagnoses, and thus it cannot "get it wrong" - it only provides prompts and suggested possibilities. The end user (in their case, themselves, blinded to the true diagnosis) has to use their clinical judgement to pick the best candidates from the first few pages of hits returned by Google, and then follow those up using more conventional resources, and perhaps arrange the appropriate investigations. Their study design forced them to pick something. A stronger study design would have allowed them to declare, for some cases, that none of the Google results really fit the case. > It > goes on to suggest that doctors would be well served by using their MD > Reference tool which searches Harrison's, Murtagh and CURRENT Diagnosis > and Treatment Series and other more reliable sources. Well, if one is flogging evidence-based medicine products, I think that one's advertising and promotional material needs to be evidence-based... In this case, since the details of the cases and the exact search terms used by the investigators are all provided (see http://www.bmj.com/cgi/data/bmj.39003.640567.AE/DC1/1) it is but an hour or two's work to plug those search terms into HCN's MD Reference Tool product and see what the results are, using methods as described in the BMJ paper. Perhaps someone with access to the HCN MD Reference Tool product might like to do this little study for them, and to publish the results here, and/or submit the results as a letter to the Editor of the MJA (I'm happy to help, but don't have access to the HCN products in question)? > I am not convinced and am reminded by all this of the Fed's Health > InSite, http://www.healthinsite.gov.au/. Superficially this looks like a > worthwhile endeavour but I cannot but help get the feeling that they > should close it down, put a redirect on the home page and use a tenth > the money to get medical experts to make an entry in the wikipedia, > http://www.healthinsite.gov.au/. Yes, I distinctly recall that Health Insite received $6 million funding in a Federal Budget some years ago. $6m would have paid for quite a lot of time for relevant local experts to collaborate on a set of medical wikipedia pages with edit rights restricted to trusted contributors - something like http://www.ganfyd.org but with the issue of who can and can't contribute thought through a bit better, and with money to help kick along the process - not much money, but offering a grand or so to get experts to spend a few hours on a weekend or two to write a page or so on relevant topics goes a long way - even allowing for editorial overheads (eg employing one or two professional copy editor to tidy up what teh health professionals write), that $6m of Healthinsite funding might have resulted in the creation and maintenance of several thousand medical wikipedia pages by now, which themselves include all the links that Health Insite contains). Tim C _______________________________________________ Gpcg_talk mailing list [email protected] http://ozdocit.org/cgi-bin/mailman/listinfo/gpcg_talk
