Evidence Based Practice

Protecting resources, promoting value: a doctor’s guide to cutting waste in clinical care – Academy of Medical Royal Colleges – November 2014

Posted on November 21, 2014. Filed under: Evidence Based Practice, Medicine | Tags: |

Protecting resources, promoting value: a doctor’s guide to cutting waste in clinical care – Academy of Medical Royal Colleges – November 2014

Cut NHS waste through NICE’s ‘do not do’ database – NICE – 6 November 2014

Advertisements
Read Full Post | Make a Comment ( Comments Off on Protecting resources, promoting value: a doctor’s guide to cutting waste in clinical care – Academy of Medical Royal Colleges – November 2014 )

Smart governance for health and well-being: the evidence – WHO – 2014

Posted on October 14, 2014. Filed under: Evidence Based Practice, Public Hlth & Hlth Promotion | Tags: |

Smart governance for health and well-being: the evidence – WHO – 2014

ISBN 978 92 890 5066 1

“Governance for health describes the attempts of governments and other actors to steer communities, whole countries or even groups of countries in the pursuit of health as integral to well-being. This study tracks recent innovations to address the priority determinants of health and categorizes them into five strategic approaches to smart governance for health. It relates the emergence of joint action by the health and non-health sectors, public and private actors and citizens, all of which have increasing roles to play in achieving seminal changes in 21st-century societies.

The chapters presented here were initially commissioned as papers to provide the evidence base for a study to support the new European policy framework for health and well-being, Health 2020. Calling for a health-in-all-policies, whole-of-government and whole-of-society approach, Health 2020 uses governance as a lens through which to view all technical areas of health. This book provides access to background papers for the study on governance for health in the 21st century, published by the WHO Regional Office for Europe in 2012. Prepared by eminent experts, the chapters provide further detail on the issues raised, and culminate in a comprehensive depiction of what constitutes smart governance for health in the 21st century.”

Read Full Post | Make a Comment ( Comments Off on Smart governance for health and well-being: the evidence – WHO – 2014 )

A formative evaluation of Collaboration for Leadership in Applied Health Research and Care (CLAHRC): institutional entrepreneurship for service innovation – Health Serv Deliv Res Sept 2014;2(31)

Posted on October 14, 2014. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Systems Improvement | Tags: |

A formative evaluation of Collaboration for Leadership in Applied Health Research and Care (CLAHRC): institutional entrepreneurship for service innovation – Health Serv Deliv Res Sept 2014;2(31) “Background Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) are a time-limited funded initiative to form new service and research collaboratives in the English health system. Their aim is to bring together NHS organisations and universities to accelerate the translation of evidence-based innovation into clinical practice. In doing so, CLAHRCs are positioned to help close the second translation gap (T2), which is described as the problem of introducing and implementing new research and products into clinical practice. Objectives In this study, we draw on ideas from institutional theory and institutional entrepreneurship to examine how actors may engage in reshaping existing institutional practices in order to support, and help sustain efforts to close the T2. Our objective was to understand how the institutional context shapes actors’ attempts to close the T2 by focusing on the CLAHRC initiative. Methods The study employed a longitudinal mixed-methods approach. Qualitative case studies combined interview data (174 in total across all nine CLAHRCs and the four in-depth sites), archival data and field notes from observations, over a 4-year period (2009–13). Staff central to the initiatives were interviewed, including CLAHRC senior managers; theme leads; and other higher education institution and NHS staff involved in CLAHRCs. Quantitative social network analysis (SNA) employed a web-based sociometric approach to capture actors’ own individual (i.e. ego) networks of interaction across two points in time (2011 and 2013) in the four in-depth sites, and their personal characteristics and roles. Results We developed a process-based model of institutional entrepreneurship that encompassed the different types of work undertaken. First, ‘envisaging’ was the work undertaken by actors in developing an ‘embryonic’ vision of change, based on the interplay between themselves and the context in which they were situated. Second, ‘engaging’ was the work through which actors signed up key stakeholders to the CLAHRC. Third, ‘embedding’ was the work through which actors sought to reshape existing institutional practices so that they were more aligned with the ideals of CLAHRC. ‘Reflecting’ involved actors reconsidering their initial decisions, and learning from the process of establishing CLAHRCs. Furthermore, we employed the qualitative data to develop five different archetype models for organising knowledge translation, and considered under what founding conditions they are more or less likely to emerge. The quantitative SNA results suggested that actors’ networks changed over time, but that important institutional influences continued to constrain patterns of interactions of actors across different groups. Conclusion The development of CLAHRCs holds important lessons for policy-makers. Policy-makers need to consider whether or not they set out a defined template for such translational initiatives, since the existence of institutional antecedents and the social position of actors acted to ‘lock in’ many CLAHRCs. Although antecedent conditions and the presence of pre-existing organisational relationships are important for the mobilisation of CLAHRCs, these same conditions may constrain radical change, innovation and the translation of research into practice. Future research needs to take account of the effects of institutional context, which helps explain why many initiatives may not fully achieve their desired aims.”

Read Full Post | Make a Comment ( Comments Off on A formative evaluation of Collaboration for Leadership in Applied Health Research and Care (CLAHRC): institutional entrepreneurship for service innovation – Health Serv Deliv Res Sept 2014;2(31) )

Evidence for Success. The guide to getting evidence and using it – Evaluation Support Scotland, Knowledge Translation Network – August 2014

Posted on October 2, 2014. Filed under: Evidence Based Practice, NGOs / Third Sector |

Evidence for Success. The guide to getting evidence and using it – Evaluation Support Scotland, Knowledge Translation Network – August 2014

“The guide offers easy to follow, step-by-step guidance and resources to support organisations to use evidence to influence policy and practice. It is for anyone who wants to use evidence to improve policy and practice, regardless of the level of experience they have in doing so. Therefore, it is intended that this guide will also be of value to a wide range of stakeholders, including: practitioners, service managers, funders and commissioners, and policy makers and planners.”

Read Full Post | Make a Comment ( Comments Off on Evidence for Success. The guide to getting evidence and using it – Evaluation Support Scotland, Knowledge Translation Network – August 2014 )

Evidence for Success Guide – Evaluation Support Scotland – 20 August 2014

Posted on September 9, 2014. Filed under: Evidence Based Practice |

Evidence for Success Guide – Evaluation Support Scotland – 20 August 2014

“The “Evidence for Success” guide was produced by the KTN and published in August 2014, which was established in 2012 to facilitate and share learning about effective knowledge translation and dissemination activities.

The guide offers easy to follow, step-by-step guidance and resources to support organisations to use evidence to influence policy and practice. It is for anyone who wants to use evidence to improve policy and practice, regardless of the level of experience they have in doing so. Therefore, it is intended that this guide will also be of value to a wide range of stakeholders, including: practitioners, service managers, funders and commissioners, and policy makers and planners.”

Read Full Post | Make a Comment ( Comments Off on Evidence for Success Guide – Evaluation Support Scotland – 20 August 2014 )

Center for Evidence and Practice Improvement (CEPI) – AHRQ – [AHRQ = US Agency for Healthcare Research and Quality]

Posted on July 23, 2014. Filed under: Evidence Based Practice, Health Informatics, Preventive Healthcare, Primary Hlth Care | Tags: |

Center for Evidence and Practice Improvement (CEPI) – AHRQ – [AHRQ = US Agency for Healthcare Research and Quality]

CEPI consists of five divisions:

Evidence-Based Practice Center Program
U.S. Preventive Services Task Force Program
Division of Decision Science and Patient Engagement
Division of Health Information Technology
Division of Practice Improvement

CEPI is also home to AHRQ’s National Center for Excellence in Primary Care Research (NCEPCR)

Read Full Post | Make a Comment ( Comments Off on Center for Evidence and Practice Improvement (CEPI) – AHRQ – [AHRQ = US Agency for Healthcare Research and Quality] )

Registries for Evaluating Patient Outcomes: A User’s Guide: 3rd Edition – AHRQ [AHRQ = US Agency for Healthcare Research and Quality] – April 2014 – now available in ebook format

Posted on July 23, 2014. Filed under: Evidence Based Practice | Tags: |

Registries for Evaluating Patient Outcomes: A User’s Guide: 3rd Edition – AHRQ [AHRQ = US Agency for Healthcare Research and Quality] – April 2014 – now available in ebook format

Read Full Post | Make a Comment ( Comments Off on Registries for Evaluating Patient Outcomes: A User’s Guide: 3rd Edition – AHRQ [AHRQ = US Agency for Healthcare Research and Quality] – April 2014 – now available in ebook format )

How do we know if a program made a difference? A guide to statistical methods for program impact evaluation – 2014

Posted on June 24, 2014. Filed under: Evidence Based Practice, Public Hlth & Hlth Promotion | Tags: , |

How do we know if a program made a difference? A guide to statistical methods for program impact evaluation – 2014

Lance, P., D. Guilkey, A. Hattori and G. Angeles. (2014). How do we know if a program made a difference? A guide to statistical methods for program impact evaluation. Chapel Hill, North Carolina: MEASURE Evaluation.
MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) and the U.S. President’s Emergency Plan for AIDS Relief (PEPFAR).

“Abstract:
This manual provides an overview of core statistical and econometric methods for program impact evaluation (and, more generally, causal modelling). More detailed and advanced than typical brief reviews of the subject, it also strives to be more approachable to a wider range of readers than the advanced theoretical literature on program impact evaluation estimators. It thus forms a bridge between more basic treatments of the essentials of impact evaluation methods and the more advanced discussions. It seeks to discuss impact evaluation estimators in a thorough manner that does justice to their complexity, but in a fashion that is approachable.

The manual is targeted to: public health professionals at programs, government agencies, and NGOs who are the consumers of the information generated by program impact evaluations; professionals serving the aforementioned role in any area of programming that influences human welfare; graduate students in public health, public policy and the social sciences; technical staff at evaluation projects; journalists looking for a more nuanced understanding of the steady stream of impact (and, more broadly, causal) studies on which they are asked to report; analysts at health analytics organizations; and so on.”

Read Full Post | Make a Comment ( Comments Off on How do we know if a program made a difference? A guide to statistical methods for program impact evaluation – 2014 )

Sir Humphrey and the professors: What does Whitehall want from academics? – University of Manchester – April 2014

Posted on June 3, 2014. Filed under: Evidence Based Practice, Health Policy, Knowledge Translation |

Sir Humphrey and the professors: What does Whitehall want from academics? – University of Manchester – April 2014

A survey of senior civil servants’ views on the accessibility and utility of academic research and expertise

Read Full Post | Make a Comment ( Comments Off on Sir Humphrey and the professors: What does Whitehall want from academics? – University of Manchester – April 2014 )

Observational Evidence and Strength of Evidence Domains: Case Examples. Research White Paper – AHRQ – April 2014

Posted on May 6, 2014. Filed under: Evidence Based Practice, Research | Tags: |

Observational Evidence and Strength of Evidence Domains: Case Examples. Research White Paper – AHRQ – April 2014

O’Neil M, Berkman N, Hartling L, Chang S, Anderson J, Motu’apuaka M, Guise JM, McDonagh M. Observational Evidence and Strength of Evidence Domains: Case Examples. Research White Paper. (Prepared by the AHRQ Scientific Resource Center under Contract No. 290-2012-00004-C). AHRQ Publication No. 14-EHC001-EF. Rockville, MD: Agency for Healthcare Research and Quality. April 2014.

“Structured Abstract

Background. Systematic reviews of health care interventions most often focus on randomized controlled trials. However, certain circumstances warrant consideration of observational evidence, and such studies are increasingly being included as evidence in systematic reviews.

Methods. To illustrate the use of observational evidence, we present case examples of systematic reviews in which observational evidence was considered as well as case examples of individual observational studies and how they demonstrate various strength of evidence domains in accordance with current AHRQ Evidence-based Practice Center methods guidance.

Results. In the presented examples, observational evidence is used when randomized controlled trials are infeasible or raise ethical concerns, lack generalizability, or provide insufficient data. Individual study case examples highlight how observational evidence may fulfill required strength of evidence domains, such as study limitations (reduced risk of selection, detection, performance, and attrition); directness; consistency; precision; and reporting bias (publication, selective outcome reporting, and selective analysis reporting), as well as additional domains of dose-response association, plausible confounding that would decrease the observed effect, and strength of association (magnitude of effect).

Conclusions. The cases highlighted in this paper demonstrate how observational studies may provide moderate-to (rarely) high-strength evidence in systematic reviews.”

Read Full Post | Make a Comment ( Comments Off on Observational Evidence and Strength of Evidence Domains: Case Examples. Research White Paper – AHRQ – April 2014 )

Making sense of evidence in management decisions: the role of research-based knowledge on innovation adoption and implementation in health care. Health Serv Deliv Res 2014;2(6)

Posted on April 17, 2014. Filed under: Evidence Based Practice, Health Mgmt Policy Planning | Tags: , |

Making sense of evidence in management decisions: the role of research-based knowledge on innovation adoption and implementation in health care. Health Serv Deliv Res 2014;2(6)

Kyratsis Y, Ahmad R, Hatzaras K, Iwami M, Holmes A..

Extract from the Abstract

“Background
Although innovation can improve patient care, implementing new ideas is often challenging. Previous research found that professional attitudes, shaped in part by health policies and organisational cultures, contribute to differing perceptions of innovation ‘evidence’. However, we still know little about how evidence is empirically accessed and used by organisational decision-makers when innovations are introduced.

Aims and objectives
We aimed to investigate the use of different sources and types of evidence in innovation decisions to answer the following questions: how do managers make sense of evidence? What role does evidence play in management decision-making when adopting and implementing innovations in health care? How do wider contextual conditions and intraorganisational capacity influence research use and application by health-care managers?”

Read Full Post | Make a Comment ( Comments Off on Making sense of evidence in management decisions: the role of research-based knowledge on innovation adoption and implementation in health care. Health Serv Deliv Res 2014;2(6) )

End of the road for homeopathy? – Croakey – 9 April 2014

Posted on April 9, 2014. Filed under: Complementary & Altern Care, Evidence Based Practice | Tags: |

End of the road for homeopathy? – Croakey – 9 April 2014

A new report from the NHMRC regarding the evidence base for homeopathy raises questions regarding the funding of a range of homeopathic treatments. Many thanks to Lorretta Marron, CEO, Friends of Science in Medicine for this overview.

Will homeopathy finally disappear into history? The National Health and Medical Research Council (NHMRC) “concludes that the assessment of the evidence from research in humans does not show that homeopathy is effective for treating the range of health conditions considered”. It’s now official: according to Australia’s peak medical research body, homeopathy doesn’t work!”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on End of the road for homeopathy? – Croakey – 9 April 2014 )

Which doctors take up promising ideas? New insights from open data – Nesta – 28 January 2014

Posted on March 4, 2014. Filed under: Evidence Based Practice, General Practice, Knowledge Translation, Primary Hlth Care | Tags: , |

Which doctors take up promising ideas? New insights from open data – Nesta – 28 January 2014

“The report looks at early adoption of promising new ideas across primary care in England and argues that analysing open data can help public services gain a greater understanding of their take up of innovations.

Key findings
No single group of GP practices were serial early adopters of all the innovations reviewed, but groups of early adopters were identified around specific types of innovations.
Larger GP practices are in a better position to explore and introduce new innovations, while neighbouring practices tended to have similar rates and patterns of adopting new innovations.
GPs rely on a range of resources to identify and learn about innovations – including informal local networks, personal relationships, and information systems. Fellow GPs and national guidance were particularly influential sources of information.
Local intermediaries – such as Academic Health Science Networks and Clinical Commissioning Groups – have an important role to play in the adoption process.

This report demonstrates a rising opportunity to inform practitioners and patients by making use of open data. Analysis of primary care open data shows the potential to chart GP surgeries’ uptake of promising innovations in technologies, drugs and practices.”

… continues

 

Read Full Post | Make a Comment ( Comments Off on Which doctors take up promising ideas? New insights from open data – Nesta – 28 January 2014 )

How do government agencies use evidence? – Socialstyrelsen: the National Board of Health and Welfare, Sweden – 1 June 2013

Posted on February 25, 2014. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Policy |

How do government agencies use evidence? – Socialstyrelsen: the National Board of Health and Welfare, Sweden – 1 June 2013

Extract from the executive summary:

“Significant research gaps remain in our understanding about what happens inside government agencies in relation to the production, commissioning, assessment and incorporation of research-based evidence into their policy advice and their program delivery and review activities. Practices and capabilities vary enormously across types of public agencies, levels of government, and policy areas. Understanding these patterns and potentialities better would help focus attention on effective methods for improving the quality of decision-making through evidence-informed processes.”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on How do government agencies use evidence? – Socialstyrelsen: the National Board of Health and Welfare, Sweden – 1 June 2013 )

Matching form to function: Designing organizational models to support knowledge brokering in European health systems – European Observatory on Health Systems and Policies – 2013

Posted on January 23, 2014. Filed under: Evidence Based Practice, Health Libnship, Knowledge Translation | Tags: |

Matching form to function: Designing organizational models to support knowledge brokering in European health systems – European Observatory on Health Systems and Policies – 2013

John N. Lavis, Nasreen Jessani, Govin Permanand, Cristina Catallo, Amy Zierler, BriDGE Study Team

Extract from the key messages

“Credible, competent knowledge brokers in European health systems will ideally organize themselves so as to: inform policy-making using the best available health systems information; inform the production, packaging and sharing of health systems information based on current and emerging policy-making priorities; and employ (and continuously improve) information-packaging and interactive knowledge-sharing mechanisms that are based on a solid understanding of the policy-making context.
The BriDGE criteria can be used to assess an existing or planned organizational model.”

… continues

Read Full Post | Make a Comment ( Comments Off on Matching form to function: Designing organizational models to support knowledge brokering in European health systems – European Observatory on Health Systems and Policies – 2013 )

How can knowledge brokering be advanced in a country’s health system? – European Observatory on Health Systems and Policies – 2013

Posted on January 16, 2014. Filed under: Evidence Based Practice, Health Libnship, Knowledge Translation | Tags: |

How can knowledge brokering be advanced in a country’s health system? – European Observatory on Health Systems and Policies – 2013

Authors
John N. Lavis, McMaster University, Canada and Harvard School of Public Health, USA
Govin Permanand, Evidence and Information for Policy, WHO Regional Office for Europe, Denmark
Cristina Catallo, School of Nursing, Ryerson University, Canada and McMaster University, Canada
BRIDGE Study Team, which includes Josep Figueras, Mark Leys, David McDaid, Gabriele Pastorino and John-Arne Røttingen

“What’s the problem?
• The overarching problem is that there is a lack of attention given to ‘what to do next’ to advance knowledge brokering in many European countries’
health systems. This problem can be understood by considering four sets of interrelated issues within any given country’s health system:
– untapped potential for health systems information to inform policy-making;
– missed opportunities to take stock of the current state of knowledge brokering and to prioritize enhancements to information-packaging mechanisms, enrichments to interactive knowledge-sharing mechanisms, and adaptations to organizational models that support knowledge brokering;
– lack of alignment of support for knowledge brokering, including incentives and requirements for using promising knowledge-brokering mechanisms and models; and
– limited reach of existing efforts to advance knowledge brokering.”

… continues

Read Full Post | Make a Comment ( Comments Off on How can knowledge brokering be advanced in a country’s health system? – European Observatory on Health Systems and Policies – 2013 )

Evidence gap maps – a tool for promoting evidence-informed policy and prioritizing future research – The World Bank – 1 December 2013

Posted on January 14, 2014. Filed under: Evidence Based Practice, Research | Tags: |

Evidence gap maps – a tool for promoting evidence-informed policy and prioritizing future research – The World Bank – 1 December 2013

Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Gaarder, Marie. 2013. Evidence gap maps – a tool for promoting evidence-informed policy and prioritizing future research. Policy Research working paper ; no. WPS 6725. Washington DC ; World Bank Group.

“Evidence-gap maps present a new addition to the tools available to support evidence-informed policy making. Evidence-gap maps are thematic evidence collections covering a range of issues such as maternal health, HIV/AIDS, and agriculture. They present a visual overview of existing systematic reviews or impact evaluations in a sector or subsector, schematically representing the types of interventions evaluated and outcomes reported. Gap maps enable policy makers and practitioners to explore the findings and quality of the existing evidence and facilitate informed judgment and evidence-based decision making in international development policy and practice. The gap map also identifies key “gaps” where little or no evidence from impact evaluations and systematic reviews is available and where future research should be focused. Thus, gap maps can be a useful tool for developing a strategic approach to building the evidence base in a particular sector. This paper provides an introduction to evidence-gap maps, outlines the gap-map methodology, and presents some examples.”

Read Full Post | Make a Comment ( Comments Off on Evidence gap maps – a tool for promoting evidence-informed policy and prioritizing future research – The World Bank – 1 December 2013 )

NHS England Research and Development Strategy Consultation – January 2014

Posted on January 13, 2014. Filed under: Evidence Based Practice, Research |

NHS England Research and Development Strategy Consultation – January 2014

“The vision of the Research and Development Strategy is to;

Support development of high quality commissioning underpinned by research and innovation,
Support NHS England in becoming an excellent organisation by encouraging a culture that values and promotes research and innovation
Create an evidence based decision making culture
Ensure research undertaken or commissioned by NHS England is patient centred
Offer every patient the opportunity to take part in research (where practical)
Contribute to economic growth

The strategy’s aim is to:

promote the use of research and the use of evidence obtained from high quality research,
support the NHS outcomes framework objectives by building the evidence base,
identify best practice to commission research that delivers benefits for patients and their families,
support the development of evidenced based innovative practice.

There are six objectives to be delivered by 2018; these outline how NHS England will deliver to the vision, the aim and its priorities. Each objective has an associated outcome and impact highlighted.”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on NHS England Research and Development Strategy Consultation – January 2014 )

Learning from research: systematic reviews for informing policy decisions. A quick guide – Alliance for Useful Evidence – December 2013

Posted on December 17, 2013. Filed under: Evidence Based Practice, Health Policy | Tags: |

Learning from research: systematic reviews for informing policy decisions. A quick guide – Alliance for Useful Evidence – December 2013

Read Full Post | Make a Comment ( Comments Off on Learning from research: systematic reviews for informing policy decisions. A quick guide – Alliance for Useful Evidence – December 2013 )

The influence of cost-effectiveness and other factors on NICE decisions – Centre for Health Economics, University of York – November 2013

Posted on December 10, 2013. Filed under: Evidence Based Practice, Health Economics, Health Mgmt Policy Planning, Health Policy | Tags: , |

The influence of cost-effectiveness and other factors on NICE decisions – Centre for Health Economics, University of York – November 2013

“Abstract

Background: The National Institute for Health and Care Excellence (NICE) emphasises that costeffectiveness is not the only consideration in health technology appraisal and is increasingly explicit about other factors considered relevant. Observing NICE decisions and the evidence considered in each appraisal allows us to ‘reveal’ its implicit weights.

Objectives: This study aims to investigate the influence of cost-effectiveness and other factors on NICE decisions and to investigate whether NICE’s decision-making has changed through time.

Methods: We build on and extend the modelling approaches in Devlin and Parkin (2004) and Dakin et al (2006). We model NICE’s decisions as binary choices: i.e. recommendations for or against use of a healthcare technology in a specific patient group. Independent variables comprised: the clinical and economic evidence regarding that technology; the characteristics of the patients, disease or treatment; and contextual factors affecting the conduct of health technology appraisal. Data on all NICE decisions published by December 2011 were obtained from HTAinSite [www.htainsite.com].

Results: Cost-effectiveness alone correctly predicted 82% of decisions; few other variables were significant and alternative model specifications led to very small variations in model performance. The odds of a positive NICE recommendation differed significantly between musculoskeletal disease, respiratory disease, cancer and other conditions. The accuracy with which the model predicted NICE recommendations was slightly improved by allowing for end of life criteria, uncertainty, publication date, clinical evidence, only treatment, paediatric population, patient group evidence, appraisal process, orphan status, innovation and use of probabilistic sensitivity analysis, although these variables were not statistically significant. Although there was a non-significant trend towards more recent decisions having a higher chance of a positive recommendation, there is currently no evidence that the threshold has changed over time. The model with highest prediction accuracy suggested that a technology costing £40,000 per quality-adjusted life-year (QALY) would have a 50% chance of NICE rejection (75% at £52,000/QALY; 25% at £27,000/QALY).

Discussion: Past NICE decisions appear to have been based on a higher threshold than the £20,000 – £30,000/QALY range that is explicitly stated. However, this finding may reflect consideration of other factors that drive a small number of NICE decisions or cannot be easily quantified.”

Read Full Post | Make a Comment ( Comments Off on The influence of cost-effectiveness and other factors on NICE decisions – Centre for Health Economics, University of York – November 2013 )

Grading the Strength of a Body of Evidence When Assessing Health Care Interventions for the Effective Health Care Program of the Agency for Healthcare Research and Quality: An Update – November 2013

Posted on December 3, 2013. Filed under: Evidence Based Practice | Tags: |

Grading the Strength of a Body of Evidence When Assessing Health Care Interventions for the Effective Health Care Program of the Agency for Healthcare Research and Quality: An Update – November 2013

Berkman ND, Lohr KN, Ansari M, McDonagh M, Balk E, Whitlock E, Reston J, Bass E, Butler M, Gartlehner G, Hartling L, Kane R, McPheeters M, Morgan L, Morton SC, Viswanathan M, Sista P, Chang S. Grading the Strength of a Body of Evidence When Assessing Health Care Interventions for the Effective Health Care Program of the Agency for Healthcare Research and Quality: An Update. Methods Guide for Comparative Effectiveness Reviews(Prepared by the RTI-UNC Evidence-based Practice Center under Contract No. 290-2007-10056-I). AHRQ Publication No. 13(14)-EHC130-EF. Rockville, MD: Agency for Healthcare Research and Quality. November 2013.

“We briefly explore the rationale for grading strength of evidence, define domains of concern, and describe our recommended grading system for systematic reviews. The aims of this guidance are twofold: (1)to foster appropriate consistency and transparency in the methods that different EPCs use to grade strength of evidence and (2)to facilitate users’ interpretations of those grades for guideline development or other decisionmaking tasks. Because this field is rapidly evolving, future revisions are anticipated; they will reflect our increasing understanding and experience with the methodology.”

Read Full Post | Make a Comment ( Comments Off on Grading the Strength of a Body of Evidence When Assessing Health Care Interventions for the Effective Health Care Program of the Agency for Healthcare Research and Quality: An Update – November 2013 )

Dissemination and implementation of clinical practice guidelines in Belgium – Health Services Research (HSR) Brussels: Belgian Health Care Knowledge Centre (KCE) – 21 November 2013

Posted on December 3, 2013. Filed under: Evidence Based Practice | Tags: |

Dissemination and implementation of clinical practice guidelines in Belgium – Health Services Research (HSR) Brussels: Belgian Health Care Knowledge Centre (KCE) – 21 November 2013

“The objective of this study is to identify the optimal dissemination and implementation strategies for clinical guidelines in order to propose avenues for improvement in Belgium.

Sections of the scientific report

This report has three parts:
• An overview of the systematic literature reviews on the efficacy of the strategies for guideline dissemination and implementation (chapter 1);
• A qualitative study to describe the landscape of guidelines in Belgium i.e. the different organizations and the links between them (chapter 2);
• The discussion of proposals to improve the future dissemination and implementation of guidelines in Belgium, with an involvement of representatives of major associations at stake (chapter 3). ”

… continues

Read Full Post | Make a Comment ( Comments Off on Dissemination and implementation of clinical practice guidelines in Belgium – Health Services Research (HSR) Brussels: Belgian Health Care Knowledge Centre (KCE) – 21 November 2013 )

The challenges of evidence: provocation paper for the Alliance for Useful Evidence, NESTA – November 2013

Posted on December 3, 2013. Filed under: Evidence Based Practice | Tags: , |

The challenges of evidence: provocation paper for the Alliance for Useful Evidence, NESTA – November 2013

by Dr Ruth Levitt

Read Full Post | Make a Comment ( Comments Off on The challenges of evidence: provocation paper for the Alliance for Useful Evidence, NESTA – November 2013 )

Communication and Dissemination Strategies To Facilitate the Use of Health-Related Evidence – AHRQ – November 2013

Posted on December 3, 2013. Filed under: Evidence Based Practice | Tags: , |

Communication and Dissemination Strategies To Facilitate the Use of Health-Related Evidence – AHRQ – November 2013

McCormack L, Sheridan S, Lewis M, Boudewyns V, Melvin CL, KistlerC, Lux LJ, Cullen K, Lohr KN. Communication and Dissemination Strategies To Facilitate the Use of Health-Related Evidence. Evidence Report/Technology Assessment No. 213. (Prepared by the RTIInternational–University of North Carolina Evidence-based Practice Center under Contract No. 290-2007-10056-I.) AHRQ Publication No. 13(14)-E003-EF. Rockville, MD: Agency for Healthcare Research and Quality; November 2013.

“Structured Abstract

Objectives. This review examined how to best communicate and disseminate evidence, including uncertain evidence, to inform health care decisions. The review focused on three primary objectives—comparing the effectiveness of: (1) communicating evidence in various contents and formats that increase the likelihood that target audiences will both understand and use the information (KQ 1); (2) a variety of approaches for disseminating evidence from those who develop it to those who are expected to use it (KQ 2); and (3) various ways of communicating uncertainty-associated health-related evidence to different target audiences (KQ 3). A secondary objective was to examine how the effectiveness of communication and dissemination strategies varies across target audiences, including evidence translators, health educators, patients, and clinicians.

Data sources. We searched MEDLINE®, the Cochrane Library, Cochrane Central Trials Registry, PsycINFO®, and the Web of Science. We used a variety of medical subject headings (MeSH terms) and major headings, and used free-text and title and abstract text-word searches. The search waslimited to studies on humans published from 2000 to March 15, 2013, for communication and dissemination, given the prior systematic reviews, and from 1966 to March 15, 2013, for communicating uncertainty.

Review methods. We used standard Evidence-based Practice Center methods of dual review of abstracts, full-text articles, and abstractions, and quality ratings and group consensus to resolve disagreements. We used group consensus to grade strength of evidence.

Results. The search identified 4,152 articles (after removing duplicates) for all three KQs. After dual review at the title/abstract stage and full-text review stage, we retained 61 articles that directly (i.e., head to head) compared strategies to communicate and disseminate evidence.Across the KQs, many of the comparisons yielded insufficient evidence to draw firm conclusions. For KQ 1, we found that investigators frequently blend more than one communication strategy in interventions. For KQ 2, we found that, compared with single dissemination strategies, multicomponent dissemination strategies are more effective at enhancing clinician behavior, particularly for guideline adherence. Key findings for KQ 3 indicate that evidence on communicating overall strength of recommendation and precision was insufficient, but certain ways of communicating directness and net benefit may be helpful in reducing uncertainty.

Conclusions. The lack of comparative research evidence to inform communication and dissemination of evidence, including uncertain evidence, impedes timely clinician, patient, and policymaker awareness, uptake, and use of evidence to improve the quality of care. Expanding investment in communication, dissemination, and implementation research is critical to the identification of strategies to accelerate the translation of comparative effectiveness research into community and clinical practice and the direct benefit of patient care.”

Read Full Post | Make a Comment ( Comments Off on Communication and Dissemination Strategies To Facilitate the Use of Health-Related Evidence – AHRQ – November 2013 )

International Profiles of Health Care Systems, 2013 – The Commonwealth Fund – 14 November 2013

Posted on November 27, 2013. Filed under: Evidence Based Practice, Health Economics, Health Informatics, Health Mgmt Policy Planning, Health Policy | Tags: |

International Profiles of Health Care Systems, 2013 – The Commonwealth Fund – 14 November 2013

S. Thomson, R. Osborn, D. Squires, and M. Jun, International Profiles of Health Care Systems, 2013, The Commonwealth Fund, November 2013.

“This publication presents overviews of the health care systems of Australia, Canada, Denmark, England, France, Germany, Japan, Italy, the Netherlands, New Zealand, Norway, Sweden, Switzerland, and the United States. Each overview covers health insurance, public and private financing, health system organization and governance, health care quality and coordination, disparities, efficiency and integration, use of information technology and evidence-based practice, cost containment, and recent reforms and innovations. In addition, summary tables provide data on a number of key health system characteristics and performance indicators, including overall health care spending, hospital spending and utilization, health care access, patient safety, care coordination, chronic care management, disease prevention, capacity for quality improvement, and public views.”

Read Full Post | Make a Comment ( Comments Off on International Profiles of Health Care Systems, 2013 – The Commonwealth Fund – 14 November 2013 )

Standards of Evidence: an approach that balances the need for evidence with innovation – Nesta – October 2013

Posted on November 19, 2013. Filed under: Evidence Based Practice | Tags: , |

Standards of Evidence: an approach that balances the need for evidence with innovation – Nesta – October 2013

“This paper provides an overview of the Nesta Standards of Evidence. Our aim is to find alignment with academically recognised levels of rigour, whilst managing to ensure impact measurement is appropriate to the stage of development of a variety of different products, services and programmes.”

… continues

Read Full Post | Make a Comment ( Comments Off on Standards of Evidence: an approach that balances the need for evidence with innovation – Nesta – October 2013 )

Social Media and Public Policy: what is the evidence? – Alliance for Useful Evidence – September 2013

Posted on October 1, 2013. Filed under: Evidence Based Practice, Health Policy | Tags: , |

Social Media and Public Policy: what is the evidence? – Alliance for Useful Evidence – September 2013

“This report considers whether social media data can improve the quality and timeliness of the evidence base that informs public policy. Can the myriad of human connections and interactions on the web provide insight to enable government to develop better policy, understand its subsequent impact and inform the many different organisations that deliver public services?”

Read Full Post | Make a Comment ( Comments Off on Social Media and Public Policy: what is the evidence? – Alliance for Useful Evidence – September 2013 )

The role of evidence in policy formation and implementation. A report from the Prime Minister’s Chief Science Advisor [NZ] – September 2013

Posted on September 17, 2013. Filed under: Evidence Based Practice, Health Policy |

The role of evidence in policy formation and implementation. A report from the Prime Minister’s Chief Science Advisor [NZ] – September 2013

ISBN 978-0-477-10404-3 (paperback)
ISBN 978-0-477-10405-0 (PDF)

“At the request of the Prime Minister, this report has been designed to explore in greater detail the issues that were brought to light in an earlier discussion paper, Towards better use of evidence in policy formation (2011). This paper extends that discussion and makes some specific suggestions as to how to improve the use of robust evidence in policy formation and evaluation.”

Read Full Post | Make a Comment ( Comments Off on The role of evidence in policy formation and implementation. A report from the Prime Minister’s Chief Science Advisor [NZ] – September 2013 )

What Is Evidence-Based Policy? – Melbourne Institute of Applied Economic and Social Research – August 2013

Posted on September 17, 2013. Filed under: Evidence Based Practice, Health Policy | Tags: |

What Is Evidence-Based Policy? – Melbourne Institute of Applied Economic and Social Research – August 2013

Paul H. Jensen, Melbourne Institute Policy Brief No. 4/13
ISSN 2201-5477 (Print)
ISSN 2201-5485 (Online)
ISBN 978-0-7340-4321-4

“In this Policy Brief, the rationale underpinning the ‘evidence-based’ approach to public policy is carefully explained, as are the pros and cons of the different methods used to construct the evidence base. “

Read Full Post | Make a Comment ( Comments Off on What Is Evidence-Based Policy? – Melbourne Institute of Applied Economic and Social Research – August 2013 )

Evidence Aid – website for use following humanitarian crises, natural disasters, major healthcare emergencies

Posted on July 30, 2013. Filed under: Disaster Management, Evidence Based Practice |

Evidence Aid – website for use following humanitarian crises, natural disasters, major healthcare emergencies

Read Full Post | Make a Comment ( Comments Off on Evidence Aid – website for use following humanitarian crises, natural disasters, major healthcare emergencies )

Best Care at Lower Cost: The Path to Continuously Learning Health Care in America – Committee on the Learning Health Care System in America, Institute of Medicine – 2013

Posted on May 14, 2013. Filed under: Evidence Based Practice, Health Economics, Health Informatics, Health Mgmt Policy Planning | Tags: , |

Best Care at Lower Cost: The Path to Continuously Learning Health Care in America – Committee on the Learning Health Care System in America, Institute of Medicine – 2013

ISBN-10: 0-309-26073-6    ISBN-13: 978-0-309-26073-2

“America’s health care system has become too complex and costly to continue business as usual. Best Care at Lower Cost explains that inefficiencies, an overwhelming amount of data, and other economic and quality barriers hinder progress in improving health and threaten the nation’s economic stability and global competitiveness. According to this report, the knowledge and tools exist to put the health system on the right course to achieve continuous improvement and better quality care at a lower cost.

The costs of the system’s current inefficiency underscore the urgent need for a systemwide transformation. About 30 percent of health spending in 2009–roughly $750 billion–was wasted on unnecessary services, excessive administrative costs, fraud, and other problems. Moreover, inefficiencies cause needless suffering. By one estimate, roughly 75,000 deaths might have been averted in 2005 if every state had delivered care at the quality level of the best performing state. This report states that the way health care providers currently train, practice, and learn new information cannot keep pace with the flood of research discoveries and technological advances.

About 75 million Americans have more than one chronic condition, requiring coordination among multiple specialists and therapies, which can increase the potential for miscommunication, misdiagnosis, potentially conflicting interventions, and dangerous drug interactions. Best Care at Lower Cost emphasizes that a better use of data is a critical element of a continuously improving health system, such as mobile technologies and electronic health records that offer significant potential to capture and share health data better. In order for this to occur, the National Coordinator for Health Information Technology, IT developers, and standard-setting organizations should ensure that these systems are robust and interoperable. Clinicians and care organizations should fully adopt these technologies, and patients should be encouraged to use tools, such as personal health information portals, to actively engage in their care.

This book is a call to action that will guide health care providers; administrators; caregivers; policy makers; health professionals; federal, state, and local government agencies; private and public health organizations; and educational institutions.”

Read Full Post | Make a Comment ( Comments Off on Best Care at Lower Cost: The Path to Continuously Learning Health Care in America – Committee on the Learning Health Care System in America, Institute of Medicine – 2013 )

What Works: evidence centres for social policy – London, Cabinet Office – March 2013

Posted on March 19, 2013. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Policy |

What Works: evidence centres for social policy – London, Cabinet Office – March 2013

What Works Centres [UK]

“The What Works Network, a key action in the Civil Service reform plan, will consist of two existing centres of excellence – the National Institute for Health and Clinical Excellence (NICE) and the Educational Endowment Foundation – plus four new independent institutions responsible for gathering, assessing and sharing the most robust evidence to inform policy and service delivery in tackling crime, promoting active and independent ageing, effective early intervention, and fostering local economic growth.

 
This initiative will build upon existing evidence-based policy making. These independent specialist centres will produce and disseminate research to local decision makers, supporting them in investing in services that deliver the best outcomes for citizens and value for money for taxpayers. The centres will also feed insights into the heart of government to inform national decision-making. It is the first time a government anywhere has set up such a model at a national level.”

… continues

the four new ones
What Works Centre for Local Economic Growth
What Works Centre for Ageing Better
What Works Centre for Crime Reduction
What Works Centre for Early Intervention

Alliance for Useful Evidence

“The Alliance for Useful Evidence champions the use of evidence in social policy and practice. We are an open–access network of individuals from across government, universities, charities, business and local authorities in the UK and internationally.”

Read Full Post | Make a Comment ( Comments Off on What Works: evidence centres for social policy – London, Cabinet Office – March 2013 )

Systems Approaches to Knowledge Mobilization: Scan of Initiatives – Prepared for: Chronic Disease Interventions Division Public Health Agency of Canada – September 2012

Posted on January 15, 2013. Filed under: Chronic Disease Mgmt, Evidence Based Practice |

Systems Approaches to Knowledge Mobilization: Scan of Initiatives – Prepared for: Chronic Disease Interventions Division Public Health Agency of Canada – September 2012

Prepared by: Jamie Gamble, Imprint Consulting Inc.

Extract from the Foreword:

“Preventing chronic disease is complex. Solutions require a multiplicity of players, in and outside of the health system, working together using integrated, multifaceted approaches. The more chronic disease prevention (CDP) efforts can be guided by „what works‟, the greater the chance of success. However, because what works for CDP is complex, more traditional approaches used for evidence-based medicine are not a good fit.

A promising new approach for mobilizing evidence and knowledge in order to improve CDP efforts, is to apply concepts and tools from complex systems science to better link evidence and action. This approach includes giving more attention to „system gaps‟ (as opposed to evidence gaps), better aligning the needs and interests of researchers and practitioners, focusing on systems that allow for continuous learning and adaptation, and implementing methods that enable real-time feedback about what is working, for whom, under what conditions and at what cost. In short, there is a need to develop approaches for mobilizing knowledge and evidence that better equip us to learn about what works in the dynamic and diverse environments within which CDP efforts are currently being undertaken.”

… continues

Read Full Post | Make a Comment ( Comments Off on Systems Approaches to Knowledge Mobilization: Scan of Initiatives – Prepared for: Chronic Disease Interventions Division Public Health Agency of Canada – September 2012 )

Understanding Clinical Practice Guidelines: A Video Series Primer – Health Council of Canada – 2012

Posted on December 4, 2012. Filed under: Evidence Based Practice | Tags: , |

Understanding Clinical Practice Guidelines: A Video Series Primer – Health Council of Canada – 2012
 
“The Health Council of Canada has developed a series of videos to provide an overview of clinical practice guidelines (CPGs) in Canada through the eyes of those who design, disseminate, and use them.

CPGs are evidence-based recommendations that help health care professionals make better clinical decisions. When designed and used properly, CPGs can play an important role in the Canadian health care system.

These videos are meant to offer greater insight into what CPGs are, how they are used, how they are disseminated and implemented, and what impact they can have.”

Read Full Post | Make a Comment ( Comments Off on Understanding Clinical Practice Guidelines: A Video Series Primer – Health Council of Canada – 2012 )

Bridging the gap: Why some people are not offered the medicines that NICE recommends – IMS Health – November 2012

Posted on December 4, 2012. Filed under: Evidence Based Practice, Pharmacy | Tags: , |

Bridging the gap: Why some people are not offered the medicines that NICE recommends – IMS Health – November 2012

by Peter Stephens

Extract from the Executive Summary:

“The extent of variation in the uptake of NICE recommended medicines across the NHS in England has been described before. The reasons for such variations are less well understood.

With the encouragement of the Metrics Oversight Group, a joint Department of Health, Industry and NHS group, IMS Health conducted a qualitative study during the summer and autumn of 2012 to help investigate the reasons for that variation.”

Read Full Post | Make a Comment ( Comments Off on Bridging the gap: Why some people are not offered the medicines that NICE recommends – IMS Health – November 2012 )

What counts as good evidence? – Provocation Paper for the Alliance for Useful Evidence – November 2012

Posted on November 21, 2012. Filed under: Evidence Based Practice | Tags: |

What counts as good evidence? – Provocation Paper for the Alliance for Useful Evidence – November 2012

Sandra Nutley, Alison Powell and Huw Davies, Research Unit for Research Utilisation (RURU), School of Management, University of St Andrews

Extract:

“Making better use of evidence is essential if public services are to deliver more for less. Central to this challenge is the need for a clearer understanding about standards of evidence that can be applied to the research informing social policy. This paper reviews the extent to which it is possible to reach a workable consensus on ways of identifying and labelling evidence. It does this by exploring the efforts made to date and the debates that have ensued. Throughout, the focus is on evidence that is underpinned by research, rather than other sources of evidence such as expert opinion or stakeholder views.

After setting the scene, the review and arguments are presented in five main sections:”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on What counts as good evidence? – Provocation Paper for the Alliance for Useful Evidence – November 2012 )

Methods for Benefit and Harm Assessment in Systematic Reviews – AHRQ – November 2012

Posted on November 20, 2012. Filed under: Evidence Based Practice, Research | Tags: , |

Methods for Benefit and Harm Assessment in Systematic Reviews – AHRQ – November 2012

“Structured Abstract

Introduction. Systematic reviewers are challenged by how to report and synthesize information about benefits and harms of medical interventions so that decisionmakers with varying preferences can better assess the balance of benefit and harm. Quantitative approaches exist for assessing benefits and harms, but it is unclear whether they are applicable to systematic reviews.

Objectives. The objectives of this report are: (1) to describe the challenges of quantitative approaches for assessing benefits and harms, (2) to describe methodological characteristics of existing quantitative approaches for assessing benefits and harms, (3) to determine the role of values and preferences in assessing benefits and harms across each step of a systematic review and (4) to formulate principles for assessing benefits and harms in systematic reviews.

Process. We formed a multidisciplinary team with expertise in clinical medicine, systematic reviews, statistics, and epidemiology. The team reviewed the literature on quantitative approaches for assessing benefits and harms of medical interventions, and held 12 weekly meetings to establish consensus about: 1) the challenges in assessing benefits and harms; 2) the methodological characteristics of approaches that have been used; and 3) the role of values and preferences when assessing benefits and harms in systematic reviews.

The team used that information to formulate principles for analyzing benefits and harms in systematic reviews so that decisionmakers are able to weigh the benefits and harms for a given population. An external panel of experts provided input in this process.

Results. Our team identified numerous challenges for the assessment of benefits and harms. The main challenges relate to selection of health outcomes important to patients, information asymmetry (e.g., reliable and robust data on benefits with sparse data on harms), and calculation of statistical uncertainty if benefit and harm are put on the same scale using a benefit harm comparison metric, and consideration of patient preferences.

We identified 16 quantitative approaches for the assessment of benefits and harms. Twelve of the methods can be used in a systematic review because the methods can be applied with the types of summary data that are typically reported and do not require individual patient data. Simpler approaches, such as the ratio of the number needed to treat to the number needed to harm, may be suitable for relatively simple decisionmaking contexts where relevant benefit and harm outcomes are few in number and similar in importance. More complex approaches are needed for decisionmaking contexts having a large number of relevant benefits and harms.

For individual-level decisions, values and preferences are key for determining the balance of benefit and harm. Choices are made by decisionmakers that are informed by the preferences of patients and other considerations. These choices, and therefore preferences, have an important role in determining how benefits and harms are assessed in systematic reviews. These choices and preferences also affect how guideline developers frame recommendations, how regulatory bodies make decisions at the population level, and how clinicians, patients, and other end users make decisions at the individual level.
The team formulated principles to conduct comparative assessments of benefits and harms in the context of a systematic review. For example, we recommend that systematic reviews define the decisionmaking context, report the sources of evidence used (e.g., estimates of baseline risks or treatment effects), be explicit about if and how patient preferences are considered, and provide a rationale for choosing a particular quantitative approach for comparative assessment of benefits and harms.

Conclusion. Quantitative approaches for comparative assessment of benefits and harms have strengths and limitations. The choice of a particular approach depends on the decisionmaking context, the quality and quantity of available data, and the epidemiological-statistical expertise of the systematic review team. A quantitative approach may help to improve the transparency of a review, relative to a qualitative approach, by being explicit about how benefits and harms are estimated and compared. Such transparency may help decisionmakers give proper consideration to complex information about benefits and harms.”

Read Full Post | Make a Comment ( Comments Off on Methods for Benefit and Harm Assessment in Systematic Reviews – AHRQ – November 2012 )

Paying Wisely: Reforming Incentives to Promote Evidence-Based Decisions at the Point of Care – Center on Health Care

Posted on November 20, 2012. Filed under: Comparative Effectiveness Research, Evidence Based Practice, Health Economics |

Paying Wisely: Reforming Incentives to Promote Evidence-Based Decisions at the Point of Care – Center on Health Care
Effectiveness – October 2012

Extract

“Congress has invested heavily in comparative effectiveness research in order to augment the clinical information that patients and physicians need to make sound decisions at the point of care. The availability of research evidence alone, however, does not guarantee that it will be used to make decisions (Timbie et al. 2012; Esposito et al. 2010). We know,   for example, that many evidence-based services are underused and that many practices persist despite a lack of evidence for their effectiveness (McGlynn et al. 2003).

In response, policymakers are looking to reform financial incentives in the fee-for-service physician payment system to encourage evidence-based care—that is, decisions based on evidence of treatment effectiveness. Although various proposals to reform provider incentives have been put forth, most focus on transforming the organization and coordination of health care at the system level rather than on how to reward an individual clinician’s use of evidence at the point of care. This paper adds an important perspective by describing how current financial incentives in the fee-for-service system lead to the overuse and underuse of services at the point of care by physicians and other clinicians. It also explores how prominent payment reform options may reward more evidence-based clinical decisions. Based on this analysis, we conclude that a combination of payment reforms—grounded in re-calibrated FFS incentives—may be the most effective way to enhance evidence-based decision making at the point of care.”

Read Full Post | Make a Comment ( Comments Off on Paying Wisely: Reforming Incentives to Promote Evidence-Based Decisions at the Point of Care – Center on Health Care )

Questions of life and death. An investigation into the value of health library and information services in Australia – October 2012

Posted on November 13, 2012. Filed under: Evidence Based Practice, Knowledge Translation | Tags: |

Questions of life and death. An investigation into the value of health library and information services in Australia – October 2012

An initiative of Health Libraries Inc, supported by the Australian Library and Information Association (ALIA)

Read Full Post | Make a Comment ( Comments Off on Questions of life and death. An investigation into the value of health library and information services in Australia – October 2012 )

Developing a framework for establishing clinical decision support meaningful use objectives for clinical specialties – RAND – 2012

Posted on October 23, 2012. Filed under: Evidence Based Practice, Health Informatics, Knowledge Translation | Tags: , |

Developing a framework for establishing clinical decision support meaningful use objectives for clinical specialties – RAND – 2012

Extract from the preface

“The federal electronic health record (EHR) incentive program includes clinical decision support (CDS) as a central requirement of improving health outcomes; however, a process for identifying and prioritizing the most-promising targets for CDS has not been established. CDS provides those involved in care processes with general and person-specific information, intelligently filtered and organized, at appropriate times, to enhance health and health care.

This report describes a protocol for eliciting high-priority targets for electronic CDS for individual clinical specialties, which could serve to inform policymakers’ deliberations and establishment of CDS meaningful use objectives. Researchers from the RAND Corporation tested the protocol with four clinical specialties: oncology, orthopedic surgery, interventional cardiology, and pediatrics. A CDS target was defined as a clinical performance gap having one or more CDS opportunities that can be implemented to address the gap.  A CDS opportunity is defined as a specific CDS intervention that could be expected to address a clinical performance gap. CDS opportunities include existing CDS tools or interventions that might be developed in the short term.”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on Developing a framework for establishing clinical decision support meaningful use objectives for clinical specialties – RAND – 2012 )

Dissemination and Adoption of Comparative Effectiveness Research Findings When Findings Challenge Current Practices – RAND – 2011

Posted on October 23, 2012. Filed under: Evidence Based Practice, Knowledge Translation, Research | Tags: , |

Dissemination and Adoption of Comparative Effectiveness Research Findings When Findings Challenge Current Practices – RAND – 2011

by Eric C. Schneider, Justin W. Timbie, D. Steven Fox, Kristin R. Van Busum, John Caloyeras

“Insufficient evidence regarding the effectiveness of medical treatments has been identified as a key source of inefficiency in the U.S. healthcare system. Variation in the use of diagnostic tests and treatments for patients with similar symptoms or conditions has been attributed to clinical uncertainty, since the published scientific evidence base does not provide adequate information to determine which treatments are most effective for patients with specific clinical needs. The federal government has made a dramatic investment in comparative effectiveness research (CER), with the expectation that CER will influence clinical practice and improve the efficiency of healthcare delivery. To do this, CER must provide information that supports fundamental changes in healthcare delivery and informs the choice of diagnostic and treatment strategies. This report summarizes findings from a qualitative analysis of the factors that impede the translation of CER into clinical practice and those that facilitate it. A case-study methodology is used to explore the extent to which these factors led to changes in clinical practice following five recent key CER studies. The enabling factors and barriers to translation for each study are discussed, the root causes for the failure of translation common to the studies are synthesized, and policy options that may optimize the impact of future CER — particularly CER funded through the American Recovery and Reinvestment Act of 2009 -are proposed.”

Read Full Post | Make a Comment ( Comments Off on Dissemination and Adoption of Comparative Effectiveness Research Findings When Findings Challenge Current Practices – RAND – 2011 )

Implementing evidence-based programmes in children’s services: key issues for success – Department for Education [UK] – September 2012

Posted on October 23, 2012. Filed under: Child Health / Paediatrics, Evidence Based Practice, Knowledge Translation | Tags: |

Implementing evidence-based programmes in children’s services: key issues for success – Department for Education [UK] – September 2012

“Evidence suggests that a carefully planned and well-resourced implementation is key to better outcomes and programme success. Across disciplines, implementation researchers have devised a number of frameworks that can be used to encourage the best practice in implementation and greatest fidelity to the original programme.

This report brings together the latest international thinking about the key issues relating to the implementation of evidence-based programmes, utilising both published work and expert opinion. The aim is to provide a summary of issues that should be considered and planned for by those about to start implementing a new programme in order to increase the chances of success; to draw attention to sources of further information; and to share lessons that have been learned by others when implementing similar programmes.

The research consisted of a literature review undertaken initially using snowballing techniques following the identification of key experts in the field. This was followed by a systematic search of electronic databases for previous reviews of implementation studies. For the second section of the report, electronic database searches were carried out for published academic papers relating to the MST, FFT, MTFC, and KEEP programmes.”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on Implementing evidence-based programmes in children’s services: key issues for success – Department for Education [UK] – September 2012 )

Establishing a Vibrant Evidence Paradigm to Support Innovation – Avalere Health, LLC – September 2012

Posted on October 16, 2012. Filed under: Evidence Based Practice | Tags: |

Establishing a Vibrant Evidence Paradigm to Support Innovation – Avalere Health, LLC – September 2012

Read Full Post | Make a Comment ( Comments Off on Establishing a Vibrant Evidence Paradigm to Support Innovation – Avalere Health, LLC – September 2012 )

Evidence in management decisions (EMD) – advancing knowledge utilization in healthcare management – NHS National Institute for Health Research – August 2012

Posted on September 14, 2012. Filed under: Evidence Based Practice, Health Mgmt Policy Planning | Tags: |

Evidence in management decisions (EMD) – advancing knowledge utilization in healthcare management – NHS National Institute for Health Research – August 2012

Read Full Post | Make a Comment ( Comments Off on Evidence in management decisions (EMD) – advancing knowledge utilization in healthcare management – NHS National Institute for Health Research – August 2012 )

Academic health science networks – Department of Health [UK] – 21 June 2012

Posted on June 22, 2012. Filed under: Evidence Based Practice, Knowledge Translation | Tags: |

Academic health science networks – Department of Health [UK] – 21 June 2012

“The goal of AHSN’s will be to improve patient and population health outcomes by translating research into practice and developing and implementing integrated health care systems. The document sets out the draft designation and establishment process.

Every local NHS organisation should aspire to be affiliated to its local AHSN, which would act as a gateway for any NHS organisation needing support or help with innovation, and provide industry with focused points of access to the NHS.”

Read Full Post | Make a Comment ( Comments Off on Academic health science networks – Department of Health [UK] – 21 June 2012 )

Translating evidence into allied health practice: A review of the literature – Clinical Education and Training Queensland

Posted on June 1, 2012. Filed under: Allied Health, Evidence Based Practice |

Translating evidence into allied health practice: A review of the literature – Clinical Education and Training Queensland

“Executive summary
The translation of evidence into allied health practice is critical to improving health care outcomes for patients within Queensland Health facilities. It forms the final step in the process of evidence based practice and currently, there is a lack of evidence for determining the best method of translating evidence into allied health professional practice. To date, a number of different strategies to implement evidence in clinical practice have been investigated. These include the use of educational materials, educational meetings, educational outreach visits, local opinion leaders, audit and feedback, and a tailored multifaceted combination approach using the above interventions. The majority of literature to date relates to medical or nursing professional practice in community health care settings in Northern America and Europe. Overall, the effect of these educational interventions is small, with a maximum of 10% change in professional practice and less than 5% change in patient outcomes if investigated. Many studies have included a number of different intervention strategies, making it difficult to determine which component of the intervention was most effective in influencing practitioner behaviour. Modest improvements in professional practice were found when programs were specifically tailored to address identified barriers to behavioural change. There are only seven clinical trials (two in an Australian setting) to date that have investigated strategies to change allied health professional practice. These studies suggest that a multifaceted combination program can result in greater adherence to desired practice in the short term, such as increased discussion of alternative medications during pharmacy consultations and increased preventative treatments in periodontal care. Due to the limited evidence base in allied health practice to date, it is not possible to identify the most effective strategies to develop a comprehensive approach to facilitate the translation of evidence into allied health practice in Queensland Health facilities.”

Read Full Post | Make a Comment ( Comments Off on Translating evidence into allied health practice: A review of the literature – Clinical Education and Training Queensland )

Embedding of research into decision-making processes – WHO Alliance for Health Policy and Systems Research – April 2012

Posted on May 23, 2012. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Policy, Knowledge Translation, Research | Tags: |

Embedding of research into decision-making processes – WHO Alliance for Health Policy and Systems Research – April 2012

Adam D Koon, Devaki Nambiar, Krishna D Rao

Background paper commissioned by the Alliance for Health Policy and Systems Research to develop the WHO Health Systems  Research Strategy

“Objectives
This study is concerned with the uptake of research evidence in policy decisions for health and the factors which are conducive for this. Specifically, this study seeks to:

(a) Present a conceptual understanding of institutional embeddedness and apply it to the context of research in policy making in health. Further, through a review of the literature, document the institutional arrangements that facilitate the embedding of research use in the policy-making domain.

(b) Present country case studies to illustrate the embeddedness of research use in policy-making and the contextual and institutional factors that create enabling conditions for it.

We examine these questions from the perspective of the six WHO building blocks – service delivery, health workforce, information, medical products, financing and governance. Information is sourced from the existing literature and from country case studies.

Read Full Post | Make a Comment ( Comments Off on Embedding of research into decision-making processes – WHO Alliance for Health Policy and Systems Research – April 2012 )

Public access to publicly-funded research – By David Willetts, Minister of State for Universities and Science (attending Cabinet) – 2 May 2012

Posted on May 4, 2012. Filed under: Evidence Based Practice, Research | Tags: , |

Public access to publicly-funded research – By David Willetts, Minister of State for Universities and Science (attending Cabinet) – 2 May 2012

Speech delivered to the Publishers Association annual general meeting, London

“I am very grateful for this opportunity to set out the Government’s approach to accessing and publishing research findings.

We are very fortunate to have such outstanding science and research capacity in this country. It is second in its range and volume only to the US. When it comes to the output generated from the funding that goes in, it is quite simply the most productive in the world. And no other country produces such a high proportion of work that is excellent. The recent review by Reed Elsevier, amongst others, provides the rigorous evidence behind these statements. With 1% of the world’s population and 4% of its researchers, we produce 6% of the world’s academic articles and 14% of those which are most highly cited. There are about 1.7 million academic articles published around the world, of which about 120,000 come from UK research. Thanks to the quality and success of our publishing industry, meanwhile, 400,000 of the world’s academic papers are published in the UK. If the rest of Britain performed like our research and publishing community, we would have rather fewer economic problems to tackle.”

… continues

Wiki founder to build open access site for UK research – The Conversation – 2 May 2012

“The British government has enlisted the services of Wikipedia in a push to make all taxpayer-funded academic research from the UK freely available online – regardless of whether it is also published in a subscription-only journal.

The move is to be announced by the universities and science minister, David Willetts, when he addresses the Publishers Association on Wednesday (British time).”

… continues

The other media story that dwarfs the News fiasco – Crikey – 3 May 2012

by Guy Rundle

“Quietly this week, while the UK was in uproar about the activities of the last big media company in a dying industry, something of far greater import happened in the world of media and information. The UK government announced that it would be making all research papers generated within its public universities available openly, online, for free.”

Read Full Post | Make a Comment ( Comments Off on Public access to publicly-funded research – By David Willetts, Minister of State for Universities and Science (attending Cabinet) – 2 May 2012 )

Enabling Health Care Decisionmaking Through Clinical Decision Support and Knowledge Management – AHRQ – April 2012

Posted on April 26, 2012. Filed under: Evidence Based Practice, Health Informatics, Knowledge Translation | Tags: , , |

Enabling Health Care Decisionmaking Through Clinical Decision Support and Knowledge Management – AHRQ – April 2012

Evidence Report / Technology Assessment no 2-3
AHRQ = Agency for Healthcare Research and Quality [US], Prepared by: Duke Evidence-based Practice Center, Durham, North Carolina

“Structured Abstract

Objectives: To catalogue study designs used to assess the clinical effectiveness of clinical decision support systems (CDSSs) and knowledge management systems (KMSs), to identify features that impact the success of CDSSs/KMSs, to document the impact of CDSSs/KMSs on outcomes, and to identify knowledge types that can be integrated into CDSSs/KMSs.

Data Sources: MEDLINE®, CINAHL®, PsycINFO®, and Web of Science®.

Review Methods: We included studies published in English from January 1976 through December 2010. After screening titles and abstracts, full-text versions of articles were reviewed by two independent reviewers. Included articles were abstracted to evidence tables by two reviewers. Meta-analyses were performed for seven domains in which sufficient studies with common outcomes were included.

Results: We identified 15,176 articles, from which 323 articles describing 311 unique studies including 160 reports on 148 randomized control trials (RCTs) were selected for inclusion. RCTs comprised 47.5 percent of the comparative studies on CDSSs/KMSs. Both commercially and locally developed CDSSs effectively improved health care process measures related to performing preventive services (n = 25; OR 1.42, 95% confidence interval [CI] 1.27 to 1.58), ordering clinical studies (n = 20; OR 1.72, 95% CI 1.47 to 2.00), and prescribing therapies (n = 46; OR 1.57, 95% CI 1.35 to 1.82). Fourteen CDSS/KMS features were assessed for correlation with success of CDSSs/KMSs across all endpoints. Meta-analyses identified six new success features: integration with charting or order entry system, promotion of action rather than inaction, no need for additional clinician data entry, justification of decision support via research evidence, local user involvement, and provision of decision support results to patients as well as providers. Three previously identified success features were confirmed: automatic provision of decision support as part of clinician workflow, provision of decision support at time and location of decisionmaking, and provision of a recommendation, not just an assessment. Only 29 (19.6%) RCTs assessed the impact of CDSSs on clinical outcomes, 22 (14.9%) assessed costs, and 3 assessed KMSs on any outcomes. The primary source of knowledge used in CDSSs was derived from structured care protocols.

Conclusions: Strong evidence shows that CDSSs/KMSs are effective in improving health care process measures across diverse settings using both commercially and locally developed systems. Evidence for the effectiveness of CDSSs on clinical outcomes and costs and KMSs on any outcomes is minimal. Nine features of CDSSs/KMSs that correlate with a successful impact of clinical decision support have been newly identified or confirmed.”

Read Full Post | Make a Comment ( Comments Off on Enabling Health Care Decisionmaking Through Clinical Decision Support and Knowledge Management – AHRQ – April 2012 )

Using Comparative Effectiveness Research to Inform Policymaking – Commonwealth Fund – 4 April 2012

Posted on April 24, 2012. Filed under: Evidence Based Practice | Tags: , |

Using Comparative Effectiveness Research to Inform Policymaking – Commonwealth Fund – 4 April 2012

by David Squires

“Comparative effectiveness research (CER) assesses alternative treatments or diagnostic options for the same condition. Such research can prove useful for clinicians and patients as a tool to inform decisions about treatment and care. It also has potential to inform policymaking, such as decisions over which treatments to cover and at what price. In the United States, the 2009 Recovery Act for the first time provided significant funding for CER, and the 2010 Affordable Care Act went further, establishing an independent institute to commission such research—the Patient-Centered Outcomes Research Institute—with dedicated long-term funding.

Several industrialized countries have operated organizations conducting and commissioning CER for many years. In some countries these bodies are government agencies, while in others they are freestanding organizations with more independence. Policymakers often use the research these organizations generate to determine the content of publicly provided health benefits—for example, to decide whether a new drug should be covered under regional or national formularies. Other uses include negotiating pricing arrangements with drug companies or designing “value-based” cost-sharing arrangements, wherein patients pay more out-of-pocket for drugs deemed less effective than their alternative. As the Patient-Centered Outcomes Research Institute develops and CER becomes more widely available, U.S. decision-makers can learn from international experiences using CER to drive health care toward improved quality and value.”

… continues on the site

Looks at:
England’s National Institute for Health and Clinical Excellence
France’s National Authority for Health (HAS)
Germany’s Institute for Quality and Efficiency in Health Care
Australia’s Pharmaceutical Benefits Advisory Committee

Read Full Post | Make a Comment ( Comments Off on Using Comparative Effectiveness Research to Inform Policymaking – Commonwealth Fund – 4 April 2012 )

Coverage with evidence development, only in research, risk sharing or patient access scheme? A framework for coverage decisions – University of York, Centre for Health Economics – 10 April 2012

Posted on April 24, 2012. Filed under: Evidence Based Practice, Health Economics, Health Technology Assessment | Tags: |

Coverage with evidence development, only in research, risk sharing or patient access scheme? A framework for coverage decisions – University of York, Centre for Health Economics – 10 April 2012

This research paper has been published simultaneously with a shortened version of the paper in Value in Health

“Context
Until recently, purchasers’ options regarding whether to pay for the use of technologies have been binary in nature: a treatment is covered or not covered. However, policies have emerged which expand the options – for example, linking coverage to evidence development, an option increasingly used for new treatments with limited/uncertain evidence. There has been little effort to reconcile the features of technologies with the available options in a way that reflects purchasers’ ranges of authority.

Methods
We developed a framework within which different options can be evaluated. We distinguished two sources of value in terms of health: the value of the technology per se; and the value of reducing decision uncertainty. The costs of reversing decisions are also considered.

Findings
Purchasers should weigh the expected benefits of coverage against the possibility the decision may need to be reversed and the possibility adoption will hinder/prevent evidence generation. Based on the purchaser’s range of authority and the features of the technology different decisions may be appropriate. The framework clarifies the assessments needed to establish the appropriateness of
different decisions. A taxonomy of coverage decisions consistent with the framework is suggested.

Conclusions
A range of coverage options permit paying for use of promising medical technologies despite their limited/uncertain evidence bases. It is important that the option chosen be based upon not only the expected value of a technology but also the value of further research, the anticipated effect of coverage on further research, and the costs associated with reversing the decision.”

Read Full Post | Make a Comment ( Comments Off on Coverage with evidence development, only in research, risk sharing or patient access scheme? A framework for coverage decisions – University of York, Centre for Health Economics – 10 April 2012 )

Information based interventions for injury recovery – Sax Institute – February 2012

Posted on March 27, 2012. Filed under: Evidence Based Practice, Health Mgmt Policy Planning | Tags: , , |

Information based interventions for injury recovery – Sax Institute – February 2012

Collie, A., Palagyi, A., McClure, R., and Clay, F. Information Based Interventions for Injury Recovery: An Evidence Check – rapid review brokered by the Sax Institute for the Motor Accidents Authority of NSW, 2012.

“Executive Summary
Background
Vehicle-related traumatic injuries are a major public health problem. A leading cause of both morbidity and mortality, motor vehicle-related injuries cause a range of physical, cognitive and psychological disabilities that may seriously impact on the quality of life of affected individuals and their families. There is now substantial evidence that provision of compensation arising from personal injury, such as transport injury, causes harm. A number of local and international research studies now suggest that interaction with the compensation system itself is a source of frustration for those injured and may impact client outcomes. Conversely, within the cohort of people with compensable injury, compensation systems have a unique opportunity to positively impact the client‟s recovery by providing effective and efficient treatment and rehabilitation services, information and education to the injured person. Compensation authorities are well positioned to promote information and education based interventions to facilitate the recovery of injured persons following transport accidents. In this context it is important to review the academic literature regarding effective information and education based interventions for promoting recovery from injury to determine approaches that may be applicable in the Australian injury compensation setting.”

Read Full Post | Make a Comment ( Comments Off on Information based interventions for injury recovery – Sax Institute – February 2012 )

Looking back, moving forward. Capturing lessons and building the evidence base for health informatics. The Connecting for Health Evaluation Programme – 15 March 2012

Posted on March 27, 2012. Filed under: Evidence Based Practice, Health Informatics | Tags: , |

Looking back, moving forward. Capturing lessons and building the evidence base for health informatics. The Connecting for Health Evaluation Programme – 15 March 2012

University of Birmingham, Health Service Journal, NHS Connecting for Health

Contents

Unfinished business: A national research programme to evaluate the effect of IT on Patient Care

Evaluation crucial to learning from both disasters and triumphs

The impact of eHealth on the Quality and Safety of Care (Parts 1 and 2)

Electronic Blood Tracking: improving blood management and patient safety

The Electronic Prescription Service in Primary Care: Great oaks..?

Lessons learned from implementation of nationally shared electronic patient records in England, Scotland, Wales and Northern Ireland

Understanding the Implementation and Adoption of the NHS Care Records Service (CRS) in English Secondary Care

Should there be greater structuring and coding of the medical record?

Understanding the impact of information technology on interactions between patients and healthcare professionals: the INTERACT-IT study

Commentary on this from eHealth Central 23 March 2012

Read Full Post | Make a Comment ( Comments Off on Looking back, moving forward. Capturing lessons and building the evidence base for health informatics. The Connecting for Health Evaluation Programme – 15 March 2012 )

Background Paper on Conceptual Issues Related to Health Systems Research to Inform a WHO Global Strategy on Health Systems Research – 29 February 2012

Posted on March 20, 2012. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Systems Improvement, Research | Tags: , |

Background Paper on Conceptual Issues Related to Health Systems Research to Inform a WHO Global Strategy on Health Systems Research – 29 February 2012

Steven J. Hoffman et al

“This paper was commissioned to provide a conceptual underpinning for the WHO Global Strategy on Health Systems Research that is currently under development. It reviews existing definitions, terms, conceptual models, taxonomies, standards, methods and research designs which describe the scope of health systems research as well as the barriers and opportunities that flow from them. It addresses each of the five main goals of the WHO Strategy on Research for Health, including organization, priorities, capacity, standards and translation.1 Any feedback would be greatly appreciated and can be sent by email to Steven Hoffman (hoffmans@mcmaster.ca).”

“Abstract
Health systems research is widely recognized as essential for strengthening health systems, getting cost-effective treatments to those who need them, and achieving better health status around the world. However, there is significant ambiguity and confusion in this field’s characteristics, boundaries, definition and methods. Adding to this ambiguity are major conceptual barriers to the production, reproduction, translation and implementation of health systems research relating to both the complexity of health systems and research involving them. These include challenges with generalizability, comparativity, applicability, transferability, standards, priority-setting and community diversity. Three promising opportunities exist to mitigate these barriers and strengthen the important contributions of health systems research. First, health systems research can be supported as a field of scientific endeavour, with a shared language, rigorous interdisciplinary approaches, cross-jurisdictional learning and an international society. Second, national capacity for health systems research can be strengthened at the individual, organizational and system levels. Third, health systems research can be embedded as a core function of every health system. Addressing these conceptual barriers and supporting the field of health systems research promises to both strengthen health systems around the world and improve global health outcomes.”

Read Full Post | Make a Comment ( Comments Off on Background Paper on Conceptual Issues Related to Health Systems Research to Inform a WHO Global Strategy on Health Systems Research – 29 February 2012 )

Guidance for Evidence-Informed Policies about Health Systems – PLoS Medicine – series of articles – 2012

Posted on March 20, 2012. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Policy, Health Systems Improvement | Tags: |

Bosch-Capblanch X, Lavis JN, Lewin S, Atun R, Røttingen J-A, et al. (2012) Guidance for Evidence-Informed Policies about Health Systems: Rationale for and Challenges of Guidance Development. PLoS Med 9(3): e1001185. doi:10.1371/journal.pmed.1001185

Summary Points
Weak health systems hinder the implementation of effective interventions; policies to strengthen such systems need to draw on the best available evidence.
Health systems evidence is best delivered in the form of guidance embedded in policy formulation processes, but health systems guidance is poorly developed at present.
The translation of research on problems, interventions, and implementation into decisions and policies that affect how systems are organised is one challenge facing the development of health systems guidance.
The development of guidance that is timely and usable by the broad range of health systems stakeholders, and of methods to appraise the quality of health systems guidance, are additional challenges.
Further research is needed to adapt existing approaches (e.g., those used in clinical guidelines) to produce meaningful advice that accounts for the complexity of health systems, political systems, and contexts.
This is the first paper in a three-part series in PLoS Medicine on health systems guidance.

Lavis JN, Røttingen J-A, Bosch-Capblanch X, Atun R, El-Jardali F, et al. (2012) Guidance for Evidence-Informed Policies about Health Systems: Linking Guidance Development to Policy Development. PLoS Med 9(3): e1001186. doi:10.1371/journal.pmed.1001186

Summary Points
Contextual factors are extremely important in shaping decisions about health systems, and policy makers need to work through all the pros and cons of different options before adopting specific health systems guidance.
A division of labour between global guidance developers, global policy developers, national guidance developers, and national policy developers is needed to support evidence-informed policy-making about health systems.
A panel charged with developing health systems guidance at the global level could best add value by ensuring that its output can be used for policy development at the global and national level, and for guidance development at the national level.
Rigorous health systems analyses and political systems analyses are needed at the global and national level to support guideline and policy development.
Further research is needed into the division of labour in guideline development and policy development and on frameworks for supporting system and political analyses.
This is the second paper in a three-part series in PLoS Medicine on health systems guidance.

Guidance for Evidence-Informed Policies about Health Systems: Assessing How Much Confidence to Place in the Research Evidence – PLoS Medicine – 20 March 2012

Simon Lewin
Summary Points
“Assessing how much confidence to place in different types of research evidence is key to informing judgements regarding policy options to address health systems problems.
Systematic and transparent approaches to such assessments are particularly important given the complexity of many health systems interventions.
Useful tools are available to assess how much confidence to place in the different types of research evidence needed to support different steps in the policy-making process; those for assessing evidence of effectiveness are most developed.
Tools need to be developed to assist judgements regarding evidence from systematic reviews on other key factors such as the acceptability of policy options to stakeholders, implementation feasibility, and equity.
Research is also needed on ways to develop, structure, and present policy options within global health systems guidance.
This is the third paper in a three-part series in PLoS Medicine on health systems guidance.”

Read Full Post | Make a Comment ( Comments Off on Guidance for Evidence-Informed Policies about Health Systems – PLoS Medicine – series of articles – 2012 )

Clinical Nurse Specialists’ Role in Selecting & Using Knowledge to Improve Practice & Develop Practice-based Policies Designed to Promote Optimum Patient Outcomes – Canadian Health Services Research Foundation – 30 January 2012

Posted on February 14, 2012. Filed under: Evidence Based Practice, Nursing | Tags: |

Clinical Nurse Specialists’ Role in Selecting & Using Knowledge to Improve Practice & Develop Practice-based Policies Designed to Promote Optimum Patient Outcomes – Canadian Health Services Research Foundation – 30 January 2012

by Dr. Joanne Profetto-McGrath (Principal Investigator) and Dr. Anna Ehrenberg, Faculty of Nursing, University of Alberta and Susan Young and Wendy Hill, Capital Health Authority, Edmonton, Alberta

“Main Messages

Clinical Nurse Specialists (CNSs) are advanced practice nurses with expert knowledge and skills in a specific area of practice (Canadian Nurses Association, 2003). The role of the CNS in the field of evidence-based practice has largely been ignored, in spite of the fact that it is pivotal to the facilitation of research into practice in the clinical setting. The published literature is limited in terms of how CNSs access and transfer research knowledge in making clinical decisions. Therefore, the purpose of this study was to identify and develop a preliminary understanding of the approaches utilized by CNSs to select and use research knowledge in their daily practice, with the long-term aim of developing concrete strategies for this group beyond obtaining and disseminating evidence.”

… continues on the site

Read Full Post | Make a Comment ( Comments Off on Clinical Nurse Specialists’ Role in Selecting & Using Knowledge to Improve Practice & Develop Practice-based Policies Designed to Promote Optimum Patient Outcomes – Canadian Health Services Research Foundation – 30 January 2012 )

Sicily statement on classification and development of evidence-based practice learning assessment tools – December 2011

Posted on December 6, 2011. Filed under: Educ for Hlth Professions, Evidence Based Practice |

Sicily statement on classification and development of evidence-based practice learning assessment tools
Julie K Tilson et al
BMC Medical Education 2011, 11:78 doi:10.1186/1472-6920-11-78

“Background
Teaching the steps of evidence-based practice (EBP) has become standard curriculum for health professions at both student and professional levels. Determining the best methods for evaluating EBP learning is hampered by a dearth of valid and practical assessment tools and by the absence of guidelines for classifying the purpose of those that exist. Conceived and developed by delegates of the Fifth International Conference of Evidence-Based Health Care Teachers and Developers, the aim of this statement is to provide guidance for purposeful classification and development of tools to assess EBP learning.

Discussion
This paper identifies key principles for designing EBP learning assessment tools, recommends a common taxonomy for new and existing tools, and presents the Classification Rubric for EBP Assessment Tools in Education (CREATE) framework for classifying such tools. Recommendations are provided for developers of EBP learning assessments and priorities are suggested for the types of assessments that are needed. Examples place existing EBP assessments into the CREATE framework to demonstrate how a common taxonomy might facilitate purposeful development and use of EBP learning assessment tools.

Summary
The widespread adoption of EBP into professional education requires valid and reliable measures of learning. Limited tools exist with established psychometrics. This international consensus statement strives to provide direction for developers of new EBP learning assessment tools and a framework for classifying the purposes of such tools.”

 

Read Full Post | Make a Comment ( Comments Off on Sicily statement on classification and development of evidence-based practice learning assessment tools – December 2011 )

Uncertainty, evidence and irrecoverable costs: informing approval, pricing and research decisions for health technologies – Centre for Health Economics University of York – October 2011

Posted on October 19, 2011. Filed under: Evidence Based Practice, Health Economics, Health Technology Assessment | Tags: |

Uncertainty, evidence and irrecoverable costs: informing approval, pricing and research decisions for health technologies – Centre for Health Economics University of York – October 2011

by Karl Claxton, Stephen Palmer, Louise Longworth, Laura Bojke, Susan Griffin, Claire McKenna, Marta Soares, Eldon Spackman and Jihee Youn
CHE Research Paper 69

“Abstract
The general issue of balancing the value of evidence about the performance of a technology and the value of access to a technology can be seen as central to a number of policy questions. Establishing the key principles of what assessments are needed, as well as how they should be made, will enable them to be addressed in an explicit and transparent manner. This report presents the key finding from MRC and NHIR funded research which aimed to: i) Establish the key principles of what assessments are needed to inform an only in research (OIR) or Approval with Research (AWR) recommendation. ii) Evaluate previous NICE guidance where OIR or AWR recommendations were made or considered. iii) Evaluate a range of alternative options to establish the criteria, additional information and/or analysis which could be made available to help the assessment needed to inform an OIR or AWR recommendation. iv) Provide a series of final recommendations, with the involvement of key stakeholders, establishing both the key principles and associated criteria that might guide OIR and AWR recommendations, identifying what, if any, additional information or analysis might be included in the Technology Appraisal process and how such  ecommendations might be more likely to be implemented through publicly funded and sponsored research. The key principles and the assessments and judgments required are discussed in Section 2. The sequence of assessment and judgment is represented as an algorithm, which can also be summarised as a simple set of explicit criteria or a seven point checklist of assessments. The application of the check list of assessment to a series of four case studies in Section 3 can inform considerations of whether such assessments can be made based on existing information and analysis in current NICE appraisal and in what circumstances could additional information and/or analysis be useful. In Section 4, some of the implications that this more explicit assessment of OIR and AWR might have for policy (e.g., NICE guidance and drug pricing), the process of appraisal (e.g., greater involvement of research commissioners) and methods of appraisal (e.g., should additional information, evidence and analysis be required) are drawn together.”

Read Full Post | Make a Comment ( Comments Off on Uncertainty, evidence and irrecoverable costs: informing approval, pricing and research decisions for health technologies – Centre for Health Economics University of York – October 2011 )

Understanding Whole Systems Change in Healthcare: The Case of Emerging Evidence-informed Nursing Service Delivery Models – an Health Services Research Foundation – 7 October 2011

Posted on October 18, 2011. Filed under: Evidence Based Practice, Knowledge Translation, Nursing | Tags: |

Understanding Whole Systems Change in Healthcare: The Case of Emerging Evidence-informed Nursing Service Delivery Models – an Health Services Research Foundation – 7 October 2011
Nancy Edwards, Doris Grinspun

“The imperative to deliver the best care possible drives research on best practices in nursing, but what does it take to spread a guideline or recommendation from one or two units or organizations to a system-wide innovation that benefits all patients and providers and the healthcare system as a whole? What cost drivers and increased benefits come with spreading a best practice; and what supports, sustains or gets in the way of spreading evidence-informed change?

Those were the questions we set out to answer in our four-year program of research called Evidence-Informed Models of Nursing Service. Funded by the Canadian Health Services Research Foundation and other partners, the program’s goal was to improve understanding of how health systems introduce, support and spread evidence-informed innovations.

Researchers from across Canada participated in the five projects that made up our program of research, and its main focus was the best practice guidelines initiative of the Registered Nurses Association of Ontario (RNAO). Eight years after the association launched the project, the guidelines are being implemented across Canada and internationally. However, for these the longest (except for study 2, which actually looks at three innovations introduced in Ontario before RNAO launched its guideline initiative). We looked at nursing guidelines because nurses are with patients around the clock, in every sector of healthcare, and getting nurses to base their work on up-to-date, evidence-based practices, is central to delivering safe care and optimizing patient, organizational and system outcomes. The learnings of this study about spreading innovations applies to all healthcare professions and sectors.”

Read Full Post | Make a Comment ( Comments Off on Understanding Whole Systems Change in Healthcare: The Case of Emerging Evidence-informed Nursing Service Delivery Models – an Health Services Research Foundation – 7 October 2011 )

Integrating Clinical Decision Support Into Workflow—Final Report – AHRQ – September 2011

Posted on October 4, 2011. Filed under: Evidence Based Practice, Health Informatics, Knowledge Translation | Tags: , |

Integrating Clinical Decision Support Into Workflow—Final Report – AHRQ – September 2011

Doebbeling BN, Saleem J, Haggstrom D, et al. Integrating Clinical Decision Support Into Workflow—Final Report. (Prepared by Indiana University, Regenstrief Institute under Contract No. HSA290200600013-3.) AHRQ Publication No. 11-0076-EF. Rockville, MD: Agency for Healthcare Research and Quality. September 2011.

Agency for Healthcare Research and Quality – AHRQ Publication No. 11-0076-EF

“Structured Abstract

Purpose: The aims were to (1) identify barriers and facilitators related to integration of clinical decision support (CDS) into workflow and (2) develop and test CDS design alternatives.

Scope: To better understand CDS integration, we studied its use in practice, focusing on CDS for colorectal cancer (CRC) screening and followup. Phase 1 involved outpatient clinics of four different systems—120 clinic staff and providers and 118 patients were observed. In Phase 2, prototyped design enhancements to the Veterans Administration’s CRC screening reminder were compared against its current reminder in a simulation experiment. Twelve providers participated.

Methods: Phase 1 was a qualitative project, using key informant interviews, direct observation, opportunistic interviews, and focus groups. All data were analyzed using a coding template, based on the sociotechnical systems theory, which was modified as coding proceeded and themes emerged. Phase 2 consisted of rapid prototyping of CDS design alternatives based on Phase 1 findings and a simulation experiment to test these design changes in a within-subject comparison.

Results: Very different CDS types existed across sites, yet there are common barriers: (1) lack of coordination of “outside” results and between primary and specialty care; (2) suboptimal data organization and presentation; (3) needed provider and patient education; (4) needed interface flexibility; (5) needed technological enhancements; (6) unclear role assignments; (7) organizational issues; and (8) disconnect with quality reporting. Design enhancements positively
impacted usability and workflow integration but not workload.

Conclusions: Effective CDS design and integration requires: (1) organizational and workflow integration; (2) integrating outside results; (3) improving data organization and presentation in a flexible interface; and (4) providing just-in time education, cognitive support, and quality reporting.”

Read Full Post | Make a Comment ( Comments Off on Integrating Clinical Decision Support Into Workflow—Final Report – AHRQ – September 2011 )

The Learning Health System and its Innovation Collaboratives – Update Report – Institute of Medicine Roundtable on Value & Science-Driven Health Care – 2011

Posted on September 22, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice | Tags: , , |

The Learning Health System and its Innovation Collaboratives – Update ReportInstitute of Medicine Roundtable on Value & Science-Driven Health Care – 2011

“By the year 2020, ninety percent of clinical decisions will be supported by accurate, timely, and up-to-date clinical information, and will reflect the best available evidence.”  Charter  IOM Roundtable on Value & Science-Driven Health Care

Read Full Post | Make a Comment ( None so far )

Communicating Risks and Benefits: An Evidence-Based User’s Guide – FDA – August 2011

Posted on September 7, 2011. Filed under: Evidence Based Practice | Tags: , |

Communicating Risks and Benefits: An Evidence-Based User’s Guide – FDA – August 2011

Baruch Fischhoff, PhD, Noel T. Brewer, PhD, & Julie S. Downs, PhD, editors

US Department of Health and Human Services, Food and Drug Administration, Risk Communication Advisory Committee and consultants
 
Extract from the introduction:

“Organizations bear economic, legal, and ethical obligations to provide useful information about the risks and benefits of their products, policies, and services. Failure to fulfill those obligations can be costly, as seen with Three Mile Island, Hurricane Katrina, Vioxx, and other cases when people believe that they have been denied vital information. Less dramatic versions of these problems arise with poorly handled produce recalls, badly labeled appliances, and confusing medication instructions. Financial analysts estimate that 70% of a typical private firm’s assets are intangibles, like goodwill, that can be lost when communications fail. Public institutions’ reputations often depend on their ability to communicate.

Risk communication is the term of art used for situations when people need good information to make sound choices. It is distinguished from public affairs (or public relations) communication by its commitment to accuracy and its avoidance of spin. Having been spun adds insult to injury for people who have been hurt because they were inadequately informed. Risk communications must deal with the benefits that risk decisions can produce (e.g., profits from investments, better health from medical procedures), as well as the risks — making the term something of a misnomer, although less clumsy than a more inclusive one.

The risk communication research literature is large and diverse, including results from many contributing disciplines (e.g., psychology, decision science, sociology, communications) and a wide range of applications. Unfortunately, the norms of academic research make it inaccessible to outsiders, filling it with jargon and technical details. Moreover, academic researchers’ theoretical interests often lead to studying communication processes in isolation, leaving gaps as to how research results apply to complex, real-world situations. Unable to access the research literature, practitioners rely on their intuition, unproven best practices, and popular accounts of psychological research.

This guide seeks to fill that gap, making evidence-based communication possible. The chapters that follow cover key topics in risk communication, focusing on three questions:

(1) What does the science say about that aspect of human behavior?
(2) What are the practical implications of those scientific results?
(3) How can one evaluate communications based on that science?

These questions assume that sound communications must be evidence-based in two related ways. One is that communications should be consistent with the science — and not do things known not to work nor ignore known problems. The second is communications should be evaluated — because even the best science cannot guarantee results. Rather, the best science produces the best-informed best guesses about how well communications will work. However, even these best guesses can miss the mark, meaning that they must be evaluated to determine how good they are and how they can be improved.”   … continues

 

 

 

Read Full Post | Make a Comment ( None so far )

Canada’s Strategy for Patient-Oriented Research. Improving health outcomes through evidence-informed care – Canadian Institutes of Health Research – August 2011

Posted on September 7, 2011. Filed under: Evidence Based Practice, Patient Participation, Research |

Canada’s Strategy for Patient-Oriented Research. Improving health outcomes through evidence-informed care – Canadian Institutes of Health Research – August 2011

“This document sets out a vision and strategy to improve health outcomes and enhance patient care through the levers of research. The underlying premise of the document is that greater uptake of research-based evidence will improve the health of Canadians while improving the cost-effectiveness of the health care system. “

Read Full Post | Make a Comment ( None so far )

Procedures and requirements for meeting the 2011 NHMRC standard for clinical practice guidelines

Posted on August 22, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice | Tags: , , |

Procedures and requirements for meeting the 2011 NHMRC standard for clinical practice guidelines 
 
Extract from the synopsis of publication

“The 2011 NHMRC Standard replaces the NHMRC standards and procedures for externally developed guidelines (2007) will apply to all guidelines seeking NHMRC approval which have commenced development after 1 January 2011.

NHMRC procedures and requirements to meet the 2011 NHMRC standard for clinical practice guidelines – Summary for developers

A summary document, including a list of what has changed compared to the 2007 NHMRC Standard, has also been prepared.

National Health and Medical Research Council. Procedures and requirements for meeting the 2011
NHMRC standard for clinical practice guidelines. Melbourne: National Health and Medical Research Council; 2011.

National Health and Medical Research Council. Procedures and requirements for meeting the 2011
NHMRC standard for clinical practice guidelines – Summary for developers. Melbourne: National Health and Medical Research Council; 2011.

Read Full Post | Make a Comment ( None so far )

An evidence-based health workforce model for primary and community care – 6 August 2011

Posted on August 8, 2011. Filed under: Evidence Based Practice, Workforce |

An evidence-based health workforce model for primary and community care – 6 August 2011
Leonie Segal and Matthew J Leach
Implementation Science 2011, 6:93 doi:10.1186/1748-5908-6-93

“Abstract (provisional)

Background
The delivery of best practice care can markedly improve clinical outcomes in patients with chronic disease. While the provision of a skilled, multidisciplinary team is pivotal to the delivery of best practice care, the occupational or skill mix required to deliver this care is unclear; it is also uncertain whether such a team would have the capacity to adequately address the complex needs of the clinic population. This is the role of needs-based health workforce planning. The objective of this article is to describe the development of an evidence-informed, needs-based health workforce model to support the delivery of best-practice interdisciplinary chronic disease management in the primary and community care setting using diabetes as a case exemplar.”

… continues on the site

Read Full Post | Make a Comment ( None so far )

Learning What Works: Infrastructure Required for Comparative Effectiveness Research – Institute of Medicine Workshop Summary – 25 July 2011

Posted on July 26, 2011. Filed under: Evidence Based Practice | Tags: , , |

Learning What Works: Infrastructure Required for Comparative Effectiveness Research – Institute of Medicine Workshop Summary – 25 July 2011

Full text

“Evidence is the cornerstone of a high-performing healthcare system. It is essential for patients and clinicians to know which treatments work best for whom if they are to make informed, collaborative care decisions. Despite this need, only a small fraction of health-related expenditures in the U.S. have been devoted to comparative effectiveness research. Recent activities—such as the creation of the Patient-Centered Outcomes Research Institute—are beginning to address this shortfall, bringing the importance of clinical research and evidence development to the forefront of health policy discussions.

As part of its Learning Health System series of workshops, the IOM’s Roundtable on Value & Science-Driven Health Care hosted a workshop to discuss capacity priorities to build the evidence base necessary for care that is more effective and delivers higher value for patients. Participants explored issues such as; data linkage and improvement; study coordination and results dissemination; research methods innovation; and the training and size of the workforce, all as they relate to improved medical decision making.”

Read Full Post | Make a Comment ( None so far )

What Is the Impact of Using Evidence-Based Treatments for Posttraumatic Stress Disorder and Depression in Veterans? – RAND – 2011

Posted on July 7, 2011. Filed under: Evidence Based Practice, Mental Health Psychi Psychol | Tags: |

What Is the Impact of Using Evidence-Based Treatments for Posttraumatic Stress Disorder and Depression in Veterans? – RAND – 2011

“If all veterans suffering from major depression and posttraumatic stress disorder were to receive evidence-based treatments, policy simulations suggest that cost savings generated would be $138 million (15 percent) over two years.”  RAND research brief.

Read Full Post | Make a Comment ( None so far )

Reducing spending on low clinical value treatments – Audit Commission, Health briefing, April 2011

Posted on June 24, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice, Health Economics | Tags: |

Reducing spending on low clinical value treatments – Audit Commission, Health briefing, April 2011

“This briefing looks at primary care trusts’ (PCTs) spending on low clinical value treatments and how some PCTs have successfully reduced their spending in this area. By low clinical value treatments we mean those treatments that are either clinically ineffective or not cost-effective.

Most, if not all, PCTs have identified reducing low clinical value treatments within their Quality, Innovation, Productivity and Prevention (QIPP) plans. No single national list of low clinical value treatments exists and PCTs have been developing their own approaches.

The aim is to free up money spent on low clinical value treatments and use it either to deliver a PCT savings plan or to invest in services with better clinical outcomes. Deciding where to spend money and the clinical effectiveness of services commissioned will be just as relevant for GP consortia as they take control of the NHS budget.

Our analysis shows that it is possible for PCTs to reduce their expenditure on low clinical value treatments if they make efforts to do so. Nationally we estimate that a reduction in PCT spending of between £179 million and £441 million is achievable. By looking at the actual and estimated spending and PCT population numbers at the PCTs we visited, it appears that for every person in a PCT’s population an annual reduction in spending of £10 is possible. Nationally, this would suggest an annual reduction in spending of about £500 million. Hospitals would not make the same saving, but there would be increased capacity and money available for treatments of higher clinical value. However, the opportunities will vary for each PCT and some may decide that  securing potentially modest reductions is not worth the effort required. For others it may be significant. The Audit Commission has developed a tool to help PCTs identify the likely local potential for reducing spending.

This briefing summarises how PCTs are engaging with this challenge and sets out the progress some PCTs have made towards ensuring the NHS provides the right treatments for the right people.”

Read Full Post | Make a Comment ( None so far )

A composite index for the benchmarking of eHealth Deployment in European acute Hospitals. Distilling reality in manageable form for evidence based policy – European Commission Joint Research Centre – May 2011

Posted on May 25, 2011. Filed under: Evidence Based Practice, Health Informatics | Tags: , |

A composite index for the benchmarking of eHealth Deployment in European acute Hospitals. Distilling reality in manageable form for evidence based policy – European Commission Joint Research Centre – May 2011

Authors: Cristiano Codagnone and Francisco Lupiañez-Villanueva

“Abstract
Benchmarking is an important pillar of European policy making and has acquired a ‘quasi-regulatory’ role within the Open Method of Coordination in that it helps the Commission and MS to set target and monitor their achievement. After at least a decade of policy efforts and investments of public money to digitalise healthcare delivery it is a good time to take stock of where we stand in terms of take up, usage and impact. Applying state of the art multivariate statistical analysis to the data of survey of eHealth deployment in Acute European Hospitals funded by Unit C4 of DG INFSO, JRC-IPTS researchers have constructed a composite indicator of take up and usage of eHealth in European hospitals, as well as a typology of impacts. This combined analysis clearly show how, if methodology and substantive policy issues are rightly integrated, benchmarking can really contribute to the policy process and help decision makers fill existing gaps and invest into promising directions.”

Read Full Post | Make a Comment ( None so far )

Implementation of Medical Research in Clinical Practice – Forward Look – European Science Foundation – 11 May 2011

Posted on May 25, 2011. Filed under: Evidence Based Practice, Research | Tags: |

Implementation of Medical Research in Clinical Practice – Forward Look – European Science Foundation – 11 May 2011

Full text

Extract from the Executive Summary

“Medical care has improved beyond recognition over the past half century. An important contribution to this improvement has been through clinical research. Clinical research includes different stages from basic-oriented research, disease-oriented research with animal models, translational research, patient-oriented research and outcome research.

When clinical research has been successfully implemented in clinical practice it can answer important questions relevant to practitioners and provide the evidence necessary to underpin practice.

It is important however, not to remain complacent and to strive for continual improvement. There is still much clinical decision-making that is not informed by evidence, and research which is carried out in a way that is not methodologically robust.

This Forward Look examines how the quality of research can be improved, and how research results can better be implemented in practice. These issues were comprehensively analysed and finally discussed and debated by more than 90 leading experts from Europe and the rest of the world in a series of workshops culminating in a consensus conference held in October 2010 at the Council of Europe in Strasbourg.

After rigorous debate and discussion, identifying gaps and highlighting best practice, a number of recommendations and conclusions were drawn, the principal of which were as follows:”   … continues

Read Full Post | Make a Comment ( None so far )

NICE Pathways – launched 10 May 2011

Posted on May 10, 2011. Filed under: Evidence Based Practice | Tags: , |

New online tool brings all related NICE guidance together for first time – 10 May 2011

“Today (Tuesday 10 May), the National Institute for Health and Clinical Excellence (NICE) has launched NICE Pathways at its annual conference in Birmingham. An online tool for health and social care professionals, NICE Pathways brings together all connected NICE guidance on a topic in a user-friendly electronic flowchart.

Previously there has been no easy way to see at a glance everything NICE has said on a specific condition, for example diabetes, across all its separate published guidance. For the first time ever, this digital resource will allow users to quickly view and navigate NICE guidance and other tools on any given topic across an entire care pathway. For example, the postnatal care pathway considers everything from the baby’s first 24 hours up until the first 2 – 8 weeks.

The 18 pathways launched today cover alcohol-use disorders, anaemia management in chronic kidney disease, breast cancer, chronic heart failure, chronic kidney disease (CKD), chronic obstructive pulmonary disease (COPD), dementia, depression, diabetes, diabetes in pregnancy, diet, glaucoma, neonatal jaundice, physical activity, postnatal care, smoking, stroke, and venous thromboembolism (VTE) prevention.

Covering the whole range of different types of NICE advice, including health technology appraisals, clinical guidelines, public health and social care advice, quality standards and implementation tools, this is part of a wider move to provide a more personalised, audience-focused way of looking at NICE guidance.

Users do not need to understand how NICE classifies its guidance to read everything NICE has said on a particular topic. They will now be able to easily select the sections of guidance they need. This new resource will also greatly facilitate access to NICE guidance for commissioners, who need to commission care across a whole pathway.

Individual pathways also link to other related pathways – for example the diet pathway links with the physical activity pathway. NICE Pathways will continue to develop by including more content and more topics as new NICE guidance is published and by adding new features such as linking to the evidence behind NICE recommendations.”  … continues

Read Full Post | Make a Comment ( None so far )

Evidence for Social Policy and Practice: Perspectives on how research and evidence can influence decision making in public services – NESTA – April 2011

Posted on May 10, 2011. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Research |

Evidence for Social Policy and Practice: Perspectives on how research and evidence can influence decision making in public services – NESTA – April 2011

“NESTA is the National Endowment for Science, Technology and the Arts – an independent body with a mission to make the UK more innovative.”

“From criminal justice, childrens services to poverty reduction, this report contains essays from organisations using different methodologies and approaches to generate evidence and influence policy and practice in a number of service areas.

The idea that policy and practice should be underpinned by rigorous evidence is internationally accepted, yet there is recognition that the level of rigour in evaluating ‘what works’ in social policy remains limited. In a time of public service reform and more decentralised decision making, the need for timely, accessible and reliable evidence is becoming ever more important.

At NESTA we both use and produce evidence through a combination of practical programmes and research projects. We are keen to learn from the organisations featured here – as well as others working in the field – to help strengthen the evidence base and improve the sharing of this knowledge.”

Read Full Post | Make a Comment ( None so far )

Using evidence: Advances and debates in bridging health research and action – Atlantic Health Promotion Research Centre – 2010

Posted on May 4, 2011. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Knowledge Translation, Public Hlth & Hlth Promotion |

Using evidence: Advances and debates in bridging health research and action – Atlantic Health Promotion Research Centre – 2010
Editor: Renée F. Lyons, 2010   ISBN 978-0-7703-8051-9

“About the Monograph
This monograph grew out of a symposium on health and knowledge translation (KT) held at 13 Norham Gardens, Green Templeton College, University of Oxford, in May 2008. The purpose of the symposium was to examine ways of thinking about and doing knowledge translation, and to debate issues central to moving research into action. Case examples were provided from health services and policy, clinical practice, and public health.”

Read Full Post | Make a Comment ( None so far )

Towards better use of evidence in policy formation: a discussion paper – NZ – April 2011

Posted on May 4, 2011. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Policy |

Towards better use of evidence in policy formation: a discussion paper – NZ – April 2011

Sir Peter Gluckman KNZM FRSNZ FRS
Chief Science Advisor to the Prime Minister
April 2011

Extract from the foreword:
“It is important to separate as far as possible the role of expert knowledge generation and evaluation from the role of those charged with policy formation. Equally, it is important to distinguish clearly between the application of scientific advice for policy formation (‘science for policy’) and the formation of policy for the operation of the Crown’s science and innovation system, including funding allocation (‘policy for science’). This paper is concerned with the former. A purely technocratic model of policy formation is not appropriate in that knowledge is not, and cannot be, the sole determinant of how policy is developed. We live in a democracy, and governments have the responsibility to integrate dimensions beyond that covered in this paper into policy formation, including societal values, public opinion, affordability and diplomatic considerations while accommodating political processes.”

Read Full Post | Make a Comment ( None so far )

Health Inequalities National Support Team Diagnostic Workbook. Assessment of Services to Reduce Diabetes-related Mortality – 1 April 2011

Posted on April 29, 2011. Filed under: Diabetes, Evidence Based Practice, Public Hlth & Hlth Promotion |

Health Inequalities National Support Team Diagnostic Workbook. Assessment of Services to Reduce Diabetes-related Mortality – 1 April 2011

“Includes potential key actions to reduce mortality.  Identifying strengths and effective practice and making tailored recommendations on how to address gaps in service delivery.”

“This workbook was developed by the Health Inequalities National Support Teams (HINST) with 70 local authorities covering populations in England. Local areas could use this approach when analysing whether a population level improvements could be achieved from a set of best-practice and established interventions. This is offered as useful resource for commissioners: use is NOT mandatory.”

Read Full Post | Make a Comment ( None so far )

A Generic Diagnostic Framework for Addressing Inequalities in Outcome at Population Level from Evidence-based Interventions – 31 March 2011

Posted on April 29, 2011. Filed under: Evidence Based Practice, Public Hlth & Hlth Promotion |

A Generic Diagnostic Framework for Addressing Inequalities in Outcome at Population Level from Evidence-based Interventions – 31 March 2011

Identifying strengths and effective practice and making tailored recommendations on how to address gaps in service delivery / Chris Bentley – DH Health Inequalities National Support Team

“This generic workbook is the overarching guide and template for a diagnostic approach to analysing whether a population level outcome will be achieved from a set of evidence-based interventions. It is the master workbook of a series of diagnostic workbooks developed by the Health Inequalities National Support Team (HINST), while working with the 70 local authorities covering populations in England with the highest levels of disadvantage and poorest health. The programme finished work in March 2011, but the Department of Health is publishing its key outputs for local commissioners and providers to use if they so wish. Any of the areas of work within local partnerships that affect the health of the population could be explored using the generic set of questions in this workbook. The HINST has already developed these generic questions into specific topic based workbooks, each selected for the importance of its potential impact on health and wellbeing, and also on mortality and life expectancy in the short, medium or long term.”

Read Full Post | Make a Comment ( None so far )

NIH launches Web resource on complementary and alternative medicine – 26 April 2011

Posted on April 29, 2011. Filed under: Complementary & Altern Care, Evidence Based Practice | Tags: |

NIH launches Web resource on complementary and alternative medicine – 26 April 2011

“Evidence-based information for health care providers

A new online resource, designed to give health care providers easy access to evidence-based information on complementary and alternative medicine (CAM), was unveiled today by the National Center for Complementary and Alternative Medicine (NCCAM) of the National Institutes of Health.

With this new resource, providers will have the tools necessary to learn about the various CAM practices and products and be better able to discuss the safety and effectiveness of complementary and alternative medicine with their patients.” 

… continues on the site

Read Full Post | Make a Comment ( None so far )

Reducing expenditure on low clinical value treatments: A health briefing – Audit Commission [UK] – 14 April 2011

Posted on April 21, 2011. Filed under: Evidence Based Practice, Health Economics, Health Systems Improvement | Tags: |

Reducing expenditure on low clinical value treatments: A health briefing – Audit Commission – 14 April 2011

“This health briefing suggests that the NHS could save up to £500 million a year by carrying out fewer ineffective or inefficient treatments.

‘Reducing expenditure on low clinical value treatments’ argues that a single approach to defining these low value treatments could help to reduce the duplication of effort between primary care trusts (PCTs) and help to ensure consistency across the country.

The briefing considers some PCTs’ efforts to decommission treatments of low clinical value. The approaches they took and the list of treatments they targeted varied. The Commission is not advocating any particular list, but the types of low value treatments identified included:

Those considered to be relatively ineffective, eg a tonsillectomy.
Those where more cost-effective alternatives are available, eg not performing a hysterectomy in cases of heavy menstrual bleeding.
Those with a close benefit and risk balance in mild cases, eg wisdom teeth extraction.
Potentially cosmetic procedures, eg orthodontics.
Decommissioning treatments can free up money that could be better spent on other treatments, but decisions can be controversial. The briefing shows how strong leadership within PCTs, as well as good communication between PCTs, and GPs, patients and the public, are crucial success factors.

A simple and easy to use online tool has also been developed that allows the user to identify savings opportunities against the ‘Croydon list’.”

Using the Reducing PCT expenditure on treatments with low clinical value online tool

Read Full Post | Make a Comment ( None so far )

Variations in health care: The good, the bad and the inexplicable – King’s Fund – 14 April 2011

Posted on April 14, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice, Surgery | Tags: |

Variations in health care: The good, the bad and the inexplicable – King’s Fund – 14 April 2011

press release

“A new report from The King’s Fund has found persistent and widespread variations across England in patients’ chances of undergoing surgery for common medical conditions.” 

“The report Variations in health care: The good, the bad and the inexplicable, outlines differences in admission rates for several routine interventions by analysing the geographical variation in health care provision in the NHS in England. Thirty-six different procedures were selected for analysis because they were either:

generally recognised to be clinically effective, or
there is uncertainty regarding their intervention, and/or
there are cost-effective alternatives available for conducting surgery – for example, treatment as a day case, rather than being admitted as an inpatient.

Evidence suggests that medical opinion and/or doctor preferences and attitudes have a substantial influence over which treatment patients will receive and are a major source of variation. Studies have also found that patients, if fully informed about their options, will often choose differently from their doctors and are less likely to elect for surgery than control groups.”

Read Full Post | Make a Comment ( None so far )

AHRQ’s Effective Health Care Program Data Points Publication Series Available

Posted on April 5, 2011. Filed under: Comparative Effectiveness Research, Evidence Based Practice | Tags: |

AHRQ’s Effective Health Care Program Data Points Publication Series Available

Agency for Healthcare Research and Quality [US]

“The DEcIDE (Developing Evidence to Inform Decisions about Effectiveness) Network announces a new publication series Data Points that will be available on AHRQ’s Effective Health Care Program Web site. This series will offer new information and insights on the use of health care services and interventions for the treatment, management, and diagnosis of diseases, as well as the variations and potential disparities across patient subpopulations.  Reports will provide brief descriptive statistics, background information, and analytic tables on a variety of specific, focused topics related to medical diagnoses, treatments, services, and patient populations.  Specific reports will present new statistics on topics such as disease incidence, prevalence, and burden of illness, as well as outcomes such as readmission, morbidity, and mortality.  The publication series will generally summarize the basic demographic and geographic breakdowns with additional details available in statistical tables that can be downloaded from the Effective Health Care Program Web site.  The first Data Points reports describe the incidence and prevalence of diabetic foot ulcers and some of its major complications in Medicare beneficiaries.  Three reports are now available on the Effective Health Care Program Web site. “

Read Full Post | Make a Comment ( None so far )

Knowledge for Improvement: Special Supplement – BMJ – April 2011

Posted on April 5, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice, Knowledge Translation |

Knowledge for Improvement: Special Supplement – BMJ – April 2011

From the Institute for Healthcare Improvement

“In April 2010, some of the best thinkers and architects of the science of quality improvement in health care gathered in England to look deeply at the state of the knowledge underpinning the discipline. For several days, thanks to the sponsorship of the UK’s Health Foundation, IHI, and the Dartmouth Institute for Health Policy and Clinical Improvement, meeting participants examined the strengths and limitations and new needs for an epistemology that’s now fueling a truly global movement for quality and safety. The work of this colloquium, organized by Dartmouth’s Paul Batalden and the Health Foundation’s Dale Webb and Paul Bate, is now the focus of some 22 essays just published by BMJ Quality & Safety (previously Quality and Safety in Health Care) in a special, OPEN ACCESS supplement entitled “Knowledge for Improvement.” Co-edited by Paul Batalden and IHI Senior Editor Frank Davidoff, the articles cover six major areas: the structure of improvement knowledge; discovering and defining sources of evidence; the social determinants of action; the importance of cross-disciplinary work; the challenges of professional education; and rethinking methods of inference. Let the reading, and learning, begin!”

Read Full Post | Make a Comment ( None so far )

The guideline advantage – American Cancer Society, American Diabetes Association, American Heart Association, American Stroke Association

Posted on April 1, 2011. Filed under: Evidence Based Practice | Tags: |

The guideline advantage – American Cancer Society, American Diabetes Association, American Heart Association, American Stroke Association

From Healthcare IT News 31 March 2011
National health groups team up for quality

“DALLAS – Three major health organizations, the American Cancer Society, American Diabetes Association and American Heart Association/American Stroke Association, have collaborated to create a quality improvement program aimed at improving outpatient care nationwide. Working with electronic health records providers from around the country, the program will provide doctors with the ability to easily gather, access and report on important data that can ultimately lead to improved care and outcomes for patients.

The program, called The Guideline Advantage, targets four of the 10 leading causes of death in the United States today, according to the Centers for Disease Control and Prevention – heart disease, cancer, stroke and diabetes.

Modeled after the American Heart Association/American Stroke Association’s Get With The Guidelines quality suite of programs, the program was first launched in 2009 as Get With The Guidelines-Outpatient, and focused on cardiovascular health. Now, as The Guideline Advantage, the program provides the basis for evaluating and improving outpatient treatment for ? and prevention of ? these four diseases, which share many similar risk factors.

The Guideline Advantage measures and compares the quality of care given by doctors and other healthcare providers in practices and clinics outside the hospital setting. The goal is for providers to implement the evidence-based guidelines for caring for patients who have or who are at-risk for these conditions, and help improve the way they provide that care. Through the use of electronic health records, the program will also develop a rich database of information for future heart disease, stroke, cancer and diabetes research.”

Read Full Post | Make a Comment ( None so far )

How can we improve guideline use? A conceptual framework of implementability – 22 March 2011

Posted on March 29, 2011. Filed under: Evidence Based Practice | Tags: |

How can we improve guideline use? A conceptual framework of implementability – 22 March 2011
Anna R Gagliardi, Melissa C Brouwers, Valerie A Palda, Louise Lemieux-Charles and Jeremy M Grimshaw
Implementation Science 2011, 6:26 doi:10.1186/1748-5908-6-26

Abstract (provisional)
Background
Guidelines continue to be underutilized and a variety of strategies to improve their use have been suboptimal. Modifying guideline features represents an alternative, but untested way to promote their use. The purpose of this study was to identify and define features that facilitate guideline use, and examine whether and how they are included in current guidelines.
Methods
A guideline implementability framework was developed by reviewing the implementation science literature. We then examined whether guidelines included these, or additional implementability elements. Data were extracted from publicly available high quality guidelines reflecting primary and institutional care, reviewed independently by two individuals, who through discussion resolved conflicts, then by the research team.
Results
The final implementability framework included 22 elements organized in the domains of adaptability, usability, validity, applicability, communicability, accommodation, implementation and evaluation. ”  … continues on the site

Read Full Post | Make a Comment ( None so far )

Finding What Works in Health Care: Standards for Systematic Reviews – Institute of Medicine – 2011

Posted on March 24, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice, Health Systems Improvement | Tags: , |

Finding What Works in Health Care: Standards for Systematic Reviews – Institute of Medicine – 2011

Authors:  Jill Eden, Laura Levit, Alfred Berg, and Sally Morton, Editors; Committee on Standards for Systematic Reviews of Comparative Effectiveness Research; Institute of Medicine

“Healthcare decision makers-including doctors-increasingly turn to systematic reviews for reliable, evidence-based comparisons of health interventions. Systematic reviews identify, select, assess, and syn thesize the findings of similar but separate studies. In this report, the IOM recommends stan dards for systematic reviews of the comparative effectiveness of medical or surgical interventions.”

ISBN-10: 0-309-21053-4
ISBN-13: 978-0-309-21053-9

Read Full Post | Make a Comment ( None so far )

NEHI outlines national plan for comparative effectiveness research – 17 February 2011

Posted on February 18, 2011. Filed under: Clin Governance / Risk Mgmt / Quality, Evidence Based Practice, Health Informatics | Tags: , , |

NEHI outlines national plan for comparative effectiveness research – 17 February 2011

by Bernie Monegain, Editor

NEHI – The National Network for Health Innovation [US]

“CAMBRIDGE, MA – NEHI, an independent nonprofit national network for health innovation, Thursday proposed a national strategy for disseminating comparative effectiveness research (CER) findings that would be led by the newly created Patient-Centered Outcomes Research Institute (PCORI).

NEHI also recommends that the CER challenge extend to other federal agencies administering programs to enhance the healthcare system’s capabilities to utilize evidence, such as the Office of the National Coordinator for Health Information Technology, which is charged with implementing federal support for the deployment of electronic medical records and clinical data infrastructure.”

…continues on the site

Read Full Post | Make a Comment ( None so far )

Achieving an Exceptional Patient and Family Experience of Inpatient Hospital Care – IHI White Paper – 2011

Posted on February 14, 2011. Filed under: Evidence Based Practice, Health Systems Improvement, Patient Participation | Tags: |

Achieving an Exceptional Patient and Family Experience of Inpatient Hospital Care – IHI White Paper – 2011

Balik B, Conway J, Zipperer L, Watson J. Achieving an Exceptional Patient and Family Experience of Inpatient Hospital Care. IHI Innovation Series white paper. Cambridge, Massachusetts: Institute for Healthcare Improvement; 2011.

“In response to growing interest from the hospital community in better understanding and improving the experience of patients and their families during hospitalization, the Institute for Healthcare Improvement (IHI) conducted an in-depth review of the research, studied exemplar organizations, and interviewed experts in the field. Our aim was to identify the primary and secondary drivers of exceptional patient and family inpatient hospital experience (defined as care that is patient centered, safe, effective, timely, efficient, and equitable), as measured by the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey’s “willingness to recommend” the hospital.

The project identified five primary drivers of exceptional patient and family inpatient hospital experience of care: leadership; staff hearts and minds; respectful partnership; reliable care; and evidence-based care.”

Read Full Post | Make a Comment ( None so far )

Selecting Patients for ICD Implantation: Are Clinicians Choosing Appropriately? – JAMA January 2011

Posted on January 25, 2011. Filed under: Cardiol / Cardiothor Surg, Evidence Based Practice |

Non–Evidence-Based ICD Implantations in the United States
Sana M. Al-Khatib,  et al    JAMA 2011:305(1):43-49
Abstract

Context Practice guidelines do not recommend use of an implantable cardioverter-defibrillator (ICD) for primary prevention in patients recovering from a myocardial infarction or coronary artery bypass graft surgery and those with severe heart failure symptoms or a recent diagnosis of heart failure.

Objective To determine the number, characteristics, and in-hospital outcomes of patients who receive a non–evidence-based ICD and examine the distribution of these implants by site, physician specialty, and year of procedure.

Design, Setting, and Patients Retrospective cohort study of cases submitted to the National Cardiovascular Data Registry-ICD Registry between January 1, 2006, and June 30, 2009.

Main Outcome Measure In-hospital outcomes.

Results Of 111 707 patients, 25 145 received non–evidence-based ICD implants (22.5%). Patients who received a non–evidence-based ICD compared with those who received an evidence-based ICD had a significantly higher risk of in-hospital death (0.57% [95% confidence interval {CI}, 0.48%-0.66%] vs 0.18% [95% CI, 0.15%-0.20%]; P <.001) and any postprocedure complication (3.23% [95% CI, 3.01%-3.45%] vs 2.41% [95% CI, 2.31%-2.51%]; P <.001). There was substantial variation in non–evidence-based ICDs by site. The rate of non–evidence-based ICD implants was significantly lower for electrophysiologists (20.8%; 95% CI, 20.5%-21.1%) than nonelectrophysiologists (24.8% [95% CI, 24.2%-25.3%] for nonelectrophysiologist cardiologists; 36.1% [95% CI, 34.3%-38.0%] for thoracic surgeons; and 24.9% [95% CI, 23.8%-25.9%] for other specialties) (P<.001 for all comparisons). There was no clear decrease in the rate of non–evidence-based ICDs over time (24.5% [6908/28 233] in 2006, 21.8% [7395/33 965] in 2007, 22.0% [7245/32 960] in 2008, and 21.7% [3597/16 549] in 2009; P <.001 for trend from 2006-2009 and P = .94 for trend from 2007-2009).

Conclusion Among patients with ICD implants in this registry, 22.5% did not meet evidence-based criteria for implantation.
Selecting Patients for ICD Implantation: Are Clinicians Choosing Appropriately?
Alan Kadish, Jeffrey Goldberger
JAMA. 2011;305(1):91-92.

Read Full Post | Make a Comment ( None so far )

Comparative Effectiveness Review Methods: Clinical Heterogeneity – AHRQ – 27 September 2010

Posted on October 25, 2010. Filed under: Evidence Based Practice | Tags: , |

Comparative Effectiveness Review Methods: Clinical Heterogeneity – AHRQ – 27 September 2010
AHRQ = Agency for Healthcare Research and Quality [US]

AHRQ’s Effective Health Care Program released a new report, Comparative Effectiveness Review Methods: Clinical Heterogeneity prepared by AHRQ’s RTI International–University of North Carolina Evidence-based Practice Center.  The report explores best practices for addressing clinical heterogeneity in systematic reviews and comparative effectiveness reviews. Patients, clinicians, policymakers, and others assert that systematic reviews typically focus on broad populations and, as a result, often lack information relevant to individual patients or patient subgroups.  The report concluded that clear evidence-based guidance on addressing clinical heterogeneity in systematic reviews and comparative effectiveness reviews is not available currently but would be valuable to AHRQ’s Evidence-based Practice Centers and to others conducting systematic reviews internationally.

Read Full Post | Make a Comment ( None so far )

Redesigning the Clinical Effectiveness Research Paradigm: Innovation and Practice-Based Approaches: Workshop Summary – 2010

Posted on October 5, 2010. Filed under: Comparative Effectiveness Research, Evidence Based Practice, Research | Tags: |

Redesigning the Clinical Effectiveness Research Paradigm: Innovation and Practice-Based Approaches: Workshop Summary – 2010

LeighAnne Olsen and J. Michael McGinnis; Roundtable on Value & Science-Driven Health Care; Institute of Medicine

ISBN-10: 0-309-11988-X
ISBN-13: 978-0-309-11988-7

Extract from the description:

“The Institute of Medicine Roundtable on Value & Science-Driven Health Care’s vision for a learning healthcare system, in which evidence is applied and generated as a natural course of care, is premised on the development of a research capacity that is structured to provide timely and accurate evidence relevant to the clinical decisions faced by patients and providers. As part of the Roundtable’s Learning Healthcare System series of workshops, clinical researchers, academics, and policy makers gathered for the workshop Redesigning the Clinical Effectiveness Research Paradigm: Innovation and Practice-Based Approaches. Participants explored cutting-edge research designs and methods and discussed strategies for development of a research paradigm to better accommodate the diverse array of emerging data resources, study designs, tools, and techniques. Presentations and discussions are summarized in this volume.”

Read Full Post | Make a Comment ( None so far )

Report to the President and the Congress on Comparative Effectiveness Research – US Department of Health and Human Services – 30 June 2009

Posted on October 4, 2010. Filed under: Comparative Effectiveness Research, Evidence Based Practice |

Report to the President and the Congress on Comparative Effectiveness Research – US Department of Health and Human Services – 30 June 2009

The Annual Report on Comparative Effectiveness Research contains information describing current Federal activities on comparative effectiveness research and recommendations for such research conducted or supported from funds made available by the Recovery Act.

Comparative Effectiveness Research Funding

Read Full Post | Make a Comment ( None so far )

Handbook on Impact Evaluation: Quantitative Methods and Practices – International Bank for Reconstruction and Development – 2010

Posted on October 4, 2010. Filed under: Evidence Based Practice, Health Systems Improvement | Tags: , |

Handbook on Impact Evaluation: Quantitative Methods and Practices – International Bank for Reconstruction and Development – 2010

Khandker, S.R., Koolwal, G.B., & Samad, H.A. (2010). Handbook on Impact Evaluation: 
Quantitative Methods and Practices. The International Bank for Reconstruction and Development.

ISBN: 978-0-8213-8028-4
eISBN: 978-0-8213-8029-1
DOI: 10.1596/978-0-8213-8028-4

Extract from the foreword:

“Handbook on Impact Evaluation: Quantitative Methods and Practices makes a valuable contribution in this area by providing, for policy and research audiences, a comprehensive overview of steps in designing and evaluating programs amid uncertain and potentially confounding conditions. It draws from a rapidly expanding and broadbased literature on program evaluation—from monitoring and evaluation approaches to experimental and nonexperimental econometric methods for designing and conducting impact evaluations.”

Read Full Post | Make a Comment ( None so far )

Registries for Evaluating Patient Outcomes: A User’s Guide – AHRQ – 2nd ed – 27 September 2010

Posted on October 4, 2010. Filed under: Evidence Based Practice, Health Informatics, Health Mgmt Policy Planning, Patient Safety | Tags: |

Registries for Evaluating Patient Outcomes: A User’s Guide – AHRQ – 2nd ed – 27 September 2010

AHRQ’s Effective Health Care  Program has released the handbook, Registries for Evaluating Patient Outcomes: A User’s Guide 2nd Edition.  Originally published in 2007, the handbook has been completely updated with four new sections addressing emerging topics in registry science:

When To Stop a Registry; Use of Registries in Product Safety Assessment
Linking Registry Data
Technical and Legal Considerations
Interfacing Registries and Electronic Health Records

Gliklich RE, Dreyer NA, eds. Registries for Evaluating Patient Outcomes: A User’s Guide. 2nd ed.
(Prepared by Outcome DEcIDE Center [Outcome Sciences, Inc. d/b/a Outcome] under Contract No.
HHSA29020050035I TO3.) AHRQ Publication No.10-EHC049. Rockville, MD: Agency for Healthcare
Research and Quality. September 2010.

Read Full Post | Make a Comment ( None so far )

Online Search, Consultation, and Reporting (OSCAR) System – US Indian Health Service

Posted on September 15, 2010. Filed under: Aboriginal TI Health, Evidence Based Practice, Public Hlth & Hlth Promotion | Tags: |

Online Search, Consultation, and Reporting (OSCAR) System

“The Indian Health Service (IHS) Health Promotion/Disease Prevention (HP/DP), Behavioral Health, and Improving Patient Care (IPC) Programs are creating an inventory of Best (i.e., Evidence-Based) Practice, Promising Practice, Local Effort (BP/PP/LE), Resources, and Policies occurring among American Indian/Alaska Native (AI/AN) communities, schools, work sites, health centers/clinics, and hospitals.

The purpose of this inventory is to:

Assist our AI/AN communities with getting the information and health services they need;
Form an IHS database of Best Practices, Promising Practices, Local Efforts, Resources, and Policies that can be easily accessed on the IHS website;
Improve informed consultation with Tribal and Urban programs by facilitating transparency in IHS and IHS supported activities; and,
Highlight the great work that occurs in the field.”

Read Full Post | Make a Comment ( None so far )

Medico-Legal Research Using Evidence-Based Medicine

Posted on September 14, 2010. Filed under: Evidence Based Practice |

Medico-Legal Research Using Evidence-Based Medicine
Caroline Young
LAW LIBRARY JOURNAL Vol. 102:3 [2010-25]

Ms. Young provides an introduction for legal researchers to locating and evaluating medical information in the context of evidence-based medicine. Topics covered include defining evidence-based medicine, using and selecting bibliographic databases for medical research, and applying the methods of evidence-based medicine to the process of medical research and evaluating information retrieved.

Read Full Post | Make a Comment ( None so far )

Reducing the use of ineffective health care interventions – January 2010

Posted on September 14, 2010. Filed under: Evidence Based Practice | Tags: , |

Reducing the use of ineffective health care interventions – January 2010

Working Paper 2010/5 January 2010
A report by the Centre for Health Economics Research and Evaluation for NSW Treasury

“This report covers international and Australian models for reducing the use of ineffective interventions, also described as disinvestment. Disinvestment is a development of Health Technology Assessment (HTA). Conventionally HTA has focussed on the introduction of new technologies. Although medical technology is advancing rapidly, there remain very many technologies in use which have not been subject to formal HTA. This has stimulated a growing interest in disinvestment.

This review identified a number of case studies and pilot projects. There is limited information available on the mechanisms used, and no rigorous evaluations of their impact. The most developed model is that of NICE which has recently embarked on providing guidance for disinvestment. A number of technologies have been reviewed; but there is limited information available on how these were identified, how disinvestment is implemented, or what the effect has been. There is substantial resistance to any active disinvestment. Across the various case studies, appraisal of candidate technologies seems most likely to be triggered by expert opinion.

In Australia, disinvestment is also generally passive. Technologies may be removed from funding or reimbursement if new research demonstrating harms or inefficacy becomes public. More generally, technologies fall into disuse, and are gradually replaced by new or improved technologies. Even when guidelines or funding rules are changed, there is generally continued use of an existing technology.

This review has found that active disinvestment has generally been removal of funding for ineffective and/or unsafe technologies, usually initiated by new evidence of inefficacy or harm. Disinvestment is more likely to be passive, ie driven by changes in medical practice, as a procedure or treatment gradually falls out of use over time. There are very few instances of disinvestment, or appraisal for disinvestment, driven by considerations of cost-effectiveness. There are considerable difficulties implementing disinvestment in ineffective health care practices.”

…continues

Read Full Post | Make a Comment ( None so far )

Improving Access to Psychological Therapies (IAPT): data handbook (v1.0) – NHS – August 2010

Posted on August 12, 2010. Filed under: Evidence Based Practice, Mental Health Psychi Psychol | Tags: |

Improving Access to Psychological Therapies (IAPT): data handbook (v1.0) – NHS – August 2010

Guidance on recording and monitoring outcomes to support local evidence-based practice
 
This handbook aims to provide guidance for all data collection issues within the IAPT programme such as: utilising outcomes data for patient-centred care, improving clinical practice and service quality; help with routine outcome measurement using standard clinical metrics; and using clinical records which will form the basis of the future National Data Set.
 
Improving Access to Psychological Therapies – services

Read Full Post | Make a Comment ( None so far )

Knowledge Brokering. Exploring the process of transferring knowledge into action – Leeds Institute of Health Sciences – May 2010

Posted on June 22, 2010. Filed under: Evidence Based Practice | Tags: , |

Knowledge Brokering. Exploring the process of transferring knowledge into action  – Leeds Institute of Health Sciences –  May 2010  – final report

A research project – Transferring knowledge into action – funded by the MRC (Medical Research Council) completed in May 2010.

Authors: Dr Vicky Ward, Dr Simon Smith, Dr Samantha Carruthers, Dr Susan Hamer,  Professor Allan House

Read Full Post | Make a Comment ( None so far )

Basing Health Care on Empirical Evidence – Issue Brief from Mathematica Policy Research, Inc – May 2010

Posted on June 15, 2010. Filed under: Evidence Based Practice, Health Mgmt Policy Planning, Health Systems Improvement |

Basing Health Care on Empirical Evidence – Issue Brief from Mathematica Policy Research, Inc – May 2010

by Jill Bernstein, Deborah Chollet, and Stephanie Peterson

“Federal [US] health reform emphasizes the development of evidence-based practice to improve the quality and effectiveness of health care and reduce unnecessary spending. Evidence-based practice uses findings from comparative effectiveness research, which compares the results of alternative treatments to identify what works best. Moving evidence into  practice, however, requires developing new information, reporting systems, and approaches to provider and consumer education. This brief reviews initiatives under way to put evidence into practice. While many of these initiatives demonstrate great potential for quality improvement, and some demonstrate potential for cost savings, their results can differ among care settings, localities, and patient populations.”

Read Full Post | Make a Comment ( None so far )

Evidence That Consumers Are Skeptical About Evidence-Based Health Care – 3 June 2010

Posted on June 4, 2010. Filed under: Evidence Based Practice, Patient Participation | Tags: |

Health Affairs, doi: 10.1377/hlthaff.2009.0296
(Published online June 3, 2010)
 
Evidence That Consumers Are Skeptical About Evidence-Based Health Care
Kristin L. Carman et al 
 Abstract
“We undertook focus groups, interviews, and an online survey with health care consumers as part of a recent project to assist purchasers in communicating more effectively about health care evidence and quality. Most of the consumers were ages 18–64; had health insurance through a current employer; and had taken part in making decisions about health insurance coverage for themselves, their spouse, or someone else. We found many of these consumers’ beliefs, values, and knowledge to be at odds with what policy makers prescribe as evidence-based health care. Few consumers understood terms such as “medical evidence” or “quality guidelines.” Most believed that more care meant higher-quality, better care. The gaps in knowledge and misconceptions point to serious challenges in engaging consumers in evidence-based decision making. ”
 
Communication toolkit mentioned in the paper

Read Full Post | Make a Comment ( None so far )

Preventing Alzheimer’s Disease and Cognitive Decline – [US] Agency for Healthcare Research and Quality – April 2010

Posted on May 24, 2010. Filed under: Aged Care / Geriatrics, Evidence Based Practice | Tags: , |

Preventing Alzheimer’s Disease and Cognitive Decline – [US] Agency for Healthcare Research and Quality – April 2010

Prepared for: Agency for Healthcare Research and Quality, U.S. Department of Health and Human Services
Prepared by: Duke Evidence-based Practice Center, Durham, North Carolina
AHRQ Publication No. 10-E005

Suggested Citation:
Williams JW, Plassman BL, Burke J, Holsinger T, Benjamin S. Preventing Alzheimer’s Disease and Cognitive Decline. Evidence Report/Technology Assessment No. 193. (Prepared by the Duke Evidence-based Practice Center under Contract No. HHSA 290-2007-10066-I.) AHRQ Publication No. 10-E005. Rockville, MD: Agency for Healthcare Research and Quality. April 2010.

“Objectives: To assess whether previous research on purported risk or protective factors for Alzheimer’s disease (AD) and cognitive decline is of sufficient strength to warrant specific recommendations for behavioral, lifestyle, or pharmaceutical interventions/modifications targeted to these endpoints.”

…continues

Read Full Post | Make a Comment ( None so far )

« Previous Entries

Liked it here?
Why not try sites on the blogroll...