Modern Healthcare writes: About 70% of more than 5,700 agencies in the U.S. got four or five stars on patient satisfaction, according to federal ratings posted Thursday. But while patients may be giving high marks for satisfaction, previous data found the agencies had only mediocre performance on outcomes.The CMS’ Home Health star ratings are drawn from consumers’ responses to surveys known as HCAHPS, which ask home health patients about the professionalism and the communication skills of care teams, among other things.But many of these agencies aren’t reducing the need for hospital admissions or improving patients’ mobility and pain.
the following is a summary of the latest quarterly quality data refresh on CMS Home Health Compare – updated 1/28/2016:
Note: number of measures are contained in () for each measure group:
A) Quality of Care
- Managing daily activities (3 measures) 14Q3_15Q2
- Managing pain and treating symptoms (5 measures) 14Q3_15Q2
- Treating wounds and preventing pressure sores (4 measures) 14Q3_15Q2
- Preventing harm (9 measures – including HHCAHPS Survey Summary Star Rating) 14Q3_15Q2
- Preventing unplanned hospital care (4 measures) 14Q2_15Q1
B) Patient survey results (5 measures) 14Q3_15Q2
Note: Data collection period for the quality of patient care star ratings
The quality of patient care star ratings are calculated by combining data for a number of individual quality measures, so they’re based on data from a combination of the data collection time periods shown above.
PatientsLikeMe is a free patient network where people can connect with others who have the same disease or condition and track and share their own experiences. Early in December 2015 PatientsLikeMe was awarded a $900,000 grant from the Robert Wood Johnson Foundation (RWJF) to help jumpstart changes that will amplify the patient voice in the measurement of healthcare performance. A portion of the grant funded a collaboration between PatientsLikeMe and the National Quality Forum (NQF) to develop, test and facilitate the broader use of patient-reported outcome measures (PROMs) to assess patient-reported health status. While PROMs have been used in clinical research, they are rarely used in routine clinical care to assess provider performance. In such settings, performance is primarily assessed by what was done to the patient (using process measures) and what happened to the patient (using clinical outcome measures), but not always by what may be most important to the patient.
The grant came as value-based purchasing is gaining ground in both the public and private sectors, with the Centers for Medicare & Medicaid Services (CMS) setting aggressive targets for linking performance related to quality, value and patient-centered care to payment. “Measuring what is relevant, useful and actionable for patients has never been more important,” said PatientsLikeMe Co-founder and President Ben Heywood. “This initiative will help quantify the patient experience at the clinical level, so that real patient outcomes can start to prompt changes in behavior, help tailor care, and improve reimbursement. With it, we’ll start to move the whole system toward more patient-centered care.”
A study published in the Journal of Patient Safety found twitter to be a relevant data source to obtain the patient perspective on medical errors. Error-reporting systems are widely regarded as critical components to improving patient safety, yet current systems do not effectively engage patients. The authors of this study – The Potential of Twitter as a Data Source for Patient Safety. – sought to assess Twitter as a source to gather patient perspective on errors. Using publicly accessible tweets in English from any geography with highly relevant phrases, such as “doctor screwed up”, over the period of January to August 2012 the authors used criteria to independently review tweets and choose those relevant to patient safety. Of 1006 tweets analyzed, 839 (83%) identified the type of error: 26% of which were procedural errors, 23% were medication errors, 23% were diagnostic errors, and 14% were surgical errors. A total of 850 (84%) identified a tweet source, 90% of which were by the patient and 9% by a family member. A total of 519 (52%) identified an emotional response, 47% of which expressed anger or frustration, 21% expressed humor or sarcasm, and 14% expressed sadness or grief. Of the tweets, 6.3% mentioned an intent to pursue malpractice litigation. The study concludes that Twitter may provide an opportunity for health systems and providers to identify and communicate with patients who have experienced a medical error. Further research is needed to assess the reliability of the data.
Read full study: The Potential of Twitter as a Data Source for Patient Safety. Journal of Patient Safety.
This month the Centers for Medicare and Medicaid Services (CMS) added outpatient and ambulatory surgery centers to the list of health care sites that should send out patient satisfaction surveys. Like the surveys already in use in hospitals and offices, this initiative starts out as a voluntary reporting program. But in time, the surveys will be mandatory and CMS will publicly report aggregate data. CMS is starting to use the surveys for “value-based purchasing,” where they calculate reimbursement rates for health care sites based on the quality of work that they do.
An interesting article on WBUR.org (Boston’s NPR Radio Station) discusses how these patient satisfaction surveys are on the rise, but how these surveys may be harming patient satisfaction in the process of measuring it because of how they are administered.
The following is a summary of the latest quality data refresh (updated Dec 10th) on CMS Hospital Compare:
HCAHPS measures: data updated to report for 14Q2_15Q1
H_STAR_RATING (for all measures and summary star rating): Survey of patients’ experiences Star Ratings: 14Q2_15Q1
HQA measures: data updated to report for 14Q2_15Q1
Heart Attack: AMI_7a, AMI_8a
Heart Attack outpatients: OP_3b, OP_5, OP_2, OP_4
Heart Failure: HF_2 (note: no data reported for HF_1, HF_3)
Pneumonia : PN_6
SCIP_INF_1, SCIP_INF_3, SCIP_VTE_2, SCIP_CARD_2, SCIP_INF_2, SCIP_INF_9
Emergency Department Care measures data updated to report 14Q2_15Q1
ED_1b, ED_2b, OP_18b, OP_20, OP_21, OP_23
Data not updated for 2 measures: EDV (13Q1_13Q4), OP_22 (13Q1_13Q4)
Preventive Care measures data updated to report for 14Q2_15Q1
IMM_2l: 2, IMM_3_OP_27_FAC_ADHPCT
Stroke Care measures data updated to report for 14Q2_15Q1
STK_4, STK_5, STK_1, STK_2, STK_3, STK_6, STK_8, STK_10
Blood Clot Prevention and Treatment measures data updated to report for 14Q2_15Q1
VTE_1, VTE_2, VTE_6, VTE_3, VTE_4, VTE_5
Pregnancy & Delivery Care measure data updated to report for 14Q2_15Q1
Surgical Complications measures: data not updated – still reporting data for following reporting time periods:
HAI measures: data updated to report for 14Q2_15Q1
HAI_1_SIR, HAI_1a_SIR, HAI_2a_SIR, HAI_3_SIR, HAI_4_SIR, HAI_5_SIR, HAI_6_SIR
Except for one measure: HAI_2_SIR: reporting data for one quarter only 15Q1
Readmissions & Deaths measures: data not updated, still reporting 11Q3_14Q2
Use of Imaging measures data not updated, still reporting for 13Q3-14Q2:
OP_8, OP_9, OP_11, OP_10, OP_13, OP_14
Payment & value of care measures: data not updated, still reporting 11Q3_14Q2
PAYM_30_AMI, PAYM_30_HF, PAYM_30_PN
Except for 1 measures: MSPB_1: 14Q1_14Q4
Structural measures (6 measures): data updated to reporting 14Q1_14Q4 for all measures except for 1 measure OP_25: 13Q1_13Q4
HIT Measures (2 measures) – data not updated, still reporting for 13Q1_13Q4
Findings from a new study published in JAMA show doctors who entered data into computerized health records during patients’ appointments did less positive communicating, and patients rated their care excellent less often, writes Reuters Health. Safety-net clinics serve populations with limited proficiency in English and limited health literacy who experience communication barriers that contribute to disparities in care and health. Implementation of electronic health records in safety-net clinics may affect communication between patients and health care professionals. The researchers used data from encounters between 47 patients and 39 doctors at a public hospital between 2011 and 2013 and studied associations between clinician computer use and communication with patients with diverse chronic diseases in safety-net clinics. This study reports that high computer use by clinicians in safety-net clinics was associated with lower patient satisfaction and observable communication differences.
Maryland Health Care Commission Releases Performance Comparison Report of Maryland’s Commercial Health Benefit PlansMartina Dolan | December 1, 2015
The Maryland Health Care Commission released its “2015 Comprehensive Quality Report: Comparing the Performance of Maryland’s Commercial Health Benefit Plans” and an abridged consumer edition. The two reports serve as sources of information for employers and consumers when choosing a health plan.
The MHCC Comprehensive Quality Report 2015 provides detailed, health benefit plan specific indicators of quality and performance based on measures that include: health care effectiveness through clinical performance, member satisfaction with the quality of health care service delivery, as well as health benefit plan descriptive features and quality initiatives.
Both reports are available at http://mhcc.maryland.gov/mhcc/pages/apcd/apcd_quality/apcd_quality_hbp.aspx.
Interesting post from CMWF on bringing consumer data into healthcare for better patient engagement. Amazon and other vendors deliver just about everything but health care — yet probably know more about your habits and behaviors than your doctor does. That may be changing as health care providers begin using the consumer profiling tools that shape advertisements to get to know their patients beyond the examining room.