Discharge summaries provided to owners of pets newly diagnosed with cancer exceed recommended readability levels

Julia E. Medland Department of Clinical Sciences, College of Veterinary Medicine, North Carolina State University, Raleigh, NC

Search for other papers by Julia E. Medland in
Current site
Google Scholar
PubMed
Close
 DVM, MS
,
Steven L. Marks Department of Clinical Sciences, College of Veterinary Medicine, North Carolina State University, Raleigh, NC

Search for other papers by Steven L. Marks in
Current site
Google Scholar
PubMed
Close
 BVSc, MS
, and
Joanne L. Intile Department of Clinical Sciences, College of Veterinary Medicine, North Carolina State University, Raleigh, NC

Search for other papers by Joanne L. Intile in
Current site
Google Scholar
PubMed
Close
 DVM, MS

Abstract

OBJECTIVE

To analyze the readability of discharge summaries distributed to owners of pets newly diagnosed with cancer.

SAMPLE

118 discharge summaries provided to pet owners following initial consultation.

PROCEDURES

A database search identified records of new patients that had been presented to the North Carolina State Veterinary Hospital medical oncology service between June 2017 and January 2019. Owner-directed portions of the summaries provided at the time of discharge were copied and pasted into a document and stripped of all identifying information. Readability of summaries was assessed with the use of 2 previously established readability calculators: the Flesch-Kincaid Grade Level (FKGL) and Flesch Reading Ease (FRE) tests.

RESULTS

Mean ± SD FKGL was 11.9 ± 1.1 (median, 11.9; range, 8.6 to 15.5; target ≤ 6), and the mean ± SD FRE score was 43 ± 5.9 (median, 42.7; range, 25.5 to 58.1; target ≥ 60). There were no significant differences in FKGL or FRE scores among discharge summaries for patients with the 4 most common tumor types diagnosed or the described treatment options. Ninety-three percent (110/118) of summaries were scored as difficult or very difficult to read.

CLINICAL RELEVANCE

Owner-directed written information regarding a diagnosis of cancer at a single teaching hospital exceeded readability levels recommended by the American Medical Association and NIH and was above the average reading level of most US adults. Efforts to improve readability are an important component of promoting relationship-centered care and may improve owner compliance and patient outcomes.

Abstract

OBJECTIVE

To analyze the readability of discharge summaries distributed to owners of pets newly diagnosed with cancer.

SAMPLE

118 discharge summaries provided to pet owners following initial consultation.

PROCEDURES

A database search identified records of new patients that had been presented to the North Carolina State Veterinary Hospital medical oncology service between June 2017 and January 2019. Owner-directed portions of the summaries provided at the time of discharge were copied and pasted into a document and stripped of all identifying information. Readability of summaries was assessed with the use of 2 previously established readability calculators: the Flesch-Kincaid Grade Level (FKGL) and Flesch Reading Ease (FRE) tests.

RESULTS

Mean ± SD FKGL was 11.9 ± 1.1 (median, 11.9; range, 8.6 to 15.5; target ≤ 6), and the mean ± SD FRE score was 43 ± 5.9 (median, 42.7; range, 25.5 to 58.1; target ≥ 60). There were no significant differences in FKGL or FRE scores among discharge summaries for patients with the 4 most common tumor types diagnosed or the described treatment options. Ninety-three percent (110/118) of summaries were scored as difficult or very difficult to read.

CLINICAL RELEVANCE

Owner-directed written information regarding a diagnosis of cancer at a single teaching hospital exceeded readability levels recommended by the American Medical Association and NIH and was above the average reading level of most US adults. Efforts to improve readability are an important component of promoting relationship-centered care and may improve owner compliance and patient outcomes.

Contributor Notes

Corresponding author: Dr. Intile (jlintile@ncsu.edu)
  • 1.

    Shaw JR. Relationship-centered approach to cancer communication. In: Vail DM, Thamm DH, Liptak JM, eds. Withrow and MacEwen’s Small Animal Clinical Oncology. 6th ed. Elsevier Inc; 2020:310.

    • Search Google Scholar
    • Export Citation
  • 2.

    Kanji N, Coe JB, Adams CL, Shaw JR. Effect of veterinarian-client-patient interactions on client adherence to dentistry and surgery recommendations in companion-animal practice. J Am Vet Med Assoc. 2012;240(4):427436.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 3.

    Coe JB, Adams CL, Eva K, Desmarais S, Bonnett BN. Development and validation of an instrument for measuring appointment-specific client satisfaction in companion-animal practice. Prev Vet Med. 2010;93(2-3):201210.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 4.

    Latham CE, Morris A. Effects of formal training in communication skills on the ability of veterinary students to communicate with clients. Vet Rec. 2007;160(6):181186.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 5.

    Stoewen DL, Coe JB, MacMartin C, Stone EA, E Dewey C. Qualitative study of the communication expectations of clients accessing oncology care at a tertiary referral center for dogs with life-limiting cancer. J Am Vet Med Assoc. 2014;245(7):785795.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 6.

    Loftus L. The non-compliant client. Vet Nurs J. 2012;27(8):294297.

  • 7.

    Jindal P, MacDermid JC. Assessing reading levels of health information: uses and limitations of flesch formula. Educ Health (Abingdon). 2017;30(1):8488.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 8.

    Safeer RS, Keenan J. Health literacy: the gap between physicians and patients. Am Fam Physician. 2005;72(3):463468.

  • 9.

    Mishra V, Dexter JP. Comparison of readability of official public health information about COVID-19 on websites of international agencies and the governments of 15 countries. JAMA Netw Open. 2020;3(8):e2018033. doi:10.1001/jamanetworkopen.2020.18033

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 10.

    Agarwal N, Hansberry DR, Sabourin V, Tomei KL, Prestigiacomo CJ. A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern Med. 2013;173(13):12571259.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 11.

    Stephens ST. Patient education materials: are they readable? Oncol Nurs Forum. 1992;19(1):8385.

  • 12.

    Choudhry AJ, Baghdadi YM, Wagie AE, et al. Readability of discharge summaries: with what level of information are we dismissing our patients? Am J Surg. 2016;211(3):631636.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 13.

    Unaka NI, Statile A, Haney J, Beck AF, Brady PW, Jerardi KE. Assessment of readability, understandability, and completeness of pediatric hospital medicine discharge instructions. J Hosp Med. 2017;12(2):98101.

    • Search Google Scholar
    • Export Citation
  • 14.

    Eltorai AEM, Sharma P, Wang J, Daniels AH. Most American Academy of Orthopaedic Surgeons’ online patient education material exceeds average patient reading level. Clin Orthop Relat Res. 2015;473(4):11811186.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 15.

    Sarzynski E, Hashmi H, Subramanian J, et al. Opportunities to improve clinical summaries for patients at hospital discharge. BMJ Qual Saf. 2017;26(5):372380.

    • Search Google Scholar
    • Export Citation
  • 16.

    Williams AM, Muir KW, Rosdahl JA. Readability of patient education materials in ophthalmology: a single-institution study and systematic review. BMC Ophthalmol. 2016;16:133. doi:10.1186/s12886-016-0315-0

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 17.

    Royal KD, Sheats MK, Kedrowicz AA. Readability evaluations of veterinary client handouts and implications for patient care. Top Companion Anim Med. 2018;33(2):5861.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 18.

    Chen TT, Khosa DK, McEwen SA, Abood SK, McWhirter JE. Readability and content of online pet obesity information. J Am Vet Med Assoc. 2020;257(11):11711180.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 19.

    Sheats MK, Royal K, Kedrowicz A. Using readability software to enhance the health literacy of equine veterinary clients: an analysis of 17 American Association of Equine Practitioners’ newsletter and website articles. Equine Vet J. 2019;51(4):552555.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 20.

    Sobolewski J, Bryan JN, Duval D, et al. Readability of consent forms in veterinary clinical research. J Vet Intern Med. 2019;33(2):350355.

  • 21.

    Tater KC. Veterinary allergy information has lower health readability than human allergy information: a comparative analysis of allergy education materials for pets and people. Vet Dermatol. 2021;32(2):144e33.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 22.

    Random Sequence Generator. Random.org. Accessed September 26, 2019. https://www.random.org/sequences/

  • 23.

    Automatic Readability Checker. Media B. Accessed September 27, 2019. https://readabilityformulas.com/free-readability-formula-tests.php

  • 24.

    Flesch R. A new readability yardstick. J Appl Psychol. 1948;32(3):221233.

  • 25.

    Kutner M, Greenberg E, Jin Y, et al. The Health Literacy of America’s Adults. Results from the 2003 National Assessment of Adult Literacy. US Department of Education National Center for Education Statistics; 2006. NCES 2006–483.

    • Search Google Scholar
    • Export Citation
  • 26.

    Weiss BD. Health literacy. In: Health Literacy and Patient Safety: Help Patients Understand. 2nd ed. American Medical Association; 2007:1316.

    • Search Google Scholar
    • Export Citation
  • 27.

    Oliffe M, Thompson E, Johnston J, Freeman D, Bagga H, Wong PKK. Assessing the readability and patient comprehension of rheumatology medicine information sheets: a cross-sectional Health Literacy Study. BMJ Open. 2019;9(2):e024582.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 28.

    Baker GC, Newton DE, Bergstresser PR. Increased readability improves the comprehension of written information for patients with skin disease. J Am Acad Dermatol. 1988;19(6):11351141.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 29.

    Overland JE, Hoskins PL, McGill MJ, Yue DK. Low literacy: a problem in diabetes education. Diabet Med. 1993;10(9):847850.

  • 30.

    Choudhry AJ, Younis M, Ray-Zack MD, et al. Enhanced readability of discharge summaries decreases provider telephone calls and patient readmissions in the posthospital setting. Surgery. 2019;165(4):789794.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 31.

    Janke N, Coe JB, Bernardo TM, Dewey CE, Stone EA. Pet owners’ and veterinarians’ perceptions of information exchange and clinical decision-making in companion animal practice. PLoS One. 2021;16(2):e0245632. doi:10.1371/journal.pone.0245632

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 32.

    Weiss BD. Strategies to enhance your patient’s health literacy. In: Health Literacy: A Manual for Clinicians. American Medical Association American Medical Foundation; 2003:32.

    • Search Google Scholar
    • Export Citation
  • 33.

    CDC. Simply Put: A Guide For Creating Easy-To-Read Materials. 3rd ed. CDC; 2009:143.

  • 34.

    Clear & simple. NIH. Accessed August 26, 2021. https://www.nih.gov/institutes-nih/nih-office-director/office-communications-public-liaison/clear-communication/clear-simple

    • Search Google Scholar
    • Export Citation
  • 35.

    The Joint Commission. Advancing Effective Communication, Cultural Competence, and Patient- and Family-Centered Care: A Roadmap for Hospitals. The Joint Commission; 2010.

    • Search Google Scholar
    • Export Citation
  • 36.

    Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105111.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 37.

    Kim EH, Stolyar A, Lober WB, et al. Usage patterns of a personal health record by elderly and disabled users. AMIA Annu Symp Proc. 2007;2007:409413.

    • Search Google Scholar
    • Export Citation

Advertisement