By Kamila Novak, KAN Consulting
During my almost three decades in the clinical trial arena, I have read hundreds of informed consent documents (ICDs). In the 90s, they used to be straightforward and short, typically about seven pages, written by people in clinical operations. When the ICD had 12-13 pages, we thought it was long.
Time passed by. Studies got more complex, regulations grew stricter, privacy laws were implemented, and sponsors wished to be covered on all sides. ICDs got much longer, with 30 pages no longer considered unusual, and the longest ICD I have seen totaled 64 pages — the length of a short novel! Parts of ICDs are still written by people in clinical operations, but other parts are now written by lawyers and data privacy officers whose language is not so easy to understand.
Not considering their length, ICDs are difficult to read for other reasons. They have section titles, but apart from that, they look like rivers of text flowing on and on, with nothing to catch the eye, nothing to flag important things at the first glance, hardly any bullet points, no colored or bold text, and no tables, diagrams, or figures. Just the flow of text.
When I read ICDs in preparation for audits, I found myself skipping lines or feeling unsure if I read the previous sentence and, of course, I did not remember what I read on page nine by the time I reached page 27. I do not expect study participants to find their reading experience any different. Does anyone calculate the readability score of ICDs? There are established methods and formulas to do that (see “Recommended Literature”), yet I have never seen the score included in the ICD review and approval form.
Yes, electronic informed consents can improve understanding if they are well designed and their whole multilayer potential is used, including hyperlinks to terms often posted by patient organizations and advocates on patient associations’ websites, videos, articles and other resources, quizzes, etc. I do not count ICDs written in a text editor, converted into a PDF, and placed on a tablet with an electronic signature tool as electronic ICDs. They are not any better than those on paper; in fact, they may be worse as I cannot clearly see two or more pages simultaneously.
When And Where ICDs Go Wrong
Involvement of people outside clinical operations and pressure to get the master ICD ready “yesterday” often result in documents that cause my eyebrows to raise. Let me illustrate it in several real-life examples.
Case 1: Acute Myocardial Infarction Study
The ICD is not extremely long at 17 pages. However, it is for an indication where minutes matter to save the patient’s life. As an investigator, do you have half an hour to discuss the study with the patients before you start treating them? No, you do not. You start inserting the catheter, tell the patient there is a study with treatment that might reduce the damage to the heart muscle, and request in haste: “Sign here.” The patient is consented, the signed ICD (with a spill of blood on the signature page) is on file, and we have met our ethical obligations. Everybody is “happy.” Needless to say, there is neither a return to the ICD nor a longer discussion once the patient’s life is out of danger. The correct way? Read Good Clinical Practice Guidelines  paragraph 4.8.15 on how to handle consent in emergency situations: life-saving procedures can be performed without consent and patients are fully consented as soon as they are out of life danger. Of course, everything should be described in detail in the protocol and approved by the IRB. Why was it not planned this way? Why did not IRBs request it?
Case 2: Thyroid Cancer Study
This is the study with the 64-page ICD. While it is unacceptably long in any case, patients with thyroid cancer are typically elderly, are more likely to have visual impairments, and have difficulty focusing, remembering what they read 10 minutes ago, or readily grasping new information or concepts. Would you understand and remember anything once you have finished reading this ICD? I doubt I would. If you submit such an ICD to an IRB in Sweden, where the maximum acceptable length of ICD is part of local regulations, they will not read it and will tell you: “Come again when you cut it down to three pages.” Why do IRBs elsewhere not do the same? They have the authority to do so.
Case 3: Hemophilia A Study
The target patient population includes boys from 2 months to 6 years of age. The ICD has a long section advising the patient — yes, patient — to not father a child, to use double contraception, and an appended consent for a pregnant partner. No, I am not joking. The study, including the ICD, was approved. Did anybody read the ICD? Where was the common sense?
Case 4: First-Line Treatment In A Newly Diagnosed, Unresectable, Or Metastatic Pancreatic Cancer Study
The ICD is 36 pages long, and reading it needs concentration. However, our patients have just received news about their diagnosis. Their world is upside down, with all plans for their future burnt to ashes. They are shocked, confused, deeply disturbed, and have thousands of thoughts swirling through their mind. Can we expect them to focus on what they read? No, they would be robots, not humans, if they could.
Case 5: A Vaccine Study In Sub-Saharan Africa
The ICD length is excellent, a mere seven pages. But it is in the country’s official language, which is not spoken by most people at home. Some people learn it at school if they are fortunate enough to attend. Out of hundreds of parents who agreed to include their children in this study, less than 1% were literate (in either official language or the one spoken at home). Out of the literate ones, only two people were fluent in the official language and, hence, consented in it. All others listened to the official recorded translation. A quiz determined their immediate understanding; the passing grade was 90%. If they answered wrong more than once, the consent discussion had to be repeated with another investigator. I consider that an excellent approach. However, what about long-term remembrance and understanding? Consented parents received a copy of a document they could not read nor understand – a useless piece of paper.
Consequences Of Poorly Written ICDs
While auditing the study in the fifth scenario, I noticed some parents did not come with their children to the required study visits after vaccination or came but refused their child’s blood sampling. Data from these visits and the blood analysis results were part of critical data to evaluate vaccine effectiveness and safety. A discussion with investigators revealed a community-held belief that the blood would be sold. While such rumors exist and there is a real reason behind (Where do you think biobanks get samples from? Regrettably, not all agents procure samples lawfully.), I could not help thinking there was another reason in play: After one, three, six, or more months, the participants likely did not remember why the visits and sampling were important, and the ICD copy they had certainly could not trigger any relevant memory. Had the ICD included the pictures they saw in the clinic, the memories could have been awakened.
If I do not understand the importance of something, I pay little attention. If I do understand, I am likely to comply. The importance of critical procedures and the reason why we need the results should be explained to participants. They may be illiterate or not highly educated, but that does not mean they cannot understand a well-articulated explanation. They deserve respect, explanations, and our efforts to help them understand.
Is something similar true in other studies and other geographies? Could a lack of consenting effectiveness be one of the reasons for participants being lost to follow-up, treatment prescription non-compliance, missed visits, or refused procedures?
Although I have not seen any study about the relationship of or correlation between the way we present information in ICDs and study participants’ compliance with procedures, I am convinced it exists. Immediate understanding of what I am consenting to is important, but so is my long-term understanding. Presentation of information affects our ability to memorize and understand. All teachers and students know that.
The Way Forward
I appeal to all involved in developing informed consents: Think about the “consumer” of your ICD, the study participant, and put yourself in their shoes. Read ISO 24495-1 Plain language — Part 1: Governing principles and guidelines  explaining how to use plain language to ensure readers can find what they need, understand it and use it considering readers’ level of interest, expertise and literacy skills and the context in which readers will use the document. What can we do?
- Project managers: Allocate enough time for ICD development and justify it to get the buy-in.
- Writers: Be creative and be visual, even when you write a paper-based ICD.
- Reviewers: Evaluate readability and plain language use, and request improvement when needed.
- Functional area managers: Remember that creativity needs space and environment to flourish; this is your job.
We have standard operating procedures for ICD development; however, they should be supplemented by training on creativity to help write better ICDs. Together with investigators revisiting important topics with the participants during the study, we will see an improved long-term understanding and compliance, better data, and better outcomes.
About The Author:
Kamila Novak, MSc, got her degree in molecular genetics. Since 1995, she has been involved in clinical research in various positions in pharma and CROs. Since 2010, she has been working as an independent consultant focusing on QA &QC being a certified auditor for several ISO standards, risk management, medical writing, and training. She is a member of the Society of Quality Assurance (SQA), the World Medical Device Organization (WMDO), the European Medical Writers’ Association (EMWA), the Drug Information Association (DIA), the Continuing Professional Development (CPD) UK, and other professional societies.
- ICH E6 (R2), November 2016
- ISO 24495-1 Plain language — Part 1: Governing principles and guidelines, June 2023
Helgesson, G., & Eriksson, S. (2011). Does Informed Consent Have an Expiry Date? A Critical Reappraisal of Informed Consent as a Process. Cambridge Quarterly of Healthcare Ethics, 20(1), 85-92.
Tam, N. T., Huy, N. T., Thoa, l., Long, N. P., Trang, N. T., Hirayama, K., Karbwang, J. (2015). Participants' understanding of informed consent in clinical trials over three decades: systematic review and meta-analysis. Bulletin of the World Health Organization, 93(3), 186–98H. https://doi.org/10.2471/BLT.14.141390.
Pietrzykowski, T., Smilowska, K. (2021). The reality of informed consent: empirical studies on patient comprehension—systematic review. Trials 22, 57. https://doi.org/10.1186/s13063-020-04969-w.
Ana M. Bertoli, Ingrid Strusberg, Gonzalo A. Fierro, Mariela Ramos, Alberto M. Strusberg, Lack of correlation between satisfaction and knowledge in clinical trials participants: A pilot study, Contemporary Clinical Trials, Volume 28, Issue 6, 2007, Pages 730-736, ISSN 1551-7144, https://doi.org/10.1016/j.cct.2007.04.005.
Flesch, R. (1948). "A new readability yardstick." J Appl Psychol 32 (3): 221-233.
Flesch, R. (1949). The art of readable writing. New York, Harper.
Kincaid, J., R. Fishburne, R. Rodgers and B. Chissom (1975). Derivation of New Readability Formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy Enlisted Personnel., Springfield, IL: Naval Technical Training Command.
Kloza, E. M., P. K. Haddow, J. V. Halliday, B. M. O'Brien, G. M. Lambert-Messerlian and G. E. Palomaki (2015). "Evaluation of patient education materials: the example of circulating cell free DNA testing for aneuploidy." J Genet Couns 24 (2): 259-266.
McCall, W. A. and L. M. Crabbs (1926). Standard test lessons in reading. New York city, Teachers college, Columbia university.
Niemiec, E., D. Vears, P. Borry and H. C. Howard (2018). "Readability of informed consent forms for whole exome/genome sequencing." European Journal of Human Genetics 26: 814-815.
Office of Human Subjects Research - Institutional Review Board. Johns Hopkins Medicine. (2016). "Informed Consent Guidance - How to Prepare a Readable Consent Form." from https://www.hopkinsmedicine.org/institutional_review_board/guidelines_policies/guidelines/informed_consent_ii.html
Paasche-Orlow, M. K., H. A. Taylor and F. L. Brancati (2003). "Readability Standards for Informed-Consent Forms as Compared with Actual Readability." New England Journal of Medicine 348 (8): 721-726.
Sullivan, L., P. Sukumar, R. Crowley, E. McAuliffe and P. Doran (2020). "Readability and understandability of clinical research patient information leaflets and consent forms in Ireland and the UK: a retrospective quantitative analysis." BMJ Open 10 (9): e037994.
Wang, L.-W., M. J. Miller, M. R. Schmitt and F. K. Wen (2013). "Assessing readability formula differences with written health information materials: Application, results, and recommendations." Research in Social and Administrative Pharmacy 9 (5): 503-516.