The inter-rater reliability of mental capacity assessments

Research output: Contribution to journalArticlepeer-review

31 Citations (Scopus)

Abstract

Background
Assessing mental capacity involves complex judgements, and there is little available information on inter-rater reliability of capacity assessments. Assessment tools have been devised in order to offer guidelines. We aimed to assess the inter-rater reliability of judgements made by a panel of experts judging the same interview transcripts where mental capacity had been assessed.

Method
We performed a cross sectional study of consecutive acute general medical inpatients in a teaching hospital. Patients had a clinical interview and were assessed using the MacArthur Competence Assessment Tool for Treatment (MacCAT-T) and Thinking Rationally About Treatment (TRAT), two capacity assessment interviews. The assessment was audiotaped and transcribed. The raters were asked to judge whether they thought that the patient had mental capacity based on the transcript. We then divided participants into three groups — those in whom there was unanimous agreement that they had capacity; those in whom there was disagreement; and those in whom there was unanimous agreement that they lacked capacity.

Results
We interviewed 40 patients. We found a high level of agreement between raters' assessments (mean kappa = 0.76). Those thought unanimously to have capacity were more cognitively intact, more likely to be living independently and performed consistently better on all subtests of the two capacity tools, compared with those who were unanimously thought not to have capacity. The group in whom there was disagreement fell in between.

Conclusions
This study indicates that clinicians can rate mental capacity with a good level of consistency.
Original languageEnglish
Article numberN/A
Pages (from-to)112-117
Number of pages6
JournalInternational Journal of Law and Psychiatry
Volume30
Issue number2
DOIs
Publication statusPublished - Mar 2007

Fingerprint

Dive into the research topics of 'The inter-rater reliability of mental capacity assessments'. Together they form a unique fingerprint.

Cite this