English

Using parallel assessments of neuropsychological status to explore conceptual flexibility

Lucy McIvor | Trainee Clinical Psychologist

The capacity to deal with information and concepts flexibly is a subtle, yet essential requirement for everyday living.  It may represent the consideration of alternatives, abstracting from the concrete to the generalisable, changing tact when the situation requires it, learning from feedback, following what ‘works’, persisting until we get it ‘right’, and holding the ‘bigger picture’ in mind. Is this perhaps the definition of ‘cognitive flexibility’? 
 
The SPANS-X1 is a neuropsychological test used to measure neuropsychological functioning in individuals with acquired brain injury. The Conceptual Flexibility Index (CFI) of the SPANS-X aims to measure cognitive flexibility (CF). It combines two subtests that possess elements of concept formation, thinking laterally and flexibly and combining concepts into superordinate categories. Currently, the CFI is not functioning like other SPANS-X index scores on measures of reliability (such as Cronbach’s alpha, alternate version and test-retest). This raises several considerations. Crucially, there is the matter of whether ‘cognitive flexibility’ exists.  If a construct cannot be reliably measured, arguably, it cannot be classed a construct. Conversely, if we do assume its existence, consideration must be given to how the SPANS-X CFI could increase its performance and capture the construct of CF more accurately.  

Within the domain of brain injury research, CF is often treated as synonymous with task- or set- shifting.2 CF is therefore viewed as a specific skill, and has historically been examined using shifting paradigms (e.g. the Wisconsin Card Sorting Test3). The outcome of this conventionality is that CF has been operationalized by the tasks that are used to measure it.4 This circular reasoning has borne neuropsychology’s own chicken and egg debate of which came first, the accepted construct of ‘CF’, or its associated tasks. It is also likely that the ability to shift task or set is just one piece of the cognitive flexibility puzzle2, and that when individuals show a ‘flexible response’ in everyday life away from ‘laboratory conditions’ an interaction of multiple cognitive subsystems (e.g. attentional mechanisms, perception, inhibition) occurs. In addition, task demands and contextual factors (e.g. an individual’s effort level and previous knowledge) will likely contribute. The use of and focus on individual test paradigms such as set or task shifting may therefore not be sufficient in capturing the full range of a person’s CF. 
 
Our current research aims to add two additional items of greater difficulty to each of the existing CFI subtests, and to add four further CF subtests that theoretically align with the current CFI. To achieve this, we will draw inspiration from a full range of existing neuropsychological tests purporting to measure ‘CF’ (the Wisconsin Card Sorting Test4, the D-KEFS sorting test6, the Hayling and Brixton Tests7, the Category Test of the Halstead-Retain8, and the Rule Shift Cards and the Modified Six Elements Tests of the BADS9). Our aim in doing this is to avoid using individual paradigms in isolation, instead combining a diverse range of paradigms to gain an overarching view of CF. Through introducing multiple diverse subtests that aim to measure the same concept, we aim to observe which subtests covary with one another. Observed covariance within these subtests would suggest the measurement of the same construct (possibly CF). We will also gauge participants’ effort levels, to ensure we are considering CF as a property of the cognitive system highly influenced by context. We will further look carefully at the validity of these neuropsychological tests; not only if the variable performance on the tests listed correlates with the nature and severity of brain injury, but also if an individual’s scores on the listed tests correlate with relevant, real-world outcomes (e.g. social and occupational functioning). 
 
Within our study, we will also ask participants to complete an online version of the Wisconsin Card Sorting Test (WCST). The WCST demonstrates a ceiling effect in non-clinical samples, which affects its reliability, however correlates well with certain functional outcomes in patient populations (e.g. returning to work). If participant performance on the WCST and the new/established CF subtests of the SPANS-X correlate, we may infer that the SPANS-X CFI might also predict useful outcomes.  

Lucy McIvor is a trainee clinical psychologist at the Salomons Institute for Applied Psychology. Her published research explores the neuropsychological basis of emotional difficulties and the enactment of violence and self-harm. Her doctoral thesis will explore the construct of ‘cognitive flexibility’ and the validity and reliability of the SPANS-X Conceptual Flexibility Index. Data collection will commence in 2023 and completion is expected by 2024. 

  1. Burgess, G.H.  (2022). Short Parallel Assessments of Neuropsychological Status-Extended (SPANS-X). Oxford: Hogrefe. 
  2. Ionescu, T. (2012). Exploring the nature of cognitive flexibility. New Ideas in Pschology, 30(2), 190–200. doi.org/10.1016/j.newideapsych.2011.11.001
  3. Berg, E. A. (1948). A simple objective technique for measuring flexibility in thinking. The Journal of General Psychology, 39(1), 15-22. doi.org/10.1080/00221309.1948.9918159
  4. Dajani, D. R., & Uddin, L. Q. (2015). Demystifying cognitive flexibility: Implications for clinical and developmental neuroscience. Trends in Neurosciences, 38(9), 571-578. doi.org/10.1016/j.tins.2015.07.003
  5. Delis, D. C., Kaplan, E., & Kramer, J. H. (2001). Delis-Kaplan Executive Function System (D–KEFS) [Database record]. APA PsycTests. doi.org/10.1037/t15082-000
  6. Burgess, P. W., & Shallice, T. (1997). The Hayling and Brixton Tests. Thames Valley Test Company.
  7. Halstead, W., and Settlage, P. (1943). Grouping behavior of normal persons and persons with lesions of the brain. Archives of Neurology and Psychiatry 49, 489–506. doi.org/10.1001/archneurpsyc.1943.02290160011001
  8. Wilson, B. A., Alderman, N., Burgess, P. W., Emslie, H. C. & Evans, J. J. (1996). The Behavioural Assessment of the Dysexecutive Syndrome. Thames Valley Test Company.