Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Computerized Cognitive Training with Older Adults: A Systematic Review

  • Alexandra M. Kueider ,

    akueider@jhsph.edu

    Affiliation Department of Mental Health, Johns Hopkins University Bloomberg School of Public Health, Baltimore, Maryland, United States of America

  • Jeanine M. Parisi,

    Affiliation Department of Mental Health, Johns Hopkins University Bloomberg School of Public Health, Baltimore, Maryland, United States of America

  • Alden L. Gross,

    Affiliation Hebrew SeniorLife, Harvard Medical School, Boston, Massachusetts, United States of America

  • George W. Rebok

    Affiliation Department of Mental Health, Johns Hopkins University Bloomberg School of Public Health, Baltimore, Maryland, United States of America

Abstract

A systematic review to examine the efficacy of computer-based cognitive interventions for cognitively healthy older adults was conducted. Studies were included if they met the following criteria: average sample age of at least 55 years at time of training; participants did not have Alzheimer’s disease or mild cognitive impairment; and the study measured cognitive outcomes as a result of training. Theoretical articles, review articles, and book chapters that did not include original data were excluded. We identified 151 studies published between 1984 and 2011, of which 38 met inclusion criteria and were further classified into three groups by the type of computerized program used: classic cognitive training tasks, neuropsychological software, and video games. Reported pre-post training effect sizes for intervention groups ranged from 0.06 to 6.32 for classic cognitive training interventions, 0.19 to 7.14 for neuropsychological software interventions, and 0.09 to 1.70 for video game interventions. Most studies reported older adults did not need to be technologically savvy in order to successfully complete or benefit from training. Overall, findings are comparable or better than those from reviews of more traditional, paper-and-pencil cognitive training approaches suggesting that computerized training is an effective, less labor intensive alternative.

Introduction

Within 20 years, older adults will account for almost 25% of the U.S. population [1]. From a healthcare perspective, a major concern with an aging population is a higher prevalence of age-related impairment in cognitive function. This expanding aging population highlights the need to identify quick, effective, low-cost solutions to delay pathological cognitive decline associated with aging [2]. Developing interventions that can preserve cognitive function can also help to maintain quality of life and independence well into old age. With the help of new technology, novel cognitive training platforms, including computers and video games, can be readily disseminated to an older population.

Interest in training programs designed to improve cognitive abilities in older adults has been growing steadily in recent years. Ample evidence now suggests cognitive training interventions can improve cognitive performance in healthy older adults [3][6] and that these gains are robust up to five years after training [7]. A recent systematic review and meta-analysis reported that for healthy older adults, training improved performance in specific cognitive domains relative to control conditions [5].

Traditional cognitive training programs are delivered in individual or group format by a trained instructor, and differ primarily with regards to trained abilities (e.g., memory), length and frequency of training, and specific strategies practiced (e.g., method of loci for memory). Many traditional cognitive training programs require face-to-face contact (but see [8]), which entails identifying a convenient meeting location, coordinating schedules, and travel time. Further, traditional face-to-face training programs can be expensive. An hour of traditional cognitive training using a bachelor’s level trainer can cost $15 an hour, while an occupational therapist will charge up to $100 an hour [9]. These staff costs do not include the cost of equipment and materials. Given the importance of cognitive training for maintaining healthy cognitive function, cost-effective alternatives are needed. Computer-based cognitive interventions are a potentially cost-effective alternative to traditional training programs.

Additionally, not only can computer-based interventions be more cost effective, they can be more easily disseminated, reaching special populations that would otherwise not receive such interventions. Older adults who are home bound or live in an assisted living or nursing home facility and have limited access to transportation are difficult to recruit for traditional cognitive training programs. Computerized training programs could offer a more flexible, personalized approach to traditional cognitive training programs, allowing for easier access and dissemination to persons with access to technology. In addition, computerized programs provide real-time performance feedback and can adjust to the user’s ability level, keeping the activity engaging and fun. Poor adherence can be a challenge with traditional cognitive training programs (e.g., [10]). Computer and video games are designed to be fun and exciting and may provide motivation for older adults to stick with the training program.

In recent years the popularity of brain exercise products, currently a $300 million worldwide industry, has skyrocketed and is estimated to achieve between $2 and $8 billion in revenue by 2015 as the baby boomer generation continues to age [11]. The market is currently inundated with commercial brain exercise programs that claim to improve memory, attention, creativity, and delay Alzheimer’s disease and cognitive decline. However, few of these programs have been rigorously tested in empirical scientific studies with older adults, which is paramount to establish the efficacy of computerized training for aging individuals [10], [12].

Given the extensive body of research reporting older adults can benefit from cognitive training interventions and the personal computing revolution, the present systematic review summarizes the last 25 years of research on computerized training to address the following two questions: (1) What types of computerized training programs have been used to influence cognitive outcome measures among cognitively normal, community-dwelling older adults? (2) What is the strength of evidence that computerized cognitive training interventions influence cognition in this population? These are critically important questions in an expanding area of public health research because computerized training programs have the opportunity to capitalize on the increasing prevalence of personal computers among older adults and the increasing number of older adults to improve cognitive function and delay cognitive decline in later life.

Methods

We conducted a review of studies on computerized training protocols for cognitively healthy older adults published or in press prior to July, 2011. To identify relevant studies, we searched computerized databases (PsycArticles, PsychInfo, Pubmed, SCOPUS) using combinations of the following key words for the population: aging, aged, elderly, old, older adult(s), old, and oldest-old; for cognitive function: cognitive, cognitive abilities, cognition, memory, psychomotor speed, and speed of processing; for interventions: action games, computer(s), computerized training, enhancement, interactive gaming, intervention, video games, virtual reality, and training. We also identified studies from reference lists in retrieved articles, unpublished dissertations, and conference abstracts. The supporting PRISMA checklist is available as supporting information (see Checklist S1).

To be included in the present review, the mean age of the study sample had to be, on average, at least 55 years of age at the time of training. Participants could not have a diagnosis of mild cognitive impairment or Alzheimer’s disease. Studies must have been published in English and have used a computerized approach targeted at any aspect of cognitive function. Computerized interventions included any study using an electronic game or task that involved participant interactions to produce visual feedback on a display device. Studies were excluded if the computerized training program sought only to evaluate the efficacy of the program itself (e.g., [13]), aimed to determine cognitive abilities that predicted success with training (e.g., [14]), did not include cognitive outcome measures (e.g., [15], [16]), results from younger adults could not be separated from older adults (e.g., [17]), or reviewed previous findings in the literature (e.g., [18]). In addition, only outcome measures of cognition were collected from eligible studies; outcome measures of everyday functioning, quality of life, and mood were excluded from this review. When outcomes were measured over several follow-up periods (e.g., post-training, 3 months, 6 months), only immediate post-training data were used to calculate effect sizes.

To synthesize study findings, studies were grouped into one of three categories based on the type of computerized training participants received: classic cognitive training tasks; neuropsychological software; or video games. Classic cognitive training tasks train specific aspects of cognition (e.g., processing speed or memory) using guided practice on standardized tasks. Neuropsychological software programs (e.g., NeuroPsychological Training, Colorado Neuropsychological Test) are designed to enhance multiple cognitive domains using a variety of tasks, can provide instant performance feedback, and are mostly self-guided, allowing participants to progress through tasks at their own pace. Video games can include electronic or computerized games in which the player manipulates images on a screen to achieve a goal. Within each of the three types of computerized training programs, studies were further grouped and reviewed according to interventions that targeted a single cognitive domain or interventions that targeted multiple cognitive domains.

Effect Size Calculations

Effect sizes for a treatment effect reported by studies included partial eta-squared (η2), with values closer to 1.0 indicating a stronger effect size, and Cohen’s d. When not reported in a study, standardized Cohen’s d effect sizes were derived from the mean differences between scores on the post-training and pre-training cognitive outcome measures for the computer trained and control groups. Effect sizes (Cohen’s d) were standardized by dividing the mean differences for the computer trained and control groups by the pooled standard deviations of each cognitive outcome measure. For timed tests in which lower numbers indicate better performance, the direction of the association was reversed for ease of interpretation. Within the three categories of training median effect sizes were calculated for each cognitive domain.

Results

Initially, 151 computerized training studies published between 1984 and 2011 were deemed relevant to the current review. Each study was reviewed and information pertaining to the study design, sample characteristics (e.g., age, cognitive status), cognitive outcomes, and the means and standard deviations of cognitive tests before and after training in the experimental and control groups were extracted.

Based on the stated inclusion and exclusion criteria, 38 of the 151 publications were eligible for the current review (Figure 1). Common reasons for exclusion included pre-existing memory impairments in the sample (e.g., [19], [20]), average age less than 55 years (e.g., [15]), and not measuring cognitive function (e.g., [21][23]).

thumbnail
Figure 1. Identification of studies in the systematic review.

https://doi.org/10.1371/journal.pone.0040588.g001

The 38 studies examined in this review used a total of 69 different cognitive measures encompassing global and domain-specific cognitive abilities. Tables 1, 2, and 3 summarize pertinent information from included studies in each of the three types of interventions. Examples of global cognitive measures included the Alzheimer’s Disease Assessment Scale-Cognitive Behavior section (ADAS-Cog), Repeatable Battery for the Assessment of Neuropsychological Status (RBANS), and the Wechsler Adult Intelligence Scale (WAIS). Domain-specific cognitive measures included memory (e.g., word list recall), executive functions (e.g., Trail Making Test Part B), information processing and psychomotor speed (e.g., Digit Symbol Substitution, Useful Field of View), and visual spatial functions.

thumbnail
Table 1. Classic Cognitive Training Studies Reported by Age Range, N, Intervention, Control, Duration, Significant Findings, and Effect Sizes.

https://doi.org/10.1371/journal.pone.0040588.t001

thumbnail
Table 2. Neuropsychological Software Studies Reported by Age Range, N, Intervention, Control, Duration, Significant Findings, and Effect Sizes.

https://doi.org/10.1371/journal.pone.0040588.t002

thumbnail
Table 3. Video Game Studies Reported by Design, Age Range, N, Intervention, Control, Duration, Significant Findings, and Effect Sizes.

https://doi.org/10.1371/journal.pone.0040588.t003

Classic Cognitive Training Tasks

Twenty-one studies used cognitive domain-specific programs including speed of processing, memory, attention, and perception in older adults 61 to 95 years of age (Table 1). Across all classic cognitive training studies the median effect size for each cognitive domain was 0.69 for reaction time, 1.30 for processing speed, 0.89 for working memory, 0.39 for executive function, 0.52 for memory, 0.39 for visual spatial abilities, and 0.57 for attention.

Duration.

Training sessions lasted from two weeks to 24 weeks, ranging from daily to three times weekly sessions. Two studies only required participants meet a specific time requirement for training which ranged from 10 hours [24] to 12 hours [25]. Another study allowed participants to use their personal computers as long as they liked over the course of a year [26].

Reaction time (3 studies).

Reaction time is the amount of time needed to process and respond to a stimulus and is critical for handling information [27], [28]. Studies that implemented computerized balance training programs, in which participants received real-time visual postural feedback, reported conflicting results about the benefits of training for reaction time. Bisson et al. [29] and Lajoie [30] reported improved simple reaction times after training (biofeedback group: d = 0.69, virtual reality group: d = 0.22 [29]; and d = 1.17 [30]), whereas Hinman found no improvement in simple reaction time in the intervention group compared to controls.

Processing speed (5 studies).

Processing speed is the ability to quickly process information. Results of five studies [9], [24], [32][34], suggested speed of processing interventions, which varied in duration from two to 12 weeks, significantly improved processing speed scores (Useful Field of View) with effect sizes (Cohen’s d) ranging from 1.09 [32] to 2.19 [34]. Three studies [9], [32], [33] reported improvements on visual spatial abilities (Road Sign Test) with effect sizes ranging from 0.13 [32] to 0.51 [9], but no impact of training on executive function (Trail Making Test Part B) or psychomotor speed (Digit Symbol Substitution Test). Roenker et al., [34] reported improvements on choice reaction time (d = 0.68 [34]), while Vance et al. [24] reported larger improvements in the intervention group on a measure of visual sensory function and attention (d = 0.23) compared to controls.

Memory (5 studies).

Memory is the ability to retain, store, and recall information [35]. There are many different types of memory (e.g., recall, recognition, episodic, verbal, visual, and working memory) and various training strategies that can be used to enhance it (e.g., rehearsal, categorization, visualization, pegword, method of loci). Finkel and Yesavage [36] compared computer-assisted instruction with traditional classroom-based mnemonic training. Participants in both groups improved on memory (mean word list recall: intervention: d = 0.45; control: d = 0.39), and the differences between the two groups were not significant (p = .26). Findings suggest computer-assisted instruction, which is less labor intensive, may serve as a viable alternative to more traditional classroom-based training.

Two studies [37], [38] examined the effects of repetition lag training over a period of three weeks. Repetition lag training focuses on learning a list of words and involves a variety of retrieval and encoding processes responsible for memory recall and recognition. Although both studies reported multiple measures of verbal memory (California Verbal Learning Test [37]; shopping list memory, face-name association [38]) were not impacted by training, results in other cognitive domains varied.

Conflicting results were observed on processing speed and working memory measures. Jennings and colleagues [37] found a positive impact of training on processing speed (Digit Symbol Substitution Task: Recollection group: d = 1.30; Recognition group: d = 0.58), as well as working memory (Self-Ordered Pointing Task: d = 2.30; N-back task: effect sizes ranged from 1.19 to 3.92 for the Recollection group and 0.89 to 2.82 for the Recognition group), whereas Lustig and Flegal [38] reported no benefits in either processing speed (pattern comparison test) or working memory (Self-Ordered Pointing Task), but positive results on executive function (Trail Making Test Part B; Integrated Sentence group: d = 0.34, Strategy Choice group: d = 0.27).

Two studies, both with 12 week duration [39], [40], examined the effects of working memory training on older adult’s cognition, and reported positive results on working memory tasks. Buschkuehl and colleagues [39] reported improved performance on two training-specific measures of working memory and reaction time. Additionally, improved performance transferred to non-trained working memory tasks (block span: d = 1.34; digit span: d = 0.29) and visual memory (visual free recall: d = 0.26). However, there was no benefit of training on a verbal memory measure (verbal free recall).

Similarly, Li et al. [40] reported positive results for participants who received spatial working memory training. Training significantly improved performance on practiced spatial working memory tasks (task accuracy: d = 0.88) and reaction time (d = 0.96) and resulted in positive transfer to working memory measures (spatial three-back, numerical-two back, and numerical three-back, all versions of the N-back task).

Executive function (3 studies).

Executive function encompasses a broad spectrum of abilities including planning, cognitive flexibility, and abstract thinking skills. Results of three studies which trained executive function over a span of three [41], [42] to five weeks [43] varied, and included improvements in reaction time [41], [42] and measures of executive function [43]. Dahlin et al. [43] reported participants in the intervention group showed improved performance after training on the recall of numbers, letters, colors, and spatial locations, and tasks requiring the continuous updating, categorization, and association of presented material, suggesting executive function abilities are modifiable in older adults following computerized training. On non-trained cognitive tasks, modest improvements were observed on measures of working memory (digit symbol: d = 0.35; digit span backwards: d = 0.60) and phonemic fluency (letter fluency: d = 0.37), and may suggest that executive function training has limited generalizability to other cognitive domains [43].

In two separate studies [41], [42] participation in dual task training to improve executive control skills, which provided continuous, individualized feedback, improved accuracy on the dual tasks, and decreased reaction times in the variable and fixed priority intervention groups with effect sizes ranging from 2.23 (dual-cross modality transfer task) to 6.32 (dual-within modality transfer task) [42]. There was no significant difference in improvement between the two intervention groups (η2 = 0.01), suggesting that both were equally effective. Results from the two studies suggest that dual-task training reduced reaction time, task-set (η2 = 0.60) [41], and dual-task costs (η2 = 0.47) [41] relative to controls.

Attention (1 study).

Selective attention is the process by which an individual directs or focuses on specific auditory or visual stimuli in the environment. Modality-specific selective attention training (i.e., visual and auditory), in which participants were taught strategies to reduce the impact of sensory modalities on task performance, was used for eight weeks [44]. Training resulted in larger improvements on attention tasks (divided attention effect sizes ranged from 1.20 to 4.07; selective attention effect sizes ranged from 0.20 to 1.64), as indicated by decreased reaction time interference (effect sizes ranged from 0.52 to 0.57) and increased accuracy (effect sizes ranged from 0.06 to 0.48).

Training-specific improvements transferred to other cognitive domains as well. Working memory significantly improved after training in both the intervention (1-back portion of the N-back; d = 0.48; 2-back: d = 0.25) and control groups (d = 0.46). The effect of training on executive function was mixed: the intervention group improved more than the control group on a measure of executive function/processing speed (Symbol Digit Modalities Test: d = 0.34; Trail Making Test: d = 0.61), but there was no impact of training on executive function (Stroop Color Word Test) or verbal memory (Hopkins Verbal Learning Test).

Multiple cognitive domains trained (4 studies).

Based on ample evidence suggesting little transfer of cognitive training effects to untrained cognitive abilities [10], some investigators have trained multiple cognitive domains with a single intervention to better characterize transfer effects and generalizability of findings across domains. Four studies [25], [26], [45], [46] used interventions that targeted multiple cognitive domains including aspects of memory, executive function, visual spatial ability, and processing speed. Twelve hours of training with tasks that depended on attention and visual spatial ability significantly executive function (N-back: d = 4.12) and visual spatial abilities (tracking tasks: d = 2.71; selective attention task accuracy: d = 4.09) [25].

A 24-week computer course training complex cognitive tasks targeting multiple cognitive abilities improved performance on some measures of memory (Rivermead Behavioral Memory test immediate recall: d = 0.67 and delayed recall: d = 0.57; Free and Cued Selective Reminding Test (FCSRT) long delay recall: d = 0.52), and executive function (Trail Making Test part B: d = 0.27) [45]. No significant differences were observed for measures of immediate memory (FCSRT short delay), executive function (Stroop Color Word Test), or verbal fluency.

Ralls [46] administered a training intervention which focused on improving logical reasoning and spatial ability prior to a six-week computer course on the basics of computer use. Participants who received the intervention significantly improved on a measure of spatial orientation (paper folding test), compared to controls. However, results indicated no effect of training on measures of logical reasoning after controlling for pre-training cognitive ability.

One study assessed the impact of computer and internet use on older adult’s cognition [26]. Training consisted of a basic computer course in which participants were taught how to operate a computer and perform simple tasks (e.g., word processing), while the intervention equipped participants with a computer and Internet access for one year with no specific usage instructions. Results indicated participants who received both the Training and the Intervention (the intervention group) showed higher total scores on a measure of memory (Visual Verbal Learning Test: d = 0.52) and better flexibility scores on a measure of task switching (Concept Shifting Task: d = 0.08) over time compared to participants in the Training, No Intervention group. Participants in the Training, No Intervention group were faster on a measure of executive function (Stroop Color Word Test) compared to all the other groups with small effect sizes ranging from 0.12 (No Contact Control) to 0.08 (No Treatment/No Intervention group). No combination of intervention or training impacted processing speed (Letter-Digit Substitution Test) or reaction time (Motor Choice Reaction Test).

Neuropsychological Software

The second body of research included nine studies that used neuropsychological software designed to test and enhance multiple domains of cognition in older adults aged 60 to 94 years (Table 2). Across all neuropsychological software studies the median effect size for each cognitive domain was 4.00 for processing speed, 0.45 for working memory, 0.39 for executive function, 0.56 for memory, 0.59 for visual spatial abilities, and 0.36 for attention.

Duration.

Training sessions lasted from three weeks to 12 weeks with sessions ranging from once weekly to five times weekly.

Memory (3 studies).

Two studies that used explicit and implicit memory tasks from the Colorado Neuropsychology Test (CNT) software for nine weeks to assess the effect of training on memory reported positive outcomes. Rasmusson and colleagues [47] reported participants who received training improved on a measure of episodic memory (Rivermead Behavioral Memory Test; d = 0.94) and verbal learning and memory (Hopkins Verbal Learning Test; d = 0.31). Rebok et al. [48] also used memory tests from the CNT software and reported positive results after training. Standardized scores on the CNT were used to show improvement across training. Improvements were larger for implicit memory tests (ΔZ = 2.08) compared to explicit memory tests (ΔZ = 0.83).

Blackford [49] used the self-paced Einstein Memory Trainer software to examine the effect of memory training via computerized instruction versus a traditional group format. The Einstein Memory Trainer program focused on name and faces, method of loci, peg words, important dates, and phone numbers, was self-paced, and gave performance feedback. The computerized intervention group used the Einstein Memory Trainer software, whereas the traditional classroom intervention group learned from a software manual. The computer control group used cognitive rehabilitation software designed to improve problem-solving and conceptual skills. On a measure of visual spatial ability (Wechsler Memory Scale Visual Reproduction test (WMS-VR)), the classroom intervention (d = 1.24), computer intervention (d = 0.19), and computer control group (d = 0.59) improved more than no-contact controls. On a delayed measure of visual spatial ability (delayed WMS-VR), the classroom intervention improved more than the no-contact controls (d = 1.11). Training did not affect executive function (Trail Making Test part A and B) or measures of memory (California Verbal Learning Test; Name-Face Association Test). There was no evidence to support the superiority of computer training to group-based training.

Multiple cognitive domains trained (6 studies).

Sweep Seeker, a stand-alone module in Posit Science InSight software packages, was used for five weeks to train visual perception and working memory in a lab or home-based setting [50]. Results suggested both interventions were equally effective at training visual perception and working memory (p = 0.36 for group differences). Training resulted in improved performance on trained perceptual tasks of medium (d = 0.85) and high difficulty (d = 0.88). On untrained tasks, perceptual discrimination improved significantly (d = 0.45) for trained participants, suggesting benefits of training transferred to untrained perceptual tasks.

Two studies implemented training protocols using a program designed by Posit Science for eight to ten weeks which focused on improving multiple cognitive abilities [4], [6]. In both studies, training improved measures of processing speed (d = 7.14 (4]; d = 0.87 [6]) and auditory memory and attention (RBANS Auditory Memory/Attention: d = 0.25 [4]; d = 0.23 [6]). Additionally, training improved several other areas including verbal memory (Rey Auditory Verbal Learning Test total word recall: d = 0.27 and delayed word recall: d = 0.20 [6]; forward word recognition span: d = 2.92 [4]) and working memory (digit span: d = 3.00 [4]; digit span backwards: d = 0.26; letter number sequencing: d = 0.23 [6]). However, training did not appear to affect episodic memory (Rivermead Behavioral Memory Test) [6].

The Integrated Cognitive Stimulation and Training Program (ICSTP) was designed to incorporate both paper-and-pencil based training activities with two computer software programs, Sound Smart and Captain’s Log, to simultaneously train multiple cognitive abilities [51]. For non-impaired intervention group participants, performance improved on a measure of logical memory (Wechsler Memory Scale-Logical Memory total recall: d = 0.62) after training. No previous studies have integrated traditional paper-and-pencil based methods and computer technology in training.

The NeuroPsychological Training (NPT) program was designed to stimulate cognitive domains related to attention, language, memory, perception, and reasoning [52]. Post-training performance on practiced tasks (recognition figures list: d = 0.56 and face-name learning task: d = 0.43) was higher in the intervention group compared to the control group. Training effects were noted on a measure of memory (paired-associate recall test: d = 0.68) for participants in the training group, but not for wait-list controls. Both groups improved on a transfer task, which measured place-word learning, but gains were larger in the intervention group (d = 1.01) compared to controls (d = 0.36).

One study compared the effect of 12 weeks of training, using either the CogniFit Personal Coach, a personalized cognitive training software, or classic computer games (e.g., Tetris, snake, puzzles, Memory Simon, memory pairs) that significantly engaged cognitive processing [53]. Participants in the CogniFit Personal Coach group improved on all eight cognitive domains measured (focused attention: d = 0.63; sustained attention: d = 0.35; memory recognition: d = 0.50; memory recall: d = 0.48; visual spatial learning: d = 0.51; visual spatial working memory: d = 0.43; executive function: d = 0.42; mental flexibility: d = 0.39), while participants in the classic computer games group showed improvement in only four domains (focused attention: d = 0.29; sustained attention: d = 0.37; memory recognition: d = 0.33; mental flexibility: d = 0.27). Participants with lower baseline cognitive function benefited most from CogniFit training.

Video Games

Eight studies investigated the effects of video games as a means of improving the cognitive abilities of older adults aged 50 to 87 years (Table 3). Across all video game studies the median effect size for each cognitive domain was 0.77 for reaction time, 0.72 for processing speed, 0.25 for executive function, 0.21 for attention, and 0.69 for global cognition.

Unlike neuropsychological software, most video games (with the exception of Nintendo Wii’s Big Brain Academy) were not originally designed to improve various aspects of cognition and thus are less targeted towards a specific cognitive domain. Commercially available video games included Big Brain Academy, Rise of Nations, and Medal of Honor, while classic video/computer games included Pac Man, Donkey-Kong, Tetris, and Atari video games (e.g., Breakout, Crystal Castles, Galazian, Frogger, Kaboom). One study used a combination of classic cognitive training tasks and video games [54].

Duration.

Training sessions lasted from two weeks to 11 weeks, with sessions ranging from twice weekly to five times weekly. Two studies only required participants to meet a specific time requirement for training which ranged from 2 hours [55] to 5 hours per week [56]. One study had no time requirements and allowed participants to play video games as long as they liked [57].

Processing speed (1 study).

Clark and colleagues [55] studied the effect of playing Pac-Man or Donkey-Kong on processing speed for seven weeks. Results indicated at post-test, the mean reaction time for the intervention group was faster compared with controls on both compatible (responded to stimuli directly in front of their finger; d = 0.33) and incompatible (responded to stimuli opposite of their finger; d = 0.56) tasks and did not result from a speed-accuracy trade off.

Attention (1 study).

To assess the impact of video games on visual attention, older adults were assigned to one of four conditions: UFOV (Useful Field of View) training, Medal of Honor video game, Tetris (a video game control), or a no-contact control group [54]. Medal of Honor, a first person shooter game, has been shown to improve a number of visual and attentional abilities in younger adults and was the main intervention under study, whereas Tetris was selected because previous studies reported little or no effect on visual attention performance of college students [58]. After training the UFOV group improved significantly more on a processing speed measure compared to the Medal of Honor (UFOV task: d = 0.73), Tetris (d = 0.72) and no-contact control (d = 0.98) groups. While the Medal of Honor group significantly improved on a processing speed measure compared to the Tetris (d = 0.72) and no-contact control (d = 0.31) groups.

Multiple cognitive domains trained (6 studies).

Six studies that trained older adults to play various video games (e.g., SuperTetris, Rise of Nations, Crystal Castles, Big Brain Academy) over a span of three to 11 weeks reported positive results in multiple cognitive domains (e.g., reaction time, multiple types of memory, executive function), but results varied significantly between studies.

Two independent studies [56], [59] reported improved reaction time (d = 0.97 [59]; d = 1.11 [56])after using Nintendo SuperTetris for five weeks [56] or a variety of Atari games (Breakout, Galaxian, Frogger, Kaboom, Ms. Pacman, Pengo, and Qix) for 11 weeks [59]. Conflicting results were reported for a measure of executive function (Stroop Color Word Test). Nintendo Super Tetris [56] appeared to improve executive function abilities in both intervention and control groups (d = 0.37), while a variety of Atari games had no effect [59]. In addition, Dustman and colleagues [59] reported participants in the intervention and control groups improved on an executive function/processing speed measure (Symbol Digit Modalities Test: d = 0.25). The intervention did not affect psychomotor speed, verbal or visual memory (Benton Visual Retention Test), or visual motor tracking (Trail Making Test Part B).

In contrast to the previous studies [59] that found no impact of video game training on a measure of executive function (Stroop Color Word Test), a more recent study reported positive results in executive control for participants who played Microsoft Game Studios Rise of Nations, a real-time strategy game thought to improve executive functioning, for four to five weeks when compared to no-contact controls [60]. After training, older adults significantly improved on tasks related to executive control (η2 = 0.42), working memory (N-back: η2 = 0.10), visual short-term memory (Visual Short Term Memory task: η2 = 0.09), and reasoning abilities (Raven’s Advanced Matrices: η2 = 0.11). Video game training also had a positive effect on task-switching (η2 = 0.17), with performance peaking after 23.5 hours of training. No effect of training was seen on measures of visual spatial abilities (Functional Field of View Task, attentional blink task, operation span task) or reaction time.

Not only do video games improve specific cognitive domains for older adults, evidence suggests they can affect global cognitive functioning as well. Two studies that used a variety of video games for eight weeks reported improved global cognitive functioning [57], [61]. Atari’s Crystal Castles, an arcade video game, was hypothesized to improve perceptual motor skills and cognitive functioning of older adults [61]. After training, participants significantly improved on global measures of cognition (WAIS-R full scale IQ: d = 0.77, verbal: d = 0.39, and performance: d = 0.71 subtests) and psychomotor speed (d = 0.88; Rotary Pursuit: d = 0.61), whereas controls showed no improvements [61]. In a study by Torres [57], global cognitive performance improved for participants who played a variety of video games (QBeez, Super Granny 3, ZooKeeper, Penguin Push, Bricks, Pigyn). After training, participants showed less cognitive decline, as indicated by lower scores on a measure of global cognition (ADAS-Cog: d = 0.67), than both active and no-contact control groups.

One study used a Nintendo Wii video game specifically marketed for brain training, Big Brain Academy [62]. After four weeks, participants significantly improved on Wii tasks as illustrated by a large effect size (d = 1.70). Although participants showed significant improvement on Wii specific tasks, these positive effects did not transfer to measures of crystallized, fluid, or perceptual speed ability tests.

Discussion

This systematic review summarized the types of computerized training that have been studied in older adults, and explored evidence of training benefits for computerized training among older adults. Based on this review, all three approaches to computerized training – classic cognitive training tasks, neuropsychological software, and video games – appear to hold promise for improving cognitive abilities in cognitively normal, community-dwelling older adults who have a higher risk of cognitive decline as they age. Studies that used classic cognitive training and neuropsychological software had the most rigorous designs, with 57% (n = 12) of classic cognitive training and 89% (n = 8) of neuropsychological software studies using a randomized controlled trial. In addition, studies using these two approaches had larger samples sizes relative to the video game studies (Table 4).

thumbnail
Table 4. Descriptive Statistics of Computerized Cognitive Training Studies.

https://doi.org/10.1371/journal.pone.0040588.t004

Effect sizes reported in this systematic review are comparable to or better than those reported in non-computerized cognitive training interventions. A meta-analysis of classic memory training interventions reported an average standardized pre-post training gain of 0.73 standard deviations [63]. A more recent meta-analysis, which analyzed the effect of memory training on specific memory abilities, reported effects sizes ranging from 0.06 (face-name delayed recall outcome measures) to 1.10 (short-term memory outcome measures) when comparing healthy older adults in treatment conditions to controls [5]. Given the similarity between computer-based and traditional cognitive training interventions, our findings justify pursuing computer-based interventions in the future.

Classic Cognitive Training Tasks

Based on the evidence reviewed, classic cognitive training interventions improved reaction time, processing speed, working memory, executive function, memory, visual spatial ability, and attention. For reaction time effect sizes ranged from 0.22 [29] to 1.17 [30] with a median effect size of 0.69; for processing speed effect sizes ranged from 0.54 [9] to 3.28 [34] with a median effect size of 1.30; for working memory effect sizes ranged from 0.25 [44] to 3.92 [37] with a median effect size of 0.89; for executive function effect sizes ranged from 0.08 [40] to 6.32 [42] with a median effect size of 0.39; for memory effect sizes ranged from 0.26 [39] to 0.67 [45] with a median effect size of 0.52; for visual spatial ability effect sizes ranged from 0.13 [32] to 4.09 [25] with a median effect size of 0.39; for attention effect sizes ranged from 0.20 to 4.07 [44] with a median effect size of 0.57. Together, these findings suggest the benefits of such computerized training programs are highly comparable to more traditional approaches.

While significance tests were not performed, working memory, executive function, and processing speed appear to be more amenable to change with classic cognitive training tasks. These domains had the largest effect sizes when compared with those of reaction time, memory, visual spatial abilities, and attention.

Neuropsychological Software

Although results varied according to the specific intervention, overall, neuropsychological software programs appear to positively impact cognitive performance. With the exception of Blackford [49] all reviewed studies found benefits of training on memory. Effect sizes ranged from 0.20 [6] to 2.92 [4] with a median effect size of 0.56. Visual spatial abilities improved across two studies with effect sizes ranging from 0.19 to 1.24 [49] with a median effect size of 0.59. Across four studies, measures of working memory improved after training with effect sizes ranging from 0.23 [6] to 3.00 [4] with a median effect size of 0.45. Processing speed effect sizes ranged from 0.87 [6] to 7.14 [4] with a median effect size of 4.0.

Overall, neuropsychological software appears to be least effective in the domains of attention and executive function. While the domains of memory and visual spatial ability are more amenable to change with neuropsychological software.

Video Games

Based on the evidence reviewed, video games appear to be an effective means of enhancing reaction time, processing speed, executive function, and global cognition in older adults. Effect sizes for reaction time ranged from 0.33 [55] to 1.11 [56] with a median effect size of 0.77; for processing speed effect sizes ranged from 0.31 to 0.98 [54] with a median effect size of 0.72; for executive function effect sizes ranged from 0.11 to 0.42 [60] with a median effect size of 0.25; and for global cognition effect sizes ranged from 0.39 to 0.71 [61] with a median effect size of 0.69.

Video game training appeared to have the largest impact on measures of reaction time and processing speed as these cognitive domains had the largest effect sizes. Results were less consistent across studies on measures of executive function and memory and may be explained by the differences in the cognitive tests used to measure these abilities. It is also possible that video game interventions are not an effective means of changing executive function and memory in older adults.

Computer-based cognitive training programs offer several advantages over traditional cognitive training programs, including the ability to individualize training according to the individual’s needs and to reach home-bound or institutionalized older adults. Additionally, computerized programs could be a more cost-effective alternative that offers the possibility of more widespread dissemination among older adults. Because computerized interventions require less face-to-face training, administration costs could be significantly reduced. Computerized cognitive interventions also offer a self-paced individualized experience, allowing individuals to focus only on areas that need improvement. This individualized format also could benefit older adults who experience performance anxiety in a more traditional group-format intervention.

The results from individual studies suggest older adults do not need to be technologically savvy to benefit from training. Many of the older adult participants in the reviewed studies had no prior experience with the technologies (i.e., video games, computers) used in the intervention studies and yet they were still able to benefit from these novel approaches. Previous research has shown participants’ prior use of computers was not significantly associated with acquisition of computer skills during training sessions, suggesting older adults can benefit from novel technologies [64].

Despite common misperceptions older adults do not enjoy learning to use new technology, perceptions of the computerized training programs were positive for the older adults who completed computerized training [65], [66]. In spite of many older adults reporting anxiety about using unfamiliar technology at the beginning of training, most reported high levels of satisfaction after training was completed. Some older adults stated they could use their new video game skills to connect more with their grandchildren [57]; whereas others were very willing to learn to use video games and believed they could be a positive form of mental exercise [54].

It is important to note that inconsistencies may be due to several factors not related to the actual training program itself, including different cognitive outcome measures and modifications of the training program. However, several limitations of this review need to be mentioned. First, the large variability in the types of training techniques used as well as length of protocols makes it difficult to determine the optimal type and dose of computer-based interventions that are the most effective. Second, due to the wide variety of cognitive measures used, control variables in multivariable models, and training interventions, we were unable to conduct a traditional meta-analysis. Meta-analysis assumes effect estimates all have the same underlying meaning, which is violated in the present set of studies because of the wide variability in the type and length of training, as well as cognitive outcome measures used to report results. Thus, estimated effects from each study are not equivalent and should not be combined using meta-analysis.

While it is possible that publication bias may lead to inflated effect sizes, every effort was made to locate and include results from unpublished studies. Three dissertations were included in the current review [46], [49], [54] as well as unpublished results presented at a conference [57]. Finally, studies which included older adults with mild cognitive impairment (MCI) were excluded from the current review. As the current diagnostic criteria of MCI only became well known until after 1999 [67], articles published prior to 2000 may have inadvertently included MCI participants. However, even though MCI criteria were not defined until 1999 [67], this group of individuals was well known and described in the literature as those with incipient dementia and isolated memory impairment among other things [68][70].

Older adults are the now fastest growing segment of Internet users [71]. According to a 2010 Pew Internet and American Life survey [72], 78% of adults aged 50–64 years and 42% of adults older than 65 years of age use the Internet. This is a sharp increase from 2000 when only 50% of adults 50–64 years and 15% of adults older than 65 years of age used the Internet [72]. As ownership of personal computers continues to grow and more older adults have access to the Internet [73], cognitive training programs need to take fuller advantage of these outlets to improve cognitive function and delay cognitive decline in later life.

While there is evidence that computerized cognitive interventions are beneficial in cognitively health community-dwelling older adults, there is need for future research. More well-designed randomized controlled trials with larger samples sizes are necessary to confirm these results. Computerized training may be a lonely individual activity and long-term adherence to such programs may be quite limited. Future studies should investigate this aspect of computer training. Furthermore, future studies should examine the efficacy and feasibility of web-based programs geared towards older adults.

Supporting Information

Author Contributions

Conceived and designed the experiments: AMK JMP ALG GWR. Analyzed the data: AMK. Wrote the paper: AMK JMP ALG GWR.

References

  1. 1. U.S. Census Bureau (2011) Statistical abstract of the United States: 2011. Washington, D.C.: Government Printing Office.
  2. 2. Brookmeyer R, Johnson E, Ziegler-Graham K, Arrighi HM (2007) Forecasting the global burden of Alzheimer’s disease. Alzheimers Dement 3(3): 186–191.
  3. 3. Ball K, Berch DB, Helmer KF, Jobe JB, Leveck MD, et al. (2002) Effects of cognitive training interventions with older adults: A randomized controlled trial. JAMA 288(18): 2271–2271–2281.
  4. 4. Mahncke HW, Connor BB, Appelman J, Ahsanuddin ON, Hardy JL, et al. (2006) Memory enhancement in healthy older adults using a brain plasticity-based training program: A randomized, controlled study. Proc Natl Acad Sci U S A. 103(33): 12523–12528.
  5. 5. Martin M, Clare L, Altgassen AM, Cameron MH, Zehnder F (2011) Cognition-based interventions for healthy older people and people with mild cognitive impairment. Cochrane Database Syst Rev (1)(1): CD006220.
  6. 6. Smith GE, Housen P, Yaffe K, Ruff R, Kennison RF, et al. (2009) A cognitive training program based on principles of brain plasticity: Results from the Improvement in Memory with Plasticity-based Adaptive Cognitive Training (IMPACT) study. J Am Geriatr Soc 57(4): 594–603.
  7. 7. Willis SL, Tennstedt SL, Marsiske M, Ball K, Elias J, et al. (2006) Long-term effects of cognitive training on everyday functional outcomes in older adults. JAMA 296(23): 2805–2814.
  8. 8. Dunlosky J, Kubat-Silman A, Hertzog C (2003) Training monitoring skills improves older adults’ self-paced associative learning. Psychol Aging 18(2): 340–345.
  9. 9. Wadley VG, Benz RL, Ball KK, Roenker DL, Edwards JD, et al. (2006) Development and evaluation of home-based speed-of-processing training for older adults. Arch Phys Med Rehabil 7(6): 757–763.
  10. 10. Rebok GW, Carlson MC, Langbaum JB (2007) Training and maintaining memory abilities in healthy older adults: Traditional and novel approaches. J Gerontol B Psychol Sci Soc Sci: 62 Spec No 1: 53–61.
  11. 11. Fernandez A (2011) Transforming brain health with digital tools to access, enhance, and treat cognition across the lifespan: The state of the brain fitness market. Accessed 2011 Jul 1.
  12. 12. George DR, Whitehouse PJ (2011) Marketplace of memory: What the brain fitness technology industry says about us and how we can do better. Gerontologist.
  13. 13. Jamieson BA, Rogers WA (2000) Age-related effects of blocked and random practice schedules on learning a new technology. J Gerontol B Psychol Sci Soc Sci 55B(6): P343.
  14. 14. Ownby RL, Czaja SJ, Loewenstein D, Rubert M (2008) Cognitive abilities that predict success in a computer-based training program. Gerontologist 48(2): 170–180.
  15. 15. Czaja SJ, Sharit J (1993) Age differences in the performance of computer-based work. Psychol Aging 8(1): 59–67.
  16. 16. Hollis-Sawyer L, Sterns HL (1999) A novel goal-oriented approach for training older adult computer novices: Beyond the effects of individual-difference factors. Educ Gerontol 25(7): 661–684.
  17. 17. Kramer AF, Larish JF, Strayer DL (1995) Training for attentional control in dual task settings: A comparison of young and old adults. J. Exp. Psychol: Applied 1(1): 50–76.
  18. 18. Thompson G, Foth D (2005) Cognitive-training programs for older adults: What are they and can they enhance mental fitness? Educ Gerontol 31(8): 603–626.
  19. 19. Barnes DE, Yaffe K, Belfor N, Jaqust WJ, DeCarli C, et al. (2009) Computer-based cognitive training for mild cognitive impairment: Results from a pilot randomized, controlled trial. Alzheimer Dis Assoc Disord 23(3): 205–210.
  20. 20. Weybright EH, Dattilo J, Rusch FR (2010) Effects of an interactive video game (Nintendo Wii) on older women with mild cognitive impairment. Ther Recreation J 44(4): 271–287.
  21. 21. Shapira N, Barak A, Gal I (2007) Promoting older adults’ well-being through internet training and use. Aging Ment Health 11(5): 477–484.
  22. 22. Sherer M (1996) The impact of using personal computers on the lives of nursing home residents. Phys Occup Ther Geriatr 14(2): 13–31.
  23. 23. White H, McConnell E, Clipp E, Branch LG, Sloane R, et al. (2002) A randomized controlled trial of the psychosocial impact of providing internet training and access to older adults. Aging Ment Health 6(3): 213–221.
  24. 24. Vance D, Dawson J, Wadley V, Edwards JD, Roenker D, et al. (2007) The Accelerate Study: The longitudinal effect of speed of processing training on cognitive performance of older adults. Rehabil Psychol 52(1): 89–96.
  25. 25. Cassavaugh ND, Kramer AF (2009) Transfer of computer-based training to simulated driving in older adults. Appl Ergon 40(5): 943–952.
  26. 26. Slegers K, van Boxtel MPJ, Jolles J (2009) The effects of computer training and internet usage on cognitive abilities of older adults: A randomized controlled study. In: Slegers K, van Boxtel MPJ, editors. Maastricht, Netherlands: Neuropsychological Publishers. pp. 41–58. Successful cognitive aging: The use of computers and the internet to support autonomy later in life.
  27. 27. Botwinick J, Brinley JF, Birren JE (1957) Set in relation to age. J Gerontol 12: 300–305.
  28. 28. Kramer AF, Madden DJ (2008) Attention. In: Craik FIM, Salthouse TA, Craik FIM, Salthouse TA, editors. New York, NY US: Psychology Press. pp. 189–249. The handbook of aging and cognition (3rd ed.).
  29. 29. Bisson E, Contant B, Sveistrup H, Lajoie Y (2007) Functional balance and dual-task reaction times in older adults are improved by virtual reality and biofeedback training. Cyberpsychol Behav 10(1): 16–23.
  30. 30. Lajoie Y (2004) Effect of computerized feedback postural training on posture and attentional demands in older adults. Aging Clin Exp Res 16(5): 363–368.
  31. 31. Hinman MR (2002) Comparison of two short-term balance training programs for community-dwelling older adults. J Geriatr Phys Ther 25(3): 10–20.
  32. 32. Edwards JD, Wadley VG, Meyers RS, Roenker DR, Cissell GM, et al. (2002) Transfer of a speed of processing intervention to near and far cognitive functions. Gerontology 48(5): 329–340.
  33. 33. Edwards JD, Wadley VG, Vance DE, Wood K, Roenker DL, et al. (2005) The impact of speed of processing training on cognitive and everyday performance. Aging Ment Health 9(3): 262–271.
  34. 34. Roenker DL, Cissell GM, Ball KK, Wadley VG, Edwards JD (2003) Speed-of-processing and driving simulator training result in improved driving performance. J Hum Fact Ergon Soc 45: 218–233.
  35. 35. Baddeley AD, Hitch G (1974) Working memory. In: Bower GH, editor. New York, NY: Academic Press. pp. 47–89. The psychology of learning and motivation: Advances in research theory.
  36. 36. Finkel SI, Yesavage JA (1989) Learning mnemonics: A preliminary evaluation of a computer-aided instruction package for the elderly. Exp Aging Res 15(3–4): 199–201.
  37. 37. Jennings JM, Webster LM, Kleykamp BA, Dagenback D (2005) Recollection training and transfer effects in older adults: Successful use of a repetition-lag procedure. Aging Neuropsychol C 12(3): 278–298.
  38. 38. Lustig C, Flegal KE (2008) Targeting latent function: Encouraging effective encoding for successful memory training and transfer. Psychol Aging 23(4): 754–764.
  39. 39. Buschkuehl M, Jaeggi SM, Hutchison S, Perrig-Chiello P, Dapp C, et al. (2008) Impact of working memory training on memory performance in old-old adults. Psychol Aging 23(4): 743–753.
  40. 40. Li S, Schmiedek F, Huxhold O, Rocke C, Smith J, et al. (2008) Working memory plasticity in old age: Practice gain, transfer, and maintenance. Psychol Aging 23(4): 731–742.
  41. 41. Bherer L, Kramer AF, Peterson MS, Colcombe S, Erikson K, et al. (2005) Training effects on dual-task performance: Are there age-related differences in plasticity of attentional control? Psychol Aging 20(4): 695–709.
  42. 42. Bherer L, Kramer AF, Peterson MS, Colcombe S, Erikson K, et al. (2008) Transfer effects in task-set cost and dual-task cost after dual-task training in older and younger adults: Further evidence for cognitive plasticity in attentional control in late adulthood. Exp Aging Res 34(3): 188–219.
  43. 43. Dahlin E, Neely AS, Larsson A, Backman L, Nyberg L (2008) Transfer of learning after updating training mediated by the striatum. Science 320(5882): 1510–1512.
  44. 44. Mozolic JL, Long AB, Morgan AR, Rawley-Payne M, Laurienti PJ (2011) A cognitive training intervention improves modality-specific attention in a randomized controlled trial of healthy older adults. Neurobiol Aging 32(4): 655–668.
  45. 45. Klusmann V, Evers A, Schwarzer R, Schlattmann P, Reischies FM (2010) Complex mental and physical activity in older women and cognitive performance: A 6-month randomized controlled trial. J Gerontol A Biol Sci Med Sci 65(6): 680–688.
  46. 46. Ralls RS (1998) Age and computer training performance: A test of training enhancement through cognitive practice [dissertation]. College Park, MD: University of Maryland College Park.
  47. 47. Rasmusson DX, Rebok GW, Bylsma FW, Brandt J (1999) Effects of three types of memory training in normal elderly. Aging Neuropsychol C 6(1): 56–66.
  48. 48. Rebok GW, Rasmusson DX, Brandt J (1996) Prospects for computerized memory training in normal elderly: Effects of practice on explicit and implicit memory tasks. Appl Cogn Psych 10(3): 211–223.
  49. 49. Blackford RC (1990) Geriatric memory training: Computer versus group instruction [dissertation]. Pasadena, CA: Fuller Theological Seminary.
  50. 50. Berry AS, Zanto TP, Clapp WC, Hardy JL, Delahunt PB, et al. (2010) The influence of perceptual training on working memory in older adults [electronic article]. PLoS ONE 5(7): 1–8.
  51. 51. Eckroth-Bucher M, Siberski J (2009) Preserving cognition through an integrated cognitive stimulation and training program. Am J Alzheimers Dis Other Demen 24(3): 234–245.
  52. 52. Bottiroli S, Cavallini E (2009) Can computer familiarity regulate the benefits of computer-based memory training in normal aging? A study with an Italian sample of older adults. Neuropsychol Dev Cogn B Aging Neuropsychol Cogn 16(4): 401–418.
  53. 53. Peretz C, Korczyn AD, Shatil E, Aharonson V, Bimboim S, et al. (2011) Computer-based, personalized cognitive training versus classical computer games: A randomized double-blind prospective trial of cognitive stimulation. Neuroepidemiology 36(2): 91–99.
  54. 54. Belchior PDC (2008) Cognitive training with video games to improve driving skills and driving safety among older adults [dissertation]. ProQuest Information & Learning.
  55. 55. Clark JE, Lanphear AK, Riddick CC (1987) The effects of videogame playing on the response selection processing of elderly adults. J Gerontol 42(1): 82–85.
  56. 56. Goldstein JH, Cajko L, Oosterbroek M, Michielsen M, van Houten O, et al. (1997) Video games and the elderly. Soc Behav Personal 25(4): 345–352.
  57. 57. Torres A (2008) Cognitive effects of video games on older people. ICDVRAT 19: 191–198.
  58. 58. Green CS, Bavelier D (2003) Action video game modifies visual selective attention. Nature 423(6939): 534–537.
  59. 59. Dustman RE, Emmerson RY, Steinhaus LA, Shearer DE, Dustman TJ (1992) The effects of videogame playing on neuropsychological performance of elderly individuals. J Gerontol 47(3): P168–71.
  60. 60. Basak C, Boot WR, Voss MW, Kramer AF (2008) Can training in a real-time strategy video game attenuate cognitive decline in older adults? Psychol Aging 23(4): 765–777.
  61. 61. Drew B, Waters J (1986) Video games: Utilization of a novel strategy to improve perceptual motor skills and cognitive functioning in the non-institutionalized elderly. Cogn Rehab 4(2): 26–31.
  62. 62. Ackerman PL, Kanfer R, Calderwood C (2010) Use it or lose it? Wii brain exercise practice and reading for domain knowledge. Psychol Aging 25(4): 753–766.
  63. 63. Verhaeghen P, Marcoen A, Goossens L (1992) Improving memory performance in the aged through mnemonic training: A meta-analytic study. Psychol Aging 7(2): 242–251.
  64. 64. Saczynski JS, Rebok GW, Whitfield KE, Plude DJ (2004) Effectiveness of CD-ROM memory training as a function of within-session autonomy. Int J Cogn Technol 9(1): 25–33.
  65. 65. Lee B, Chen Y, Hewitt L (2011) Age differences in constraints encountered by seniors in their use of computers and the internet. Comput Hum Behav 27(3): 1231–1237.
  66. 66. Schmiedek F, Bauer C, Lovden M, Brose A, Lindenberger U (2010) Cognitive enrichment in old age: Web-based training programs. GeroPsych 23(2): 59–67.
  67. 67. Petersen RC, Smith GE, Waring SC, Ivnik RJ, Tangalos EG, et al. (1999) Mild cognitive impairment. Clinical characterization and outcome. Arch Neurol 56: 303–308.
  68. 68. Flicker C, Ferris SH, Reisberg B (1991) Mild cognitive impairment in the elderly: Predictors of dementia. Neurology 41: 1006–1009.
  69. 69. Tierney MC, Szalai JP, Snow WG, Fisher RH, Chi H (1996) A prospective study of the clinical utility of ApoE genotype in the prediction of outcome in patients with memory impairment. Neurology. 1996 46: 149–154.
  70. 70. Minoshima S, Giordani B, Berent S, Frey KA, Foster NL, et al. (1997) Metabolic reduction in the posterior cingulated cortexin in very early Alzheimer’s disease. Ann Neurol 42: 85–94.
  71. 71. Hart TA, Chaparro BS, Halcomb CG (2008) Evaluating websites for older adults: Adherence to ‘senior-friendly’ guidelines and end-user performance. Behav Info Technol 27(3): 191–199.
  72. 72. Pew Internet and American Life Project (2010) Changes in internet use by age, 2000–2010. Accessed 2011 Jul 24.
  73. 73. Gamberini L, Alcaniz M, Barresi G, Fabregat M, Ibanez F, et al. (2006) Cognition, technology and games for the elderly: An introduction to ELDERGAMES project. PsychNology J 4(3): 285–308.