This was a 5 year project funded by the European Research Council, and is the catalyst behind much of the work featured in the lab. The project has two main aims. The first is to further our understanding of how the information we pick up through our senses influences the way we time our movements and the second is to see if we can use different types of technology to create dynamic patterns of sensory information (visual and auditory) to help people improve how they control their movements in a sporting (e.g. putting a ball in golf) or health related context (e.g. walking in people with Parkinson's).
Controlling the timing of our movements to successfully perform an action (e.g. directing our eyes to read this text, drinking from a cup or catching a ball) is somthing we do without thinking twice. Even simple activities such as these raise interesting and challenging questions. How does the brain control the timing of these types of movement so that we do not knock over the cup we want to drink from or miss the ball we want to catch? How does the information we pick up through our senses (e.g. the location of the cup or flight path of the ball) influence our decisions about when and how to act? How do we learn skilled movements that require precise timing and what happens if the areas of the brain associated with timing actions breaks down (for example Parkinson’s disease or Stroke)? Can specially engineered patterns of sensory information be used to improve the timing of our movement in a health related (e.g. rehabilitation) or a sports related context (e.g. learning a new skill)?
To answer these questions, the first part of the TEMPUS-G project involved carrying out a number of experiments to help us understand how the brain tunes into sensory information (visual and auditory) that is picked up from our interactions with the surrounding environment. To start with we looked at the difference between moving to a beat and moving to a regularly changing continuius sound. We found that the additional sensory information provided by the continuous sound created a timing reference for movement and, as a result, improved the consistency of timing between movements. We have also shown that for a particular type of event, such as a ball rolling down a sand-papered ramp, we can control the timing of our actions using sound alone. We have shown how the relationship between the changing pattern of sound intensity provides information for the brain about when the ball will arrive at the catching point. This information is then used by the brain to control the timing of the catching action.
We have also looked at examples in sport such as deciding whether we can get through a gap between two defenders in rugby (Watson et al, 2010) or stop a free kick in soccer that bends on its way to the goal in soccer (Dessing & Craig, 2010). In these cases, to study these dynamic sporting actions we developed further our state of the art immersive, interactive virtual reality set-up so that players can move in the virtual environment as if they were in a real-life sporting scenario (see Technology We Use for more details).
Understanding what is important in the way the patterns of sensory information change over time has helped us to develop artificial patterns of sensory information that could be used to improve the timing of our actions. In collaboration with engineers, we have developed a visual guide using a programmable LED display that can be used in the context of learning a skill or speeding movements in patients with Parkinson’s disease. We are currently testing how well healthy adults and people with Parkinson’s can use this guide. We have also developed sounds that represent the temporal framework within which a movement should be controlled. Again these sounds are being used to improve the consistency of movement in a sporting and in a health context (e.g. improving gait in Parkinson’s). Finally, one of the most exciting developments in the project has been exploring how movement based games controllers can be used as a means of interacting with sensory guides developed in a virtual environment context. We are looking into displaying patterns of visual and acoustic information to improve mobility in people with Parkinson’s or who have had a Stroke.Return to Top