Kenneth I. Forster


Research

My interests are centered on the structure of the human language processor, and the mechanisms by which we code and store linguistic expressions. The central question here is whether the language processor can be subdivided into distinct, autonomous levels of processing corresponding to lexical, syntactic and semantic processing, or whether in practice these levels are so intermixed that there is no real division possible. Typical research questions here include studies of sentence context effects on word perception, and the effects of semantic plausibility on sentence processing generally.

Another central interest is the study of visual word recognition, and the nature of the information retrieval mechanisms that enable effortless but accurate retrieval of the properties of words at such rapid rates. Do these mechanisms use an associative memory, as proposed in neural network approaches, or is there some kind of serial scanning mechanism that searches rapidly through the words in our mental lexicon? Much of our work here concerns the phenomenon of masked priming, where orthographically related words prime each other, even when the subject is totally unaware of the nature of the prime.

Click here for a discussion of masked priming.

Another key interest is understanding how various types of ungrammaticality affect the task of sentence matching, where subjects have to decide whether two printed sequences of words are the same or not. Simple ungrammaticalities, such as agreement violation or word- order errors, have a strong effect on the time taken to perform this task, but more complex types of ungrammaticality apparently have no effect at all. Other sentence processing tasks, however, do not show this pattern. Solving this puzzle should provide some insight into the nature of the syntactic processor.

We are also working with an alternative technique to self-paced word-by-word reading, called the MAZE task. This technique forces incremental integration of each new word into the structure of the preceding context.

Click here for a demonstration of the MAZE technique.

[Home]