Cognition And Perception Are Separate After All?

By Neuroskeptic | July 21, 2015 4:28 pm

Can our beliefs, motivations and emotions influence our visual perception? Are cognition and perception ultimately inseparable?

A lot of recent psychological research says “yes” to the question. For instance, it has been claimed that carrying a heavy backpack makes a hill look – not just feel – steeper. Other researchers say that feeling sad makes things seem darker – not just metaphorically, but literally.

However, according to a new paper by Yale psychologists Chaz Firestone & Brian J. Scholl, all of these claims for ‘top-down’ cognitive influence on perception are mistaken: Cognition does not affect perception: Evaluating the evidence for ‘top-down’ effects.

Firestone and Scholl begin by noting the recent buzz surrounding top-down influence:

The emergence of so many empirical papers reporting top-down effects of cognition on perception has shifted the broader consensus in cognitive science.

Indeed, such alleged top-down effects have led several authors to declare that the revolution in our understanding of perception has already occurred, proclaiming as dead not only a “modular” perspective on vision but often the very distinction between perception and cognition itself.

However, they say, while there have been over 160 published reports of top-down influence, all of these experiments suffer from one or more ‘pitfalls’ that call the results into serious question.


For instance, Firestone and Scholl identify ‘perception vs. judgement’ as one of the six pitfalls. As they put it,

Many alleged top-down effects on perception live near the border of perception and cognition, where it is not always obvious whether a given cognitive state affects what we see, or instead only our inferences or judgments made on the basis of what we see… Any time an experiment shifts perceptual reports, it is possible that this shift reflects changes in judgment, rather than perception.

They take as an example the claim that

throwing a heavy ball (rather than a light ball) at a target increases estimates of that target’s distance (Witt et al., 2004). One interpretation of this result (favored by the original authors) is that the increased throwing effort actually made the target look farther away… However, another possibility is that… after having such difficulty reaching the target with their throws, subjects may have simply concluded that the target must have been farther away than it looked.

Firestone and Scholl go on to cite a 2009 study which used the same ball-throwing task but in which the participants were asked more detailed questions. In this study it emerged that the participants were indeed basing their distance judgements on more than just pure visual perception.

This is a long paper, and in the course of it the authors discuss numerous other cases in which, they say, ‘top-down’ influence over perception has been wrongly inferred for various reasons.

However, could there still be hope for the idea of cognitive penetration of perception? Firestone and Scholl readily agree that attention can affect visual perception, making attended objects seem brighter, nearer, and larger than unattended ones. In fact, they say that mistaking ‘attentional effects’ for perceptual ones is one of the six pitfalls of this field of research.

Yet isn’t attention itself a top-down, cognitive process? The authors acknowledge this point. However, they say that while attention can come under ‘top down’ control e.g. when we decide to attend to something, our attention can be and often is drawn to things ‘automatically’, in other words it can arise out of visual perception ‘bottom up’. They see attention as a kind of special case, quite different from the other claimed kinds of top-down process.

Nonetheless, attention does seem to be a ‘chink in the armor’ of the otherwise ‘impenetrable’ perceptual module.

ResearchBlogging.orgFirestone C, & Scholl BJ (2015). Cognition does not affect perception: Evaluating the evidence for ‘top-down’ effects. Behavioral and Brain Sciences, 1-77 PMID: 26189677

  • Marc Clint Dion

    I think there may be something to this idea. I’ve recently noticed that when I am thinking intensely, while walking around; I lose the ability to see. I become almost completely unaware of my surroundings and it’s as if my eyes cannot not see what’s around me until my focus returns to what is happening around me. It could be that attention must actually be focused to the eyes for them to be visually effective and when we are distracted, we don’t notice that information from the eyes is not being processed because we are not paying attention to it. This brings the possibility that what we focus on can affect what we see.

  • SuchindranathAiyer

    They were always (since the 1970s) different in the Behavioural Sciences but cognition proceeds through the filter of perception. This is what makes rational discussion based on facts almost impossible among human beings! Particularly in “Politics” and “Religion”.

  • Rolf Degen

    I wonder what the authors would have to say about this study:

    It suggest that conformity pressure not only influences subjects’ reports about their perceptions but even early unconscious visual perceptual processing. That’s hard to believe, indeed.

    • Neuroskeptic

      Good point. I wonder however whether this EEG paper suffers from the problem described in “Systematic biases in early ERP and ERF components as a result of high-pass filtering.” which I blogged about here.

      The problem is that certain EEG filters can distort the signal in such a way that information is sent “back in time” so later peaks influence earlier ones. The authors say that this can explain some reported claims of top-down modulation of early EEG components.

      I don’t know if it applies in this case!

  • Nacho Sanguinetti

    I recently had to review some papers on this effort generating distance overstimation. I think it is very inconclusive work on itself, the methodology seems sub-optimal for categorical demonstrations. I read a couple of critiques about these backpack experiments. One is a very strongly opinionated one by Firestone himself.

    I guess the point is that you really need a perfect experimental paradigm. Unless you will get some data on something else. As you mention, judgement, or other cognitive process that intervenes in the behavior of the human but not necessarily in the perception. In some cases artifacts of the demand characteristics of the experiment.

    For example it was shown that there is a larger probability for the subjects to overestimate distances and slopes when they are when they are wearing a backpack if post-hoc they mentioned the possibility that the experiment was about -what effect the backpack would have-. Also, If people were given an excuse for carrying the backpack (like for example fake muscle recordings), then they would not overestimate the distances any more.

    Who is being deceived? The experimental demands of wearing a backpack.
    Durgin FH1, Baird JA, Greenburg M, Russell R, Shaughnessy K, Waymouth S.

    Does energy expenditure affect the perception of egocentric distance? A failure to replicate experiment 1 of Proffitt, Stefanucci, Banton, and Epstein (2003).
    Hutchison JJ1, Loomis JM.

    The Various Perceptions of Distance: An Alternative View of How Effort Affects Distance Judgments
    Adam J. Woods, John W. Philbeck, and Jerome V. Danoff

    How “Paternalistic” Is Spatial Perception? Why Wearing a Heavy Backpack Doesn’t—and Couldn’t—Make Hills Look Steeper
    Chaz Firestone

  • Lyndon

    I have trouble parsing this because I do not think we have some
    standard concept of perception. If we define perception as some end
    state, say that which comes into our field of vision, then the
    interplay between cognition and perception seems to be some endless
    muddle. Think of “what you see” when you read a sentence. I am
    thinking of misreadings, of seeing words that were not there or not
    complete, of rereading and re-seeing a sentence, of reading very
    blurrry text, and so on.

    Specifically, what exactly are we going to define as the structures of perception during the making out of a verry blurry word? Trying
    to separate the visual system from its interactions with the
    knowledge system within such cases seems a bizarre route to take. I
    am thinking of how we perceive/process/see all the tokens of the
    letter “@” for instance.

    If we define perception in a narrow way it may give us limited insight into the post-perceptual processing that is performed. Under such a narrow conception we will have a stark difference between our concept of perception and what we are seeing, which would be strange to many. Taken to the extreme, if we define perception as purely the nervous activity in the retina, then it would of course be separate from what we usually call cognition. But that would obviously be far from what most people think of as perception.

    However, the wariness towards many of these social psychology
    cognition-effected-perception examples may be warranted.

    • Nacho Sanguinetti

      I’m going to think out loud.

      I think reading is a very complicated example. But I still think It has separate perceptual aspects and cognitive aspects. Of course they might interact at some point but I don’t think it’s as muddy.
      For example If you read a blurry text, the reading can be crystal clear, but you are still seeing blurry words. Your capacity of interpreting those objects may not affect the blurriness you perceive.
      That is actually a good experiment. Make humans compare words in terms of how blurry they are, and compare words to pseudowords. See if real words are perceived as being more sharp than pseudowords. I have a feeling that initially you will find an effect, words will be sharper. But it could possibly be explained by one of these pitfalls, like ” the subject interpreting the experiment like the experimenter wants it to”.

      Also, if you try to read words cut horizontally in half. You can read this, but you are not cognitively filling in the rest of the word to your visual system.

      In relation to what you say, that when you read you sometimes read words that were not there. Or the opposite. I’m sure this happens all the time. But do you read them? Or do you see them? It may have something to do with attention. You are probably perceiving the right words, but are concentrating your resources into interpreting. (Of course perception requires some level of interpreting “this is a banana, this is a tree” maybe even this is the word banana and this is the word tree.) At some point trying to make semantic sense may funnel your attention elsewhere creating these cognitive effects where you misread.

      • waltinseattle

        very good explanation of how these concepts are a set of nestled processes. how we perceive the reality of missing print, but we also “perceive” the intended, predicted message. Ha. when the A.I. can figure those distorted “prove you are not a bot” tests next to the post comment…..

  • LorenAmelang

    I’m in the predictive coding camp. As Zoe Jenkin says of Lupyan:

    In “Cognitive penetrability of perception in the age of prediction: Predictive systems are penetrable systems”, Lupyan makes the case that information stored anywhere in the mind can bear on the contents of perception, depending on how relevant that information is to a particular task. If this picture is right, then the implications for cognitive architecture are vast. It would imply not only that cognitive states can on occasion alter perceptual processing, but that they always do, because the contents of perception are generated based on these system-wide priors. On this picture, there is no proprietary store of information that is distinctive of perception, or of any particular sensory modality. The same body of information can bear on the formation of perceptual and cognitive states. Since many ways of drawing the distinction between perception and cognition invoke precisely this kind of difference, Lupyan’s conclusion would entail that no such difference between perception and cognition can be found.

    Seems to me what Firestone & Scholl have shown is that it is dangerous to work with “stores of information” to which subjects have conscious access. Of course it is infinitely harder to design experiments around subconscious knowledge that neither participants nor our culture yet have words for, but as we discover ever more sophisticated steps along the grid/place/direction/speed cell continuum, I’m convinced we will find objective perceptual space and body map cells that govern both perception and cognition. And find your perception is dramatically different from mine despite our (as yet) having no words to describe the differences.

    • Neuroskeptic

      Thanks for the comment, very interesting.

      But couldn’t predictive coding apply within a modular framework? Couldn’t the visual system have a set of predictions (say) which are segregated from the predictions within other faculties?

      • LorenAmelang

        As I said in my reply to Nacho Sanguinetti, visual perception _could_ have its own impenetrable set of predictions. It is my own direct experience that has convinced me it does not. In our world of cameras and video, we tend to assume everyone perceives the same objective reality, we all know what it is “supposed to” look like. We have no words for describing any other shapes for it. But when your perceptual pathway can’t produce that expected result you face grim choices, and for me, keeping my perception and cognition in the same experiential space was essential. The alternative was insanity. Thankfully I found the perception maps are quite accessible to cognition, though we haven’t yet evolved language to share that experience.

    • Nacho Sanguinetti

      What do you mean by this? Could you explain please.

      “I’m convinced we will find objective perceptual space and body map cells that govern both perception and cognition.”

      • LorenAmelang

        Perception that is not correlated to body movement in 3D space is dangerously ambiguous, as is easily demonstrated by familiar illusions. Perhaps you could say that each instantaneous “frame” of de novo visual perception is impenetrable, but as soon as you have perceived the same scene from two different vantage points, some part of your brain is comparing, analyzing, interpreting, and storing the 3D structure of the scene as part of your “place” memory. Once you’ve “seen” a place from multiple vantage points, you can never go back to the impenetrable de novo state.

        One could argue that this space map (which I call the Psychoros, ‘The dancing-ground of the soul’) is unique to perception and impenetrable to cognition. But my experience tells me it is not. Mine was traumatically broken early in my childhood, and I’ve come to recognize how I adapted. As I move through the world, there are “attention points” I compulsively look at. My perceptual systems use them as gateways between segments of 3D space that don’t otherwise connect. While I’m cognitively locked onto the gateway image, my perceptual parameters get reconfigured – sometimes dramatically – in ways similar to what happens when we adapt to lenses or prisms imposed in our visual path. We’re already dangerously past the set of concepts most people use to think about space, so I’ll stop there…

        But my direct experience tells me that my sense of body size and shape, posture and physical stress, movement speed, visual perspective, and even emotional state are all intimately based on this sense of place. We already know about grid cells, place cells and speed cells. If we had shared concepts for their effects on perception, we could probably discover how they work together to create what we think is the real world.

        • Morbeau

          Fascinating, thank you. My brain does something similar – when it’s calm and unattached to a worry, it reliably starts to play back segments of 3D memories of places. It’s fond of Ireland and Mount St. Helen’s apparently (and lots of others, but those ones are prominent for no reason).

    • guillermina.ryals
  • mambo_bab

    Even if cognition is different from perception quantitatively, it should be some effects, and should be the result of exaggeration or undersizing? Though there are some difference between Cognition and Perception, I think it is not intrinsic.

  • tim faber

    I am confused by their perception – judgment distinction. “we can directly see that an object is red, and also conclude or infer that an object is red”. Do they draw the line when a finished visual ‘percept’ is send to the prefrontal cortex for ‘cognitive’ processing?

    • waltinseattle

      I am reminded how the horse in a field of wind blown grass does not wait to perceive a predator lurking. every little noise signal above background is definitely the beast. White or blue plastic bags are great “triggers” this way. Windy day particularly, not other times so much. again I will vote for the top down, but laugh at the neuroadvertising hopes of the social experimentations.

  • smut clyde

    Then this paper came out in Psychological Science, as if to demonstrate Firestone and Scholl’s point that the pendulum has swung so far towards the Top-down-perception paradigm that grotequely-wrong studies are being accepted into print because they fit so neatly into the reviewers’ expectations:

    “Sadness Impairs Color Perception” (C. A. Thorstenson, A. D. Pazda, A. J. Elliot)
    (thoroughly demolished within days at PubPeer and by Andrew Gelman).

    • waltinseattle

      my take is that the top down aspects are not mere artifacts of the experiment but are part of the complex polyspace of the subject. I may or may not feel the uphill climb differently, but if I am primed by my own everloving fears to see lions tigers and spacemen after me, well, the hill is flat! this is top down, it is real for my experience, and the lab guy can’t touch that. Overvalenced (internal) data are a more nuanced complex than the social “science” experiments want to deal with perhaps.

  • bowlweevils

    I find it nearly impossible to evaluate the comments because there is no indication of whether the commenter has actually read the Firestone & Scholl article or not, or has read the article and the responses, or is just responding based on the content of the blog post.

    While the blog post is interesting, any comment attempting to evaluate the argument put forth by Firestone & Scholl without actually reading the article with the evidence they supply is pretty much worthless.

    Furthermore, many of the comments fail at a basic level of approaching the subject matter that is covered in every Intro to Psych course: this is not something that can be resolved by introspection. What you think about your experiences is not up to the task. We are dealing with complex experiments that provide data intended to address phenomena that occur in a time frame of less than 100ms.

    Your thoughts about what your brain is doing at this level of precision and resolution are as relevant as your thoughts about how particular enzymes in your stomach are catalyzing reactions among the chemical components of a meal you ate, or the electro-dynamic alterations of the calcium and potassium channels in a neural branch as you flex a muscle of your big toe. You simply cannot do it.

    Your thoughts may lead to methods of gathering relevant data using appropriate methods, but your thoughts are not in and of themselves valid evidence.



No brain. No gain.

About Neuroskeptic

Neuroskeptic is a British neuroscientist who takes a skeptical look at his own field, and beyond. His blog offers a look at the latest developments in neuroscience, psychiatry and psychology through a critical lens.


See More

@Neuro_Skeptic on Twitter


Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Collapse bottom bar