By: Daniel Tippens
Let’s understand our sensory systems in a common sense way to refer to the 5 senses we are familiar with: vision, touch, audition, and so on. A sensory system, understood in this paper, is characterized by certain sense organs, specific brain-areas dedicated to processing the information that those sense organs receive, and a certain kind of perceptual experience which results from that processing. The eyes are the sense organs for vision, brain areas such as V1-V4 in the brain are dedicated to processing visual information, and visual experiences have a certain kind of feel to them which is different from audition and touch. The way things look, experientially, isn’t the same as the way things sound.
In the 1980’s Jerry Fodor published his landmark book Modularity of Mind . There he argues that our sensory systems are “modular.” Here is a first pass understanding of a module. A module is something that is functionally independent; it is a functional part of a larger system which is capable of being knocked out or removed without destroying the rest of the system. Suppose your T.V has speakers plugged into it in order for you to have sound. If you were to unplug and remove the speakers from the T.V, the rest of the T.V still works. You could, for instance, still watch a movie with subtitles. The speakers would be considered a module in this case.
Similarly, our senses can be knocked out without the rest of our minds being destroyed. Someone can get hit in the head and lose sight while still being fully capable of thinking clearly and smelling their coffee. So our sensory systems seem to be modular in this way.
Now, Fodor does think that our senses are modules in that that they are functionally independent, but he understands the modularity of our senses in an even deeper way. In fact, he lists about nine properties which are present, in varying degrees, in our sensory systems which make them modular.
For example, our senses operate quickly and automatically, insofar as they generate experiences very fast and outside of our conscious control. When we open our eyes, we almost immediately experience our surroundings, and without our consent. Our senses also respond to specific sorts of inputs. Vision responds to lightwaves, audition to soundwaves, touch to pressure, etc.
There are plenty of other things Fodor thinks make our senses “modular,” on his understanding of the term. But there is one property in particular that I wish to discuss and employ in this paper, and that is the property of informational encapsulation. This property, I think, can help to explain the relationship between two of our senses, vision and flavour.
A system is informationally encapsulated (or “encapsulated” from here on out) when information from outside of the system cannot be accessed or used in the system’s information processing procedures. Consider the steps that precede visual experience:
First: Our eyes receive inputs.
Second: Brain regions dedicated to vision, such as V1-V4, process those inputs.
Third: An output is produced.
Fourth: We have a visual experience.
If vision were encapsulated, it would mean that no information from outside the visual system would affect the way that the second step occurs. The easiest way to understand this is to consider the famous Muller-lyer Illusion, shown below.
In this illusion, the two parallel lines are actually equal in length. However, due to the different directions of the flanking arrows, we visually experience the lines as being different lengths. The bottom line looks shorter than the top line.
Now, I have told you that these lines are the same length, but if you are still skeptical go ahead and pull out a measuring device and prove this for yourself. Once you have done this, let’s assume you really do believe me that these lines are the same length. Despite having the belief that the lines are of equal measure in length, you still can’t help but see the lines as being different lengths.
What this shows, for Fodor, is that information from your beliefs does not get accessed by your visual system when vision is generating your experience, i.e performing step two. If vision could access your beliefs, then we should expect that upon believing that the lines are of equal length, you would see them as being equal lengths. Since that does not happen, Fodor holds that vision is encapsulated from beliefs.
For Fodor, vision and the other senses are encapsulated, at least to some notable degree, from all cognitive states other than beliefs, such as desires and intentions. Even if I want the lines to be of equal length, that doesn’t change my visual experience of the lines. Call our systems which contain and use things like beliefs, desires, and intentions cognitive systems, and all of these cognitive systems make up what we will call cognition. If there are cases where cognition can influence sensory systems such as vision, call these cases of cognitive penetration. Sensual undertones of the term aside, if a sensory system is cognitively penetrable, then it is *not* encapsulated relative to cognition, since information from within cognition is being accessed by that system.
Encapsulation is also a relative property, in that something can be encapsulated relative to some things and not others. Even if we agree with Fodor that the visual system is encapsulated relative to cognition, it could still be the case that our visual system is not encapsulated relative to other systems, such as the other senses. Indeed, cross-modal perception research has shown this to be the case.
Consider the famous sound-induced flash illusion, which you can experience by checking out the youtube video linked in the endnotes . Subjects sit in front of a computer screen. In one situation, experimenters present the subjects with two visual flashes in rapid succession. Each flash is accompanied simultaneously by a loud *beep.* In the test situation, subjects are presented with only one flash and two beeps. Here, subjects report seeing a second flash which accompanies the second beep, despite the fact that there really was no second flash presented. The beep that subjects hear causes them to see another, illusory, flash. Vision has accessed auditory information. When one sensory system influences another sensory system, call this sensory penetration. In the sound-induced flash illusion, audition sensorily penetrates vision, thus vision is not encapsulated relative to audition.
While the jury is still out whether or not cognitive states like beliefs and desires can influence sensory systems , cross modal perception research has pretty much shown that our sensory systems are not encapsulated relative to one another . Experiments much like the sound-induced flash illusion seem to show that all of our sensory systems regularly access information from one another when generating the perceptual experiences that we enjoy. This might lead one to think that the notion of a sensory system being encapsulated relative to other sensory systems is quite useless.
Against this, I wish to suggest that the notion of encapsulation is still useful for understanding the relationship between two sensory systems; flavour and vision. I argue that vision is informationally encapsulated from flavour. Though it is possible that vision influences flavour perception, i.e vision sensorily penetrates flavour, the visual system is not influenced by the flavour system. What we see is not influenced by what we taste. I then suggest a couple of reasons for this. The first is that these two systems occupy a very unique anatomical relationship. They do not have overlapping receptive fields (which we will define, and see why this matters, in part II of this piece). The second is an introspective argument that vision does not seem to be influenced by flavour. Taken together, inference to the best explanation is that the visual system is encapsulated relative to the flavour system.
Informational encapsulation and sensory penetration – what do I mean by these?
Before moving on, it is worth reviewing a bit of what we have discussed so far. We have seen that encapsulation is the idea that a system cannot access information from other systems in order to process information and generate its outputs. Encapsulation is also to be understood as a relative term; a system could be encapsulated relative to some systems, such a cognitive ones, but not others, such as sensory ones.
For more terminological recap, a sensory system is cognitively penetrable if it can be influenced by cognition, in which case the sensory system is *not* encapsulated relative to cognition. A sensory system is sensorily penetrable when it can be influenced by another, or many, sense modalities, in which case the penetrated system is *not* encapsulated relative to at least one other sense modality. I will say that a system is penetrable if it is influenced by either cognition or sensory systems.
I intend to argue that vision is informationally encapsulated from flavour. This entails that flavour does not sensorily penetrate vision. Before I argue this, It is important that I outline an even more specific understanding of “encapsulation” than what is currently on the table, and also elaborate on what would count as an instance of sensory penetration. But first, let’s start with “encapsulation.” For me, this should be understood as encapsulation at the level of experience, and not necessarily at the level of processing. Let me explain.
Consider an ad-blocker on your computer. The ad blocker regularly gets information from pop-ups, but discards it before you can see it on your screen. Analogously, perhaps the visual system accesses information from cognition, but discards or suppresses it prior to experience. In this way, it could be that the vision is encapsulated at the level of experience, but not at the level of processing. Another way to put this would be to say that perhaps there can be cognitive penetration of visual processing, but not cognitive penetration of visual experience. In the Muller lyer illusion, it could be that the visual system does in fact access information from my beliefs, but it discards, or suppresses, that information such that I always experience the lines as being the same length. Jesse Prinz, for example, suggests this kind of thing as an explanation for the Muller Lyer illusion .
So if a system is penetrable at the level of processing, it doesn’t follow that it is penetrable at the level of experience, as we have just seen. However, if a system is penetrable at the level of experience, it follows that it is penetrable at the level of processing. This is simply because any change in experience, it is assumed, is a result of a change in processing. As such, experiential penetration entails processing penetration. As mentioned earlier, when I argue that vision is encapsulated from flavour, I am saying that the encapsulation is at the level of experience.
Now, we should be clear on what does, and does not, count as an instance of sensory penetration, as this will be important later on. Roughly, a case of sensory penetration is one in which the experience associated with one sense, e.g vision, is directly causally dependent upon information within another sense, say audition, and is not the result of a quirk. To illustrate the direct causal dependence criterion, consider the following.
Let’s say that you are playing baseball and you hear a fire truck passing by behind you in the street. You turn to look at the firetruck. In this case auditory information has caused you to turn your head, which changed your visual experience. Although your visual experience of the firetruck was causally dependent on auditory information, it was not directly causally dependent. It was not directly causally dependent in the sense that the change in your visual experience was mediated by your turning your head. In this case, your visual experience was indirectly causally dependent on audition.
To understand this direct causal relation a bit better, let’s take a look at the ventriloquism illusion. In this illusion, one’s eyes and attention remain fixed on the ventriloquist’s puppet while hearing the ventriloquist uttering speech sounds. Vision is registering the location of the speech as coming from the puppet’s mouth (since it is moving), and audition is registering the location of the speech as coming from the ventriloquist’s mouth. What happens in this illusion, is you hear the speech as coming from the puppet’s mouth ,. Vision has sensorily penetrated audition. You eyes, ears, and attention remained fixed, but visual information directly influenced where you heard the sound as coming from, in a way unmediated by things like turning one’s head.
I also said that sensory penetration cannot be the result of a quirk. This is intended to rule out cases of synesthesia. There are many kinds of synesthesia, but really only one worth noting here: intermodal synesthesia. Intermodal synesthesia is when stimulation to one sense is sufficient to generate another experience in a different sense. For example, seeing the color red could cause a synesthete to experience the taste of coffee. The flavour experience of coffee, here, seems to be directly causally dependent upon the information within vision, which would seem to count it as a case of sensory penetration.
However, synesthesia is widely believed to be a quirk . There are varying statistics on the incidence of synesthesia. They range from 1:20 to 1:2000. Whichever statistic is right, synesthesia is still rare compared to sensory penetration cases such as the ventriloquism effect and the sound-induced flash illusion, which take place across the board in the population.
Synesthesia, then, is the result of quirky wiring in the brain. Since I am concerned with how the ordinary human brain works, I will set aside cases of synesthesia, not including them as cases of sensory penetration. In the next post, we will discuss why I think that vision is encapsulated from flavour.
Daniel Tippens is co-founder of The Electric Agora. He is also a research technician in the S. Arthur Localio Laboratory at New York University School of Medicine.
 Fodor, Jerry A. (1983). Modularity of Mind: An Essay on Faculty Psychology. Cambridge, Mass.: MIT Press. ISBN 0-262-56025-9.
 Feel free to take a look at the illusion here: https://www.youtube.com/watch?v=D3Z1cxA2Tp0
 Firestone, Chaz; Scholl, Brian; “Top-Down” Effects Where None Should Be Found: The El Greco Fallacy in Perception Research” Psychological Science, 2014.
 Stokes, Dustin, Cognitive penetrability of perception, Philosophy Compass, 8 (7):646-663. 2013).
 Macpherson, Fiona, Cross modal experience, Proceedings of the Aristotelian Society 111 (3pt3):429-468 (2011).
 Prinz, Jesse, Is the Mind Really Modular? in Contemporary Debates in Cognitive Science, blackwell publishing, 2006.
 Bertelson, P. (1999). Ventriloquism: A case of cross-modal perceptual grouping. In Aschersleben, G., Bachmann, T., and Musseler, J., editors, ¨ Cognitive Contributions to the Perception of Spatial and Temporal Events, pages 347–362. Elsevier, Amsterdam.
 Vroomen, J., Bertelson, P., and de Gelder, B. (2001). Auditory-visual spatial interactions: Automatic versus intentional components. In de Gelder, B., de Haan, E., and Heywood, C., editors, Out of Mind, pages 140–150. Oxford University Press, Oxford
 O’Callaghan, Casey, Synesthesia vs. Cross-modal Illusions Sensory Blendings: New Essays on Synaesthesia, ed. Ophelia Deroy, Oxford University Press, forthcoming.
9 responses to “Vision is (probably) Informationally Encapsulated from Flavour (Part I)”
This all sounds plausible, but where do you put attention in your model, as in the ventriloquism example? Smell (and taste) are pretty vague senses in humans (slow, unable usually to parse out complex mixtures, hard to verbalise), so highly informative senses like sight dominate in terms of our reported experience. We know attention can have top-down effects at the level of the sense organ – eg the famous example of suppression of auditory nerve signals when a cat is visually attending to a target – so would it count if aromas affect visual attention, as in spiders?
And similarly, in the case of the olfactory-visual Stroop test,
I would think one does not have any conscious access to the priming effect, and so would not report a difference, even though it can be measured behaviourally.
As you will see briefly in the next essay, for me (and most others in this field), if attentional shifting *alone* is the thing that is responsible for a change in an experience in another modality, then it does not count as a case of sensory penetration.
The reason is that attention shifting is, basically, analogous to turning one’s head and reorienting one’s attention overtly. All you are doing when you shift attention is changing the inputs that get filtered through for further processing, not changing the processing itself (see the Stokes essay on cognitive penetration linked in endnote  for more details on this).
I think it is also worth noting – r.e your comments on how vision is “highly informative” and smell and taste are “vague,” I think it is important to note that vision is a *spatial* sense, and taste and smell are *chemical* senses. Taste and smell primarily provide information about chemical content, whereas vision (and audition) provide information about the spatial location of various objects.
So in one sense vision and audition are highly informative – e.g in their spatial capacities – and in another sense they aren’t – e.g in their chemical capacities
I’m familiar with the visual illusions, and how they are resistant to belief. I wonder however about other perceptual feelings that subjectively seem to me to have some type of visual penetration. For example, as an endurance runner, subjectively as I have become more fit hills appear to me to be less steep than previously. I wonder if this is because the sense of ease vs discomfort can take more dominance on the perceptual as the discomfort increases and maybe can psychologically interfere. Another example however is in basketball the sense the the hoop becoming larger as confidence increases. These are just subjective anecdotes but I wonder Dan if you have background to explain this. Thanks.
Actually, there have been a lot of studies involving the kinds of cognitive penetration effects that you describe (things like- does feeling confident increase perceived hoop size? Does believing you are poor make coins look larger?). However, most of them are flawed.
The studies all used the same methods which were something like this:
Subjects were divided into two groups – the test and control group.
The test group would wear a heavy backpack and look at a hill. The experimenters would then ask them to judge how steep the hill was.
The control group would not be given a heavy backpack, and the experimenters would then ask them to judge how steep the hill was.
The results were that the test group judged the hill to be steeper than the control group.
This type of study abounded over the past two decades or so. Literally hundreds of studies following this exact kind of method were published, all finding the exact same kind of results (for things like being hungry and perceived apple size, and so on).
However, Chaz Firestone and Brian Scholl (both at Yale) basically falsified these studies. They found that what was causing the test groups to have different judgments was actually *response bias* and not their actual experience (see the paper in endnote 3 to look into how they proved this for yourself). Response bias is when subject consciously or subconsciously guess what the experimenters are testing for, and this affects their response, i.e the results, in the experiment.
So, I am at least unconvinced that things like feeling hungry or believing you are poor actually cognitively penetrate experience. Like I said in the paper, the jury is still out.
sethleon2015: Personally, I find footholds are much larger on the same climb than they used to be.
DanT – when I used the term vague, I meant in terms of amount of data.
Emberson et al
see “expectation-based feedback” as understandable as learning, but acting via top-down suppression at an earlier stage of processing. I would read this as within-modality penetration.
Chaz and Firestone
also see semantic priming as learning. Both could be seen as cognitive effects on perception, though mechanistically perhaps quite different.
r.e vagueness – gotcha.
r.e the firestone and scholl paper you referenced, that is actually an argument against the claim that there are top-down effects on perception. Indeed, Scholl and Firestone gave guest lectures at NYU which I attended where they presented the paper you cited, the paper I cited, and one other paper. All of which argue that there are no top-down effects on perception. I also attended a debate between Scholl and Gary Lupyan where Scholl was arguing that there are no (interesting) top-down effects on perception.
I just say this because it seemed like you placed Firestone and Scholl as at teetering into the camp that holds that there are top-down effects on perception. So for cautious clarity, they are not in that camp.
It is also worth noting that you said the “mechanisms” could be quite different. Indeed, for the debate about cognitive penetration, the devil is in the details, which in this case, means the mechanisms. If cognitive penetration is happening through the front end via attention or the back end via memory activation (through semantic priming), Scholl and Firestone don’t think these count as genuine cases of cognitive penetration.
No, I was contrasting similar phenomena, presumably in different brain regions/pathways, that Emberson (and the papers they cite) and Firestone each characterize as learning, but one is “low” level (Emberson compares it to classical conditioning) and the other possibly “higher”, given the latter looks more flexible. But might the former mechanism underlie certain phenomena we call attentional?
The odour-visual Stroop result I cited earlier could be interpreted as Firestone suggests. Other Stroop phenomena are affected by hypnosis, so again that might support higher level attentional mechanisms (though in the case of pain perception, I believe there are top-down mechanisms inferable from imaging studies).
A reference for human “Corticofugal Modulation of Peripheral Auditory Activity”
That is both visual and auditory top-down modulation of LGN and MGN “gating” by attention occurs, and in the auditory system even further down. I would see this is a direct causal pathway, and guess perceptually it will be experienced as a diminution of the targeted “nuisance” sense-datum :). Similar gating of olfactory inputs at the olfactory bulb or medio-dorsal thalamus has been suggested but does not have much support [Tham 2009].
Very interesting. Two questions:
1. What would you say to someone arguing for there being an influence in the sense of a mutual feedback loop between taste and vision? In other words, vision clearly influences taste (that, the food being hot, and the glutamate triggering umami receptors being the foundation for Auguste Escoffier’s inventing modern French haute cuisine), so couldn’t that effect (namely something tasting good) affect in turn my further visual perceptions? For example, I sit down in Escoffier’s restaurant at the Ritz. Visually, I am primed (by design) in innumerable ways to have a positive view of the food. Then, the food, which is hot (service a la Russe) and overwhelming my olfactory bulb and which is loaded with glutamate from the veal stock Escoffier used as a base in so many of his recipies, drives my umami receptors crazy, confirming/reinforcing/influencing what I see/notice. In other words, while there is certainly a difference between visual stimuli and my interpretation of them, might I not notice things I would otherwise miss or overlook others, having been influenced by taste?
2. To the extent that this may occur, could not my filling in of the gap in my visual field per my optic nerve blindspot be influenced by these effects as well?
nice to hear from you 🙂
Stay tuned for part II tomorrow. I think I discuss some of the things you brought up in your comment