By Daniel Tippens
Some notes on attention and sensory penetration
I intend to argue that vision is informationally encapsulated (just “encapsulated” from here on out) from flavor. In the last post, we discussed the ideas of encapsulation and sensory penetration. The notion of encapsulation that I am working with takes place at the level of experience, and sensory penetration takes place when one sense modality directly influences another sense modality’s respective experience and is not a result of a quirk. If one sense X is sensorily penetrated by Y, then X is *not* encapsulated from Y.
There is still one more note here which needs to be made. At the beginning of the last post, I said that the following steps precede visual experience:
First: Our eyes receive inputs.
Second: Brain regions dedicated to vision, such as V1-V4, process those inputs.
Third: An output is produced.
Fourth: We have a visual experience.
This isn’t quite accurate, or, at least, not detailed enough. There is still another step that should be inserted, that of attentional selection:
First: Our eyes receive inputs.
Second: Selective attention selects certain inputs to enter into processing within the visual system.
Third: Brain regions dedicated to vision, such as V1-V4, process those inputs.
Fourth: An output is produced.
Fifth: We have a visual experience (which may or may not just be step 4).
When we look at things in our environment, we can selectively attend to some things over others. Indeed, at this very moment you are selectively attending to this word instead of the word “output” several lines above. Our eyes are bombarded with inputs at any given moment, far too many for our visual system to be able to handle. So, attention acts as a gatekeeper, allowing some inputs to enter into further visual processing, and not others .
There are two types of attention: overt and covert. Overt attention is when one actually moves one’s eyes in the direction of the stimulus they wind up attending to. This could be captured by recalling the case of the firetruck in the last post. Covert attention takes place when one’s eyes remain fixed, but one shifts one’s attention around nonetheless. Think of when you are at a party, in a conversation with someone, and you want to keep an eye on what your spouse is doing in the corner of the room. In order to avoid being rude, you are likely to keep your eyes fixed on your interlocutor, but covertly attend to your spouse.
We need to discuss covert attention in order to rule out one last case that does not count as sensory penetration. We said that sensory penetration is when one sense directly influences another sense’s resulting experience and is not a result of a quirk (thereby ruling out synesthesia). But now consider this case involving covert attention.
Imagine that you are at the movie theatre. On the screen, everything is silent, and characters are moving around. You are attending to the center of the movie screen. A sound is presented on the left-hand side of the screen, thereby drawing your covert attention to it. Your visual experience has been influenced by audition and was not the result of a quirk, but it still doesn’t count as a case of sensory penetration. The reason is that the sound merely shifted your attention, which changed the inputs that went through for visual processing, but didn’t alter the processing of those inputs. So, our final understanding of sensory penetration is when one sense directly influences another sense’s resulting experience, is not a result of a quirk, and is not a result of a shift in attention alone.
What is flavor?
In order to argue that vision is encapsulated from flavor, we should get clear on what flavor is. Many people confuse “taste” with “flavor .” Experientially, taste is the experience which results from gustatory receptors on the tongue and around the mouth. It is widely believed that there are 5 basic tastes (and 5 basic taste experiences): sweet, sour, bitter, salty, and umami.
But we don’t just experience the five basic tastes. We experience chocolate, watermelon, and creamy soup. We experience flavors. In fact, most of our flavor experience doesn’t come from taste, so where do these extra experiential qualities come from? Enter olfaction (smell) , which comes in two types: orthonasal and retronasal. Orthonasal olfaction takes place when odorants (the inputs to our olfactory receptors in the nose) enter the nose from the outside. When you think of orthonasal olfaction, think of sniffing as the paradigm action which facilitates this kind of smell.
Though orthonasal olfaction doesn’t really play a role in flavor experience, retronasal olfaction does. Retronasal olfaction takes place when odorants enter the nose through the nasal cavity in the back of the mouth. If you have ever smoked hookah, basically the entirety of the flavor you experience is due to retronasal olfaction. If you eat a skittle and pinch your nose, you will notice that much of its flavor seems to disappear. Retronasal olfaction’s role in flavor experiences also explains why food tastes so different when you have a cold.
Interestingly, our flavor experiences seem to be sensorily penetrated by every single modality. Touch can influence flavor perception , as when we eat a creamy soup. Audition can also play a role, as when we experience the flavor of potato chips to differ depending on their crunchiness . Vision, too, has some causal influence. The color of the food, or even its packaging, can influence our flavor experiences .
It is important to note the difference between a sensory experience partially constituting the experience of flavor, as opposed to merely causally influencing it. The color of a food might causally influence our flavor experience, but it doesn’t constitute it. We don’t experience the colors as being spatially located in the mouth, and things don’t taste red. Taste and retronasal olfactory experiences constitute the experience of flavor, but colors only causally influence, i.e sensorily penetrate, flavor experiences.
So flavor experience, for the purposes of this paper, is the experience one has thanks to retronasal olfaction and taste which can be sensorily penetrated, i.e directly causally influenced by, the other sense modalities, and is spatially located in the mouth . What the flavor system tracks are, *ahem*, flavors; chemical properties of the foodstuff that one ingests. As Barry Smith has put it, “Chefs and nature create flavors, not brains .”
Vision is encapsulated from flavor, an inference to the best explanation argument.
Here I present a three-part argument for the claim that vision is encapsulated from flavor at the level of experience, i.e that flavor does not sensorily penetrate vision at the level of experience. There are two things that typically trigger sensory penetration which cross-modal perception researchers have uncovered. One is known as the spatial-temporal rule, and the other is cross-modal correspondence. The first part of my argument seeks to show that the spatial-temporal rule is never satisfied between flavor and vision due to their anatomical relationship. The second part is an introspective argument to support the conclusion that while cross-modal correspondence may allow for flavor to sensorily penetrate vision at the level of processing, it does not obtain at the level of experience. The third part uses the conclusions from the first two arguments as the basis of an inference to the best explanation claim that vision is encapsulated from flavor.
Cross-modal perception researchers tend to agree that the reason our sense modalities exchange information with one another is to resolve conflict about a common subject matter. This is known as the principle of intersensory discrepancy . Consider again the ventriloquist effect. When one sees the puppet’s mouth moving and hears the ventriloquist’s voice, audition and vision are receiving discrepant information about the source of the speech . Audition and vision recognize the disagreement, and then resolve the conflict. One auditorily experiences the source of the speech as coming from the puppet’s mouth.
Now, when do two senses recognize a common subject matter? One of the most important triggers is spatial-temporal overlap . The puppet’s mouth movements and the ventriloquist’s voice are spatially and temporally close in proximity. Indeed, the ventriloquist is intentionally matching his puppet’s mouth movements with his speech sounds. When this coincidence in space and time obtains between stimuli in two different senses, sensory penetration occurs. This is known as the spatial-temporal rule (just the “spatial rule” from now on).
So, it is a necessary condition that the spatial rule to obtain in order for sensory penetration, of the kind in the ventriloquist effect, to obtain. Note that I said of the kind in the ventriloquist effect, as we will see later that there is another kind of sensory penetration which occurs.
The question is, then, do flavor and vision ever satisfy the spatial rule? If not, this kind of sensory penetration can’t occur between them. I think not, because in order for the spatial rule to be satisfied, two sense modalities must have the same object in both of their receptive fields at the same time. The two modalities must have overlapping receptive fields, within which some common object rests. This simply never is the case for vision and flavor.
Here, “receptive field” refers to the physical field of space that a sense modality has access to at a given time when one’s body remains fixed. This means that receptive fields between modalities overlap in different ways. First, some receptive fields subsume others. The receptive field of audition subsumes, for the most part, the receptive fields of all of the other modalities. I can hear things that I see, smell, touch, and taste, but I cannot taste, smell, touch, or see everything that I hear. Consider when you hear a seagull far off in the distance behind you, and you keep your body fixed.
Second, some receptive fields partially overlap. I can see some of the things that I smell or touch, but not all of them, as is the case with potent objects touching my back. Third, receptive fields have potential overlap. When I shout out to you from behind, you can turn your head and bring your visual receptive field to overlap with your auditory receptive field.
In order for the spatial rule to obtain, an object must be within the overlapping receptive fields of the two modalities. This simply never happens with flavor and vision, for they stand in a very unique anatomical relationship with one another. The receptive field of vision is always outside of the mouth, a place that flavor simply doesn’t have access to, not even potentially.
It is worth noting that vision is the only sense modality that stands in this relationship with flavor. Touch and audition both have partial overlapping receptive fields with flavor, as you can feel the texture of the food as you hear your teeth chomping on it. But for the visual system, flavor takes place right under our very noses, and for the flavor system, the visual system’s receptive field is always just out of reach. As such, vision and flavor never satisfy the spatial rule and so they can never engage in the kind of sensory penetration that takes place in the ventriloquist effect.
There is another trigger that leads to sensory penetration, which is known as cross-modal correspondence. There are different accounts of cross-modal correspondence, but we can understand it, for now, as an associative relation between two modalities. Consider how vision can influence flavor. Seeing the color of the food one is about to ingest can influence the flavor that one experiences. This takes place despite the fact that the spatial rule isn’t satisfied, as we concluded earlier.
So, sensory penetration can occur in some cases without the spatial rule being satisfied. Between flavor and vision, there are a few associations, but we will focus on the pervasive association between color and flavor. One then might argue the following. Look, an associative relation is symmetrical. If x is associated with y, then y is associated with x. So if colors are associated with flavors, and this association is sufficient to cause sensory penetration of flavors by vision, then the principle of symmetry says that flavors are associated with colors, and this association ought to be sufficient to allow for flavor to penetrate vision, in some cases.
I have yet to see any research that addresses how flavor can influence vision (something I will discuss briefly later), and so introspective, bolstered by ordinary language, arguments must suffice here. Remember the distinction between sensory penetration at the level of processing, and sensory penetration at the level of experience. Introspectively it doesn’t seem that flavors influence the colors that we see. It could be that the association between colors and flavors does allow for sensory penetration of the visual system at the level of processing, but introspectively this doesn’t register in one’s visual experiences, which is what I am arguing for.
Ordinary language seems to supplement the introspective argument well. In conversation many of us have noted that the rich colors of foods influence our flavor experience, but we don’t talk about the converse holding as well. We don’t talk about how eating something while looking at a painting influences the way it looks 
But still a keen introspector could raise the following objection: but wait, sometimes I see a food and it looks delicious, but to my surprise, it actually tastes terrible. Subsequently, I see that food as being gross, or repulsive. Doesn’t this count as a case of flavor sensorily penetrating vision?
Here it is important to draw a distinction between low-level and high-level perception. Let’s say you and I are looking at a blue circular object that is a blueberry. You are fortunate enough to have the concept of a blueberry, while I am not. When you and I look at the blue circular object, you see it as a blueberry, while I simply see a blue-circular object. The properties that I experience, color, shape, etc. are known as low-level properties. The property that you experience, being a blueberry, is a high-level property. You have deployed the concept of a blueberry within your experience .
In the case of eating something which looks good, experiencing its terrible flavor, and then seeing the food as repulsive, flavor is influencing the high-level properties that one visually experiences. When most researchers, and I, talk about encapsulation and penetration, they are referring to encapsulation at a low-level. This is because it is obvious that all sorts of things can influence high-level perception, cognition included. An oncologist who has learned a lot in pathology will see a scan and see something as a tumor, while you and I will only see certain low-level properties in the image. So the important question is, can flavor influence vision at a low-level? The aforementioned objection says nothing to support the conclusion that flavor can influence vision at a low-level.
Since the spatial-temporal rule can never be satisfied between flavor and vision, and associations aren’t sufficient to allow flavor to sensorily penetrate vision at the level of experience, the inference to the best explanation is that vision is encapsulated from flavor at the level of experience.
For a while perception researchers seemed to focus exclusively on understanding how the visual system works. This could have been because the visual system occupies an enormous part of the brain, because researchers assumed that the senses were encapsulated relative to one another and therefore could be studied in isolation, or perhaps because researchers assumed that the principles governing how the visual system works would extend seamlessly to the other modalities .
Unfortunately, all three reasons turned out to be problematic. That the visual system owns a large neural real estate doesn’t justify studying it exclusively, it turns out that our senses (for the most part) are not encapsulated relative to one another, and non-visual sense modalities abide by different principles when creating our experiences.
Cross-modal perception research has been blossoming ever since these things were discovered. However, I fear that another bias toward studying certain systems over others is happening again. Most research within the cross-modal perception field seems to focus on the relationship between the spatial senses (vision, audition, and touch). Fewer researchers investigate the relationship between the chemical senses (olfaction and flavor), and even fewer research the relationship between the chemical senses and the spatial senses.
Though a few people are considering the ways in which vision can influence flavor, I have yet to see a single group discussing how flavor influences vision. I suspect there is, actually, a good reason for this. Most of us have always recognized the conclusion that I argued for here, that vision is encapsulated from flavor. At the very least, we all recognize that this is probably true.
Daniel Tippens is co-founder of The Electric Agora and a research technician at New York University School of Medicine.
Technically speaking, attention doesn’t select the inputs to the eyes, it selects representations that have have been created from those inputs. But for simplicity’s sake I will just talk as though attention selects inputs.
 C. Spence, B. Smith, M. Auvray, Confusing Tastes with flavours, Published in
Perception and Its Modalities, edited by Dustin Stokes, Mohan Matthen andStephen Briggs, Oxford: Oxford University Press 2014.
 M. Auvray, C. Spence, The multisensory perception of flavour, Consciousness and Cognition, 17 (2008), pp. 1016–1031.
 Aradhna Krishna and Maureen Morrin, “Does Touch Affect Taste? The Perceptual Transfer of Product Container Haptic Cues.” Journal of Consumer Research: April 2008.
 K. Knöferle, C. Spence, Crossmodal correspondences between sounds and tastes
Psychonomic Bulletin, DOI 10.3758/s13423-012-0321-z.
 Spence, Charles, On the psychological impact of food colour, flavour 2015, 4:21 doi:10.1186/s13411-015-0031-3.
 Small, D. M., Gerber, J. C., Mak, Y. E., and Hummel, T. (2005). Differential neural responses evoked by orthonasal versus retronasal odorant perception in humans. Neuron, 47:593–605.
 Smith, Barry, On the Nature of Sensory Experience: The Case of Taste and Tasting, Phenomenology and Mind Online Journal 2013, pp. 292-313 ISSN 2280-7853 (print), ISSN 2239-4028
 By Welch, Robert B.; Warren, David H. Immediate perceptual response to intersensory discrepency, Psychological Bulletin, Vol 88(3), Nov 1980, 638-667.
 O’Callaghan, Casey, Perception and Multimodality, Oxford Handbook to Philosophy and Cognitive Science, Eric Margolis, Richard Samuels, and Stephen Stich, Editors Oxford University Press, forthcoming.
 But for further argument against the importance of the spatial-temporal rule, see Spence, C. Just how important is spatial coincidence to multisensory integration? Evaluating the spatial rule, Ann N Y Acad Sci. 2013 Aug;1296:31-49. doi: 10.1111/nyas.12121. Epub 2013 May 24.
 It is certainly plausible that if one is eating a delicious food while looking at a painting could alter one’s visual experience in an indirect way. For example, if the food is delicious, it could cause a change in one’s mood, which could cognitively penetrate one’s visual experience. But that would only be a case of cognitive penetration, and not sensory penetration.
 It is still controversial whether or not experience contains high-level properties. For a discussion on this issue, see: Bayne, Tim, Perception and the Reach of Phenomenal Content, Forthcoming in The Philosophical Quarterly. To be reprinted in Macpherson, F. and Hawley, K. (forthcoming) (eds.) The Admissible Contents of Experience, Oxford: Blackwell.
 O’Callaghan, Casey, Perception and Multimodality, Oxford Handbook to Philosophy and Cognitive Science, Eric Margolis, Richard Samuels, and Stephen Stich, Editors Oxford University Press, forthcoming.
20 responses to “Vision is (probably) Informationally Encapsulated from Flavor (Part II)”
Hi Dan, very good!
One thing I’m not clear on is your description of the lack of spatio-temporal overlap between flavor and vision. Yet you also describe flavor as composed of both taste and smell, and the orthonasal aspect of smell comes from outside the nose. So isn’t there some overlap there or am I misunderstanding?
Also your example of the radiologist made me think of another case. This is from memory so I may not be getting it right. I saw some research on children in India who had been blinded (by cataracts I think). When they get corrective surgery they have to learn how to interpret what their visual information means. An example was that a black and white drawing of a Gernsey cow just looked like black and colored area’s on a page to them. Not a cow. So the experience is very different and not just based on where the attention goes. I’m wondering how this fits in with the definition of the vision faculty, and if potentially so much of the visual experience gets defined away what is eventual benefit of the reductive definition.
Thanks for the compliment! Thats nice to hear as this is a condensed version of my writing sample for grad school 🙂
Anyway, sorry for the confusion about flavour. I suppose I didn’t make it clear enough in my essay. Flavour experience is constituted by retronasal olfaction and taste alone (for the purposes of this essay, as some other things such as somatosensation (- pain sensation, thermal sensation, and texture sensation- are also included as constitutive of flavour experience in some corners).
So, since taste and retronasal olfaction only detect chemical elements which are in one’s mouth, flavour’s receptive field is *in the mouth only.*
r.e the second part of your comment: I’m not quite sure what to make of the case you mentioned about certain people with cataracts. You said that they learn to “interpret” their visual experiences. This could be understood in a couple of ways. Let me illustrate by talking about something known as adaptation.
Suppose I place “inversion glasses” on you (which is something researchers can do), which causes you to have an inverted visual experience. The sky is in the bottom half of your visual field, and the ground is in the top half, etc.
After a few days or a week, you “adapt” to this, which simply means you are able to behave as though you have a normal visual experience. You can circumnavigate the world as normal, reach for objects normally, etc.
There are two ways to explain what adaptation is such that you can behave normally. On one interpretation, your actual *experience* is re-oriented, such that you come to experience the world right-side up despite having the goggles on. The other explanation is that your experience stays the same (inverted), but your motor centers are able to adapt to the new visual experience such that your actions align with your inverted experience such that you, from a third-party perspective, behave as everyone else does.
In the case of the cataracts children, when you say they “interpret” their visual experience, it could mean either one of the things I just mentioned. Perhaps “interpret” just means they cognitively or, behaviorally, learn to act normally in virtue of their abnormal experience, and the experience remains the same. Or, perhaps “interpret” means that their actual visual experience changes. I simply don’t know which is true :/
I’m not sure what the terms ‘information’ and ‘experience’ mean here, and what is different about ‘processing’ them. In terms of what I have called the BioComputing Phenomenology Thesis (biocomputing is more than conventional computing), informational semantics alone is insufficient.
In terms of vision and taste, the biocomputing pathways could carry a combination of both informational and phenomenological ‘data’.
I was kind of expecting that answer about the retronasal aspect of smell being the one related to flavor. That explains what I was confused about.
On the second question I don’t think I was very clear. My understanding was that the indian children had been blind do to lack of the access to the medical care that could repair the non-CNS visual problem. There was never an issue with the neural information processing. Yet when the external visual problem was cured there remained differences in the visual experience. As I understand it the children missed a key developmental stage related to the interpretation of the visual information somewhat analogous to how it becomes much more difficult to learn language at a later age. This led to the children not being able to link black and white drawings of cows to their actual physical referent. Since the information ‘input’ in these children is the same as in any normal person yet the visual experience is different I was wondering how that might relate.
This again is from memory of a PBS panel I saw a few years back and I haven’t actually looked at the topic in any formal way and don’t have a link to the relavent studie(s).
“an associative relation is symmetrical”: only in a couple of mathematical usages of the term. In most empirical sciences, the appearance of correlation can be due to direct causation (X->Y), so experimental (or other) evidence is needed to exclude the latter. In epidemiology at least, the terms “association” and “risk factor” refers to a correlation with and a correlate of a disease where the true state of nature about causation is not (yet) known.
More generally, I think your argument pretty good, pace skeptical objections about whether “low” and “high” level processing can really be reliably split apart (eg are top-down processes low or high).
Very informative! Question: would color connotations count as high or low level?
As I mentioned in a comment for the Jerry Fodor “special” last round (found here: http://theelectricagora.com/2015/10/14/this-weeks-special-jerry-fodors-special-sciences-or-the-disunity-of-science-as-a-working-hypothesis/comment-page-1/#comment-437) I believe that our mental and behavioral sciences are still in need of founding theory from which to build. While I do essentially agree with the conclusion of your essay, I think we’d find that once a functional model of the human mind does become established, you’d then be able to develop your position far more effectively. But given that I believe my own model of the human mind does happen to be quite useful, there is also the potential for me to use it to personally assess this issue.
I see no technical reason that taste couldn’t alter visual experiences, except that apparently evolution did not find this helpful. Conversely the provided sound/flash and line length illusions, must have been helpful as general heuristics from which to interpret the world, and even given their sometimes faulty implications. But if it were useful for visual perceptions to become more bluish while enjoying tasty food, though reddish for the opposite, for example, then I’m quite sure that such mutations would have prevailed, and quite regardless of our ability to view the chewing process to the degree that we smell it.
Dan, I do not know if this observation gives you much to work with, since I haven’t actually addressed the specific logic and terms which you’ve presented. I will however say that I did find the essay enjoyable, and essentially because I’m very curious about how standard cognitive science functions. I must hope that you and others are able to consider me something beyond “a crackpot” for suggesting that the field does still require a working model of the human mind from which to function. Physicists before Newton mustn’t have known that they were “pre Newton” either! Thus I’d rather not criticize technicals issues from your paper, by means of a model that you and others have not yet become familiar with. Perhaps some day I’ll be given a post here from which to outline my human mind model, which may thus give me more freedom in this capacity. Of course if you and others were to find the model useful, or perhaps find ways to improve it, the tool would likely spread beyond us. Wouldn’t it be amazing if we here were to help the field develop its own founding theory from which to function?
“As I understand it the children missed a key developmental stage related to the interpretation of the visual information somewhat analogous to how it becomes much more difficult to learn language at a later age. This led to the children not being able to link black and white drawings of cows to their actual physical referent.”
Well, I can’t be sure about the case you are talking about, but it sounds like they had some kind of visual agnosia. Can you take a look at this wikipedia link on the condition and tell me if you think either apperceptive or associative agnosia sounds like an explanation for what was going on with the children you mentioned?
When I talk of the symmetry of associations, I am basically talking about this: http://plato.stanford.edu/entries/associationist-thought/#AssSym
“would color connotations count as high or low level?”
I am pretty sure most people agree that colors count as low-level properties.
Did you mean something different with color “connotations?”
Hi Philosopher Eric,
I agree that it is conceivable that it could have turned out that vision is sensorily penetrated by flavour. Perhaps flavour systems sensorily penetrate vision in other species besides us.But it seems that for us, our visual system is encapsulated from flavour.
And don’t worry, I don’t think of you as a “crackpot.” Just a pot with some cracks 😛
Hi Dan the associative agnosia matches my memory of what the panel was describing. Perhaps an aspect of the the visual cortex needs a functioning eye to develop properly.
At this link:
I took a look at this study of children in India treated for congenital and developmental cataracts. They conclude:
‘Most cases of poor outcome occurred in the congenital cataract group, especially those operated later in life and are attributed to early visual deprivation and attending amblyopia. Early detection and surgery, optical rehabilitation, and close follow-up are essential for good outcome. This is especially crucial in those aged less than 1 year.’
So they don’t mention the agnosia, but the amblyopia (lazy eye) appears to also be a result of the eye-brain function not developing properly due to the cataracts interference. Interesting how brain development depends on external connections, although not so directly connected to your post which concerns normal function.
Thanks for a couple of highly interesting and thought provoking essays. As far as I can see (in this case not very far as I have no expertise and little knowledge of the area) you are right and it.says something quite intriguing about our cognition. Interesting to see how this progresses
If color/shape connotations are low level, what happens when I taste a repulsive pea soup and the color now switches from being reminiscent of a pastel shade of green to reminding one of vomit? Or the atrocious refried beans whose shape, instead of being a vaguely interesting and amorphous blob, now distinctly reminds one of something much more unpleasant?
Sorry for the delayed response 🙂
“If color/shape connotations are low level, what happens when I taste a repulsive pea soup and the color now switches from being reminiscent of a pastel shade of green to reminding one of vomit?”
Well, if you are saying that there are cases where eating something repulses makes the food’s color *remind* you or causes you to *think* of vomit, then this wouldn’t count as a case of flavour sensorily penetrating vision. It would simply be that flavour is causing certain events in cognition.
Similarly with the case of atrocious refried beans “reminding” someone of something much more unpleasant, this sounds like it is just a case of flavour causing a certain memory event in cognition.
If, on the other hand, you are saying that tasting a repulsive pea soup actually changes the color that you experience from one shade to another, I simply deny this. Or at least, I deny that this kind of thing is something that obviously does happen.
From my introspective perspective, this doesn’t happen. I think flavour can cause you to experience a different *high* level property in vision, such as seeing the soup *as* repulsive, but it can’t change the low level properties (like color) in your visual experience.
Hi Dan, these were two interesting articles on a subject I honestly had not considered. I don’t have much to add other than to suggest experiments be conducted, we might just be surprised, even if the effects are slight.
One possible arguable form of flavor penetrating vision, might be its effect on what we imagine we see. Imagined visual images (with eyes closed) is considered to use the same visual systems. It is possible that input from the eyes themselves is too strong a signal to be overcome, but the systems (when no other input is available) can be influenced. For example if I close my eyes and eat something sour or bitter it is hard to visualize items that are sweet. And vice versa.
But someone with expertise in visual processing could correct me on if that (imagined images) counts as a form of vision.
Interesting comment r.e flavour influencing visual imagination.
I can see the plausibility to this, though I don’t think that imagined images count as a form of vision in the sense I am using it here.
The reason for this is that “vision” as I am using it here, necessarily involves the processing of inputs registered by the eyes, i.e a necessary condition for the kind of vision I am talking about is stimulation to the sense organs. This is not a necessary condition for imagined visual images to be created.
If you don’t accept this, there is still something else worth mentioning. While the visual system may be used in some way to create images, there is still a large part of cognition involved, specifically, long term memory. So, it wouldn’t be clear if tasting something is affecting long-term memory recall or something within the visual system.
I do like your speculation above, as well as Dantip’s response. I’ve been trying to play it cool here, given that I believe that I’ve developed a great functional model of human consciousness — I’d rather not be seen as some kind of “arrogant know-it-all.” Regardless, this is how I personally consider “imagined visions”:
In the diagram below, observe that I provide three varieties of conscious input to the conscious processor, or “thought.”
“Qualia” would be the punishment/reward input, “senses” would be where things such vision exist, and I designate “memory” as a recording of past consciousness (which the conscious processor is apparently able to use as well). If something is imagined however, I’d place this under conscious processing itself, or “thought.” Sure the mentioned inputs could play a part in building a given imagined image (and certainly memory) but that’s to be expected. Observe that if you imagine a car right now, “processing” is what should actually produce it (obviously with a great deal of memory input to work from), not outside information.
Also consider dreams as we sleep, which should most certainly concern imagination. Input from memory as well as from qualia should be prominent in our dreams, but can we also taste something while dreaming? Well yes, though if it does not emanate through the tongue, I’d say that this would be an “imagined flavor,” or similar to an “imagined image.”. Can an imagined flavor in a dream, penetrate an imagined image? Well I don’t see why not!
Anyway, I do hope to interest you and others in my own functional model of human consciousness. I don’t believe that any other such tool has yet come on the market, and even given our great need.
Hi Dan, I was going to say that it could hint at how regular vision might be influenced (though the effects could be much smaller to nothing). But your final paragraph is a solid counter to that. Nice.
The colour of odours, or in fact perceptual illusions among wine tasters:
Hi David Duffy,
I haven’t read over the whole article yet, but it looks like they are investigating orthonasal olfaction (smell) and its influence on colors in wine tasting subjects (subjects that are routing wine tasters).
Since orthonasal and retronasal olfaction are quite different, at *least* functionally, the study you mentioned doesn’t, I think, threaten what I’m saying here. Perhaps you have read more and could correct me?
Hi Dantip – it was more as obliquely relevant to dbholmes suggestion, but backwards, as even trained individuals imagine flavours based on colour cues. That is, vision influences (reported) aroma/flavour at a high level, which you already knew. They conclude we just don’t have a good conscious grasp of our experiences of flavour, thus impoverished semantics even though chemical senses are so critical to life.
Novak et al (2015)
present evidence for vision-olfaction integration for recognizing negative emotion in faces. They suggest based on the neuroanatomy and functional imaging work that the only influences from olfaction to vision will be top-down:
a visual and an olfactory network, each comprising an emotion convergence area (i.e., amygdala), a multisensory area (pSTS for vision, OFColf for olfaction), and a unisensory area (extrastriate cortex/EC for vision, PPC for olfaction).
However, I am quite taken by the idea that a human face is a single quale…
Ah gotcha, I always seem to be one step behind you!