«

»

Jul 01

How would our brains need to be altered to make us perceive new qualities?

Suppose we wanted to re-engineer the human brain in a way that would allow us to perceive entirely new kinds of qualities.  How could that be done?  Suppose for example that we have developed a new kind of neuron that is sensitive to magnetic fields, and we want to connect them to our brains in a way that would give us conscious awareness of magnetism, using qualities that differ from those generated by the other senses.  What sort of structural alterations would be needed?  The answer is unclear; and the fact that it is unclear is important, because it points to a serious gap in our understanding of the neurobiology of consciousness.

The purpose of this post is to justify those statements.  Let me start with a bit of background.

We have, as everybody knows, several distinct sensory systems:  vision, hearing, touch, olfaction, etc.  Each of those systems is implemented differently in the brain, but at the level of conscious awareness, they have certain things in common.  Most importantly, input from each sense is perceived in terms of qualities.  For vision the qualities include color and brightness; for hearing they include pitch and loudness; etc.

Our full perceptual world can be decomposed into several orthogonal quality domains, one for each sensory system.  We can compare qualities within a domain, but not across domains.  Thus we can compare the color or brightness of two points in visual space, and we can compare the pitch or loudness of two tones, but we have no way of comparing a color with a pitch, except at a metaphorical level.  (There is an interesting caveat: in people with synesthesia, the separation between sensory domains breaks down.  That’s an important fact, but a discussion of it would lead me astray.)

The perceptual separation between sensory systems is mirrored at the brain level.  Each sensory system is implemented by a distinct set of brain components.  There are subcortical areas such as the cochlear nuclei; there are dedicated areas of the thalamus (for every sense except olfaction), and most importantly, there are dedicated primary and secondary areas of the cerebral cortex.  Thus for vision we have the primary visual area V1 in the occipital lobe, and secondary areas V2, V3, etc. adjoining it.  For hearing we have the primary auditory area A1 in the parietal lobe, and secondary areas A2 etc. adjoining it.  And so on.  The secondary areas project to higher-level parts of the cerebral cortex, which integrate information from multiple sensory systems.

Thus if we want to add a new sensory system to the brain, up to a certain point it is reasonably clear what we should do.  We should allocate a portion of the cerebral cortex as a dedicated primary area.  Our magnetic-sensing neurons can either project there directly, or, if they need to be reformatted, they can be routed through a newly allocated part of the thalamus.  Our primary magnetic area can send its output to adjoining secondary magnetic areas which extract useful features, analogous to the way secondary visual areas extract features such as shape and color from raw visual signals.

But what then?  What can we do with the resulting secondary signals so that the person whose brain this is will gain a conscious awareness of magnetism, perceived in terms of new qualities?  Clearly we need to send the signals to other parts of the cerebral cortex, but which parts, and how should the projections be structured?

The answers are not at all clear.  We can try “broadcasting” the output to every other part of the cortex, in an essentially unstructured way, but it seems very doubtful that that will work.  But if not that, then what?

The fact that we don’t know how to do this points to an important gap in our understanding of sensory processing.  A student of neuroscience, after learning about the structure of the visual system—how it extracts features such as color, motion, and binocular disparity—might get the impression that there are no longer any deep mysteries about perception:  that it’s just a matter of working out all the niggly details. Nothing could be farther from the truth.  We have substantial understanding of the low-level computational processing of sensory signals, but we don’t understand how they give rise to distinct perceptual qualities.  In fact we don’t even understand, at the level of brain function, what a perceptual quality is.

That’s an important thing for philosophers to know.  The “hard problem of consciousness”—as David Chalmers likes to put it—is not just hard at a philosophical level.  It is also unsolved at an empirical level.

 

10 comments

Skip to comment form

  1. stevenjohnson

    Have any specific neural structures been implicated in cases of blindsight?

    Have there been any anomalies in memory formation in cases of blindsight?

    (There was a report of another sensory phenomenon where the victims were unconscious of receiving inputs, but I’m sorry I’m drawing a blank on anything other than my thought, “That’s like blindsight.”)

    I’m not quite sure why you say synesthesia is misleading in this context. If it’s because you don’t see any reason to consider it a relict transitional phase, I get your point. But it seems to suggest that, although consciousness is hard, it may not be quite as big a problem as philosophers assume. But maybe I’m misled because so much of my thinking (my life) is unconscious?

    1. Bill

      Hi Steven. Blindsight — let me begin by pointing out — is sort of the reverse of what I was talking about. In blindsight a sensory input is created that influences behavior without giving rise to conscious awareness — I was talking about adding a sensory input in a way that does give rise to awareness.

      That being said, the structure most commonly associated with blindsight is the superior colliculus. It receives input from the eyes, projects (indirectly) to the cerebral cortex, and is involved in controlling eye movements and attention.

      I didn’t mean to say that synesthesia is misleading, just that it is a complication that I didn’t want to deal with in this post. The most interesting question about synesthesia, to me, is whether people who have it experience a differently structured space of qualities, or whether they experience a similar set of qualities but frequently experience qualities that are not appropriate for a given stimulus modality. (I hope that isn’t worded too opaquely.)

      Best regards, Bill

      1. stevenjohnson

        Hi Bill, thanks for the answer.

        If the superior colliculus coordintes visual input, the cerebral cortex and eye movements and attention, at a guess then, one aspect that distinguishes blindsight from your opposite, an engineered magnetic sense, is coordination. The very recent popular report of the claustrum coordinating (timing, purportedly) inputs to create consciousness are similarly suggestive. So possibly your imaginary neuroengineer should try to dedicate a new structure that connects the parts of the brain that process spatial orientation; the muscles that align magnetic sensing hairs or whatever particular organs detect magnetic fields; the cerebral cortex? All very speculative of course, taking off from one example (I do wish I could remember the other report…it’s frustratingly close to coming to awareness.) But the notion that awareness is related to the need to focus the sense on particular parts of the environment does seem natural to me. Otherwise a vague unlocalized presumption that doesn’t impinge on the consciousness would seem to be sufficient.

        As to your remarks on synesthesia? I believe I understand you perfectly. But for me an equally interesting question is whether synesthesia is a dysfunction or disease? Are the multiple qualities truly inappropriate, a deficit like Daltonism? Or are the differences in qualities merely a variation with negligible impact? I guess a philosopher’s way of asking it would be, are qualia causal?

        1. Bill

          Regarding your first paragraph, I think in the long run it will probably be more productive to look at behavior, particularly speech. If I give a person an input that he is aware of, and ask him whether he is aware of it, he will say “yes”. If I give him an input that he is not aware of (such as a blindsight stimulus), he will say “no”. What is the cause of that difference in verbal behavior?

          Regarding the second issue, I’m committed to the belief that qualia are causal. If a person stubs his toe and says “My toe hurts”, I argue that the cause of that speech act is the pain in his toe — and pain is a quality. I should add that I prefer to speak of “qualities” rather than “qualia”, because to many philosophers the word “qualia” carries some mystical significance that I don’t think is very useful.

          Thanks for the comments, Bill

  2. Matt Skaggs

    Do the Artificial Intelligence folks have an approach that deals with this? It seems like AI would require some model that at least takes sensory input and translates it into perception.

    1. Bill

      Hi Matt. AI does deal with giving systems sensory inputs that provide information for their computations, but the question of whether that process should be called “perception” goes beyond AI into the realm of philosophy. Anyway, the question I was addressing in this post relates specifically to humans: how could you add inputs to the human brain in a way that would cause us to perceive them as qualities such as a color or a pitch? Nobody really has a clear idea at present how to implement qualities in an AI system.

  3. Peter Smith

    There is the reverse problem of those who suffer from petit mal.
    I started suffering from this problem four years ago and it is a most extraordinary experience. Segments of one’s conscious experience go missing and one is entirely unaware that there are missing segments. One consciously experiences a seamless timeline that seems complete. But it isn’t complete and ordinarily one has no way of knowing there are missing segments. It is only by viewing something with a well defined timeline that one becomes aware that portions of the timeline are missing. I am guessing here, but it seems as if the link between phenomenological experience and the conscious mind is broken for short periods of time. And then with no phenomenological input the conscious mind loses awareness.

    1. Bill

      Hi Peter, I don’t see in what sense that is the reverse of the problem I described, but I agree that it’s a very interesting and important problem. I would probably describe it in a different way, though. It’s a sort of temporal analog of the visual blind spot. A more direct analogy is to a phenomenon that everybody actually experiences: saccadic time warping. Every time we make a saccadic eye movement, our awareness is suppressed during the time our eyes are moving (a fraction of a second), and subjective time is warped before and after the movement in a way that eliminates the perception of a gap. That phenomenon has been studied pretty extensively, and a number of philosophers have discussed it, including Dennett. You can look at Wikipedia’s article on saccadic masking if you’re interested in more information.

      Best regards, Bill

      1. Peter Smith

        Hi Bill, yes, I did put it poorly. Thanks for the reference about saccadic masking. You say that our awareness is suppressed during saccades. Would that be only visual awareness or general awareness? If general awareness that would make it very similar to petit mal. In that case petit mal could well be a defect in the visual saccade system.

        1. Bill

          As far as I know there is no deficit in general awareness.

          There’s another phenomenon worth mentioning, by the way. It sometimes happens that people lose vision in a portion of the visual field, due to a stroke or some other kind of brain damage. This is known as a “scotoma”. In most cases the person is not directly aware that anything is missing, even if the scotoma is quite large. This is another example of the general principle that absence of awareness does not imply awareness of absence, which applies to many situations.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>