Hack 58. Watch Yourself to Feel More
Looking at your skin makes it more sensitive, even if you can't see what it is you're feeling. Look through a magnifying glass and it becomes even more sensitive.
The skin is the shortest-range interface we have with the world. It is the only sense that doesn't provide any information about distant objects. If you can feel something on your skin, it is next to you right now.
Body parts exist as inward-facing objectsthey provide touch informationbut they also exist as external objectswe can feel them with other body parts, see them, and (if you're lucky) feel and see those of other people. [Hack #64] and [Hack #93] explore how we use vision to update our internal model of our body parts. But the integration of the two senses goes deeper, so much so that looking at a body part enhances the sensitivity of that body part, even if you aren't getting any useful visual information to illuminate what's happening on your skin.
5.7.1. In Action
Kennett et al.1 tested how sensitive people were to touch on their forearms. In controlled conditions, people were asked to judge if they were feeling two tiny rods pressed against their skin or just one. The subjects made these judgments in three conditions. The first two are the most important, providing the basic comparison. Subjects were either in the dark or in the light and looking at their armbut with a brief moment of darkness so they couldn't actually see their arm as the pins touched it. Subjects allowed to look at their arms were significantly more accurate, indicating that looking at the arm, even though it didn't provide any useful information, improved tactile sensitivity.
The third condition is the most interesting and shows exactly how pervasive the effect can be. Subjects were shown their forearm through a magnifying glass (still with darkness at the actual instant of the pinprick). In this condition, their sensitivity was nearly twice as precise as their sensitivity in the dark!
This is astounding for at least two reasons. First, it shows that visual attention can improve our sensitivity in another domain, in this case touch. There is no necessity for touch to interact like this with vision. The senses could be independent until far later in processing. Imagine if the double-click rate setting on your mouse changed depending on what was coming down your Internet connection? You'd think it was pretty odd. But for the brain this kind of interaction makes sense because we control where we look and events often spark input to more than one of our senses at a time.
The second reason this is astounding is because it shows how a piece of technology (the magnifying glass) can be used to adjust our neural processing at a very fundamental level.
5.7.2. How It Works
Touch information is gathered together in the parietal cortex (consult the crib notes in [Hack #7] if you want to know where that is), in an area called the primary somatosensory cortex. You'll find neurons here arranged into a map representing the surface of your body [Hack #12], and you'll find polysensory neurons. These respond in particular when visual and tactile input synchronize and suppress when the two inputs are discordant; it seems there's a network here that integrates information from both senses, either within the somatosensory map of the body or in a similar map nearby.
This theory explains why brain damage to the parietal cortex can result in distortions of body image. Some patients with damaged parietal lobes will point to the doctor's elbow when asked to point to their own elbow for example.
This hack and [Hack #64] show that short-term changes in our representation of our body are possible. Individual neurons in the cortex that respond to stimulation of the skin can be shown to change what area of skin they are responsible for very rapidly. If, for example, you anesthetize one finger so that it is no longer providing touch sensation to the cortical cells previously responsible for responding to sensation there, these cells will begin to respond to sensations on the other fingers.2 In the magnifying glass condition, the expanded resolution of vision appears to cause the resources devoted to tactile sensitivity of the skin to adjust, adding resolution to match the expanded resolution the magnifying glass has artificially given vision.
5.7.3. In Real Life
This experiment explains why in general we like to look at things as we do them with our hands or listen to them with our earslike watching the band at a gig. We don't just want to see what's going onit actually enhances the other senses as well.
Perhaps this is also why first-person shooter games have hit upon showing an image of the player's hands on the display. Having hands where you can see them may actually remap your bodily representation to make the screen part of your personalor near-personalspace, and hence give all the benefits of attention [Hack #54] and multimodal integration (such as the better sense discrimination shown in this hack) that you get there.
5.7.4. End Notes