Bouba/kiki effect

Bouba/kiki effect

The bouba/kiki effect is a non-arbitrary mapping between speech sounds and the visual shape of objects. This effect was first observed by German-American psychologist Wolfgang Köhler in 1929.

The presence of these “synesthesia-like mappings” suggest that this effect might be the neurological basis for sound symbolism



Transient Smartphone “Blindness”

Transient Smartphone “Blindness”
N Engl J Med 2016; 374:2502-2504June 23, 2016

manufacturers are producing screens with increased brightness to offset background ambient luminance and thereby allow easy reading. Hence, presentations such as we describe are likely to become more frequent.

the eye that had viewed the smartphone had much lower retinal sensitivity than the eye that had been covered (this interocular difference is what the patients perceived as transient monocular blindness).

After approximately 20 minutes, responses from both eyes were very similar

Non-conscious perception of emotional signals

Neural bases of the non-conscious perception of emotional signals
Marco Tamietto & Beatrice de Gelder
Nature Reviews Neuroscience 11, 697-709 (October 2010)

Many emotional stimuli are processed without being consciously perceived.
Recent evidence indicates that subcortical structures have a substantial role in this processing.
These structures are part of a phylogenetically ancient pathway that has specific functional properties and that interacts with cortical processes.
There is now increasing evidence that non-consciously perceived emotional stimuli induce distinct neurophysiological changes and influence behaviour towards the consciously perceived world.
Understanding the neural bases of the non-conscious perception of emotional signals will clarify the phylogenetic continuity of emotion systems across species and the integration of cortical and subcortical activity in the human brain.

Subcortical face processing

Subcortical face processing
Mark H. Johnson
Nature Reviews Neuroscience 6, 766-774 (October 2005)

The model illustrates how faces are processed through both a subcortical face-detection route (involving the superior colliculus, pulvinar and amygdala) and a cortical route.
The subcortical route modulates processing in structures that are fed by the cortical pathway and are involved in face identification (fusiform gyrus and inferior occipital gyrus), facial expression (amygdala, orbitofrontal cortex, sensorimotor cortex) and eye gaze (superior temporal sulcus).

Seeing Less Helps The Brain Hear More

Crossmodal Induction of Thalamocortical Potentiation Leads to Enhanced Information Processing in the Auditory Cortex
Emily Petrus, et al.
Neuron 81(3):664–673, 5 February 2014


Visual deprivation improves frequency selectivity of A1 neurons
•Visual deprivation improves sound discrimination performance by A1 neurons
•Visual deprivation strengthens thalamocortical synapses in A1, but not in V1
•Crossmodal changes are more effectively recruited than unimodal changes in adult

Sensory systems do not work in isolation; instead, they show interactions that are specifically uncovered during sensory loss.
To identify and characterize these interactions, we investigated whether visual deprivation leads to functional enhancement in primary auditory cortex (A1).
We compared sound-evoked responses of A1 neurons in visually deprived animals to those from normally reared animals.

Here, we show that visual deprivation leads to improved frequency selectivity as well as increased frequency and intensity discrimination performance of A1 neurons.
Furthermore, we demonstrate in vitro that in adults visual deprivation strengthens thalamocortical (TC) synapses in A1, but not in primary visual cortex (V1).
Because deafening potentiated TC synapses in V1, but not A1, crossmodal TC potentiation seems to be a general property of adult cortex.
Our results suggest that adults retain the capability for crossmodal changes whereas such capability is absent within a sensory modality. Thus, multimodal training paradigms might be beneficial in sensory-processing disorders.

journalistic version:
Seeing Less Helps The Brain Hear More
February 05, 2014
A few days in the dark can improve an animal’s hearing, scientists report this week in the journal Neuron.
This temporary loss of visual input seems to trigger favorable changes in areas of the brain that process auditory information.

Even when blindness occurs after that critical period in early childhood.

Riddoch’s phenomenon

The Blind Woman Who Sees Rain, But Not Her Daughter’s Smile
May 26, 2014

Imagine a world that is completely black. You can’t see a thing — unless something happens to move. You can see the rain falling from the sky, the steam coming from your coffee cup, a car passing by on the street.

This was the world that Milena Channing claimed to see, back in 2000, shortly after she was blinded by a stroke at 29 years old. But when she told her doctors about these strange apparitions, they looked at her brain scans (the stroke had destroyed basically her entire primary visual cortex, the receiving station of visual information to the brain), and told her she must be hallucinating.

“You’re blind and that’s it,” Channing remembers them saying to her.

Frustrated and convinced these visions were real, Channing made her way from doctor to doctor until she finally found one who believed her: Dr. Gordon Dutton, an ophthalmologist in Glasgow.
He told her he’d once read about such a case — a soldier in World War I who, after a bullet injury to the head, could only see things in motion.

Riddoch’s phenomenon, Dutton told her it was called, named for the Scottish neurologist George Riddoch who named it. And then he prescribed her … a rocking chair!

Decoding visual object perception from fMRI

fusiform face area (FFA, red) and parahippocampal place area (PPA, blue).
A human observer, who was only given signals from the FFA and PPA of each participant, was able to estimate with 85% accuracy which of the two categories the participants were imagining.

Decoding mental states from brain activity in humans
John-Dylan Haynes and Geraint Rees
Nature Reviews Neuroscience 7, 523-534 (July 2006)

Recent advances in human neuroimaging have shown that it is possible to accurately decode a person’s conscious experience based only on non-invasive measurements of their brain activity. Such ‘brain reading‘ has mostly been studied in the domain of visual perception, where it helps reveal the way in which individual experiences are encoded in the human brain.
The same approach can also be extended to other types of mental state, such as covert attitudes and lie detection. Such applications raise important ethical issues concerning the privacy of personal thought.

cited by:
Introduction to Neuroeconomics: how the brain makes decisions
Coursera. July 2014

Failure of Inhibitory Control on an Ambiguous Distractor

bidirectional interactions between the LPFC and MT+.

Greater disruption due to failure of inhibitory control on an ambiguous distractor.
Science 314:1786-1788. (15 December 2006)
Tsushima Y, Sasaki Y, Watanabe T

Second, the results may reveal important bidirectional interactions between a cognitive controlling system and the visual system.
The LPFC, which has been suggested to provide inhibitory control on task-irrelevant signals (22–26), may have a higher detection threshold for incoming signals than the visual cortex.
Task-irrelevant signals around the threshold level may be sufficiently strong to be processed in the visual system but not strong enough for the LPFC to notice and, therefore, to provide effective inhibitory control on the signals (Fig. 4A).
In this case, such signals may remain uninhibited, take more resources for a task-irrelevant distractor, leave fewer resources for a given task (32, 33), and disrupt task performance more than suprathreshold signals.
On the other hand, suprathreshold coherent motion may be noticed, may be given successful inhibitory control by the LPFC, and may leave more resources for a task (Fig. 4B) (22–26).
This mechanism may underlie the present paradoxical finding that subthreshold task-irrelevant stimuli activate the visual area strongly and disrupt task performance more than some suprathreshold stimuli.
It could also be one of the reasons why subthreshold stimuli often lead to relatively robust effects (2, 11, 14).

MT+: human middle temporal visual area
LPFC: lateral prefrontal cortex

cited by: