Winners of Neukom Graduate Fellowships have been announced for the 2012-2013 academic year. Fellowships will provide a full year of funding, including stipend and benefits, to Ph.D. students engaged in faculty-advised research in the development of novel computational techniques as well as the application of computational methods to problems in the Sciences, Social Sciences, Humanities, and the Arts.
The 2012-2013 winners are:
Self-regulation allows people to make plans, choose from alternatives, control impulses, inhibit unwanted thoughts, and regulate social behavior. Although humans have an impressive capacity for self-regulation, failures are common and people lose control of their behavior in a wide variety of circumstances (Heatherton & Wagner, 2011). Such failures are an important cause of several contemporary societal and health problems— obesity, addiction, poor financial decisions, sexual infidelity, and so on. Conversely, those who are better able to self-regulate demonstrate improved relationships, increased job success, and better mental health (Tangney et al., 2004) and are less at risk for developing addiction problems or engaging in risky sexual behavior (Quinn & Fromme, 2010).
An understanding of the circumstances under which people fail at self-regulation—as well as the brain mechanisms associated with those failures—can provide valuable insights into how people regulate and control their thoughts, behaviors, and emotions. Although psychologists have made considerable progress in identifying individual differences in the capacity for self-control (Mischel et al., 2010) as well as the contextual circumstances that support or disrupt self-regulation, researchers are only beginning to examine basic mechanisms and their underlying neural substrates (see Heatherton, 2011). Psychologists have developed complex theories and models of self-regulation, although most of these theories fail to consider underlying neurobiological functions or structures. Cognitive neuroscience research suggests that successful self-regulation is dependent on top-down control from the prefrontal cortex over subcortical regions involved in reward and emotion. In order to achieve progress in understanding the underlying neural mechanisms of self-regulation, however, researchers need to begin to consider the complex interactions across networks of brain regions and how such interactions may give rise to individual differences in the ability to self-regulate. To accomplish this goal, computational approaches such as those described here, should be applied to determine the most relevant features from the massive MRI and behavioral datasets researchers currently acquire.
The guiding hypothesis of this research is that individual differences in cue-reactivity to food images and integrity of the fronto-parietal network are associated with long-term success or failure in self-regulation. To our knowledge this is the first multi-modal imaging attempt of this type in dieters. In summary, this study seeks to determine neural correlates of self-regulation by applying cutting edge computational approaches to brain imaging data of individuals starting a diet or fitness program, potentially providing an important step towards understanding how neural activity and connectivity subserve the ability to initiate weight loss and in a broader context, self-regulate.
Louis-David Lord, Paul Expert, Jeremy F. Huckins and Federico E. Turkheimer (2013). Cerebral energy metabolism and the brain's functional network
architecture: an integrative review. Journal of Cerebral Blood Flow & Metabolism advance online publication, 12 June 2013; doi:10.1038/jcbfm.2013.94
Object localization is a fundamental function of human perceptual system, one that has been mostly studied in visual domain. One challenge is that due to movements of the eyes or of the objects and the delays in computing location, this localization function needs to be predictive. Several vision illusions this predictive dependency between motion and object localization: for example, when a flashed target is presented close to a moving texture, the perceived position of the target is shifted in the direction of the motion following the flash, revealing a prediction for where it would if it had continued moving with the background. However, we also use cues from other modalities like sound and touch to localize objects around us.
In this project, we explore predictive localization in the auditory domain as well as interactions between visual motion and auditory localization. These studies examine whether rules of object localization are similar across sensory modalities.
Music is universal. This fact has led many researchers to wonder whether music itself has an adaptive function or whether it is merely evolutionary "cheesecake"—a pleasurable byproduct of adaptive non-musical processes (Pinker, 1997). There is strong evidence that musical understanding, like language, is subserved by statistical learning processes (Huron, 2006; Patel 2008). That is, our enjoyment of music, our sense of anticipation, tension, and cathartic release, depends on our ability to predict the way the music unfolds, and whether or not it conforms to our predictions. We suggest that music not only depends on prediction but plays an active role in fine-tuning the predictive processes in the listener. Thus, one evolutionary function of experiencing music, and other temporal arts, may be training and refining the capacity for statistical learning. Here we propose a three-part study to investigate whether music tunes the brain to statistical structure.
Our previous work demonstrated that music and movement share a dynamic structure (Sievers et al., 2013). This led to the conception of a more general theory that music provides a "cognitive sandbox;" an off-line mental space for the exploration and learning of statistical relationships that exist in the physical and social world. This study provides the first specific test of this theory by examining whether the statistical structure embedded in music is learned over time, whether that learning can be indexed by neural population coding, and whether that learning transfers across domains.
Sievers B., Polansky L., Casey M., Wheatley T. (2012). Music and Movement Share a Dynamic Structure that Supports Universal Expressions of Emotion. Proceedings of the National Academy of Sciences. doi:10.1073/pnas.1209023110
Last Updated: 3/4/14