研究者業績

日髙 聡太

ヒダカ ソウタ  (Souta Hidaka)

基本情報

所属
上智大学 総合人間科学部 心理学科 教授
学位
博士(文学)(東北大学)

研究者番号
40581161
ORCID ID
 https://orcid.org/0000-0001-6727-5322
J-GLOBAL ID
201101066235007859
researchmap会員ID
B000000665

外部リンク

委員歴

 7

論文

 59
  • Na Chen, Souta Hidaka, Naomi Ishii, Makoto Wada
    Frontiers in Psychiatry 15 2024年9月5日  
    Introduction Various genetic mutations have been implicated in autism spectrum disorder (ASD). Some candidate genes for ASD are known to be related to signal transduction and may be involved in hand development as well as neurodevelopment. Therefore, although subtle, anatomical variations in hand configurations may be observed in individuals with ASD. However, except for research on the finger ratio, which has been suggested to be related to prenatal sex hormone exposure, only few studies have been conducted. Given the spectrum characteristics of ASD, we explored whether hand configurations are associated with ASD-related traits in the general population. Methods Photographs of the dorsal surface of each hand were obtained, and the distances between the metacarpophalangeal joints and finger lengths were measured. The Autism Spectrum Quotient, Empathy Quotient, and Systemizing Quotient were used to evaluate ASD-related traits. Results We found a significant positive correlation between the aspect ratio of the right hand and the Systemizing Quotient score: individuals with a larger width relative to the finger length showed more systemizing traits. Discussion These findings suggest that gene polymorphisms or prenatal sex hormone exposure may underlie the relationship between systemizing traits and hand configurations.
  • Souta Hidaka, Raffaele Tucciarelli, Salma Yusuf, Fabiana Memmolo, Sampath Rajapakse, Elena Azañón, Matthew R. Longo
    Journal of Experimental Psychology: Human Perception and Performance 2024年8月15日  査読有り筆頭著者責任著者
  • Kyuto Uno, Souta Hidaka
    Psychonomic Bulletin & Review 2024年1月3日  査読有り
  • 日高聡太, 川越 敏和, 浅井 暢子, 寺本 渉
    心理学研究 2023年12月  査読有り筆頭著者責任著者
  • Souta Hidaka, Miyu Takeshima, Toshikazu Kawagoe
    i-Perception 14(6) 1-14 2023年11月  査読有り筆頭著者責任著者
  • Souta Hidaka, Na Chen, Naomi Ishii, Risa Iketani, Kirino Suzuki, Matthew R. Longo, Makoto Wada
    Autism Research 16(9) 1750-1764 2023年7月6日  査読有り筆頭著者責任著者
    Abstract People with autism spectrum disorder (ASD) or higher levels of autistic traits have atypical characteristics in sensory processing. Atypicalities have been reported for proprioceptive judgments, which are tightly related to internal bodily representations underlying position sense. However, no research has directly investigated whether self‐bodily representations are different in individuals with ASD. Implicit hand maps, estimated based on participants' proprioceptive sensations without sight of their hand, are known to be distorted such that the shape is stretched along the medio‐lateral hand axis even for neurotypical participants. Here, with the view of ASD as falling on a continuous distribution among the general population, we explored differences in implicit body representations along with autistic traits by focusing on relationships between autistic traits and the magnitudes of the distortions in implicit hand maps (N ~ 100). We estimated the magnitudes of distortions in implicit hand maps both for fingers and hand surfaces on the dorsal and palmar sides of the hand. Autistic traits were measured by questionnaires (Autism Spectrum [AQ] and Empathy/Systemizing [EQ‐SQ] Quotients). The distortions in implicit hand maps were replicated in our experimental situations. However, there were no significant relationships between autistic traits and the magnitudes of the distortions as well as within‐individual variabilities in the maps and localization performances. Consistent results were observed from comparisons between IQ‐matched samples of people with and without a diagnosis of ASD. Our findings suggest that there exist perceptual and neural processes for implicit body representations underlying position sense consistent across levels of autistic traits.
  • Souta Hidaka, Mizuho Gotoh, Shinya Yamamoto, Makoto Wada
    Scientific Reports 13(1) 2023年4月11日  査読有り筆頭著者責任著者
    Abstract The number of clinical diagnoses of autism spectrum disorder (ASD) is increasing annually. Interestingly, the human body temperature has also been reported to gradually decrease over the decades. An imbalance in the activation of the excitatory and inhibitory neurons is assumed to be involved in the pathogenesis of ASD. Neurophysiological evidence showed that brain activity decreases as cortical temperature increases, suggesting that an increase in brain temperature enhances the inhibitory neural mechanisms. Behavioral characteristics specific to clinical ASD were observed to be moderated when people with the diagnoses had a fever. To explore the possible relationship between ASD and body temperature in the general population, we conducted a survey study using a large population-based sample (N ~ 2000, in the age groups 20s to 70s). Through two surveys, multiple regression analyses did not show significant relationships between axillary temperatures and autistic traits measured by questionnaires (Autism Spectrum (AQ) and Empathy/Systemizing Quotients), controlling for covariates of age and self-reported circadian rhythms. Conversely, we consistently observed a negative relationship between AQ and age. People with higher AQ scores tended to have stronger eveningness. Our findings contribute to the understanding of age-related malleability and the irregularity of circadian rhythms related to autistic traits.
  • Souta Hidaka, Raffaele Tucciarelli, Elena Azañón, Matthew R Longo
    Journal of experimental psychology. Human perception and performance 48(12) 1427-1438 2022年12月  査読有り筆頭著者責任著者
    Orientation information contributes substantially to our tactile perception, such as feeling an object's shape on the skin. For vision, a perceptual adaptation aftereffect (tilt aftereffect; TAE), which is well explained by neural orientation selectivity, has been used to reveal fundamental perceptual properties of orientation processing. Neural orientation selectivity has been reported in somatosensory cortices. However, little research has investigated the perceptual characteristics of the tactile TAE. The aim of the current study was to provide the first demonstration of a tactile TAE on the hand and investigate the perceptual nature of tactile TAE on the hand surface. We used a 2-point stimulation with minimal input for orientation. We found clear TAEs on the hand surface: Adaptation induced shifts in subjective vertical sensation toward the orientation opposite to the adapted orientation. Further, adaptation aftereffects were purely based on orientation processing given that the effects transferred between different lengths across adaptor and test stimuli and type of stimuli. Finally, adaptation aftereffects were anchored to the hand: tactile TAE occurred independently of hand rotation and transferred from palm to dorsum sides of the hand, while the effects did not transfer between hands. Our findings demonstrate the existence of hand-centered perceptual processing for basic tactile orientation information. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
  • Yosuke Suzuishi, Souta Hidaka
    i-Perception 13(1) 204166952110592-204166952110592 2022年1月  査読有り
    Vision of the body without task cues enhances tactile discrimination performance. This effect has been investigated only with static visual information, although our body usually moves, and dynamic visual and bodily information provides ownership (SoO) and agency (SoA) sensations to body parts. We investigated whether vision of body movements could enhance tactile discrimination performance. Participants observed white dots without any textural information showing lateral hand movements (dynamic condition) or static hands (static condition). For participants experiencing the dynamic condition first, it induced a lower tactile discrimination threshold, as well as a stronger SoO and SoA, compared to the static condition. For participants observing the static condition first, the magnitudes of the enhancement effect in the dynamic condition were positively correlated between the tactile discrimination and SoO/SoA. The enhancement of the dynamic visual information was not observed when the hand shape was not maintained in the scrambled white dot images. Our results suggest that dynamic visual information without task cues can enhance tactile discrimination performance by feeling SoO and SoA only when it maintains bodily information.
  • Souta Hidaka, Kyoshiro Sasaki, Toshikazu Kawagoe, Nobuko Asai, Wataru Teramoto
    Scientific Reports 11(1) 8651-8651 2021年12月  査読有り筆頭著者責任著者
    <title>Abstract</title>Our bodily sensation is a fundamental cue for our self-consciousness. Whereas experimental studies have uncovered characteristics of bodily sensation, these studies investigated bodily sensations through manipulating bodily sensations to be apart from one’s own body and to be assigned to external, body-like objects. In order to capture our bodily sensation as it is, this questionnaire survey study explored the characteristics of bodily sensation using a large population-based sample (<italic>N</italic> = 580, comprising 20s to 70s age groups) without experimental manipulations. We focused on the sensations of ownership, the feeling of having a body part as one’s own, and agency, the feeling of controlling a body part by oneself, in multiple body parts (the eyes, ears, hands, legs, nose, and mouth). The ownership and agency sensations were positively related to each other in each body part. Interestingly, the agency sensation of the hands and legs had a positive relationship with the ownership sensations of the other body parts. We also found the 60s age group had a unique internal configuration, assessed by the similarity of rating scores, of the body parts for each bodily sensation. Our findings revealed the existence of unique characteristics for bodily sensations in a natural state.
  • Souta Hidaka, Luigi Tamè, Matthew R Longo
    Cognition 209 104569-104569 2020年12月31日  査読有り筆頭著者責任著者
    Perceptual completion is a fundamental perceptual function serving to maintain robust perception against noise. For example, we can perceive a vivid experience of motion even for the discrete inputs across time and space (apparent motion: AM). In vision, stimuli irrelevant to AM perception are suppressed to maintain smooth AM perception along the AM trajectory where no physical inputs are applied. We investigated whether such perceptual masking induced by perceptual completion of dynamic inputs is general across sensory modalities by focusing on touch. Participants tried to detect a vibro-tactile target stimulus presented along the trajectory of AM induced by two other tactile stimuli on the forearm. In a control condition, the inducing stimuli were applied simultaneously, resulting in no motion percept. Tactile target detection was impaired with tactile AM. Our findings support the notion that the perceptual masking induced by perceptual completion mechanism of AM is a general function rather than a sensory specific effect.
  • Ayako Yaguchi, Souta Hidaka
    Multisensory research 34(5) 1-16 2020年12月8日  査読有り
    Autism spectrum disorder (ASD) is characterized by atypical social communication and restricted and repetitive behaviors; such traits are continuously distributed across nonclinical and clinical populations. Recently, relationships between ASD traits and low-level multisensory processing have been investigated, because atypical sensory reactivity has been regarded as a diagnostic criterion of ASD. Studies regarding an audiovisual illusion (the double-flash illusion) reported that social communication difficulties are related to temporal aspects of audiovisual integration. This study investigated whether similar relationships exist in another audiovisual illusion (the stream-bounce effect). In this illusion, two visual objects move toward each other, coincide, and pass each other, and the presentation of a transient sound at their coincidence induces a dominant perception that they bounce away from each other. Typically developing adults were recruited to perform experimental trials involving the stream-bounce effect. We measured their ASD traits using the Autism-Spectrum Quotient. The total quotient score was not related to any behavioral measurements of the effect. In contrast, for participants with higher difficulty in communication, the greatest magnitude of the stream-bounce effect occurred when the presentation timing of the sound tended to follow the visual coincidence. Participants with higher difficulty in imagination also showed the greatest magnitude of the effect when the presentation timing of the sound preceded that of the visual coincidence. Our findings regarding the stream-bounce effect, along with previous findings regarding the double-flash illusion, suggest that atypical temporal audiovisual integration is uniquely related to ASD sub-traits, especially in social communication.
  • Yosuke Suzuishi, Souta Hidaka, Scinob Kuroki
    Scientific reports 10(1) 13929-13929 2020年8月18日  査読有り
    We perceive the roughness of an object through our eyes and hands. Many crossmodal studies have reported that there is no clear visuo-tactile interaction in roughness perception using static visual cues. One exception is that the visual observation of task-irrelevant hand movements, not the texture of task-relevant objects, can enhance the performance of tactile roughness discrimination. Our study investigated whether task-irrelevant visual motion without either object roughness or bodily cues can influence tactile roughness perception. Participants were asked to touch abrasive papers while moving their hand laterally and viewing moving or static sine wave gratings without being able to see their hand, and to estimate the roughness magnitude of the tactile stimuli. Moving gratings with a low spatial frequency induced smoother roughness perceptions than static visual stimuli when the visual grating moved in the direction opposite the hand movements. The effects of visual motion did not appear when the visual stimuli had a high spatial frequency or when the participants touched the tactile stimuli passively. These results indicate that simple task-irrelevant visual movement without object roughness or bodily cues can modulate tactile roughness perception with active body movements in a spatial-frequency-selective manner.
  • Souta Hidaka, Luigi Tamè, Antonio Zafarana, Matthew R. Longo
    Cortex 128 124-131 2020年7月  査読有り筆頭著者責任著者
    Spatial distortions in touch have been investigated since the 19th century. For example, two touches applied to the hand dorsum feel farther apart when aligned with the mediolateral axis (i.e., across the hand) than when aligned with the proximodistal axis (along the hand). Stimulations to our sensory receptors are usually dynamic, where spatial and temporal inputs closely interact to establish our percept. For example, physically bigger tactile stimuli are judged to last longer than smaller stimuli. Given such links between space and time in touch, we investigated whether there is a tactile anisotropy in temporal perception analogous to the anisotropy described above. In this case, the perceived duration between the onset of two touches should be larger when they are aligned with the mediolateral than with the proximodistal axis of the hand dorsum. To test this hypothesis, we asked participants to judge which of two tactile temporal sequences, having the same spatial separation along and across the dorsum, felt longer. A clear anisotropy of the temporal perception was observed: temporal intervals across the hand were perceived as longer than those along the hand. Consistent with the spatial anisotropy, the temporal anisotropy did not appear on the palm side of the hand, indicating that the temporal anisotropy was based on perceptual processes rather than top-down modulations such as attentional or decisional/response biases. Contrary to our predictions, however, we found no correlation between the magnitudes of the temporal and spatial anisotropies. Our results demonstrated a novel type of temporal illusion in touch, which is strikingly similar in nature to the previously reported spatial anisotropy. Thus, qualitatively similar distorted somatosensory representations appear to underlie both temporal and spatial processing of touch.
  • Souta Hidaka, Raffaele Tucciarelli, Elena Azañón, Matthew R Longo
    Acta psychologica 208 103090-103090 2020年5月30日  査読有り筆頭著者責任著者
    Recent studies have demonstrated that mental representations of the hand dorsum are distorted even for healthy participants. Perceptual hand maps estimated by pointing to specific landmarks (e.g., knuckles and tips of fingers) is stretched and shrunk along the medio-lateral and the proximo-distal axes, respectively. Similarly, tactile distance perception between two touches is longer along the medio-lateral axis than the proximo-distal axis. The congruency of the two types of distortions suggests that common perceptual and neural representations may be involved in these processes. Prolonged stimulation by two simultaneous touches having a particular distance can bias subsequent perception of tactile distances (e.g., adaptation to a long distance induces shorter stimuli to be perceived even shorter). This tactile distance adaptation aftereffect has been suggested to occur based on the modulations of perceptual and neural responses at low somatosensory processing stages. The current study investigated whether tactile distance adaptation aftereffects affect also the pattern of distortions on the perceptual hand maps. Participants localized locations on the hand dorsum cued by tactile stimulations (Experiment 1) or visually presented landmarks on a hand silhouette (Experiment 2). Each trial was preceded by adaptation to either a small (2 cm) or large (4 cm) tactile distance. We found clear tactile distance aftereffects. However, no changes were observed for the distorted pattern of the perceptual hand maps following adaptation to a tactile distance. Our results showed that internal body representations involved in perceptual distortions may be distinct between tactile distance perception and the perceptual hand maps underlying position sense.
  • Ayako Yaguchi, Souta Hidaka
    Perception 49(4) 405-421 2020年4月2日  査読有り
    Autism spectrum disorder (ASD) refers to neurodevelopmental disorders characterized by symptoms such as social deficits and restricted interests and behavior. Several studies have investigated specific sensory processing in relation to ASD traits. However, findings appear to be inconsistent and inconclusive because of variation in ASD traits among participants and differences in the tasks adopted. In this study, we investigated relationships between sensory thresholds in visual, auditory, and tactile modalities and various ASD traits to account for individual variability of traits in typically developing adults using the same experimental tasks. We estimated detection and discrimination thresholds for brightness, sound pressure, and vibrotactile stimulus strength. We also estimated the degree of ASD traits in each participant with a questionnaire. We found that higher tactile detection and visual discrimination thresholds were related to ASD traits in difficulty of communication. A lower tactile discrimination threshold and higher visual detection threshold was also related to the ASD trait of strong focus of attention. These findings suggest the existence of unique relationships between particular low-level sensory processing and specific ASD traits, indicating that irregularities in sensory processing may underlie variation in ASD traits.
  • Hidaka S, Suzuishi Y, Ide M, Wada M
    Scientific reports 8(1) 17018-17018 2018年11月  査読有り筆頭著者責任著者
  • Souta Hidaka, Yosuke Suzuishi, Norimichi Kitagawa
    Perception 47(10-11) 301006618805335-1070-1080 2018年10月  査読有り
  • Sugita Y, Hidaka S, Teramoto W
    Scientific reports 8(1) 13396-13396 2018年9月6日  査読有り
  • Ayako Yaguchi, Souta Hidaka
    Multisensory Research 31(6) 523-536 2018年  査読有り
    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by deficits in social communication and interaction, and restricted interests and behavior patterns. These characteristics are considered as a continuous distribution in the general population. People with ASD show atypical temporal processing in multisensory integration. Regarding the flash-beep illusion, which refers to how a single flash can be illusorily perceived as multiple flashes when multiple auditory beeps are concurrently presented, some studies reported that people with ASD have a wider temporal binding window and greater integration than typically developed people others found the opposite or inconsistent tendencies. Here, we investigated the relationships between the manner of the flash-beep illusion and the various dimensions of ASD traits by estimating the degree of typically developed participants' ASD traits including five subscales using the Autism-Spectrum Quotient. We found that stronger ASD traits of communication and social skill were associated with a wider and narrower temporal binding window respectively. These results suggest that specific ASD traits are differently involved in the particular temporal binding processes of audiovisual integration.
  • Hidaka Souta, Yaguchi Ayako
    MULTISENSORY RESEARCH 31(8) 729-751 2018年  査読有り
  • Teramoto, W., Hidaka, S., Sugita, Y.
    Spatial Biases in Perception and Cognition 2018年  
  • Souta Hidaka, Satomi Higuchi, Wataru Teramoto, Yoichi Sugita
    ACTA PSYCHOLOGICA 178 66-72 2017年7月  査読有り
    Studies of crossmodal interactions in motion perception have reported activation in several brain areas, including those related to motion processing and/or sensory association, in response to multimodal (e.g., visual and auditory) stimuli that were both in motion. Recent studies have demonstrated that sounds can trigger illusory visual apparent motion to static visual stimuli (sound-induced visual motion: SIVM): A visual stimulus blinking at a fixed location is perceived to be moving laterally when an alternating left-right sound is also present. Here, we investigated brain activity related to the perception of SIVM using a 7 T functional magnetic resonance imaging technique. Specifically, we focused on the patterns of neural activities in SIVM and visually induced visual apparent motion (VIVM). We observed shared activations in the middle occipital area (V5/hMT), which is thought to be involved in visual motion processing, for SIVM and VIVM. Moreover, as compared to VIVM, SIVM resulted in greater activation in the superior temporal area and dominant functional connectivity between the V5/hMT area and the areas related to auditory and crossmodal motion processing. These findings indicate that similar but partially different neural mechanisms could be involved in auditory-induced and visually-induced motion perception, and neural signals in auditory, visual, and, crossmodal motion processing areas closely and directly interact in the perception of SIVM.
  • Masakazu Ide, Souta Hidaka, Hanako Ikeda, Makoto Wada
    SCIENTIFIC REPORTS 6 37301-37301 2016年11月  査読有り
    Crossmodal studies have demonstrated inhibitory as well as facilitatory neural effects in higher sensory association and primary sensory cortices. A recent human behavioral study reported touch-induced visual perceptual suppression (TIVS). Here, we introduced an experimental setting in which TIVS could occur and investigated brain activities underlying visuo-tactile interactions using a functional magnetic resonance imaging technique. While the suppressive effect of touch on vision was only found for half of the participants who could maintain their baseline performance above chance level (i.e. TIVS was not well replicated here), we focused on individual differences in the effect of touch on vision. This effect could be suppressive or enhancement, and the neuronal basis of these differences was analyzed. We found larger inhibitory responses in the anterior part of the right visual cortex (V1, V2) with higher TIVS magnitude when visuo-tactile stimuli were presented as spatially congruent. Activations in the right anterior superior temporal region, including the secondary somatosensory cortical area, were more strongly related to those in the visual cortex (V1, V2) with higher TIVS magnitude. These results indicate that inhibitory neural modulations from somatosensory to visual cortices and the resulting inhibitory neural responses in the visual cortex could be involved in TIVS.
  • Tatsuya Sato, Hazime Mizoguchi, Ayumu Arakawa, Souta Hidaka, Miki Takasuna, Yasuo Nishikawa
    JAPANESE PSYCHOLOGICAL RESEARCH 58 110-128 2016年6月  査読有り
    Only a few Japanese psychologists have been interested in the history of psychology. The historiography-or the methodology of historical description-for the history of psychology has been ignored. Moreover, chairs for professors, academic journals, academic meetings, and an archive of the history of psychology remain to be established. In this paper, the history of the "history of psychology" in Japan is explored. This academic sub-discipline has emerged only during the past few decades and a theory driven historiography covering the history of psychology was initiated in Japan only at the end of the 20th century. Activities on the history of Japanese psychology can be divided into three phases: (a) the traditional history (translations and introduction of foreign knowledge and celebration of history); (b) the transition phase (the movement towards social criticism and archival research from intra-discipline); and (c) the history based on the historiography. Lastly, recent trends in the history of psychology in Japan are examined and discussed.
  • 池田 華子, 田中 智明, 日高 聡太, 石山 智弘, 宮崎 弦太
    認知科学 23(2) 101-117 2016年  
    In relation to the recent development of ultra high definition imaging technique (4K)<br> that have quadruple amount of pixels relative to high definition imaging (HD), it has<br> been reported that observer's subjective impression differ between these imaging. The<br> present study examined how differences in resolution (4K and HD imaging) influence<br> subjective impressions of movies in association with movie contents (natural/artificial<br> objects) and fields of view (wide/medium/narrow) (Exp1). We also investigated the<br> effects of the quantities of motion on subjective impressions of movies in different im-<br>ge resolutions with the flame rate higher (59.94 fps) than the previous study (23.98<br> fps) (Exp2). We found that 4K movies, as compared to HD movies, induced stronger<br> impressions regarding evaluation and comfort especially when they were presented with<br> natural scene and/or larger field of view. It was also shown that 4K movies with higher<br> flame rate induced stronger impressions regarding desirability and comfort regardless<br> of motion quantities, contrary to the previous finding that 4K movies with the larger<br> quantities of motion gave observer lower impression regarding desirability and comfort<br> than HD movies. These results demonstrate that the differences in image resolution<br> could modulate subjective impressions of movies in accordance with the differences in<br> movie contents, fields of view, and flame rate. Moreover, the current findings suggest<br> that there exist some desirable conditions under which the ultra high definition imaging<br> could effectively enhance observers' subjective impressions of movies.<br>
  • Hidaka Souta, Teramoto Wataru, Sugita Yoichi
    FRONTIERS IN INTEGRATIVE NEUROSCIENCE 9 62-62 2015年12月22日  査読有り
  • Souta Hidaka, Masakazu Ide
    SCIENTIFIC REPORTS 5 10483-10483 2015年5月  査読有り
    In a single modality, the percept of an input (e.g., voices of neighbors) is often suppressed by another (e.g., the sound of a car horn nearby) due to close interactions of neural responses to these inputs. Recent studies have also suggested that close interactions of neural responses could occur even across sensory modalities, especially for audio-visual interactions. However, direct behavioral evidence regarding the audio-visual perceptual suppression effect has not been reported in a study with humans. Here, we investigated whether sound could have a suppressive effect on visual perception. We found that white noise bursts presented through headphones degraded visual orientation discrimination performance. This auditory suppression effect on visual perception frequently occurred when these inputs were presented in a spatially and temporally consistent manner. These results indicate that the perceptual suppression effect could occur across auditory and visual modalities based on close and direct neural interactions among those sensory inputs.
  • 池田 華子, 田中 智明, 石山 智弘, 日高 聡太, 宮崎 弦太
    日本感性工学会論文誌 14(3) 369-379 2015年  
    Ultra-high definition (4K) imaging allows us to achieve considerably higher image quality than would high definition (HD) imaging. The present study examined how 4K and HD imaging could influence subjective impressions of movies differently, in association with the quantities of motion and fields of view of these movies. We found that stronger impressions regarding comfort and impact were evoked for 4K movies with smaller quantities of motion and medium field of view. Stronger perceptions of impact occurred for HD movies with larger quantities of motion and larger field of view. HD movies also gave stronger impression regarding dynamics regardless of motion quantities. Additionally, HD movies down-converted from 4K movies tended to induce higher impressions regarding evaluation and comfort in some situations. These results suggest that subjective impressions of movies are influenced by the differences in resolution images, as well as interactions between imaging types and characteristics of movie contents.
  • Souta Hidaka, Kazumasa Shimoda
    MULTISENSORY RESEARCH 27(3-4) 189-205 2014年  査読有り
    It has been reported that color can affect the judgment of taste. For example, a dark red color enhances the subjective intensity of sweetness. However, the underlying mechanisms of the effect of color on taste have not been fully investigated; in particular, it remains unclear whether the effect is based on cognitive/decisional or perceptual processes. Here, we investigated the effect of color on sweetness judgments using a taste adaptation method. A sweet solution whose color was subjectively congruent with sweetness was judged as sweeter than an uncolored sweet solution both before and after adaptation to an uncolored sweet solution. In contrast, subjective judgment of sweetness for uncolored sweet solutions did not differ between the conditions following adaptation to a colored sweet solution and following adaptation to an uncolored one. Color affected sweetness judgment when the target solution was colored, but the colored sweet solution did not modulate the magnitude of taste adaptation. Therefore, it is concluded that the effect of color on the judgment of taste would occur mainly in cognitive/decisional domains.
  • Masakazu Ide, Souta Hidaka
    SCIENTIFIC REPORTS 3 3453 2013年12月  査読有り
    An input (e.g., airplane takeoff sound) to a sensory modality can suppress the percept of another input (e.g., talking voices of neighbors) of the same modality. This perceptual suppression effect is evidence that neural responses to different inputs closely interact with each other in the brain. While recent studies suggest that close interactions also occur across sensory modalities, crossmodal perceptual suppression effect has not yet been reported. Here, we demonstrate that tactile stimulation can suppress the percept of visual stimuli: Visual orientation discrimination performance was degraded when a tactile vibration was applied to the observer's index finger of hands. We also demonstrated that this tactile suppression effect on visual perception occurred primarily when the tactile and visual information were spatially and temporally consistent. The current findings would indicate that neural signals could closely and directly interact with each other, sufficient to induce the perceptual suppression effect, even across sensory modalities.
  • Junichi Takahashi, Souta Hidaka, Wataru Teramoto, Jiro Gyoba
    PSYCHOLOGICAL RESEARCH-PSYCHOLOGISCHE FORSCHUNG 77(6) 687-697 2013年11月  査読有り
    Pattern redundancy is a key concept for representing the amount of internal mental load (encoding efficiency) needed for pattern perception/recognition. The present study investigated how pattern redundancy influences encoding and memory processes in the visual system using a rapid serial visual presentation (RSVP) paradigm. With RSVP, it is well known that participants often fail to detect repetitions of words (repetition blindness, RB). We used this phenomenon as an index of the encoding and storage of visual patterns. In three experiments, we presented patterns with higher and lower redundancy, as defined by Garner's equivalent set size (ESS). The results showed that RB occurred more frequently for higher redundancy patterns when the temporal distance between the targets was less than 500 ms; this tendency was reversed with longer temporal distances of over 500 ms. Our results suggest that pattern redundancy modulates both the early encoding and subsequent memory processes of a representation.
  • Souta Hidaka, Wataru Teramoto, Mirjam Keetels, Jean Vroomen
    EXPERIMENTAL BRAIN RESEARCH 231(1) 117-126 2013年11月  査読有り
    The brain tends to associate specific features of stimuli across sensory modalities. The pitch of a sound is for example associated with spatial elevation such that higher-pitched sounds are felt as being "up" in space and lower-pitched sounds as being "down." Here we investigated whether changes in the pitch of sounds could be effective for visual motion perception similar to those in the location of sounds. We demonstrated that only sounds that alternate in up/down location induced illusory vertical motion of a static visual stimulus, while sounds that alternate in higher/lower pitch did not induce this illusion. The pitch of a sound did not even modulate the visual motion perception induced by sounds alternating in up/down location. Interestingly, though, sounds alternating in higher/lower pitch could become a driver for visual motion if they were paired in a previous exposure phase with vertical visual apparent motion. Thus, only after prolonged exposure, the pitch of a sound became an inducer for upper/lower visual motion. This occurred even if during exposure the pitch and location of the sounds were paired in an incongruent fashion. These findings indicate that pitch-space correspondence is not so strong to drive or modulate visual motion perception. However, associative exposure could increase the saliency of pitch-space relationships and then the pitch could induce visual motion perception by itself.
  • Wataru Teramoto, Maori Kobayashi, Souta Hidaka, Yoichi Sugita
    EXPERIMENTAL BRAIN RESEARCH 229(1) 97-102 2013年8月  査読有り
    Visual motion aftereffects can occur contingent on arbitrary sounds. Two circles, placed side by side, were alternately presented, and the onsets were accompanied by tone bursts of high and low frequencies, respectively. After a few minutes of exposure to the visual apparent motion with the tones, a circle blinking at a fixed location was perceived as a lateral motion in the same direction as the previously exposed apparent motion (Teramoto et al. in PLoS One 5:e12255, 2010). In the present study, we attempted to reverse this contingency (pitch aftereffects contingent on visual information). Results showed that after prolonged exposure to the audio-visual stimuli, the apparent visual motion systematically affected the perceived pitch of the auditory stimuli. When the leftward apparent visual motion was paired with the high-low-frequency sequence during the adaptation phase, a test tone sequence was more frequently perceived as a high-low-pitch sequence when the leftward apparent visual motion was presented and vice versa. Furthermore, the effect was specific for the exposed visual field and did not transfer to the other side, thus ruling out an explanation in terms of simple response bias. These results suggest that new audiovisual associations can be established within a short time, and visual information processing and auditory processing can mutually influence each other.
  • Masakazu Ide, Souta Hidaka
    EXPERIMENTAL BRAIN RESEARCH 228(1) 43-50 2013年7月  査読有り
    Perceptual systems can distinguish among a variety of inputs in the temporal domain, including even different sensory inputs. This process has been investigated mainly by using a temporal task (temporal order judgment: TOJ). For example, studies have reported estimated critical limits (just noticeable difference: JND) of the TOJ between a visual stimulus and a tactile stimulus (visuo-tactile TOJ, e.g., flashes and vibrations) fell within a certain temporal range. Recent studies have also suggested that the visual presentation of a hand image could modulate visuo-tactile integrations in the temporal domain, but these studies did not thoroughly examine such effects by using temporal tasks. Here, we investigated the effect of visual presentation of a hand image on visuo-tactile TOJ. In our experiments, a visual stimulus was presented on the index finger of a hand image and a tactile stimulus was presented on the index finger of a participant's hand. We found that the JND of visuo-tactile TOJ became larger when a forward hand image was presented than when inverted hand or arrow images were presented. However, this effect was not observed for the TOJ between an auditory stimulus and a visual stimulus. Thus, the visual presentation of a hand image whose angle corresponds to that of one's own hand could selectively degrade visuo-tactile TOJ. This finding indicates that visual hand images implicitly enhance the internal proximity between the visual and tactile stimuli and make them difficult to distinguish from each other in the temporal domain.
  • Hidaka S, Nagai M
    Frontiers in psychology 4 196 2013年4月19日  査読有り
  • Akio Honda, Hiroshi Shibata, Souta Hidaka, Jiro Gyoba, Yukio Iwaya, Yôiti Suzuki
    i-Perception 4(4) 253-264 2013年  査読有り
    We investigated the effects of listeners' head movements and proprioceptive feedback during sound localization practice on the subsequent accuracy of sound localization performance. The effects were examined under both restricted and unrestricted head movement conditions in the practice stage. In both cases, the participants were divided into two groups: a feedback group performed a sound localization drill with accurate proprioceptive feedback a control group conducted it without the feedback. Results showed that (1) sound localization practice, while allowing for free head movement, led to improvement in sound localization performance and decreased actual angular errors along the horizontal plane, and that (2) proprioceptive feedback during practice decreased actual angular errors in the vertical plane. Our findings suggest that unrestricted head movement and proprioceptive feedback during sound localization training enhance perceptual motor learning by enabling listeners to use variable auditory cues and proprioceptive information. © 2013 Honda et al.
  • Souta Hidaka, Hiroshi Shibata, Michiyo Kurihara, Akihiro Tanaka, Akitsugu Konno, Suguru Maruyama, Jiro Gyoba, Hiroko Hagiwara, Masatoshi Koizumi
    NEUROSCIENCE RESEARCH 73(1) 73-79 2012年5月  査読有り
    We investigated brain activity in 3-5-year-old preschoolers as they listened to connected speech stimuli in Japanese (first language), English (second language), and Chinese (a rarely exposed, foreign language) using near-infrared spectroscopy. Unlike the younger preschoolers who had been exposed to English for almost 1 year, brain activity in the bilateral frontal regions of the older preschoolers who had been exposed to English for almost 2 years was higher for Japanese and English speech stimuli than for Chinese. This tendency seemed to be similar to that observed in adults who had learned English for some years. These results indicate that exposure to a second language affects brain activity to language stimuli among preschoolers. (c) 2012 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.
  • Maori Kobayashi, Wataru Teramoto, Souta Hidaka, Yoichi Sugita
    PLOS ONE 7(5) e36803 2012年5月  査読有り
    Background: One possible strategy to evaluate whether signals in different modalities originate from a common external event or object is to form associations between inputs from different senses. This strategy would be quite effective because signals in different modalities from a common external event would then be aligned spatially and temporally. Indeed, it has been demonstrated that after adaptation to visual apparent motion paired with alternating auditory tones, the tones begin to trigger illusory motion perception to a static visual stimulus, where the perceived direction of visual lateral motion depends on the order in which the tones are replayed. The mechanisms underlying this phenomenon remain unclear. One important approach to understanding the mechanisms is to examine whether the effect has some selectivity in auditory processing. However, it has not yet been determined whether this aftereffect can be transferred across sound frequencies and between ears. Methodology/Principal Findings: Two circles placed side by side were presented in alternation, producing apparent motion perception, and each onset was accompanied by a tone burst of a specific and unique frequency. After exposure to this visual apparent motion with tones for a few minutes, the tones became drivers for illusory motion perception. However, the aftereffect was observed only when the adapter and test tones were presented at the same frequency and to the same ear. Conclusions/Significance: These findings suggest that the auditory processing underlying the establishment of novel audiovisual associations is selective, potentially but not necessarily indicating that this processing occurs at an early stage.
  • Souta Hidaka, Wataru Teramoto, Masayoshi Nagai
    VISION RESEARCH 59 25-33 2012年4月  査読有り
    Detection performance is impaired for a visual target presented in an apparent motion (AM) trajectory, and this AM interference weakens when orientation information is inconsistent between the target and AM stimuli. These indicate that the target is perceptually suppressed by internal object representations of AM stimuli established along the AM trajectory. Here, we showed that transient sounds presented together with AM stimuli could enhance the magnitude of AM interference. Furthermore, this auditory effect attenuated when frequencies of the sounds were inconsistent during AM. We also confirmed that the sounds wholly elevated the magnitude of AM interference irrespective of the inconsistency in orientation information between the target and AM stimuli when the saliency of the sounds was maintained. These results suggest that sounds can contribute to the robust establishment and spatiotemporal maintenance of the internal object representation of an AM stimulus. (C) 2012 Elsevier Ltd. All rights reserved.
  • Maori Kobayashi, Wataru Teramoto, Souta Hidaka, Yoichi Sugita
    SCIENTIFIC REPORTS 2 365 2012年4月  査読有り
    On cross-modal interactions, top-down controls such as attention and explicit identification of cross-modal inputs were assumed to play crucial roles for the optimization. Here we show the establishment of cross-modal associations without such top-down controls. The onsets of two circles producing apparent motion perception were accompanied by indiscriminable sounds consisting of six identical and one unique sound frequencies. After adaptation to the visual apparent motion with the sounds, the sounds acquired a driving effect for illusory visual apparent motion perception. Moreover, the pure tones with each unique frequency of the sounds acquired the same effect after the adaptation, indicating that the difference in the indiscriminable sounds was implicitly coded. We further confrimed that the aftereffect didnot transfer between eyes. These results suggest that the brain establishes new neural representations between sound frequency and visual motion without clear identification of the specific relationship between cross-modal stimuli in early perceptual processing stages.
  • Wataru Teramoto, Souta Hidaka, Yoichi Sugita, Shuichi Sakamoto, Jiro Gyoba, Yukio Iwaya, Yoiti Suzuki
    JOURNAL OF VISION 12(3) 2012年  査読有り
    Auditory temporal or semantic information often modulates visual motion events. However, the effects of auditory spatial information on visual motion perception were reported to be absent or of smaller size at perceptual level. This could be caused by a superiority of vision over hearing in reliability of motion information. Here, we manipulated the retinal eccentricity of visual motion and challenged the previous findings. Visual apparent motion stimuli were presented in conjunction with a sound delivered alternately from two horizontally or vertically aligned loudspeakers; the direction of visual apparent motion was always perpendicular to the direction in which the sound alternated. We found that the perceived direction of visual motion could be consistent with the direction in which the sound alternated or lay between this direction and that of actual visual motion. The deviation of the perceived direction of motion from the actual direction was more likely to occur at larger retinal eccentricities. These findings suggest that the auditory and visual modalities can mutually influence one another in motion processing so that the brain obtains the best estimates of external events.
  • Akio Honda, Hiroshi Shibata, Souta Hidaka, Jiro Gyoba, Yukio Iwaya, Yôiti Suzuki
    i-Perception 2(8) 865-865 2011年10月  
  • Souta Hidaka, Wataru Teramoto, Maori Kobayashi, Yoichi Sugita
    BMC NEUROSCIENCE 12 44 2011年5月  査読有り
    Background: After a prolonged exposure to a paired presentation of different types of signals (e. g., color and motion), one of the signals (color) becomes a driver for the other signal (motion). This phenomenon, which is known as contingent motion aftereffect, indicates that the brain can establish new neural representations even in the adult&apos;s brain. However, contingent motion aftereffect has been reported only in visual or auditory domain. Here, we demonstrate that a visual motion aftereffect can be contingent on a specific sound. Results: Dynamic random dots moving in an alternating right or left direction were presented to the participants. Each direction of motion was accompanied by an auditory tone of a unique and specific frequency. After a 3-minutes exposure, the tones began to exert marked influence on the visual motion perception, and the percentage of dots required to trigger motion perception systematically changed depending on the tones. Furthermore, this effect lasted for at least 2 days. Conclusions: These results indicate that a new neural representation can be rapidly established between auditory and visual modalities.
  • Souta Hidaka, Wataru Teramoto, Yoichi Sugita, Yuko Manaka, Shuichi Sakamoto, Yoiti Suzuki
    PLOS ONE 6(3) e17499 2011年3月  査読有り
    Background: Vision provides the most salient information with regard to the stimulus motion. However, it has recently been demonstrated that static visual stimuli are perceived as moving laterally by alternating left-right sound sources. The underlying mechanism of this phenomenon remains unclear; it has not yet been determined whether auditory motion signals, rather than auditory positional signals, can directly contribute to visual motion perception. Methodology/Principal Findings: Static visual flashes were presented at retinal locations outside the fovea together with a lateral auditory motion provided by a virtual stereo noise source smoothly shifting in the horizontal plane. The flash appeared to move by means of the auditory motion when the spatiotemporal position of the flashes was in the middle of the auditory motion trajectory. Furthermore, the lateral auditory motion altered visual motion perception in a global motion display where different localized motion signals of multiple visual stimuli were combined to produce a coherent visual motion perception. Conclusions/Significance: These findings suggest there exist direct interactions between auditory and visual motion signals, and that there might be common neural substrates for auditory and visual motion processing.
  • Souta Hidaka, Masayoshi Nagai, Allison B. Sekuler, Patrick J. Bennett, Jiro Gyoba
    JOURNAL OF VISION 11(10) 2011年  査読有り
    Letter discrimination performance is degraded when a letter is presented within an apparent motion (AM) trajectory of a spot. This finding suggests that the internal representation of AM stimuli can perceptually interact with other stimuli. In this study, we demonstrated that AM interference could also occur for pattern detection. We found that target (Gabor patch) detection performance was degraded within an AM trajectory. Further, this AM interference weakened when the differences in orientation between the AM stimuli and target became greater. We also revealed that AM interference occurred for the target with spatiotemporally intermediate orientations of the inducers that changed their orientation during AM. In contrast, the differences in phase among the stimuli did not affect the occurrence of AM interference. These findings suggest that AM stimuli and their internal representations affect lower visual processes involved in detecting a pattern in the AM trajectory and that the internal object representation of an AM stimulus selectively reflects and maintains the stimulus attribute.
  • Wataru Teramoto, Souta Hidaka, Jiro Gyoba, Yoiti Suzuki
    ATTENTION PERCEPTION & PSYCHOPHYSICS 72(8) 2215-2226 2010年11月  査読有り
    In representational momentum (RM), the final position of a moving target is mislocalized in the direction of motion. Here, the effect of a concurrent sound on visual RM was demonstrated. A visual stimulus moved horizontally and disappeared at unpredictable positions. A complex tone without any motion cues was presented continuously from the beginning of the visual motion. As compared with a silent condition, the RM magnitude increased when the sound lasted longer than and decreased when it did not last as long as the visual motion. However, the RM was unchanged when a brief complex tone was presented before or after the target disappeared (Experiment 2) or when the onset of the long-lasting sound was not synchronized with that of the visual motion (Experiments 3 and 4). These findings suggest that visual motion representation can be modulated by a sound if the visual motion information is firmly associated with the auditory information.
  • Wataru Teramoto, Souta Hidaka, Jiro Gyoba, Yoiti Suzuki
    ATTENTION PERCEPTION & PSYCHOPHYSICS 72(8) 2215-2226 2010年11月  査読有り
    In representational momentum (RM), the final position of a moving target is mislocalized in the direction of motion. Here, the effect of a concurrent sound on visual RM was demonstrated. A visual stimulus moved horizontally and disappeared at unpredictable positions. A complex tone without any motion cues was presented continuously from the beginning of the visual motion. As compared with a silent condition, the RM magnitude increased when the sound lasted longer than and decreased when it did not last as long as the visual motion. However, the RM was unchanged when a brief complex tone was presented before or after the target disappeared (Experiment 2) or when the onset of the long-lasting sound was not synchronized with that of the visual motion (Experiments 3 and 4). These findings suggest that visual motion representation can be modulated by a sound if the visual motion information is firmly associated with the auditory information.
  • 寺本渉, 吉田和博, 日高聡太, 浅井暢子, 行場次朗, 坂本修一, 岩谷幸雄, 鈴木陽一
    日本バーチャルリアリティ学会論文誌 印刷中(3) 483-486 2010年10月  査読有り
    For virtual reality systems, the enhancement of a sense of presence (a subjective experience of being in one place even when one is physically situated in another) has been the most important issue. Both theoretically and empirically, the sense of presence has been found to relate dominantly to background components contained in a scene. In contrast, the reality or virtuality which can be assumed to link essentially to foreground components in a scene has not been investigated in detail. The present study defined the latter type of sense as vraisemblance (verisimilitude), and made an exploratory investigation into spatio-temporal characteristics responsible for the higher vraisemblance by using a scene containing Shishi-odoshi (a traditional Japanese fountain made of bamboos) in a Japanese garden as audio-visual stimuli. In Experiment 1, the effects of the field size of view and the sound pressure level of the background were investigated. Higher vraisemblance was observed with the middle field size of view with the original sound pressure level of the background, whereas higher sense of presence was observed with the larger field size of view with the larger background sound. In Experiment 2, the effect of temporal asynchrony between the foreground audio-visual stimuli produced by Shishi-odoshi was investigated. The results show that the range of temporal-window for the audio-visual stimuli necessary for high vraisemblance was different from those for high presence. These findings suggest that the sense of vraisemblance can be distinguishable from the sense of presence, and deeply involved to the foreground-based aesthetic impression in a scene.
  • Souta Hidaka, Wataru Teramoto, Jiro Gyoba, Yoiti Suzuki
    VISION RESEARCH 50(20) 2093-2099 2010年9月  査読有り
    An abrupt change in a visual attribute (size) of apparently moving visual stimuli extends the time the changed stimuli is visible even after its physical termination (visible persistence). In this study, we show that elongation of visible persistence is enhanced by an abrupt change in an attribute (frequency) of the sounds presented along with the size-changed apparently moving visual stimuli. This auditory effect disappears when sounds are not associated with the visual stimuli. These results suggest that auditory attribute change can contribute to the establishment of a new object representation and that object-level audio-visual interactions can occur in motion perception. (C) 2010 Elsevier Ltd. All rights reserved.

MISC

 29
  • 日高 聡太, 三枝 千尋
    基礎心理学研究 41(2) 144-145 2023年3月31日  
  • 日髙 聡太, 浅野 倫子
    日本心理学会大会発表論文集 86 ITL-001-ITL-001 2022年  
    知覚・認知処理の目的は,外界から入力された情報を我々にとって有用な形に変換,表現することにあると考えられる。知覚・認知処理において複数の感覚情報を組み合わせて利用することで,信頼性のある頑健な表現を構築することが可能となる。本講演では,講演者がこれまで行ってきた多感覚を対象とした実験心理学的研究の成果を概観する。ある感覚が別の感覚に影響を及ぼすという相互作用に加えて,複数の感覚にまたがって類似した処理特性を持つという共通性についても触れながら,人の知覚・認知処理の動作原理について考察する。
  • Souta Hidaka, Masakazu Ide
    INTERNATIONAL JOURNAL OF PSYCHOLOGY 51 69-70 2016年7月  
  • 池田 華子, 田中 智明, 日高 聡太
    電子情報通信学会技術研究報告 = IEICE technical report : 信学技報 114(347) 1-6 2014年12月1日  
    本研究では,動画像の解像度(4K・HD)および撮影対象(自然物・人工物)や撮影画角(広・中・狭)の違いが,観察者の主観的印象に及ぼす影響を検討した.その結果,高解像度映像(4K)に対して,より好ましく鮮明であり,より見やすいと評価された.特に高解像度映像において,撮影画角が広い場合や自然風景が提示された際にその効果が強く表れることを示した.以上から,撮像対象の違いに応じて,映像の解像度が観視者の主観的印象に影響を及ぼすことが示唆された.
  • 池田 華子, 田中 智明, 日高 聡太
    映像情報メディア学会技術報告 = ITE technical report 38(48) 1-6 2014年12月  

書籍等出版物

 6
  • 日本視覚学会 (担当:分担執筆, 範囲:第V章「多感覚認知」・第5.2節「視覚から聴覚への作用」)
    朝倉書店 2022年11月 (ISBN: 9784254102949)
  • 日高聡太, 北川智利 (担当:分担執筆, 範囲:第11章「感覚間相互作用」)
    コロナ社 2021年4月 (ISBN: 9784339013658)
  • Teramoto, W, Hidaka, S, Sugita, Y (担当:分担執筆, 範囲:Auditory bias in visual motion perception)
    Cambridge University Press 2018年
  • 日髙 聡太 (担当:分担執筆, 範囲:第1章「原理・歴史第3節「19世紀後半における近代心理学の成立」)
    誠信書房 2014年
  • 日髙 聡太 (担当:分担執筆, 範囲:第2章「感覚・知覚心理学」・第6節「神経生理学的理論」.)
    朝倉書店 2012年

講演・口頭発表等

 19

担当経験のある科目(授業)

 20

共同研究・競争的資金等の研究課題

 14