Motion aftereffects in vision, audition, and touch, and their crossmodal interactions

to increase with adapting velocity up to the highest velocity tested (200 m/s). Ehrenstein (1978) conducted a similar study where he rotated the sound clockwise and counterclock-wise around the subject and saw similar effects. He noted that not countermovement occurred such as in the waterfall illusion, merely a shift in the localization of stationary stimuli. Grantham (1997, 1998) found that the occurrence of aMAE depended on the velocity and the frequency of the sound with higher frequencies and slower speeds leading to a reduced or non-existent effect.

To understand how we perceive the world around us, illusions have been used to gain insight into our perceptual systems.The most common types of illusions studied are visual illusions however illusions in other sensory modalities such as audition and touch can give us insight not only into those sensory modalities but also into more general mechanisms in the brain.One of these illusions is the motion aftereffect (MAE) which has been used extensively to study visual motion perception.The visual MAE (vMAE) is a result of motion adaptation and follows a prolonged exposure to a continuously moving visual stimulus.A subsequently presented static stimulus results in an illusory perception of the stimulus moving in the opposite direction (for review, Mather et al., 1998;Mather et al., 2008).While the visual MAE has been studied extensively, less attention has been given to its tactile and auditory analogues.The comparison of these effects in different sensory modalities can teach us about the generalizability of the brain mechanisms underlying motion perception more generally.
Our brains are constantly being bombarded with sensory signals from vision, audition, and touch, rarely do we interact with our environment using only one sense.This has led to research reexamining these types of sensory perception from a crossmodal perspective (Thesen et al., 2004;Vibell et al., 2017;Chang et al., 2022).A set of studies (Kitagawa and Ichihara, 2002;Konkle et al., 2009;Maeda et al., 2004) have focused on crossmodal motion aftereffects (i.e., vision-audition, vision-tactile and vice versa), where adaptation to stimuli in one sensory modality can affect perception in another one.Crossmodal motion aftereffects are particularly interesting as they can teach how adaptation in one sensory modality can carry over to perception in another sensory modality which in turn tells us about the underlaying cortical organization.By combining these studies with neuroimaging, we can learn about the processing areas where this occurs, which can teach us about which brain areas are truly crossmodal, have spatially interleaved sensory organizations or merely have crossmodal influences maybe via backpropagation from higher order areas.This systematic review discusses this literature with a particular emphasis on human neuroimaging and crossmodal interactions in the less commonly studied tactile modality.It aims to describe the research on visual, auditory, and tactile motion MAEs, their interactions, and their cortical mechanisms.
Brief review of visual MAEs.The vMAE occurs when prolonged exposure to a continuously moving stimuli in one direction makes a subsequently viewed stationary stimulus appear to be moving in the opposite direction (Wohlgemuth, 1911).For example, stationary rocks appear to be floating upward after continuous exposure to water flowing downward (Addams,1834).It is a robust illusion that can appear with horizontal, vertical, and spiral motions and has been observed in all types of individuals.
In a typical vMAE experiment, participants adapt to a moving stimulus for a set period of time followed by a stationary testing stimulus.Participants are asked to report the duration for which they perceived the MAE in the testing stimulus.The vMAE duration is approximately the square root of the adapting duration when adaption is above 60 s (Hershenson, 1989).For example, if there is an adapting duration of 80 s, the MAE would last approximately 9 s.Some studies have reported the motion aftereffect can be "stored" by closing the eyes after exposure to the moving stimulus or by inserting a blank screen between the adaptation and test stimuli (see Mather et al., 2008).
Several theoretical accounts have been put forth to explain the vMAE.A simple neural explanation is one of the most prominent ways to explain what is happening in the MAE.Sutherland (1961) proposed the MAE results from an adaption of direction selective neurons (i.e., after exposure to unidirectional movement, a static stimulus would produce less firing in the cells stimulated by the movement creating a stimulus that appears to move in the opposite direction).This view became widely accepted following discoveries of adaptation effects in non-human animals such as rabbits, cats, and primates (Anstis et al., 1998).While the core principle of this model is still accepted, the advancement of experimental techniques requires changes to be made to the theoretical explanations of the vMAE.For example, more recent findings demonstrate that the vMAE is a result of multiple neural adaptations at several visual cortical sites (Mather et al., 2008).For brevity, see the following studies for a highlighted history of motion perception (e.g., Burr and Thompson, 2011;Nishida, 2011) and summaries specific to visual motion aftereffects (Mather et al., 1998(Mather et al., , 2008)).
Brief review of Auditory MAEs.Adaptation to a moving sound source can cause noticeable changes in the spatial localization of subsequent sound stimuli.This phenomenon, called the auditory motion aftereffect (aMAE), is believed to be based on the neural mechanisms of selective motion sensitivity.It is similar to the vMAE, but the auditory aftereffect is specifically characterized by its spatial and frequency selectivity as well as by the optimal motion velocity at which the effect is maximal (Andreeva, 2015).Dong et al. (2000) showed robust and reliable aMAE in all participants, while listening to a sound where the speaker was shifted to the right using a robot arm.Subsequently a stationary sound was perceived to move to the left and the magnitude tended to increase with adapting velocity up to the highest velocity tested (200 m/s).Ehrenstein (1978) conducted a similar study where he rotated the sound clockwise and counterclockwise around the subject and saw similar effects.He noted that not countermovement occurred such as in the waterfall illusion, merely a shift in the localization of stationary stimuli.Grantham (1997Grantham ( , 1998) ) found that the occurrence of aMAE depended on the velocity and the frequency of the sound with higher frequencies and slower speeds leading to a reduced or non-existent effect.

Introduction Review of tactile MAEs
While the vMAE requires participants to look at dynamic and static stimuli projected to the same location on the retina and the aMAE requires listening the frequencies moving through specific locations in space, the tMAE requires participants to feel these stimuli on their skin.Therefore, it operates slightly differently than its visual and auditory counterparts (See Vibell et al., 2023).After removing the adaptation stimulus, are likely to perceive subsequently presented tactile stimuli as moving in the opposite direction, while the vMAE can be moved by shifting your gaze.However, Badde and Heed (2023) suggest the tMAE is not always bound to the specific location but rather jumps hands if these are crossed during the adaptation.Unlike the vMAE, the literature dedicated to the tMAE disagrees about the robustness of this motion illusion.
While some researchers have reliably induced the tMAE (Hollins and Favorov, 1994;Thalman, 1922;Konkle et al., 2009;Kuroki et al., 2012;Watanabe et al., 2007), others have not (Hazlewood, 1971;Lerner and Craig, 2002;Wohlgemuth, 1911).Watanabe and colleagues attribute these varying finding to the different types of tactile receptors and to participants adapting to a different type of sensor while being tested on another (Watanabe et al., 2007) e.g., adapting to fast adapting (FA) mechanoreceptors while being tested on slow adapting (SA) ones.
Another aspect that differs slightly in tactile motion perception is its perceived duration.Tomassini et al. (2011) showed that perceived duration of a stimulus is strongly influenced by speed: faster moving stimuli appear to last longer.They also showed that visual stimuli were perceived to move faster than tactile stimuli.Even when they matched visual and tactile stimuli by speed this effect remained, albeit slightly less strong.This difference in speed perception between vision and touch may also play a role when interpreting the MAE.Wohlgemuth (1911) was the first to attempt to induce a tMAE; although he failed to find an effect.Thalman (1922) continued to explore the tMAE in a series of 10 experiments using a knotted cotton cord as a stimulus.Nine of the experiments were performed on the forearm while one was tested on the calf of the leg.He reliably induced the tMAE in all four participants both on the forearm and the calf leg.For unknown reasons, the tMAE was not studied for approximately 50 years after Thalman's demonstration, until Hazlewood (1971) investigated the occurrence of the tMAE using a moving corrugated belt on the fingertips and forearm while having them report if they experienced illusory movement.The results from preliminary explorations and two main experiments revealed that out of the 40 participants, only 2 reported the effect.Hazlewood (1971) suggested the tMAE is very slight and is induced differently than vMAE, if it occurs at all.On the other hand, Hollins and Favorov (1994) reported robust tMAEs using a rotating metal drum.Participants placed their hand on the rotating drum for 120s and then removed their hand before placing it on a stationary drum.Their five participants experienced mostly negative (i.e., in the opposite direction of the adaptation stimulus) tMAE in the majority of trials.In addition, their results illustrate that the strength and duration of the tMAE increased with the duration of the adaptation phase (between 3 s and 120 s).Lerner and Craig (2002) tried to replicate Hollins and Favorov (1994)experiment using a similar drum.Half of their participants reported no tMAE.Moreover, the reported tMAEs were mostly in the positive direction (i.e., the same direction of adaptation) which is not the direction one would expect based on the knowledge of vMAEs in general.Lerner and Craig (2002) performed a separate experiment using an OPTACON (Telesensory Systems Inc., Palo Alto, CA), a reading device for the blind, which was sequentially activated to create tactile motion perception.Again, approximately half of the participants reported a tMAE.Although more participants experienced a negative tMAE, the direction reported was not statistically significant.These results indicate that tMAEs were not a robust phenomenon like previous reports of vMAEs and aMAEs.
However, Watanabe et al. (2007) argue that Lerner and Craig (2002) used non-optimal combinations that targeted different receptor types in the adaptation-and testing stages which contributed to their discrepant findings.They set out to evaluate the tMAE since the existence of the aftereffect has not been conclusively demonstrated.They used three pins with 30 Hz vibration on the tip of the finger to test tMAEs.After an adaptation period of 10 × 10 s, they used dynamic test stimuli (varying ISI) to observe tactile vertical motion, long-distance motion, and circular motion aftereffects and evaluated the point of subjective equality (PSEs).Their results showed a clear tMAE in the opposite direction for all individuals.This aftereffect persisted at longer distances and when motion was circular on the fingertip.The researchers suggested that the discrepant earlier findings might be related to fast and slow mechanoreceptors and that other experiments had adapted to fast receptors and tested the slow.That is, to elicit a reliable tMAE, appropriate stimuli must be used to stimulate the same mechanoreceptors in the adaptation and test phases.Planetta and Servos (2008) investigated the direction, duration, frequency, and vividness of the tMAE at varying adapting speeds.Similar to Lerner and Craig (2002), participants only reported a tMAE in approximately 40% of the trials.The direction of the tMAEs were reported in three ways: positive (19.8%), negative (12.8%), and other (7.8%).Their results showed that duration, frequency, and vividness increased with adapting speed (e.g., as the speed increased, the duration of the tMAE increased).In another study, Planetta and Servos (2010) investigated which type(s) of mechanoreceptors are involved in the tMAE by comparing different sites of stimulation, namely the hand, the cheek, and the forearm.A tMAE was most often reported after stimulation of the hand, not the cheek or forearm.The researchers suggest that most likely the fast-adapting type I receptors and the hair follicles are involved in the tMAE.Kuroki et al. (2012) summarized tMAEs and demonstrated how adaptation in one direction alters the direction of the next motion on the fingertip (i.e., Watanabe et al., 2007) and with crossed fingers (Kuroki et al., 2012).Kuroki et al. (2012) expanded upon the work of Watanabe and colleagues. (2007) by shifting finger postures to test for the MAE.They argued their findings reflect early stages, (i.e., sensory specific) of sensory processing rather than high-level supra-modal motion processing.When participants adapted with crossed fingers, the MAE occurred in the same direction as when fingers were not crossed.There was no MAE if one hand went through the adaptation phase and the other was tested.Kuroki et al. (2012) found that the direction of the tMAE was determined by the environmental direction meaning that perceived direction changes with body posture.Moreover, they concluded that stimulation of peripheral receptors is essential for the occurrence of the aftereffect.These findings are important for future work to optimally induce tMAEs.
The tMAEs tested above (i.e., analogous to waterfall illusion) are not the only type of tMAE (e.g., speed & distance aftereffects).The tactile speed aftereffect occurs after an adaptation phase of exposure to a moving stimulus.A dynamic test stimulus is then perceived as moving slower compared to the same test stimulus presented to a non-adapted hand (McIntyre et al., 2012(McIntyre et al., , 2016a(McIntyre et al., , 2016b)).McIntyre et al. (2012) examined this speed aftereffect and found that it was independent of the direction of the adapting stimulus relative to the test direction.As tactile receptive fields in S1 are direction sensitive, the independence of direction suggests that adaptation of these receptors cannot be the cause of the aftereffect.Additionally, adaptation in the preferred direction should cause stronger adaptation and as a consequence, a direction sensitive aftereffect.As this was clearly not the case in this study, the researchers conclude that this aftereffect has to be of central origin.
The main conclusions to draw from the above-mentioned studies is that tactile motion aftereffects are harder to induce than visual motion aftereffects (for example, the waterfall illusion).The unreliable results in some studies can be due to the different types of receptors underlying the tactile system.Thus, it is important to adapt and test the same receptor types and to take this into consideration when evaluating studies on tMAEs.

Motion processing in the brain
Motion is processed in a series of nodes in the brain that both differ and overlap between the sensory modalities.The sensory receptors informing the primary cortices have different properties that provide different bases for their early perceptual processing.For example, touch is an exploratory sense and relies on motion between an object and the skin surface for the haptic exploration.This relies on more than tactile receptors, but also skin stretch, the texture of an object, the type of skin (glabrous vs non-glabrous) and the integration of motion perception from several fingers.Idiosynchrasies of each sensory system impacts its subsequent processing in the brain.
The visual perceptual system has direction tuning in the primary visual cortex (V1) that integrates object components that increase in complexity as it progresses through the processing pathway (Burr and Thompson, 2011;Pack and Bensmaia, 2015).It is processed along the visual motion pathway which includes V1, middle temporal area (MT), and medial superior temporal area (MST; Movshon and Newsome, 1996;Orban, 2008;Gilaie-Dotan, 2016) and the ventral inferior parietal (VIP).Areas MT and MST combines visual motion signals, similar to those generated during an observer's movement through the environment, with eye-movement and vestibular signals.This helps determining the path on which the observer is moving.More complex movement is processed further down the pathway e.g., combinations of gratings moving (Tomassini et al., 2011).
Analogous pathways are found for auditory and tactile motion perception.Many neurons in the primary auditory cortex (A1) are tuned for direction (Ahissar et al., 1992).In about a third of the neurons, this includes rightward and leftward motion.As for vision, auditory processing splits into 'what' and 'where' streams (Ahveninen et al., 2006).From here the motion perception connects to areas MT and VIP. (Singh-Curry and Husain, 2009).Tactile motion is processed in S1 where the majority of cutaneous neurons are direction and orientation selective.This direction tuning applies to many types of object components such as bars, edges, rotating cylinders, and blunt probes (see Pei and Bensmaia, 2014).S1 consists of areas 3a, 3b and 2. Areas 3b and 1 are primarily responsible for the analysis of information from mechanoreceptors in the skin and are responsive to motion, though 3b respond to motion in all directions.Watanabe et al. (2007) speculate that tactile MAEs occurs beyond area 3b as the receptive fields are not large enough.
While many studies have examined tactile motion processing, little attention has been given to the neural basis of the tMAE specifically.The difference between tactile motion and the tMAE is that the tMAE is an illusion that may happen after the adaption to continuous motion in one direction.Planetta and Servos (2012) appear to provide the first and only evidence of the neuroanatomical basis of the tactile motion aftereffect.Using fMRI, they found a sustained BOLD response in the contralateral postcentral gyrus when participants perceived the tMAE compared to when they did not perceive the aftereffect.Moreover, there was no hMT+/V5 activation observed in their study.If hMT+/V5 is indeed a multisensory motion processing area, we would expect to find activation in this region for tMAE since it is well-established that hMT+/V5 is involved with vMAEs and aMAEs (Anstis et al., 1998;Mather et al., 2008).The methodology of Planetta and Servos (2012) might not have allowed for hMT+/V5 activation to occur.
The neural processes underlying the aMAE seem to involve the human planum temporale (hPT) for auditory motion direction and localization in a similar way that hMT+/V5 works for vision (Battal et al., 2019).While processes that involve both audition and vision activate hMT+/V5.In early deaf individuals visual motion gives rise to activation in hMT, while the hMT+/V5 showed reduced activation compared to hearing people (Benetti et al., 2021).Similar findings were observed in an fMRI study where blindfolded people listened to moving and stationary sounds with an activation of hMT+/V5 (Poirier et al., 2005).These features and the presence of the intersensory motion adaptation effects indicate a common nature of the auditory and visual motion aftereffects and suggest that similar mechanisms may underlie motion adaptation for different sensory modalities.
In the brain, it is well established that the middle temporal complex (hMT+/V5 or MT+), which includes areas MT and MST, is a visual motion processing area and underlies the visual MAE (Antis, Verstraten and Mather, 1998;Mather et al., 2008).Britten et al. (1992) found that a relatively small set of neurons in area MT account for performance on perceptual discrimination in motion tasks.Traditionally, hMT+/V5 was thought of as a unisensory visual motion processing area.However, recent research, driven by neuroimaging, argues that hMT+/V5 is instead a multisensory motion processing area.In fact, research has shown that many brain areas originally thought be unisensory areas are actually multisensory areas (Calvert et al., 2004).Several studies have shown hMT+/V5 activation in response to auditory motion (Calvert, 2001;Gurtubay-Antolin et al., 2021;Lewis et al., 2000) but not every study has shown activation of this area (Smith et al., 2004(Smith et al., , 2007)).The results of tactile motion studies are also mixed, with some reports of hMT+/V5 activation (Blake et al., 2004;Francis et al., 2001;Hagen et al., 2001;Hagen et al., 2002;Johansen-Berg, 2001;Ricciardi et al., 2007;Sani et al., 2010;Summers et al., 2009), reports of no activation (Bremmer et al., 2001;Planetta and Servos, 2012), and hMT+/V5 deactivation (Nikashita et al., 2008).However, recent tactile motion studies argue that the hMT+/V5 is indeed a multisensory motion processing area (Calvert et al., 2004;Pei et al., 2011Pei et al., , 2014)).
There is much evidence that hMT is crossmodal.The cross-modal interaction is interesting because it sheds further insight into the neural mechanisms underlying the interaction or adaptation.Thus, if the interaction is spatially selective, we can conclude that the neural mechanisms of that region are spatially selective.Likewise, if motion aftereffects work across the senses, we can conclude that the neural mechanisms encoding motion are likely crossmodal.Using fMRI, Summers et al. (2009) found brain areas that are common in motion processing for visual and tactile modalities (i.e., activation in the hMT+/V5 and in the intraparietal area of the posterior parietal cortex).In another fMRI study, Van Kemenade et al. ( 2014) investigated whether hMT+/V5 contains direction-specific information for visual/tactile moving stimuli.Their results indicate that hMT+/V5 contains information about the direction of a moving stimulus for both tactile and visual modalities.Bremmer et al. (2001) presented visual, auditory, and tactile moving stimuli and observed three brain areas associated with multimodal processing: the intraparietal area, the ventral premotor cortex, and the lateral inferior postcentral cortex.These findings have been corroborated by studies using transcranial magnetic stimulation (TMS) confirming that the functional organization of hMT+/V5 is not solely visual in nature (Amemiya et al., 2017;Ricciardi et al., 2011;Basso et al., 2012).These experiments further support the idea that hMT+/V5 encodes motion information from different senses.However, that does not tell us if it is truly crossmodal.Animal studies have shown that the sensory information in hMT+/V5 is organized with spatially interleaved (or separated) neuronal populations that encode unisensory motion information.In contrast, neurons in VIP are multisensory and thus encode information from multiple senses (Poremba et al., 2003).
Plasticity between sensory regions further inform us about the nature of multisensory areas.We know from studies on patients with damage to their sensory system that there can be substantial cortical reorganization (e.g., Finney et al., 2001;Lomber et al., 2010;Neville and Bavelier, 2002).Following impairment of a sensory system, ensues enhanced perceptual performance in the remaining systems with one or more modalities compensating for the damaged sensory modality.This cortical reorganization can be substantial to the degree that we observe enhanced auditory spatial localization in the blind or activation by visual stimuli in A1 in the deaf.These findings highlight the plasticity of cortical organization and that a lot of the cortical areas discussed in this review can shift after damage to the sensory systems (see also Benetti and Collignon, 2022;Fine and Park, 2023;Lomber, Meredith & Kral, 2010).
In sum, the involvement of parietal areas such as the postcentral gyrus and parietal operculum in tactile information processing is welldocumented in the literature (Gardner and Kandel, 2000).The current debate is whether hMT+/V5 is a unisensory visual motion processing area or a multisensory motion processing area.It is possible that backpropagation and timing may influence the activation seen in these studies.Overall, various neuroimaging studies have found activation in hMT+/V5 in response to tactile motion processing as well as auditory motion processing, further supporting the notion that hMT+/V5 is indeed a multisensory area.

Crossmodal integration and MAEs
Our senses do not exist within a vacuum, that is, when we are presented with an object, we receive countless sensory signals from multiple modalities synchronously.To perceive the world around us, our senses integrate with each other so that we get the most useful information.While most studies have looked at MAEs in individual modalities, more recently researchers have studied the MAE from a crossmodal perspective (e.g., Alais et al., 2018;Berger and Ehrsson, 2016;Gori et al., 2011;Konkle et al., 2009).One of the biggest questions of this area is do MAEs transfer between different modalities?If they do in one direction, is the same also true for the opposite direction?For example, if visual motion can affect the perception of tactile motion, does tactile motion affect the perception of visual motion?Also, of interest has been the degree to which sensory modalities influence each other.
While many studies have examined tactile motion processing, little attention has been given to the neural basis of the tMAE specifically.The difference between tactile motion and the tMAE is that the tMAE is an illusion that may happen after the adaption to continuous motion in one direction.Planetta and Servos (2012) appear to provide the first and only evidence of the neuroanatomical basis of the tactile motion aftereffect.Using fMRI, they found a sustained BOLD response in the contralateral postcentral gyrus when participants perceived the tMAE compared to when they did not perceive the aftereffect.Moreover, there was no hMT+/V5 activation observed in their study.If hMT+/V5 is indeed a multisensory motion processing area, we would expect to find activation in this region for tMAE since it is well-established that hMT+/V5 is involved with vMAEs and aMAEs.The methodology of Planetta and Servos (2012) might not have allowed for hMT+/V5 activation to occur.
Visual and tactile motion adaptation aftereffects are spatially selective.Interestingly, both can be sensitive to external coordinates (though the tactile version relies on default posture information) suggesting motion adaptation affects regions beyond primary sensory cortices (vision: Turi and Burr, 2012;touch: Badde and Heed, 2023).Turi and Burr (2012) suggested that there are two types of vMAEs, the traditional and positional MAEs.The traditional MAE is retinotopic and operates at an earlier level, while the positional MAE is spatiotopic, acts later, and takes color and luminance into account.
A growing body of evidence suggests that the response of a neuron to a target stimulus is proportional to the hit probability using some sort of Bayesian weighting.Hillis et al. (2002) suggested that information is combined using optimal cue integration.This suggests that the brain relies on information from simultaneous sensory streams by weighting them based on the inverse of their variance, giving greater weight to more reliable sensory input (Clark and Yuille, 1990).This was shown neurophysiologically by Fetsch et al. (2013).This section will cover various crossmodal motion aftereffects and the directional relationship between several modalities.This is particularly interesting because the illusion will reflect the properties of the brain area(s) that support that specific level of representation.If aftereffects are spatially selective, we conclude that the neural mechanisms underlying motion perception are spatially selective.If aftereffects work across the senses, then the neural mechanisms encoding motion likely process information from different senses.One might conclude that only cross-modal aftereffects provide conclusive evidence that motion direction is encoded by multisensory neurons given that integration effects might occur at decisional stages as increased BOLD activity does not necessarily point towards the involvement of multisensory neurons.

Audio-visual integration
Multiple studies support crossmodal aftereffects between these audition and vision but the theories that underlie these processes vary.Alais et al. (2018) investigated auditory and visual motion using speed discrimination and argue that a common mechanism processes motion regardless of the input (visual or auditory).Berger and Ehrsson (2016) suggest that visual and auditory motion perception rely on shared neural representations, which are consistent with the maximum likelihood decision strategy.Lewis et al. (2000) investigated the neural substrates of auditory and visual motion processing using fMRI.Participants performed separate visual and auditory motion discrimination tasks.Their results showed overlapping activation in parts of the parietal cortex leading them to conclude that certain cortical regions (e.g., intraparietal sulcus) are involved in multisensory integration (Lewis et al., 2000).Although theories of the underlying mechanisms of crossmodal aftereffects might vary, the field has moved towards consistent findings of crossmodal aftereffects between vision and audition in both directions.

Audio-visual MAEs
In the auditory and visual domains, research investigating whether crossmodal MAEs could be obtained have presented mixed findings.Kitagawa and Ichihara (2002) found that visual motion could alter subsequent perception of auditory stimuli however auditory motion adaptation did not impact a vMAE.This finding is contradictory to the findings of Maeda et al. (2004) and Hedger and colleagues (2013) who found that auditory stimuli and music, respectively, can induce vMAEs in the vertical plane.Berger and Ehrsson (2016) also provide clear evidence that apparent auditory motion can elicit a visual motion aftereffect in the horizontal plane.In addition, Tregillus et al. (2016) used a complex, naturalistic auditory stimulus (including motion, inter-aural time differences, inter-aural level differences, and Doppler shifts) to look at crossmodal adaptation of motion aftereffects from the auditory domain to the visual domain.Their results showed a consistent trend of crossmodal aftereffects on visual motion after auditory motion adaptation.They suggest that more complex stimuli might be more likely to induce crossmodal aftereffects.Additionally, Hidaka et al. (2011a) studied contingent motion aftereffects using the visual and auditory domains.Their results indicate that auditory stimuli affected visual motion perception.Another study examined the sound-induced visual motion illusion which found a direct interaction between auditory and visual motion signals (Hidaka et al., 2011b).Their results showed that auditory motion stimuli can induce visual motion perception that is consistent with the auditory motion perception.The aforementioned studies use various methodologies to test crossmodal MAEs which may contribute to the mixed conclusions about whether crossmodal MAEs are possible in one direction (i.e., auditory stimuli on visual motion perception).Combined, these studies suggest that changes to auditory and visual motion adaptation can affect the crossmodal aftereffects between these modalities.
Some researchers have investigated crossmodal aftereffects between vision and audition in different clinical populations (i.e., in individuals who were congenitally blind due to dense bilateral cataracts, in those who regained sight, and in individuals who were congenitally deaf cochlear implant users; Guerreiro et al., 2016;Fengler et al., 2018, respectively).Guerreiro et al. (2016) showed that cataract reversal individuals were as likely to show aMAE following visual motion adaption.However, cataract-reversal individuals showed a significant vMAE following auditory motion adaption compared to normal sighted and visually impaired individuals.In addition, the results of Fengler et al. (2018) demonstrate that control subjects and experienced cochlear implant users both experienced crossmodal motion aftereffects.While previous research has been unclear about crossmodal aftereffects between vision and audition, these studies provide evidence for this notion even in different clinical populations.

Visuo-tactile integration
Visual and tactile motion adaptation aftereffects are spatially selective.Interestingly, both can be sensitive to external coordinates (though the tactile version relies on default posture information) suggesting motion adaptation affects regions beyond primary sensory cortices (vision: Turi and Burr, 2012;touch: Badde and Heed, 2023).Turi and Burr (2012) suggested that there are two types of vMAEs, the traditional and positional MAEs.The traditional MAE is retinotopic and operates at an earlier level, while the positional MAE is spatiotopic, acts later, and takes color and luminance into account.
Researchers have also investigated whether visual motion can affect the perception of tactile motion, and vice versa.Some studies provide evidence of unidirectional (i.e., vMAEs affects tactile motion perception; Bensmaia et al., 2006;Craig, 2006) while others provide additional evidence of a bidirectional relationship between the two modalities (Gori et al., 2011;Gray and Tan, 2002;Guo et al., 2013;Konkle et al., 2009) in that tMAEs also affect visual perception.Interestingly, visual and tactile motion signals interact even in the absence of awareness (Hense et al., 2019) indicating that this is a pre-attentive phenomenon.
In a series of five experiments, Bensmaia et al. (2006) investigated the tactile perception of speed, and the effect visual perception of motion has on it.They presented participants with visual and tactile drifting sinusoids under different conditions such as direction of motion and changes in temporal frequency of the visual grating.When the visual and tactile gratings drifted in the same direction simultaneously, visual stimuli change the perceived speed of the tactile stimuli.These changes varied depending on the temporal frequency of the visual stimuli.However, when the visual stimuli moved in the opposite direction, the effect was at least reduced.They argue that their results show a visual-tactile interaction operating at perceptual stages level and not as a later response bias.Craig (2006) also provides evidence that visual motion interferes with tactile motion perception.He investigated the effect of visual apparent motion on judgments of the direction of tactile apparent motion.Similar to Bensmaia and colleagues' research (2006), Craig (2006) found that when visual motion is presented simultaneously in the opposite direction as tactile motion, there was reduced accuracy in judging the direction of tactile motion.While Craig (2006) discussed the relation of the reduction observed to the congruency effect, the study also provided evidence of the interaction between visual and tactile motion perception.
Dynamic tactile information has also been shown to reorient visual attention and vice versa -i.e., visual motion can reorient tactile attention as shown by Gray and Tan (2002).While this study does not investigate motion aftereffects per se, their results provide evidence that our sensory system can dynamically update sensory links.In particular, our perceptual system can update crossmodal links between different modalities of moving objects (Gray and Tan, 2002).

Visuo-tactile MAEs
Konkle et al. ( 2009) examined the crossmodal transference of the MAE; if it can transfer from vision to touch and vice versa.More specifically, whether an adaptation to visual motion can induce a tMAE and can an adaptation to tactile motion induce a vMAE.The first experiment consisted of 10 s of a drifting visual grating followed by a tactile test stimulus (i.e., tactile sweeps).The second experiment consisted of tactile motion sweeps (30 ms/row for 6 rows, followed by a 150 ms gap, 33.3 Hz vibration frequency) which were followed by flickering visual gratings.Their results suggest that visual and tactile motion processing can be adapted by the other modality.Additionally, the researchers suggest that V5/MT + complex is a potential neural site that may underlie the crossmodal interactions of motion processing because it seems that the neural substrates are partially overlapping based on the transfer of these aftereffects.Thus, visual motion processing and tactile motion processing are not isolated (Konkle et al., 2009).
Overall, these studies provide evidence that visual motion stimuli can affect tactile perception.The crossmodal information processing between visual and tactile modalities is not completely clear.A few research studies have examined the level of processing between vision and touch (Gori et al., 2011;Guo et al., 2013;Nakashita et al., 2008).For example, Guo et al. (2013) showed that integration between vision and touch has two stages: combination and integration.Additionally, some studies have suggested that integration of vision and touch occur at early sensory levels rather than a higher cognitive level (Bensmaia et al., 2006;Gori et al., 2011).Gori et al. (2011) used two different psychophysical techniques, summation, and facilitation, to study the processes that mediate crossmodal visual-tactile interactions.Their results suggest that the two forms of motion, visual and tactile, are processed by similar mechanisms that interact with each other.Moreover, they propose that visual and tactile motion signals at two levels: a relatively low, direction-specific sensory level and a higher-level where multisensory interactions occur.Konkle and Moore (2009) proposed the adaptive processing hypothesis which is the idea that "any area circuit that processes a stimulus is changed by that stimulus and that these dynamics are a functional property of the areas throughout the system" (p.480).
Future research should continue to investigate crossmodal information processing between vision and touch to better understand how these two modalities integrate with each other in motion perception.

Audio-tactile integration
Soto-Faraco et al. ( 2004) examined the effect of congruency on audio-tactile interactions for the perception of apparent motion streams.They found a bidirectional relationship between audition and touch for crossmodal integration of dynamic capture information.In particular, their findings suggest that the perceived direction of apparent motion (auditory or tactile) can be modulated by concurrent task-irrelevant (auditory or tactile) apparent motion.Perceived auditory motion is strongly modulated by concurrent tactile motion whereas perceived tactile motion perception is modulated to a lesser extent by concurrent auditory motion.
In addition, the results of Spence et al. (1998) demonstrate extensive crossmodal links between vision, audition, and touch in exogenous covert orienting.There appears to be a bidirectional relationship between vision and touch and audition and touch.However, they only found an effect of audition on vision and not an effect of vision on audition.

Audio-tactile MAEs
To our knowledge at the time of writing, no research has been published about crossmodal motion aftereffects operating from audition to touch and touch to audition.However, researchers have investigated the integration of the two sensory modalities under different conditions.For example, studies have shown the bidirectional relationship between audition and touch in the crossmodal dynamic capture effect (Soto--Faraco et al., 2004) and crossmodal spatial attention (e.g., exogenous covert orienting; Spence et al., 1998).
Although crossmodal motion aftereffects between audition and touch have not been investigated, previous studies of crossmodal attention and crossmodal dynamic capture can pave the way for the expectations of crossmodal MAEs.For example, it would be reasonable to suggest there is a bidirectional relationship between the two modalities.Thus, there may be a bidirectional relationship between vision, audition, and touch in crossmodal motion aftereffects, though this should be established empirically.If researchers could demonstrate a crossmodal auditory-tactile motion aftereffect, it could be argued that like vision and touch, auditory and tactile motion perception rely on partially overlapping neural substrates that operate in a similar manner.

Conclusion
The majority of the research on MAEs has been performed in the visual and auditory domains.However, previous research in these modalities has informed experimenters on how to explore the tMAE analogue.The first debate in the literature is that tMAEs are not a robust illusion like its counterparts.However, recent research has aimed to provide evidence on the specific parameters inducing this effect based on the mechanoreceptors at play (Watanabe et al., 2007).Additionally, a vast percentage of the neuroimaging research focuses on tactile motion perception and not necessarily the tMAE.In fact, the one study that examined tMAE using fMRI (Planetta and Servos, 2012) did not find activation in hMT+/V5.This is the opposite of most of the tactile motion perception literature which finds activation in hMT+/V5.Thus, the second debate arises: is hMT+/V5 a multisensory brain area for motion processing and if so, it is truly multisensory or just shares interleaved sensory representations in close proximity.Research on crossmodal MAEs is important for investigating these types of questions.Future research should further detail the areas that are independent and truly multisensory in more detail and what type of processing they perform.Importantly the spatial and temporal boundaries for each sensory modality should be explored and mapped, particularly in the context of how they transfer crossmodally.By mapping these effects between the senses, we can gain a better understanding of how general cortical mechanisms operate, which likely transfers to other cortical functioning.

Declaration of competing interest
None.