Alpha Oscillations Create the Illusion of Time

Abstract Recent neuroscience experiments have brought inconsistent findings to light about the influence of neural activity in the alpha-frequency band (at ≈10 Hz) on the temporal dynamics of visual perception. Whereas strong alpha effects were found when perception was more based on endogenous factors, there were null-effects for alpha when perception relied more on objective physical parameters. In this Perspective, I open up a new view on neural alpha activity that resolves some important aspects of this controversy by interpreting alpha not as temporal processing of sensory inputs per se but above all as the observer's internal processing dynamics, their so-called perception sets. Perception sets reflect internally stored knowledge for how to organize and build up perceptual processes. They result from previous sensory experiences, are under top–down control to support goal-directed behavior, and root in pre-established neural networks that communicate through alpha frequency channels. I present three example cases from the recent neuroscience literature that show an influence of alpha-driven perception sets on the observer's visual-temporal resolution, object processing, and the processing of behaviorally relevant image content. Because alpha-driven perception sets can structure perception from its high-level aspects, like categories, down to its basic building blocks, like objects and time samples, they may have a fundamental impact on our conscious experience of the sensory world, including our perception of time itself.


INTRODUCTION
The concept of an inherent temporal structure underlying visual processing has a longstanding history in experimental psychology going-at least-back to early psychophysical investigations on "moments in perception" (e.g., Shallice, 1964;Stroud, 1956; for a review, see VanRullen & Koch, 2003).According to this view, the visual system operates in discrete steps over time and parses the continuous flow of visual signals into distinct temporal fragments.Indeed, discretization makes much sense from a computational standpoint, because it reduces complexity and enables cascade-like information transfer.In recent years, neuroscience research has seen a resurgence of this perspective, because of ample empirical evidence from electrophysiology that connects the typical time frames of "perceptual moments" (≈100-200 msec) to brain oscillations (for reviews, see Herzog, Kammer, & Scharnowski, 2016;VanRullen, 2016).By now, it has become increasingly clear that multiple neural oscillations on different time scales, especially in the theta-(at ≈5 Hz) and alphafrequency bands (at ≈10 Hz), are involved in the temporal organization of visual perception and attention.The main focus of this Perspective lies on alpha oscillations, which are typically associated with the functional inhibition of brain processes (Haegens, Nácher, Luna, Romo, & Jensen, 2011; for a review, see Klimesch, Sauseng, & Hanslmayr, 2007).In terms of temporal dynamics, alpha oscillations are thought to set time frames for occipital cortex excitability via the pulsed release from inhibition, such that visual inputs are segmented into discrete samples ( VanRullen, Reddy, & Koch, 2005; for a review, see Mazaheri & Jensen, 2010).In this view, alpha oscillations over occipital areas may function as the frame rate of visual perception and define its temporal resolution (Cecere, Rees, & Romei, 2015;Samaha & Postle, 2015).By contrast, theta oscillations are hypothesized to temporally coordinate covert visual attention, in a similar way and on a similar time scale compared with overt, ocularmotor exploration via saccadic eye movements (Otero-Millan, Troncoso, Macknik, Serrano-Pedraza, & Martinez-Conde, 2008).For example, covert attention samples with theta-frequency dynamics between different spatial locations (Landau & Fries, 2012), between different objects (Fiebelkorn, Saalmann, & Kastner, 2013), or for apparent motion (Ronconi, Oosterhof, Bonmassar, & Melcher, 2017).In line with these findings, an emerging theoretical view for theta oscillations identifies their functional role in alternating covert attention-states between sensory (sampling) and motor (shifting) processes in parietal cortex (Fiebelkorn & Kastner, 2019).Theta and alpha oscillations may also be temporally coordinated relative to each other, to support the sampling (at alpha) and switching (at theta) between multiple objects (Jia, Liu, Fang, & Luo, 2017).In summary, theta oscillations seem to operate sequentially over longer time scales (≈200 msec) and across space, whereas alpha oscillations are thought to structure visual processing in sampling windows over faster time scales (≈100 msec) and at the same spatial location.
A common characteristic of previous explanations for the functional role of brain oscillations is their reductionist, technical flavor by depicting visual-temporal processing akin to a cinematographic system.In the case of theta, visual attention is typically viewed as a surveillance camera, which samples and shifts the processing focus across spatial locations, to cope with exploitation-exploration problems.In the case of alpha, vision should theoretically work like a video camera, which takes distinct sensory snapshots and temporally integrates them into coherent perceptions.As highlighted by a recent controversy about this topic, however, these reductionist-technical metaphors for visual-temporal processing might not be the complete picture, at least in the case of alpha.In fact, the conception of an alpha-driven video camera-like discretization device for sensory inputs embedded in the visual system has been severely challenged by recent inconsistent evidence.In the following, I will first briefly review the core aspects of this debate and then lay down arguments for a new view on alpha and visual-temporal processing.This will involve a selective review focusing on recent findings that specifically emphasize the tight relationship between alpha and top-down control on temporal perception.In summary, this Perspective aims to turn away from the simple reductionist-technical view, which may help to resolve some important related aspects of the recent controversial findings in the field, but may also open new vistas about how we consciously experience the flow of time itself.

THE STATE OF THE ART AND THE CONTROVERSY ABOUT ALPHA AS VISUAL-TEMPORAL RESOLUTION
On the neural level, temporally discrete processing could result from the intermittent and synchronous firing of excitatory and inhibitory neural ensembles, which gives rise to oscillations in their mean local field potential.Current interpretations of discrete temporal perception focus on the integration processes between this ongoing discretization machinery because of transient neural network activity and input signals from downstream sensory neurons.They are framed around the idea that oscillatory cycles in the brain constitute discrete event boundaries for input integration (which is in line with earlier theories; Varela, Toro, Roy John, & Schwartz, 1981;Kristofferson, 1967).In short: Two signals falling within the same cycle are integrated into one event, whereas temporal segregation occurs when they fall into two different cycles.Let us call this perspective in the following the "alpha as visual-temporal resolution" hypothesis.Temporal integration processes are typically investigated in fusion-, integration-, or masking paradigms by experimentally manipulating the temporal relationships between (typically two) successive sensory stimuli (Di Lollo, 1980).In more recent years, neuroscience research has combined similar temporal integration paradigms with the analysis of the phase and frequency of neural oscillations (for a review, see Lundqvist & Wutz, 2022).The underlying rationale was that faster oscillatory frequencies reflect a higher visual-temporal resolution, whereas slower frequencies indicate a lower resolution.In particular, two recent studies have been very influential for pushing this view.
The first prime example is a study by Samaha and Postle (2015), in which the participants viewed two brief visual flashes with varying ISIs between them or one visual flash of overall identical duration on different trials.The participants' task was to indicate whether they had perceived one or two flashes per trial.They observed the expected pattern for such two-flash fusion experiments: With short ISIs, the two flashes were predominantly seen as one single flash, whereas with longer ISIs, the proportion of twoflash responses reached almost perfect discrimination performance.Moreover, this work showed clear correlational evidence that individual participants with finer visual-temporal resolution, as indicated by lower two-flash fusion thresholds, generate faster alpha oscillations over occipital EEG electrodes.A second important observation was reported by Cecere and colleagues (2015).Here, the authors measured the temporal resolution of the visual system indirectly through illusory cross-modal integration in the sound-induced flash illusion.This phenomenon describes the illusory percept of a second visual flash, when two auditory sounds are presented within a circumscribed temporal interval together with actually just one single visual flash.Although the involved functional mechanisms of this illusion are far from fully resolved, the underlying assumption is that cross-modal interactions are only observed for sounds presented within the temporal frame, in which a visual percept is established.In the study, the proportion of two-flash responses declined monotonically with increasing intersound interval, such that short ISIs reliably induced the illusory percept of two flashes and long ISIs favored accurate visual perception of a single flash.Furthermore, there was a strong correlation between the participants' inflection points of their psychometric functions with the simultaneously recorded alpha peak frequency over occipital EEG channels.As in the study by Samaha and Postle (2015) above, this finding suggests that individuals with finer visual-temporal resolution, as indicated by lower inflection points, generate faster alpha oscillations.Moreover, this study went one step further and established a causal link by applying transcranial alternating current stimulation at each participant's individual alpha frequency and at ± 2 Hz to actively influence the frequency of the participants' ongoing alpha oscillations during the task.Consistent with their hypothesis, they found that speeding-up versus slowing-down alpha frequencies via transcranial alternating current stimulation resulted in lower versus greater inflection points, respectively.In summary, these findings indicate even a causal role of the frequency of occipital alpha oscillations on the temporal resolution of (audio-) visual processing.
These hallmark findings for the "alpha as visualtemporal resolution" hypothesis were recently challenged by a new work by Buergers and Noppeney (2022).In this extensive, multiday EEG study, the authors used thorough psychophysics in a signal-detection framework, to distinguish between the effects of the observers' perceptual sensitivity and their response criterion for one-versus two-flash reports across different sensory contexts.The participants viewed all possible audiovisual stimulus combinations with one or two visual flashes together with either zero, one, or two auditory sounds in a randomly intermixed trial design.Akin to the above-mentioned work, the participants' task on each trial was to indicate whether they had perceived one or two flashes.Importantly, this included the very same stimulus conditions as in the two-flash fusion experiment by Samaha and Postle (2015), which mirrors the zerosound condition here, and as in the sound-induced flash illusion reported by Cecere and colleagues (2015), which effectively boils down here to the one flash with two sounds condition.Thus, this work enables not only a direct comparison across the different sensory contexts but it can also be seen as a replicative effort of the previously reported strong relationship between peak alpha frequency and the visual-temporal resolution in two-flash fusion experiments.However, the main findings by Buergers and Noppeney (2022) were not as could be expected based on the previous work: They report null findings for individual peak alpha frequency both on sensitivity and criterion across all sensory contexts.

A NEW VIEW ON ALPHA: PERCEPTION SETS
What might be possible reasons for these apparent empirical inconsistencies about the influence of peak alpha frequency on the visual-temporal resolution?In my eyes, the critical distinction for comparing and interpreting these two seemingly contradictory sets of results consists in the experimental settings of presenting one versus many sensory contexts across different trials.As outlined above in the two-flash fusion experiment by Samaha and Postle (2015), there was only one sensory context in the visual modality with either two brief flashes or one flash, which, however, was matched with the two-flash trials in total physical duration.Likewise, the alpha effects for the sound-induced flash illusion were found when one flash was presented together with two sounds over and over again for the entire course of the experiment (Cecere et al., 2015).Conceivably, such repetitive experimental settings with identical or highly ambiguous stimulus conditions are crucial for observing alpha effects on the visualtemporal resolution but also on perception more general.Under those conditions, the perceptual reality depends more strongly on endogenous, top-down processes rather than on objective physical parameters.By contrast, Buergers and Noppeney (2022) showed a different sensory context with either zero, one, or two sounds on each trial in a fully random trial design.As also noted by the authors themselves, their design may have altered the participants' perceptual state on each trial, because they were repeatedly confronted with different stimuli and responses, often including veridical one-and two flash percepts.Moreover, the randomly intermixed stimulus presentation may have prevented the building-up of expectations and predictive processes about upcoming perceptual experiences.In other words, it may have hindered the participants' mind to take over their perceptual reality.It would be very interesting to know whether the participants were able to generate strong alpha activity in absolute terms under these conditions.Taking the seemingly contradictory findings together suggests that the recruitment of alpha oscillations, including the modulation of alpha frequency, for temporally resolving visual stimuli, reflects much more the observers' endogenous processing dynamics rather than the discrete processing of sensory inputs.This view concurs well with previous interpretations for alpha activity as a marker for different brain states for externally versus internally oriented processing modes (for a review, see Hanslmayr, Gross, Klimesch, & Shapiro, 2011).Several lines of evidence point into this direction: First, the well-known strong alpha amplitudes under resting-state conditions can be seen as anecdotal evidence, especially when the eyes are shut, that is, when the sensory reality is completely blocked from having an impact on neural function and only internal processes play a role.A second line of evidence comes from TMS studies, which suggest that alpha oscillations play a key role for regulating cortical excitability to external inputs (Romei, Gross, & Thut, 2010) and for internally generated phosphene perception (Dugué, Marque, & VanRullen, 2011;Romei et al., 2008).A third paramount example is spatial cueing, in which less alpha power commonly indexes external spatial attention toward the attended hemifield and greater power is associated with the ignored hemifield and more internally oriented processes (Haegens, Händel, & Jensen, 2011;Sauseng et al., 2005).Fourth in visual detection and discrimination experiments, significant portions of trial-bytrial variability can be explained by differences in alpha activity, that is, by stronger prestimulus alpha power and phase coupling with poorer performance ( Van Dijk, Schoffelen, Oostenveld, & Jensen, 2008;Hanslmayr et al., 2007), as well as by alpha phase opposition immediately before detected versus missed stimuli (Busch, Dubois, & VanRullen, 2009;Mathewson, Gratton, Fabiani, Beck, & Ro, 2009).This may indicate stronger internal workings in the mind's eye and less processing of external inputs at times of strong alpha activity or at certain phases within an alpha cycle.Along those lines, it is interesting to note that the phase opposition effects only reliably predict performance on trials with high alpha power (Mathewson et al., 2009), which indicates that the two measures may not be independent.Moreover, recent evidence shows that their impact on perceptual performance may involve distinct neural populations, that is, mainly occipital and inferior-temporal (IT) areas for power and additionally prefrontal regions for phase (Zazio, Ruhnau, Weisz, & Wutz, 2022).In summary, these findings suggest that strong alpha activity indicates a stronger focus on internal, top-down processes, which come to play when external stimulations are subtle and less variable.Alpha dynamics may indicate what I will in the following refer to the perceiver's internal "perception set" (in analogy to the familiar quotation of a mind set).Perception sets arise from previous sensory experiences building internally stored perceptual knowledge, which is mounted in neural networks that reverberate through alpha frequency channels.They may reflect top-down strategies or biases projected onto sensory inputs, to support goaldirected behavior and build up coherent perceptions.There is ample empirical evidence, including some of my own work, that alpha dynamics reflect such top-down projections.To illustrate this point, I will briefly summarize three examples that show top-down influences by alpha on the visual-temporal resolution, on object processing and on image content.

CASE 1: ALPHA AS PERCEPTION SET FOR VISUAL-TEMPORAL RESOLUTION
First and most tightly related to the relationship between alpha frequency and the visual-temporal resolution, alpha frequency can be top-down modulated by the observer according to the task demands for temporal segregation versus integration of visual stimuli ( Wutz, Melcher, & Samaha, 2018).In this study, the participants viewed two brief display frames, which were separated by a short ISI and each of which contained annuli on predefined locations.In separate blocks, the participants' task was to identify either a missing-or an odd-element's location.The two display frames had to be temporally integrated over the brief ISI, to locate the missing element's location.Conversely to locate the odd element's location, the two display frames had to be kept separate over time.Importantly, the visual stimuli were identical on each trial and the ISI between the display frames was pre-set for each individual participant to match their performance in either task at a given psychometric threshold.Thus, any effects between the two tasks cannot be attributed to sensory processing differences or more domain-general differences, like difficulty or attention demands.The concurrently recorded magnetoencephalography (MEG) signals showed that the participants' occipital and IT alpha frequency just before and during stimulus processing slowed down for the temporal integration task and sped up when temporal segregation was required.In line with the "alpha as visual-temporal resolution" hypothesis, these findings indicate a strong relationship between peak alpha frequency and whether we temporally combine visual samples into a simultaneous percept versus whether we perceive them as successive, separate instances over time.Beyond that, they reveal the endogenous top-down control of the observer over their individual alpha frequency according to and timed to visual task demands.In this view, alpha dynamics are not primarily a hallmark of the discretization of sensory stimuli per se but moreover of our own perception set for how we process the temporal flow of sensory inputs on the millisecond scale.We can either process it input by input and each time rapidly build perceptions or we can stay receptive for a while longer before we form a final percept, to emphasize temporal continuity.

CASE 2: ALPHA AS PERCEPTION SET FOR OBJECT PROCESSING
As a second example, the strategic modulation of alpha dynamics by top-down task goals was also observed over longer time scales on the level of object processing ( Wutz, Zazio, & Weisz, 2020).This is an important point, because objects are a core building block of visual cognition on an intermediate processing level between low-level sensory stimuli and high-level conscious experiences (Kahneman, Treisman, & Gibbs, 1992).In this MEG study, the participants viewed multiple, randomly moving objects among distractors over several seconds.After a memory delay period, they had to indicate either the final location of one object from a predefined pool of target objects (i.e., a partial report procedure) or the mean position of all target objects (i.e., the geometrical centroid).This required the participants to process the dynamic scene either object by object or summarize them as a group.As in the study above, the stimuli, responses, and also the demand to keep track of multiple spatial locations over time were held constant between the two conditions.In addition, the two conditions were again run in separate experimental blocks.During the tracking of the moving objects and during the memory delay, there was increased alpha-band power packed into brief oscillatory activity bursts in inferior parietal cortex.These activity bursts were diagnostic for the performed task (local objects vs. global groups) and the number of processed objects.There were more alpha-burst occurrences on object-versus grouplevel processing trials and below/at versus above the typical capacity limit (at ≈4 objects).Moreover, greater alpha-bursting activity was also observed in a dual task condition, in which the required task was cued after the trial, and they were extended into the subsequent memory delay.This suggests that the effects reflect internal processing dynamics, which are free from sensory influences and are strategically tied to the required object-level processing.The altered alpha dynamics seem to reflect the observer's endogenous perception set for processing dynamic scenes as multiple local objects.Conversely, alpha bursting broke down when object-level processing was above capacity or when it was not required for the task because of grouping strategies.Furthermore, the findings suggest that perception sets can structure visual-temporal processing over several seconds.
The third example study by Rassi, Wutz, Müller-Voggel, and Weisz (2019) depicts the interpretation of alpha dynamics in terms of perception sets reverberating in pre-established semantic pathways very graphically.This study went beyond previous evidence for an influence of alpha on temporal or object processing, that is, whether an object is seen at a given time.Instead, it showed the impact of prestimulus alpha fluctuations on the content of perception, that is, whether one or another object is seen at a given time.The authors used the well-known Rubin's vase-face illusion, which is the perception of a bi-stable visual image either as a vase in the image center or as two faces directed toward each other.This perceptually ambiguous image was briefly presented during the MEG recording, and the participants' task was to report their perceived interpretation on each trial.Again, the physical stimuli were identical on each trial and the perceptual reports were stochastic over trials; that is, there was no time-varying bias to report one or the other percept.Multivariate pattern analysis showed robust decoding of the perceptual report (vase vs. face) in primary visual cortex ( V1) and in the fusiform face area (FFA) after image onset.Most importantly, however, before stimulus onset, there was stronger connectivity in the alpha-band between V1 and FFA when the participants subsequently reported to have seen a face.Granger causality analysis revealed stronger feedback information flow from FFA to V1.The strength of this prestimulus feedback connectivity in the alpha-band predicted not only the category of the upcoming percept but also the strength of the poststimulus neural activity associated with that percept.In a followup study, similar results were found under conditions of binocular rivalry (Rassi, Wutz, Peatfield, & Weisz, 2022).In summary, this body of work illustrates that alpha reflects internal processing dynamics in pre-established neural networks, or, in other words, it reflects stored perception sets, which in turn can strongly bias the moment-by-moment content of our visual experiences.

DISCUSSION
The recent inconsistent findings about the impact of alpha frequency in two-flash fusion experiments have sparked a debate on principles about the "alpha as visual-temporal resolution" hypothesis.In this article, I have laid down a possible interpretation of the current data that aims to bring the seemingly contradictory findings together.Whereas strong alpha effects were observed in blocked experimental designs with repetitive, ambiguous stimuli and constant sensory contexts (Cecere et al., 2015;Samaha & Postle, 2015), there was no reliable impact when perceptual experiences were more often veridical and when different sensory contexts alternated randomly over the trials (Buergers & Noppeney, 2022).At large, these data suggest that alpha influences perception more strongly when it depends more on endogenous, topdown processes rather than on the objective physical truth.From there, a new perspective emerged, in which alpha dynamics are not primarily associated with the discretization of sensory inputs per se but furthermore reflect the observer's internal processing dynamics: their so called perception sets.Perception sets represent internally stored knowledge for how to build up perceptions, which is formed from previous sensory experiences and can be strategically applied to support goal-directed behaviors.They root in pre-established neural networks, which communicate through alpha frequency channels.
This new perspective builds on previous findings that relate alpha activity primarily to internal, top-down processing modes (see the mini-review above), which immediately raises the question of how these internal processes might work.I have outlined three example cases for alpha-driven perception sets in recent neuroscience experiments, to illustrate that they function as a building plan or a structural design for perceptions, over which the observer has top-down control.Consistent with the "alpha as visual-temporal resolution" hypothesis, the first example case shows the strategic modulation of alpha frequency in V1 and IT areas, to regulate temporal frames for input sampling and integration ( Wutz et al., 2018).This top-down control on alpha frequency and phase may not only be used to alter the observer's visual-temporal resolution, but may also have a fundamental impact on temporal prediction (Samaha, Bauer, Cimaroli, & Postle, 2015) and the perception of causality (Cravo, Santos, Reyes, Caetano, & Claessens, 2015).In the second example case, alpha-bursting activity in inferior parietal cortex was associated with the intentional switching between object and group levels during the attentive tracking of objects over time ( Wutz et al., 2020).In line with recent evidence for a strong role of alpha for feature binding (Zhang, Zhang, Cai, Luo, & Fang, 2019), this finding indicates that alpha may also reflect goal-directed influence on the assembly of objects within and across time frames and the building up of complex, dynamic scenes over extended time periods.Finally, the third example case reveals alpha-timed connectivity pathways between V1 and FFA employed for resolving perceptual conflicts in favor of certain, behaviorally relevant image categories, like faces (Rassi et al., 2019(Rassi et al., , 2022)).These findings demonstrate the top-down impact of alpha activity on inscribing content into perceptions and confirm its important role for perceptual inference and decision-making in case of perceptual ambiguity (Shen, Han, Chen, & Chen, 2019;Sherman, Kanai, Seth, & VanRullen, 2016).Overall, this view fits well with recent work from the developmental psychology literature, which revealed spectral-temporal differences for category decoding between adults and 6to 8-month-old infants, possibly reflecting different, agerelated perception sets for certain image categories.For adults, category decoding emerged rapidly in the alpha band, whereas equivalent category signals in infants peaked more slowly and involved primarily the theta band (Xie et al., 2022).The slower theta-based category signals seen in infants may be our innate rhythm for learning new associations and thus more in tune with the natural way we explore our visual reality and make sense of it (Landau & Fries, 2012).Conversely, the ultra-rapid categorization skills seen previously in adult humans ( VanRullen & Thorpe, 2001) may be acquired through the course of neural development and then make use of already established semantic networks, which communicate through alpha frequency channels and incorporate learned, topdown knowledge.
In terms of neural mechanisms, top-down influences on perception have been previously related to crossfrequency interactions between nested alpha-gamma oscillations ( Jensen, Gips, Bergmann, & Bonnefond, 2014;Jensen, Bonnefond, & VanRullen, 2012).In this view, feature-selective neurons in visual cortex may code for bottom-up inputs with gamma activity (or spiking) and enable the detection of change and motion on the millisecond scale.Conversely, distributed neural ensembles oscillating with alpha frequencies may top-down modulate local activity and convey time frames that already contain a blueprint for coherent perceptual experiences.Alpha-gamma interactions may also involve cross-laminar interactions, in which top-down influences from deep layers regulate excitability and sensory processing in the superficial and input layers ( Van Kerkoerle et al., 2014;Spaak, Bonnefond, Maier, Leopold, & Jensen, 2012).Another possibility may be traveling waves between occipital and parietal cortex, through which higher-level, topdown priors iteratively and recursively match with and refine lower-level sensory processing (Zhigalov & Jensen, 2023;Alamia & VanRullen, 2019).Such perceptual echoes may shape ongoing perception by expectations and predictive processes resulting from internally stored knowledge.Of course, the influence of subcortical connections, in particular of thalamic contributions, cannot be ruled out, because of the inherent limitations of magneto-/ electroencephalography-based source localizations.
The idea of pre-established, endogenous factors influencing ongoing perception is not new.For example, the perception sets discussed here bear close resemblance to earlier descriptions of "visual routines" (Roelfsema, Lamme, & Spekreijse, 2000;Ullman, 1984).This concept describes specialized sequences put together from more basic elemental operations and implemented in the visual system, to analyze incoming sensory signals in a fast, efficient, and predefined manner.Typical examples are visual search, texture segregation, or contour grouping.What is new here is that the identified perception sets are differentiated enough to engage in high-level visual processes, like categories, but also in more intermediate-to low-level aspects, like objects and time samples.Thus, similar mechanisms may organize perceptual experiences in favor of one or the other image category, process dynamic scenes as multiple local objects or as a grouped swarm, as well as combine time samples into a simultaneous percept or separate them to give rise to the perception of temporal succession.This is so, because perception sets reflect the reverberation of basic cognitive processes retrieved from a stored complex knowledge-based system.It is knowledge-based because it provides the observers with the ability to selectively retrieve and replay stored information that represents the meaning of sensory inputs.This view concurs with previous pioneering theories (Klimesch, 2012), which interpret alpha activity as the temporal structure of semantically orientating, or "knowing what is seen" with respect to its time, space, and context.In this sense, perception sets may represent a direct gateway to conscious experiences, because recent influential theories identify the key attribute of consciousness-supporting systems in global and differentiated (i.e., nonstereotypical) connectivity, which represents integrated knowledge (Tononi, 2004).
As a final note, perception sets seem to influence perception down to its very basic building blocks, including its temporal organization.The reason for this may be that the brain generates its own temporal structure, which is largely organized by oscillations (Buzsaki & Draguhn, 2004).Because perception sets are embedded into brain networks, which reverberate through alpha-frequency channels, their processing dynamics may in turn influence the way we perceive the temporal aspects of our reality.Because our subjective perception of the temporal duration of events correlates strongly with the amount of information that we can retrieve from them ( Wutz, Shukla, Bapi, & Melcher, 2015), this may alter the quantity and the quality of our conscious experiences themselves ( Van Wassenhove, 2017).The work presented here suggests that we can top-down regulate the timing and speed of neural and perceptual processes relative to the arrival of sensory inputs according to behavioral goals.Consequently, the perception of temporally successive events may not result from a clock-like, cinematographic discretization device but instead it may be a logical corollary from a top-down controlled, selective temporal attention mechanism that functions in a similar way to its spatial counterpart by selecting, individuating, and integrating discrete visual units.Parsing the continuous flow of sensory signals into distinct, successive fragments of time may thus be a top-down strategy of the visual system, to prevent information overflow and construct physically coherent perceptions.For example, it would be very confusing to perceive the same object simultaneously at two different spatial locations, when it is moving.Conversely, certain meditation or mindfulness practices may use the top-down control on the temporal organization of perception to remain receptive, but also attentive over longer time periods, to integrate more detailed, convergent inputs and understand more complex relationships.Thus, our perception of time may ultimately depend on our perspective or our perception set for it, which in turn are governed by the temporal dynamics of our neural processes (for a similar argument, see Rovelli, 2019).In this view, space and time may not be the "pure forms of … sensation," as Kant (1781) had put it, but instead the internally generated activity of our mind may play a key role.Or to put it short, "mind over matter."

Conclusion
Alpha dynamics in the brain may not per se reflect the temporal resolution of discrete visual processing but much more our own idiosyncratic projections on the visual world, which may alter our perception of time itself.
Corresponding author: Andreas Wutz, Department of Psychology and Centre for Cognitive Neuroscience, University of Salzburg, Salzburg, Austria, or via e-mail: andreas.wutz@plus.ac.at.