Spherical arena reveals optokinetic response tuning to 1 stimulus location, size and frequency across entire 2 visual field of larval zebrafish

Many animals have large visual fields, allowing them to observe almost any stimulus in 61 their surround. The underlying sensory circuits have evolved to sample those visual 62 field regions most informative to the animal. These regions can vary between different 63 visually mediated behaviours, such as stabilisation and hunting behaviour. Despite this, 64 relatively small displays are often used in vision neuroscience, making it difficult to 65 study the tuning of the visual system to specific visual field locations. To overcome these 66 technical limitations and reveal the organisation of motion circuits with respect


Introduction
The layout of the retina and the visual system as a whole evolved to serve specific behavioural tasks animals need to perform in order to survive in their respective habitats.A well-known example is the position of the eyes in the head which varies between hunting animals (frontal eyes) and animals that frequently need to avoid predation (lateral eyes) (Cronin, 2014).Hunting animals keep the prey within particular visual field regions to maximize behavioural performance (Bianco, Kampff, & Engert, 2011;Hoy, Yavorska, Wehr, & Niell, 2016;Smolka, Zeil, & Hemmi, 2011;Yoshimatsu, Schröder, Berens, & Baden, 2019;Zhang, Kim, Sanes, & Meister, 2012).To avoid predation, however, it is useful to observe a large proportion of visual space, especially those regions in which predators are most likely to occur (Smolka et al., 2011;Zhang et al., 2012).The ecological significance of visual stimuli thus depends on their location within the visual field, and it is paralleled by non-uniform processing channels across the retina.This non-uniformity manifests as an area centralis or a fovea in many species, which is a region of heightened photoreceptor density in the central retina and serves to increase visual performance in the corresponding visual field regions.Photoreceptor densities put a direct physical limit on performance parameters such as spatial resolution (Haug, Biehlmaier, Mueller, & Neuhauss, 2010;Merigan & Katz, 1990).In addition to these restrictions mediated by the peripheral sensory circuitry, an animal's use of certain visual field regions is also affected by behaviour-specific neural pathways and orientation behaviour.The resulting combination of retinal and extra-retinal anisotropies affects the behavioural performance in different tasks -such as feeding and stabilisation behaviour -depending on visual field location (Baden et al., 2013;Bianco et al., 2011;Hoy et al., 2016;Murasugi & Howard, 1989;Shimizu et al., 2010;Yang & Guo, 2013;Zimmermann et al., 2018).
Investigating behavioural performance limits and non-uniformities can offer insights into the processing capabilities and ecological adaptations of vertebrate brains, especially if they can be studied and quantitatively understood at each processing step.The larval zebrafish is a promising organism for such an endeavour, since its brain is small and a wide array of experimental techniques is available (Baier & Scott, 2009;McLean & Fetcho, 2011).Zebrafish are lateral-eyed animals and have a large visual field, which covers 163° per eye (Easter & Nicola, 1996).Their retina contains four different cone photoreceptor types (Nawrocki, Bremiller, Streisinger, & Kaplan, 1985), each distributed differently across the retina.UV photoreceptors are densest in the ventrotemporal retina (area temporalis ventralis), whereas the red, green and blue photoreceptors cover more central retinal regions (Zimmermann et al., 2018).
Although zebrafish larvae perform a wide range of visually mediated behaviours, ranging from prey capture (Trivedi & Bollmann, 2013) and escape behaviour (Heap, Vanwalleghem, Thompson, Favre-Bulle, & Scott, 2018) to stabilisation behaviour (Kubo et al., 2014;Orger, Kampff, Severi, Bollmann, & Engert, 2008), the importance of stimulus location within the visual field is still not well understood in most cases (but see (Bianco et al., 2011) for prey capture).During visually mediated stabilisation behaviours, such as optokinetic and optomotor responses, animals move their eyes and bodies, respectively, in order to stabilize the retinal image and/or the body position relative to the visual surround.The optokinetic response (OKR) consists of reflexively executed stereotypical eye movements, in which phases of stimulus "tracking" (slow phase) are interrupted by quick phases.In the quick phases, eye position is reset by a saccade in the direction opposite to stimulus motion.In humans, optokinetic responses are strongest in the central visual field (Howard & Ohmi, 1984).Furthermore, lower visual field locations of the stimulus evoke stronger OKR than upper visual field locations, which likely represents an adaptation to the rich optic flow information available from the structures on the ground in the natural environments of primates (Hafed & Chen, 2016;Murasugi & Howard, 1989).
In zebrafish larvae, OKR behaviour has been used extensively to assess visual function in genetically altered animals (Brockerhoff et al., 1995;Muto et al., 2005;Neuhauss et al., 1999) and OKR tuning to the velocity, frequency, and contrast of grating stimuli has been measured (Clark, 1981;Cohen, Matsuo, & Raphan, 1977;Huang & Neuhauss, 2008;Rinner, Rick, & Neuhauss, 2005).While zebrafish can distinguish rotational from translational optic flow to evoke appropriate optokinetic and optomotor responses (Kubo et al., 2014;Naumann et al., 2016;Wang, Hinz, Haikala, Reiff, & Arrenberg, 2019), it is still unclear which regions of the visual field zebrafish preferentially observe in these behaviours.The aquatic lifestyle, in combination with the preferred swimming depths (Lindsey, Smith, & Croll, 2010), might cause the lower visual field to contain less relevant information when compared to terrestrial animals.This in turn might have resulted in behavioural biases to other -more informative-visual field regions.A corresponding systematic behavioural quantification in zebrafish, which would relate OKR behaviour to naturally occurring motion statistics and the underlying neuronal representations in retina and retino-recipient brain structures, has been prevented by technical limitations.Specifically, little is known about (i) the dependence of OKR gain on stimulus location or (ii) on stimulus sizes, (iii) possible interactions between stimulus location, size and frequency, (iv) putative asymmetries between the left and right hemispheres of the visual field, and (v) the relationship between a putative dependence of OKR on stimulus location and zebrafish retinal architecture.
In other species with large visual fields, such as Drosophila, full-surround stimulation setups have been used successfully (Kim, Rouault, Druckmann, & Jayaraman, 2017;Maisak et al., 2013;Reiser & Dickinson, 2008), but to date, none has been used for zebrafish.This is at least partly due to their aquatic environment and the associated difficulties regarding the refraction of stimulus light at the air-water interface.Such distortions of shape can be partially compensated by pre-emptively altering the shape of the stimulus.However, using regular computer screens or video projection, the resulting luminance profiles remain anisotropic, potentially biasing the response toward brighter locations.Additionally, most stimulus arenas cannot easily be combined with the recording of neural activity, e.g., via calcium imaging, as stimulus light and calcium fluorescence overlap in both the spectral and time domains.These challenges must be overcome to enable full-field visual stimulation in zebrafish neurophysiology experiments.
Here, we present a novel visual stimulus arena for aquatic animals, which covers almost the entire surround of the animal, and use it to characterize the anisotropy of the zebrafish OKR across different visual field locations as well as the tuning to stimulus size, spatial frequency and leftside versus rightside stimulus locations.We find that the OKR is mostly symmetric across both eyes and driven most strongly by lateral stimulus locations.These stimulus locations approximately correspond to a retinal region of increased photoreceptor density.By rotating the experimental setup and/or the animal, our control experiments revealed that additional extra-retinal determinants of OKR drive exist as well.Our characterization of OKR drive across the visual field will help inform bottom-up models of the vertebrate neural pathways underlying optokinetic and other visual behaviour.To avoid stimulus aberrations at the air-to-water interface, we designed a nearly spherical glass bulb containing fish and medium.With this design, stimulus light from the surrounding arena is virtually not refracted (light is orthogonal to the air-to-water interface), and reaches the eyes of the zebrafish larva in a straight line.Thus, no geometric corrections are required during stimulus design, and stimulus luminance is expected to be nearly isotropic across the visual field.We additionally designed the setup to minimise visual obstruction, and developed a new embedding technique to immobilise the larva at the tip of a narrow glass triangle (see Methods).In almost all possible positions, fish can thus perceive stimuli without interference.The distance between most of the adjacent LED pairs is smaller than the photoreceptor spacing in the larval retina (Haug et al., 2010;Tappeiner et al., 2012), resulting in a good spatial resolution across the majority of the spherical arena surface (see detailed discussion in S1 Text).But as flat square LED tiles cannot be perfectly arranged on a spherical surface, small triangular gaps are unavoidable.More importantly, several gaps in LED coverage, resulting from structural elements of the arena, were restricted mainly to the back, the top, and bottom of the animal.The "keel" behind and in front of the fish supports the horizontal "ribs", and the circular openings in the top and bottom accommodate the optical path for eye tracking or scanning microscopy.

Stimulus position dependence of the optokinetic response
Horizontally moving vertical bars reliably elicit OKR in zebrafish larvae (Beck, Gilland, Tank, & Baker, 2004).We used a stimulus which rotated clock-and counter clockwise with a sinusoidal velocity pattern (velocity amplitude 12.5 degree/sec, frequency of the velocity envelope 0.1 Hz, spatial frequency 0.06 cycles/degree, Fig 3a).OKR performance was calculated by measuring the amplitude of the resulting OKR slowphase eye movements after the saccades had been removed (Fig. 3b-d, Methods).The OKR gain then corresponds to the speed of the slow-phase eye movements divided by the speed of the stimulus (which is equivalent to the ratio of the eye position and stimulus position amplitudes).To quantify position tuning, we cropped the presented gratings (Fig 3a) to a disk-shaped area of constant size, centred on one of 38 nearly equidistant parts of the visual field (Fig 4a).The distribution of positions was symmetric between the left and right, upper and lower, as well as front and rear hemispheres, with some stimuli falling right on the edge between two hemispheres.As permanent asymmetries in a stimulus arena or in its surroundings could affect OKR gain, we therefore repeated our experiments in a second group of larvae after rotating the arena by 180 degrees (S1d-e Fig), then matched the resulting pairs of OKR gains during data analysis (see Methods).Any remaining asymmetries in the OKR distributions should result from biological lateralisation.
To overcome our spatially discrete sampling, we then fit our data with a symmetric bimodal function comprised of two Gaussian-like two-dimensional distributions on the stimulus sphere surface (see Methods), to determine the location of highest OKR gain evoked by ipsilateral stimuli and contralateral stimuli, respectively.We observed significantly higher OKR gains in response to nearly lateral stimuli, and lower gains across the rest of the visual field (Fig 4b,Fig 4c,Fig 4d).OKR was strongest for stimuli near an azimuth of 80.3 degrees and an elevation of 6.1 degrees for the left side (in body-centred coordinates), as well as -77.0 and -2.0 degrees for the right side -slightly rostral of the lateral meridian, and very close to the equator.Note that due to the fast stimulus speeds, the absolute slow phase eye velocities were high, while the OKR gain was relatively low.We chose such high stimulus speeds in order to minimize the experimental recording time needed to obtain reliable OKR measurements for each visual field location.
As our stimulus arena is not completely covered by LEDs (Fig 1c,Fig 1d), some areas remain permanently dark.These could interfere with the perception of stimuli presented on adjacent LEDs.This is especially relevant as LED coverage is almost perfect for some stimulus positions (near the equator), whereas the size of triangular holes increases at others (towards the poles).We thus performed control experiments comparing the OKR gain evoked by a stimulus in a densely-covered part of the arena to the OKR gain evoked by same stimulus, but in the presence of additional dark triangular patches (S1a Fig) .We found no significant difference in OKR gain (S1c Fig,left,p<0.05).Additionally, we performed another series of control experiments using a dark shape mimicking the dark structural elements, the front "keel" of the arena (S1b Fig) .Again, we found no difference in OKR gain (S1c Fig,right,p<0.05), and thus ruled out that position dependence data was corrupted by incomplete LED coverage.Since the eyes were moving freely in our experiments, the range of eye positions during OKR, or so-called beating field (Schaerer & Kirschfeld, 2000), could have changed with stimulus position.We found that animals instead maintained similar median horizontal eye positions (e.g., left eye: -83.7±1.8 degrees, right eye: 80.3±1.9 degrees, average median ± standard deviation of medians, n=7 fish, S2a Fig) even for the most peripheral stimulus positions.
A priori, it is unclear whether the sampling preference originates from the peculiarities of the sensory periphery in the eye, or the behavioural relevance inferred by central brain processing.The former would prioritise stimulus preference based on its position relative to the eye and, by extension, its representation on specific parts of the retina.The latter would prioritise stimulus preference based on its position relative to the environment, such as a predator approaching from the water surface.To distinguish both possible effects in the context of OKR, as well as to reveal any stimulus asymmetries accidentally introduced during the experiment, we performed control experiments with larvae embedded upside-down (i.e., with their dorsum towards the lower pole of the arena).Unexpectedly, the elevation of highest OKR gains relative to the eye changed from slightly above to slightly below the equator of the visual field when comparing upright to inverted fish (Figure S1h-k): When upright, azimuths and bodycentred elevations of the peaks of the best fit to data were -67.8° and 8.4° for the left eye, as well as 73.1° and 6.2° for the right eye.When inverted, -88.8° and -1.2° for the left eye, as well as 80.0° and -12.2° for the right eye.These numbers were obtained from the gains of those eyes to which any given stimulus was directly visible.Because the set of visual stimuli presented to inverted fish stemmed from an earlier stimulus protocol with less even sampling of the visual field, a slight scaling of azimuths and elevations is expected.The consistent sign-change of the elevation, however, is not.We performed a permutation test in which embedding-direction labels were randomly swapped while stimulus-location labels were maintained, and the Gaussian-type fit to data was then repeated on each permuted dataset.This test confirmed that fish preferred upward (in environmental conditions) rather than dorsalward elevations (p < 0.05, S5 Code).
Adjustment by the fish of its vertical resting eye position between the upright and inverted body positions would have been a simple potential explanation for this result.However, time-lapse frontal microscopy images (S3 Fig, Methods) ruled this out, since for both upside-up and upside-down embedding the eyes were inclined by an average of about 4 degrees towards the dorsum (3.5±1.0° for the left eye, 4.9±0.8°for the right eye, mean ± s.e.m.).We also tested the influence of camera and infrared light (840 nm) positions Fig. 4) -which in either case should have been invisible to the fish (Shcherbakov et al., 2013) -and found that they could indeed not explain the observed differences.As the body-centred preferred location in upside-down embedded fish flipped from slightly dorsal to slightly ventral (S1j Fig), and thus remained virtually unchanged in environmental coordinates, optokinetic stimulus location preference appears to be related to the behavioural relevance of these stimulus positions, and cannot merely be caused by retinal feedforward circuitry.

Yoking of the non-stimulated eye
Almost all stimuli were presented monocularly -that is, in a position visible to only one of the two laterally located eyes.Without exception, zebrafish larvae responded with yoked movements of both the stimulated and unstimulated eye.To rule out reflections of stimuli within the arena, we performed a series of experiments in which the unstimulated side of the glass bulb had been covered with a matte, black sheet of plastic.Reflections on the glass-air interface would otherwise cause monocular stimuli (that should only be visible to the ipsilateral eye) to also be seen by the contralateral eye.Yoking indices (YI) were significantly different between the regular monocular setup (YI≈0.7)and the control setup (YI≈0.2) containing the black surface on the side of the unstimulated eye, confirming that yoking indices had been affected by reflections (S4 Fig, an index of 1 indicated completely monocular eye movements, an index of 0 perfectly conjugate eye movements/yoking).This suggests a crucial role for sharp reflections of the stimulus pattern at the glass-to-air or water-to-air interface (Arrenberg et al., unpublished) in our spherical setup and other commonly used stimulus arenas.We performed additional control experiments using a previously described setup (F. A. Dehmelt, A. von Daranyi, C. Leyden, & A. B. Arrenberg, 2018) with four flat LCD screens for stimulus presentation in a different room.In these experiments, stimuli were presented monocularly or binocularly, and the unstimulated eye was either (i) shielded with a blank, white shield placed directly in front of the displays, (ii) shielded with a matte, black sheet of aluminium foil placed inside the petri dish (control for possible reflections on the Petri dish wall), or (iii) stimulated with a stationary grating.This experiment showed that yoking was much reduced (YI≈0.3)if the non-stimulated eye saw a stationary grating (iii) instead of the white or black shields (i-ii, YI≈0.1) or a binocular motion stimulus (YI≈0) (S5 Fig, p<0.05).

Spatial asymmetries
As multiple previous studies reported left-right asymmetries in zebrafish visuomotor processing and behaviour other than OKR (Andrew et al., 2009;Facchin, Argenton, & Bisazza, 2009;Sovrano & Andrew, 2006;Watkins, Miklósi, & Andrew, 2004), we computed an asymmetry index ‫ܤ‬ (Methods) to reveal whether zebrafish OKR is lateralised in individuals or across the population.We did not observe a general asymmetry between the response of the left and right eyes.Rather, our data is consistent with three distinct sources of asymmetry: individual bias towards one eye, shared bias across individuals, and asymmetries induced by the environment (including the experimental setup and stimulus arena).Through multivariate linear regression, we fit a linear model of asymmetries to our data (Methods), which combined data from fish embedded upside-up (Fig. 3 Our results show that the OKR behaviour is mostly symmetric across both eyes, with individual fish oftentimes having a dominant eye due to seemingly random bias for one eye (lateralisation) across fish.Some of the observed asymmetries are consistent with external factors.Therefore, the OKR gains presented in Fig 4 have been corrected in order to present only biologically meaningful differences (Methods).

Spatial frequency dependence of the optokinetic response
We investigated the spatial frequency tuning of OKR behaviour across visual field positions by presenting 7 different spatial frequencies of the basic stimulus, each cropped into a planar angle of 40 degrees, at different visual field locations.Because we held the temporal frequency constant, stimulus velocity decreased whenever spatial frequency increased.These 7 disk-shaped stimuli were presented while centred on one of 6 possible locations in different parts of the visual field, with 3 locations on each hemisphere: one near the location of highest OKR gain as determined in our experiments on position dependence (Fig 4), one in a nasal location, and one in a lower temporal location.In total, we thus presented 42 distinct types of stimuli (Table 3).For each stimulus location and eye, the highest OKR gain was observed at a spatial frequency of 0.03 to 0.05 cycles/degree (Fig 5).We did not observe any strong modulation of frequency dependence by stimulus location.

Size dependence of the optokinetic response
It is unclear to what extent small stimuli are effective in driving OKR.We therefore employed a stimulus protocol with 7 OKR stimuli of different covered areas on the sphere.Spatial and temporal frequencies were not altered, so bars appeared with the same width and velocity profile in all cases.These 7 disk-shaped stimuli were presented while centred on one of 6 possible locations, identical to those used to study frequency dependence, again yielding 42 unique stimuli.Stimulus area size was chosen at logarithmic intervals, ranging from stimuli almost as small as the spatial resolution of the zebrafish retina, to stimuli covering the entire arena.In line with many other psychophysical processes, OKR gain increased sigmoidally with the logarithm of stimulus size (Fig 6).Weak OKR behaviour was already observable in response to very small stimulus diameters (e.g.10.4° -0.8 %), and reached half-maximum performance at a stimulus size of roughly 120° (a quarter of the entire surrounding space).As was the case for spatial frequency dependence, we did not observe any strong modulation of size dependence by stimulus location, although OKR gains of the left eye appeared more dependent on stimulus location than those of the right eye.

Optokinetic response gain covaries with retinal density of long-wave sensitive photoreceptors
We hypothesized that the non-uniform distribution of the OKR gain is related to the surface density of photoreceptors and investigate this using data from a recent study (Zimmermann et al., 2018)  For comparison, density maps in retinal coordinates, not body coordinates, are shown in S7 Fig.To register our OKR gain data onto the photoreceptor density maps, we took the average eye position into account, which was located horizontally at -84.8±6.2 degrees azimuth for the left and 80.1±6.5 deg for the right eye (mean±st.dev., n=7 fish), and vertically at 3.5±3.2degrees elevation for the left and 4.9±2.7 deg for the right eye (n=10 fish).For green, blue and especially red receptors, the stimulus centred on the position of maximum OKR gain, as inferred from our oculomotor experiments (Fig 4b,Fig 4d), covers a region of near-maximum photoreceptor density (white ring in Figure 6).For ultraviolet receptors, there is no strong correlation between photoreceptor density and OKR gain.

Discussion
The spherical arena introduced here covers a large proportion of the surround and therefore lends itself to many other investigations of zebrafish and other species with limited visual acuity.In comparison to other feasible technical solutions, such as video projections setups, our spherical LED array stimulus setup provides homogeneous light and contrast across the entire stimulation area.Thereby stimulus design becomes much easier since the stimulus warping and conditioning becomes unnecessary.When combined with calcium imaging in a scanning microscope, the use of LED arrays provides the additional advantage that the visual stimulus can be controlled with high temporal precision, fast enough to interlace visual stimuli and line scans.
Despite the common notion that OKR is a whole-field gaze stabilisation behaviour, our results show that the OKR can be driven effectively by moving stimuli that cover only small parts of the spherical surface (with a half-maximum OKR gain around 25 % of the surface).Our experiment on spatial frequency dependence further demonstrates that the spatial frequency tuning of the OKR is similar across retinal locations.Here we suggest two plausible explanations, (1) existing photoreceptor density differences are compensated for centrally in visual brain areas mediating the OKR, or (2) the photoreceptor density is simply not the limiting factor for OKR performance in this frequency range.
Previous reports indicated that the zebrafish visual system is lateralised with the left eye preferentially assessing novel stimuli, while the right eye being associated with decisions to respond (Miklosi & Andrew, 1999;Sovrano & Andrew, 2006).We therefore investigated whether there are consistent behavioural asymmetries for the OKR and observed almost no consistent, inter-individual asymmetries in OKR between the left and right hemispheres of the visual field, other than those induced by external conditions.Individual fish, however, show a wide and continuous range of biases towards either hemisphere.
We measured OKR gain in larvae at 5-7 days post fertilisation (dpf) of age, whereas our data on photoreceptor densities corresponds to slightly older, 7-8 dpf larvae.Owing to their rapid development, zebrafish undergo noticeable morphological changes on this timescale, but the zebrafish retina itself is known to be well developed by 5 dpf (Avanesov & Malicki, 2010) and stable OKR behaviour is exhibited from then on.Crucially, we did not observe a salient age-dependent spatial shift of maximum OKR gain between our 5 dpf and 7 dpf larvae (data not shown).
The qualitative match between red cone retinal photoreceptor densities and the beating field surrounding the stimulus position driving the highest OKR gains may provide a mechanistic bottom-up explanation of the gradual differences associated with OKR.The correspondence of red photoreceptor density with the visual field map of OKR gain is consistent with the fact that our LEDs emit light at 568 nm peak power, which should have activated the red cones most.Our data is also in agreement with observations in other species, that the OKR drive is strongest when the moving stimulus covers the central visual field (Howard & Ohmi, 1984;Murasugi & Howard, 1989;Shimizu et al., 2010).In a simplistic, additive view of visual processing, increased numbers of receptors would be triggered by incident light, gradually leading to stronger activation of retinal ganglion cells and downstream circuits, eventually driving extraocular eye muscles towards higher amplitudes.Instead, or in addition, the increased resolution offered by denser distributions of photoreceptors could help reduce sensory uncertainty (and increase visual acuity).It is unclear however, how more uncertainty would lead to consistently lower OKR gains instead of a repeated switching between periods of higher and lower gains.If sensory uncertainty were indeed crucial to OKR tuning, presenting blurred or otherwise deteriorated stimuli should reduce OKR gain in disfavoured locations more strongly than those in favoured locations.It is also possible that correlations between OKR gain and photoreceptor density are entirely coincidental, as our spatial frequency tuning results for different stimulus locations had implied.Genetic zebrafish variants with altered photoreceptor distributions would thus be a valuable tool for further studies.
The pronounced increase in OKR gain for nearly lateral stimulus locations raises questions regarding the top-down behavioural significance of these directions in the natural habitat of larval zebrafish.While reduced OKR gains near the limits of the visual field might be expected, we show that gains are also reduced in the frontal binocular area, as well as in upper and lower visual field locations.Interestingly, when animals were mounted upside-down, they still prefer stimulus locations just above the equator of the environment.This result cannot be explained by shifted resting vertical eye positions in the inverted animal, which we have measured.Instead, it could potentially be explained by multimodal integration, where body orientation appears to influence the preferred OKR stimulus locations via the vestibular system (Lafortune, Ireland, & Jell, 1990;Pettorossi, Ferraresi, Botti, Panichi, & Barmack, 2011;Zolotilina, Eremina, & Orlov, 1995).Furthermore, it seems possible that the unequal distribution of OKR gains across the visual field might be related to the optic flow statistics that naturally occur in the habitats of larval zebrafish (Arunachalam, Raja, Vijayakumar, Malaiammal, & Mayden, 2013;Engeszer, Patterson, Rao, & Parichy, 2007;Parichy, 2015;Spence, Gerlach, Lawrence, & Smith, 2008;Zimmermann et al., 2018).For another stabilisation behaviour in zebrafish, the optomotor response (Orger et al., 2008), we have recently shown that the underlying circuits prefer stimulus locations in the lower temporal visual field to drive forward optomotor swimming (Wang, Hinz, Zhang, Thiele, & Arrenberg, preprint 2019).Therefore, the optokinetic and the optomotor response are preferentially driven by different regions in the visual field, suggesting that they occur in response to different types of optic flow in natural habitats.Both the optokinetic and the optomotor response (OKR, OMR) are thought to be mediated by the pretectum (Kubo et al., 2014;Naumann et al., 2016), and we therefore hypothesize that circuits mediating OKR and OMR segregate within the pretectum and form neuronal ensembles with mostly different receptive field centre locations.Future studies on pretectal visual feature extraction in the context of naturalistic stimulus statistics are needed in order to establish a more complete picture of the visual pathways and computations underlying zebrafish OKR, OMR and other visually mediated behaviours.

Animal experiments
Animal experiments were performed in accordance with licenses granted by local government authorities (Regierungspräsidium Tübingen) in accordance with German federal law and Baden-Württemberg state law.Approval of this license followed consultation of both in-house animal welfare officers and an external ethics board appointed by the local government.We used mitfa-/-animals (5-7 dpf) for the experiments, because this strain lacks skin pigmentation that could interfere with eye tracking.

Coordinate systems and conventions
To remain consistent with the conventions adopted to describe stimuli and eye positions in previous publications, we adopted an East-North-Up, or ENU, geographic coordinate system.In this system, all positions are relative to the fish itself, and expressed as azimuth (horizontal angle, with positive values to the right of the fish), elevation (vertical angle, with positive values above the fish), and radius (or distance to the fish).The point directly in front of the fish (at the rostrum) is located at [0°, 0°] azimuth and elevation.Azimuth angles cover the range [-180°, 180°] and elevation angles [-90°, 90°].Azimuth sign is opposite to the conventional mathematical annotation of angles when looking top-down onto the fish.Supplementary materials provide a detailed description of the coordinate systems used, and for transformations between Cartesian and geographic coordinate systems, please consult the supplementary material (S1 Text).

Design of the spherical arena
Geometric design of the arena.The overall layout of the spherical arena was optimised to contain the near maximum number of LED tiles that can be driven by our hardware controllers (232 out of a possible 240), and arrange them with minimal gaps in between.Also, care was taken to leave sufficient gaps near the top and bottom poles to insert the optical pathway used to illuminate and record fish behaviour.A further 8 LED tiles could be included as optional covers for the top and bottom poles, bringing the total number to 240 out of 240 possible.A detailed walkthrough of the mathematical planning is found in the supplementary material (S1 Text).
Arena elements.The arena consists of a 3D-printed structural scaffold; green light emitting LED tiles (Kingbright TA08-81CGKWA, 20x20 mm each, peak power at 568 nm) hot-glued to the scaffold and connected by cable to a set of circuit boards with hardware controllers (Fig 2d Electronics and circuit design.To provide hardware control to the LEDs, we used circuit boards designs and C controller code provided by Alexander Borst (MPI of Neurobiology, Martinsried) and Väinö Haikala and Dierk Reiff (University of Freiburg) (Joesch, Plett, Borst, & Reiff, 2008).Any custom circuit board design and code could be substituted for these, and alternative solutions exist, e.g., in Drosophila vision research (Suver, Huda, Iwasaki, Safarik, & Dickinson, 2016).At the front end, these electronics control the 8x8 LED matrices, which are multiplexed in time to allow control of individual LEDs with just 8 input and 8 output pins.
Optical pathway, illumination and video recording.A high power infrared LED was placed outside the stimulus arena and its light diffused by a sheet of milk glass and then guided towards the fish through the top hole of the arena (Fig 2b , Fig2d).Non-absorbed IR light exits through the bottom hole, where it is focused onto an IR-sensitive camera.Between the arena and the proximal lens, a neutral density filter (NE13B, Thorlabs, ND 1.3) was inserted half-way (off-axis) into the optic pathway using an optical filter slider (CFH2/M, Thorlabs, positioned in about 5 cm distance of the camera CCD chip) to improve image contrast (oblique detection).We used the 840nm, 125 degree IR emitter Roschwege Star-IR840-01-00-00 (procured via Conrad Electronic GmbH as item 491118-62) in custom casing, lenses LB1309 and LB1374, mirror PF20-03-P01 (ThorLabs GmbH), and IR-sensitive camera DMK23U618 (TheImagingSource GmbH).Approximate distances between elements are 14.5cm (IR source to first lens), 12cm (first lens to centre of glass bulb), 22cm (bulb centre to mirror centre), 8.5cm (mirror centre to second lens), 28.5 cm (second lens to camera objective).Fish mounting device.Larvae were mounted inside a custom-built glass bulb (Fig 2c).Its nearly spherical shape minimises reflection and refraction at the glass surface.It was filled with E3 solution, so there was no liquid-to-air boundary distorting visual stimuli.Through an opening on one side, we inserted a glass rod, on the tip of which we immobilise the larva in agarose gel (see description of the embedding procedure below).The fish was mounted in such a way that the head protruded the tip of the narrow triangular glass stage, which ensured that visual stimuli are virtually unobstructed by the glass triangle on their way to the eyes (Fig 2c).The entire glass structure was held at the centre of the spherical arena by metal parts attached to the arena scaffold itself (Fig 2i).Care was taken to remove air bubbles and completely fill the glass bulb with E3 medium.
Computer-assisted design and 3D printing.To arrange the square LED tiles across a nearly spherical surface, we 3D-printed a structural scaffold or "skeleton", consisting of a reinforced prime meridian major circle ("keel") and several lighter minor circles of latitude (Fig 2e).Available hardware controllers allow for up to 240 LED matrices in parallel, so we chose the exact size of the scaffold (106.5 mm in diameter) to hold as many of these as possible while minimising gaps in between.As individual LEDs are arranged in a rectangular pattern on each of the flat LED tiles, and stimuli defined by true meridians (arcs from pole to pole, or straight vertical lines in Mercator projection), pixelation of the stimulus is inevitable, and stimulus edges become increasing stairshaped near the poles.Because of the poor visual acuity of zebrafish larvae (see S1 Text), this should not affect OKR behaviour.Our design further includes two holes necessary for behavioural recordings and two-photon imaging, located at the North and South poles of the sphere.We placed the largest elements of the structural scaffold behind the zebrafish (Fig 2).Given the ~160° azimuth coverage per eye in combination with a slight eye convergence at rest, this minimises the loss of useful stimulation area.
We printed all structures out of polylactide (PLA) filament using an Ultimaker 2 printer (Ultimaker B.V.).Parts were assembled using a hot glue gun.

Visual field coverage
We can estimate the fraction of the visual field effectively covered by LEDs based on a projection of LED tiles onto a unit sphere.The area ‫ܣ‬ of a surface segment delimited by the projection of the edges of a single tile onto the sphere centre is given by where ‫ݑ‬ ఈ and ‫ݑ‬ ఉ are the Cartesian unit vectors spanning the tile itself and (±λ, ±λ) is the Cartesian position of the four edges of another rectangle.This smaller rectangle is the straight projection of the sphere segment onto the tile, ߣ ൌ sinሺtan ିଵ ሺ‫ܦ‬ 2ܴ ௌ ⁄ ሻሻ where ܴ ௌ ൌ 106.5 ݉݉ is the sphere radius and ‫ܦ‬ ൌ 21 ݉݉ is the length of the edges of the tile.Summing over the number of tiles included in the arena, the equations above can be used to estimate the total coverage of the sphere by its square LED tiles to around 66.5% of the surface area.Using this strict estimate, the small gaps in between LED arrays are counted as not covered, even though we successfully demonstrated that they are small enough not to affect OKR performance, likely due to the low visual acuity of zebrafish larvae.A more meaningful estimate of coverage must take these results into account (S2 Text), and in fact reveals that stimuli presented with our LEDs effectively cover 85.6% of all possible directions.In core parts of the visual field, coverage exceeds 90%.

Stimulus design
We designed visual stimuli, transformed them to geographical coordinates, and mapped them onto the physical positions of each individual LED with custom MATLAB software.We have made this code available for free under a Creative Commons NC-BY-SA 4.0 license (S1 Code).The mapped stimulus was then uploaded to the hardware controllers using custom-built C code originally developed by Väinö Haikala.
We chose to present stimuli centred on 36 different locations distributed nearly equidistantly across the spherical arena, as well as symmetrically distributed between the left and right, upper and lower, front and rear hemispheres (Fig 4a).These positions were determined numerically: First, we populated one eighth of the sphere surface by placing one stimulus centre at a fixed location at the intersection of the equator and the most lateral meridian (90 degrees azimuth, 0 degrees elevation), constraining two more stimulus centres to move along this lateral meridian (90 degrees azimuth, initially random positive elevation), constraining yet another stimulus centre to move along the equator (initially random positive azimuth, 0 degrees elevation), and allowing three more stimulus centre to move freely across the surface of this eighth of the sphere (initially random positive azimuth and elevation), for a total of 7 positions.Second, we placed additional stimulus centres onto all 29 positions that were mirror-symmetric to the initial 7, with mirror planes placed between the six hemispheres listed above.We then simulated interactions between all 38 stimulus centres akin to electromagnetic repulsion, until a stable pattern emerged.Resulting coordinate values were rounded for convenience (S2 Code).

Embedding procedure
To immobilise fish on the glass tip inside the sphere, we developed a novel embedding method.A cast of the glass triangle (and of the glass rod on which it is mounted) was made by placing it inside a Petri dish, which was then filled with a heated 2% agarose solution.After agarose cooled down and polymerised, agarose within a few millimetres of the tip of the glass triangle was manually removed, before removing the triangle itself.The resulting cast was stored in a refrigerator and then used to hold the glass triangle during all subsequent embedding procedures, limiting the freedom of movement of the larva to be embedded.The triangle was stored separately at room temperature.Before each embedding, we coated the glass triangle with polylysine and dried it overnight in an incubator at 29 degrees Celsius to increase the subsequent adhesion of agarose.We then returned the glass triangle into its cast, and constructed a tight, 2 mm high circular barrier around its tip using pieces of congealed agarose.A larva was picked up with as little water as possible using a glass pipette and very briefly placed inside 1 ml of 1.6% low-melting agarose solution at 37 degrees Celsius.Using the same pipette the larvae was then transferred onto the glass triangle along with the entire agarose.After the larva had been placed a few millimetres away from the tip of the glass triangle, the orientation of the animal could be manipulated with custom-made platinum wire tools without touching its body, as previously described (Arrenberg 2016).Before the agarose congeals, swimming motions of the animal were exploited to guide it towards the tip, and ensure an upright posture.The final position of the fish was chosen as such that its eyes are aligned with the axis of the glass rod, its body is upright without any rotation, and its head protrudes forward from the tip of the glass triangle, maximising the fraction of its field of view unobstructed by glass elements.The agarose was left to congeal, and the Petri dish was filled with in E3 solution.The freshly congealed agarose surrounding the glass triangle was then removed using additional, flattened platinum wire tools, once again separating the glass triangle from the cast.Using the same tools, we finally cut triangular holes into the remaining agarose to completely free both eyes.To ensure free movement of both eyes, we confirmed the presence of large and even optokinetic eye movements using a striped paper drum before the experiment.
We then pick up the glass triangle by the glass rod attached to it, cut off any remaining agarose detritus, and place it inside the E3-filled glass bulb.No air remained in the bulb, and no pieces of detritus were introduced in to the bulb, as these would accumulate near the top and bottom of the bulb, respectively, interfering with the optical pathway and thus reduce image quality.

Data analysis
Video images of behaving zebrafish larvae were processed in real time using a precursor of the ZebEyeTrack software (F. A. Dehmelt et al., 2018), available from www.zebeyetrack.com.The resulting traces of angular eye position were combined with analogue output signals from the hardware controllers of the spherical arena to match eye movement to the various stimulus phases.This was achieved using custom-built MATLAB software, which is freely available under a Creative Commons NC-BY-SA 4.0 license (S3 Code).
Data was then analysed further by detecting and removing saccades, and fitting a piecewise sinusoidal function to the eye position traces.The parameters of the fit were then compared to the parameters of the equally sinusoidally changing angular positions of the stimulus.For each fish, eye, and stimulus phase, the ratio between the amplitude of the fit to eye position and the amplitude of stimulus position represents one value of the gain of the optokinetic response.
For each interval between two subsequent saccades, or inter-saccade-interval (ISI), the fit function to the eye position data is defined by ݂ሺ‫ݐ‬ ‫א‬ ‫ܫܵܫ‬ ሻ ൌ െܿ ଵ cosሺܿ ଶ ‫ݐ‬ ܿ ଷ ሻ ܿ ାଷ Here, ‫ݐ‬ are the time stamps of data points falling within the ݇-th ISI, ܿ ଵ , ܿ ଶ and ܿ ଷ are the amplitude, frequency and phase shift of oscillation across all ISIs, and ܿ ାଷ is a different constant offset within each ISI, which corrects for the eye position offsets brought about by each saccade.The best fit value ܿ ଵ was taken as an approximation of the amplitude ܽ ா of eye movement, ܽ ா ൎ ܿ ଵ .The process of cropping saccades from the raw data and fitting a sinusoid to the remaining raw data is demonstrated in Fig 3 .The OKR gain g is a common measure of visuomotor function.It is defined as the ratio between the amplitude ܽ ா of eye movement and the amplitude ܽ ௌ of the visual stimulus evoking eye movement, In other words, OKR gain indicates the degree to which zebrafish larvae track a given visual stimulus.For each eye, a single gain value per stimulus phase is computed.While a value of 1 would indicate a "perfect" match between eye movement and stimulus motion, zebrafish larvae at 5 dpf often exhibit much lower OKR gains (Rinner et al., 2005).While highest gains are obtained for very slowly moving stimuli, in our experiments, we chose higher stimulus velocities.Although these velocities are only tracked with small gains, the absolute velocities of the eyes are high, which allowed us to collect data with high signal-to-noise levels and reduce the needed recording time.
To rule out asymmetries induced by the arena itself or by its surroundings, we recorded two sets of stimulus-position-dependence data, one with the arena in its original configuration, and another with the arena rotated by 180 degrees (S1h-i Fig) .Each set contained data from multiple larvae, and with at least 2 separate presentations of each stimulus position.For each stimulus position, and separately for both sets of data, we computed the median OKR gain across fish and stimulus repetitions.We then averaged between the two datasets, yielding a single OKR gain value per stimulus position.As asymmetries are less crucial when studying stimulus frequency and size (Fig 5), we did not repeat those with a rotated arena, and could thus omit the final step of the analysis.
Von Mises-Fisher fits to data Based on the assumption that OKR position tuning could be normally distributed with respect to each angle, OKR gain would be approximated by a two-dimensional, circular von Mises-Fisher function centred on the preferred stimulus location.Because the eyes are yoked, the OKR gain of one eye will be high around its own preferred position, as well as around the preferred position of the contralateral eye.To account for this, we fit the sum of two independent von Mises-Fisher functions to our OKR gain data: Here, ߦ is the Cartesian coordinate vector of a point on the sphere surface, and corresponds to the geographic coordinates azimuth ߙ and elevation ߚ. ߤ ଵ and ߤ ଶ are The parameter ߮ is 1 for the default arena setup, and -1 during control experiments with a horizontally flipped arena setup.To determine ܾ ଵ , ܾ ଶ and ܾ ଷ , we fit this system of equations by multivariate linear regression to experimentally observed bias indices.The system is initially underdetermined, as it contains ݊ 2 coefficients for every ݊ fish observed.However, if we assume that individual biases average out across the population, we can determine the population-wide coefficients ܾ ଵ and ܾ ଶ by setting aside the individual ܾ ଷ, for a first regression.To determine how far each individual    Colours correspond to the location of stimulus centres shown in (c).There is no consistent dependence on stimulus location of either frequency tuning or size tuning.

Figures and tables
Data from n=7 fish for frequency dependence and another n=7 fish for size dependence.Tables Table 1.Stimulus parameters (whole-field and hemispheres).These stimuli consisted of a horizontally moving grating, either covering the entire visual field or cropped to one of the 6 principal hemispheres (front, rear, upper, lower, left, right).The stimulus mask is determined by the azimuth ߙ (degrees) and elevation ߚ (degrees) of its centre, as well as its size, given by the angle ߜ (degrees) it spans.The moving grating is characterised by its spatial frequency SF (cycles/degree), temporal frequency TF (cycles/sec), peak velocity v (deg/sec), and oscillation period T (sec LED arena allows presentation of stimuli across the visual field By combining 3D printing with electronic solutions developed in Drosophila vision research, we constructed a spherical stimulus arena containing 14,848 individual LEDs covering over 90% of the visual field of zebrafish larvae (Fig 1, Fig 2).Using infrared illumination via an optical pathway coupled into the sphere (Fig 2c, Fig 2d), we tracked eye movements of larval zebrafish during presentation of visual stimuli (Florian A. Dehmelt, Adam von Daranyi, Claire Leyden, & Aristides B. Arrenberg, 2018).
, Fig. 4), upside-down (S1d Fig, S1h Fig) and data obtained with the arena rotated relative to the fish (S1e Fig, S4i Fig).. Regression coefficients for external causes of asymmetry were similar to or smaller than those for biological causes (S6a Fig), and individual biases from fish to fish were broadly and symmetrically distributed from left to right (mean coefficient 3.7 • 10 ିସ േ 120.0 • 10 ିସ st.dev., n =15), so that no evidence was found for a strong and consistent lateralisation of OKR behaviour across animals (S6b Fig).
on photoreceptor densities in explanted eye cups of 7-8 day old zebrafish larvae.As shown in Fig 6b, ultraviolet receptor density exhibits a clear peak in the upper frontal part of the visual field, whereas red, green and blue receptors (Fig 6a) are most concentrated across a wider region near the intersection of the equator and lateral meridian, with a bias to the upper visual field (in body coordinates).
); 8x8 individual LEDs contained in each tile (Fig 2a); a nearly spherical glass bulb filled with water, into which the immobilised larvae are inserted (Fig 2g, middle); a metal rotation mount attached to the scaffold "keel" of the arena (Fig 2g, right), holding the glass bulb in place and allowing corrections of pitch and roll angles; the optical pathway with an infrared light source to illuminate the fish from below (Fig 2e), and a USB camera for video recording of the transmission image (Fig 2d).
Cartesian coordinate vectors pointing to the centre of the two distributions, and ߢ ଵ and ߢ ଶ express their respective concentrations, or narrowness.The parameters ߤ , ߢ , the amplitudes ‫ܥ‬ ଵ , ‫ܥ‬ ଶ and the offset ‫ܥ‬ ଷ are fit numerically.Yoking index, asymmetry and mathematical modellingTo quantify asymmetries in the gain between left and right, stimulated and unstimulated eyes, we introduce the yoking index and ݃ ோ are the OKR gains of the left eye and right eye, measured during the same stimulus phase.Depending on stimulus phase, only the left eye, only the right eye or both eyes may have been stimulated.If the yoking index is positive, the left eye responded more strongly than the right eye; if it is negative, the amplitude of right eye movement was larger.An index of zero indicates "perfect yoking", i.e. identical amplitudes for both eyes.In addition, we define a "bias" index to capture innate or induced asymmetries between responses to stimuli presented within the left or right hemisphere of the visual field, and ݉ ோ are the medians of OKR gains after pooling across either all left-side or all right-side stimulus types (D1-D19 and D20-D38, respectively).Several sources of asymmetry contribute to ‫:ܤ‬ (1) arena-or environment-related differences in stimulus perception, constant across individuals; (2) a biologically encoded preference for one of the two eyes, constant across individuals; (3) inter-individual differences between the eyes, constant across stimulus phases for each individual; (4) other sources of variability unaccounted for, and approximated as a noise term ߟ.We hypothesise that the overall asymmetry observed for each larva ݇ is given by a simple linear combination of these contributions, ‫ܤ‬ ൌ ܾ߮ ଵ ܾ ଶ ܾ ଷ, ߟ

Figure 1 .
Figure 1.Presenting visual stimuli across the visual field.(a) When presented with a horizontal moving stimulus pattern, zebrafish larvae exhibit optokinetic response (OKR) behaviour, where eye movements track stimulus motion to minimise retinal slip.Its slow phase is interrupted by intermittent saccades, and even if only one eye is stimulated (solid arrow), the contralateral eye is indirectly yoked to move along (dashed arrow).(b) Often, experiments on visuomotor behaviour such as OKR sample only a small part of the visual field, whether horizontally or vertically.As different spatial directions may carry different behavioural importance, an ideal stimulation setup should cover all or most of the animal's visual field.For zebrafish larvae, this visual field can be represented by an almost complete unit sphere.(c) We arranged 232 LED tiles with 64 LEDs each across a spherical arena, such that 14,484 LEDs (green dots) covered nearly the entire visual field.(d) The same individual positions, shown in geographic coordinates.Each circle represents a single LED.Each cohesive group of eight-by-eight circles corresponds to the 64 LEDs contained in a single tile.(e) To identify LED and stimulus locations, we use Up-East-North geographic coordinates: Azimuth ߙ describes the horizontal angle, which is zero in front of the animal and, when seen from above, increases for rightward position.Elevation ߚ refers to the vertical angle, which is zero throughout the plane containing the animal, and positive above.(f) The spherical arena is covered in flat square tiles carrying 64 green LEDs each.(g) Its structural backbone is made of a 3D-printed keel and ribs.Left and right hemispheres were constructed as separate units.(h) Across 85-90% of the visual field, we can then present horizontally moving bar patterns of different location, frequency and size to evoke OKR.

Figure 2 .
Figure 2. A spherical LED arena to present visual stimuli across the visual field.(a) LED tiles are arranged in ribbons parallel to the equator, and glued in between structural ribs.Gaps at the top and bottom pole of the sphere allow coupling in an optical pathway for infrared illumination and subsequent video recording of eye movement.(b) Optical pathway for eye movement tracking.(c) To minimise obstructionand refraction, the zebrafish larva is immobilised on the tip of a glass triangle (left) using agarose, which is then inserted into the centre of a spherical glass bulb (middle).This bulb is then mounted into a metal holder (right) and thus placed at the centre of the sphere.(d) Image of the two hemispheres and the camera setup.One hemisphere is mounted on a rail to allow opening and closing the arena.

Figure 3 .
Figure 3. OKR gain is inferred from a piece-wise fit to the slow phase of tracked eye movements.(a) We present a single pattern of horizontally moving bars to evoke OKR (left).Its velocity follows a sinusoidal time course, repeating every 10 seconds for a total of 100 seconds for each stimulus phase (right).(b) OKR gain is the amplitude of eye movement (grey trace) relative to the amplitude of the sinusoidal stimulus (green trace).The OKR gain is often well below 1, e.g. for high stimulus velocities as used here

Figure 4 .
Figure 4. OKR gain depends on stimulus location.(a) The stimulus is cropped to a disk-shaped area 40 degrees in diameter, centred on one of 38 nearly equidistant locations across the entire visual field (left), to yield 38 individual stimuli (right).(b-d) Dots reveal the location of stimulus centres D1-D38.Their colour indicates the average OKR gain across individuals and trials, corrected for external asymmetries.Surface colour of the sphere displays the best von-Mises Fisher fit to the discretely sampled OKR data.Top row: OKR gain of the left eye (b), right eye (d), and the merged data including only direct stimulation of either eye (c), shown from an oblique, rostrodorsal angle.Bottom row: same, but shown directly from the front.OKR gain is significantly higher for lateral stimulus locations and lower across the rest of the visual field.The spatial distribution of OKR gains is well explained by the bimodal sum of two von-Mises Fisher distributions.(e) Mercator projections of OKR gain data shown in panels (b-d).White and grey outlines indicate the area covered by each stimulus type.Numbers indicate average gain values for stimuli centred on this location.Red dots show mean eye position during stimulation.Dashed outline and white shading on panels (b, d, e) indicate indirect stimulation via yoking, i.e., stimuli not directly visible to either the left or right eye.Data from n=7 fish for the original configuration and n=5 fish for the rotated arena, Figure 5. OKR gain depends on stimulus size and frequency.(a) Patterns with 7 different frequencies were cropped to disks of a single size.These disks were placed in 6 different locations for a total of 42 stimuli.cpd: cycles per degree.(b) Patterns with identical spatial frequencies were cropped to disks of 7 different sizes.These disks were also placed in 6 different locations for another set of 42 stimuli.Degrees indicate planar angles subtended by the stimulus outline, so 360° correspond to whole-field stimulation.(a, b) Displaying the entire actual pattern at the size of this figure would make the individual bars hard to distinguish.We thus only show a zoomed-in version of the patterns in which 45 out of 360 degrees azimuth are shown.(c) Coloured dots indicate the 6 locations on which stimuli from a and b were centred, shown from above (top), from front (middle), and from an oblique angle (bottom).(d) OKR gain is unimodally tuned to a wide range of spatial frequency (measured in cycles per degree).(e) OKR gain increases sigmoidally as the area covered by the visual stimulus increases logarithmically (a stimulus size of 1 corresponds to 100% of the spherical surface).(d-e)

Figure 6 .
Figure 6.Maximum OKR gain is consistent with high photoreceptor densities in the retina.Contour lines show retinal photoreceptor density determined by optical measurements of explanted eye cups of 7-8 dpf zebrafish larvae, at increments of 10% of maximum density.Data shown in visual space coordinates relative to the body axis, i.e., 90° azimuth and 0° elevation corresponds to a perfectly lateral direction.To highlight densely covered regions, densities from half-maximum to maximum are additionally shown in shades of colour.Solid circles indicate the location of maximum OKR gain inferred from experiments of type D in 5-7dpf larvae (Fig 4).White outlinesindicate the area that would be covered by a 40° disk-shaped stimulus centred on this location when the eye is in its resting position.As the eyes move within their beating field during OKR, the actual, non-stationary retinal coverage extends further rostrally and caudally.For (a) red, green, and blue photoreceptors, high densities coincide with high OKR gains.(b) For ultraviolet receptors, there is no clear relationship to the OKR gain.(c) For reference, the summed total density of all receptor types combined.We did not observe a significant shift in the position-dependence of maximum OKR gain between groups of larvae at 5 dpf, 6 dpf or 7 dpf of age, consistent with the notion that retinal development is far advanced and the circuits governing OKR behaviour are stable at this developmental stage.
-A21, but with positive azimuth ߙ (right hemisphere) S3 Table.Stimulus parameters (control experiments).These stimuli consisted of a horizontally moving grating displayed on four flat, rectangular stimulus screens surrounding the larva.One pair of screens displayed stimuli visible to the left eye only, and the other pair displayed stimuli to the right eye only.Results shown in S5 Fig.

Table 2 .
). Stimulus parameters (position dependence).These stimuli consisted of a 897 horizontally moving grating, cropped with a disk-shaped stimulus mask, and presented 898 in one of 38 different locations across the visual field.Parameters as in Table 1.Results 899 shown in Fig 4.

Table 3 .
Stimulus parameters (frequency dependence).These stimuli consisted of a horizontally moving grating, cropped with a disk-shaped stimulus mask.At each location, 7 different spatial frequencies and thus velocities were used, while temporal frequency was held constant.Parameters and units as in Table1.Results shown in Fig 5.as V1-V7, but with azimuth ߙ ൌ െ110 and elevation ߚ ൌ െ15 F15-F21 same as V1-V7, but with azimuth ߙ ൌ െ28 and elevation ߚ ൌ 15 F22-F42 same as A1-A21, but with positive azimuth ߙ (right hemisphere) same

Table 4 .
Stimulus parameters (size dependence).These stimuli consisted of a horizontally moving grating, cropped with a disk-shaped stimulus mask.At each location, disks with 7 different, logarithmically spaced areas were shown.Parameters and units as in Table1.Results shown in Fig 5.