Multi-regional circuits underlying visually guided decision-making in Drosophila

Visually guided decision-making requires integration of information from distributed brain areas, necessitating a brain-wide approach to examine its neural mechanisms. New tools in Drosophila melanogaster enable circuits spanning the brain to be charted with single cell-type resolution. Here, we highlight recent advances uncovering the computations and circuits that transform and integrate visual information across the brain to make behavioral choices. Visual information flows from the optic lobes to three primary central brain regions: a sensorimotor mapping area and two 'higher' centers for memory or spatial orientation. Rapid decision-making during predator evasion emerges from the spike timing dynamics in parallel sensorimotor cascades. Goal-directed decisions may occur through memory, navigation and valence processing in the central complex and mushroom bodies.


Introduction
To survive and reproduce, an animal must respond to its surroundings by acquiring and processing sensory information, then selecting relevant behavioral outputs. Which behavior it chooses depends on immediate sensory cues, its current state, and learned associations that convey the value and valence of its options. In highly visual animals such as ourselves or the fruit fly, Drosophila melanogaster, visual information acts through multiple pathways distributed across the brain to influence all these decision factors. How this information is brought together to generate a single unified action choice remains an open question. In some cases, decision-making is not obviously separable from sensory processing or motor planning [1,2], and higher-order processing from different regions may also subsequently converge with sensorimotor pathways to reach a consensus [3]. Thus, a multi-region approach is necessary to fully describe decision mechanisms. In this review, we examine progress in anatomical and functional mapping of fly visual processing over the last five years, and its implications for how behavioral decisions emerge from distributed circuits.
Newly available tools recommend D. melanogaster as a model system for brain-wide interrogation of circuits underlying decision processes. The fly uses a tractably small brain (100 000 neurons) to make sophisticated visually guided behavioral choices, including choosing which remembered landmarks to navigate towards [4,5], deciding how to surmount gaps and other terrain obstacles [6][7][8], choosing whether to fight or court other flies [9][10][11], evaluating egg-laying substrate [12], and selecting a strategy to escape predators [13]. New transgenic libraries [14][15][16] target visual and central brain neurons with single cell type resolution [17][18][19][20][21][22][23], allowing activating, silencing, or monitoring of neural activity in identified neurons of head-fixed flies behaving in virtual reality environments [24,25] and in high-throughput behavior screens [26][27][28][29]. Perhaps most importantly for discovering novel circuits throughout the brain, the largest connectivity mapping of a neural network to date has now been completed for much of the fly central brain, the 'hemibrain connectome' [30 ]. This complements other high-resolution mapping efforts across the whole brain [31], or in individual brain regions [18,20,22,23,[31][32][33][34][35][36][37][38]. New imaging techniques also allow experimenters to monitor neuronal activity in nearly the whole brain [39 ], nerve cord [40], or in multiple specific populations simultaneously [41]. Using this wealth of tools, it is increasingly practical to understand how information is transformed across successive synaptic layers and dissect where choice arises during information processing.
Here, we first review visual pathways through the fly brain as revealed by recent connectomics and single neuron analyses. We next highlight a particular class of neuron at the interface between early visual processing and central brain circuits, offering insight into the representational space in which behavioral choices operate. We then provide an example of how a critical survival decision emerges from successive transformation of information from photoreceptors all the way to muscles. Finally, we conclude that visually informed decisions, even for rapid responses, emerge from the details of processes distributed both vertically within a sensorimotor cascade and horizontally across parallel information processing pathways, supporting the use of model systems with brainwide access to study them.

Visual information flow in the brain
To understand how the brain utilizes visual information to form decisions, we must first understand the pathways carrying visual information in the brain. Visual processing begins in the four neuropil layers of the fly optic lobe (lamina, medulla, lobula, and lobula plate) [42,43]. Here, visual information from photoreceptors is divided into separate channels (e.g. ON, OFF, color), and local comparisons between neighboring pixels extract contrast and motion information [44]. Motor information also provides feedback to most of these layers [45][46][47][48]. Visual projection neurons then carry information out of the optic lobe to three distinct regions of the central brain: the lateral cerebral ganglia (PLP/PVLP/PS, see Figure 1), the central complex (CX), and the mushroom bodies (MB).

Lateral cerebral ganglia
The lateral areas of the central brain, including the posterior lateral and ventrolateral protocerebra (PLP/ PVLP) and posterior slope (PS), receive direct input from the lobula and lobula plate. These areas are implicated as loci for sensorimotor decision-making because they are sites of multisensory convergence [49] and output to many descending neurons [22,50]. The best-known visual input to this region is a set of 60 tangential cells that encode ethologically relevant wide-field visual movement patterns, especially the optic flow associated with self-motion (well-studied in other flies [51]). In addition, a distinct class of 20 columnar neuron types encoding small-field visual features also provide input to this region ( [20], see next section). In support of the interpretation of the PLP/PVLP/PS as a sensorimotor mapping area, both cell classes have members that synapse directly onto descending neurons. Wide-field tangential cells that detect self-rotation converge onto descending neurons for steering during flight [52,53], and small-field columnar neuron types that detect looming synapse onto descending neurons that mediate escape takeoff [22,54 ]. Although these direct connections mediate rapid visual responses, the action selected is not always the same and may vary with context such as locomotor state [55] (also see sensor-to-motor section). Also, most visual projection neurons to this area do not contact descending neurons directly, suggesting further processing of their visual information before actions are selected.

Central complex
In contrast, visual information reaches the CX indirectly via 1-2 other brain regions [20,56 ,57,58]. CX pathways lend additional flexibility to visual responses and help guide goal-directed behaviors. The CX plays a role in orientation [7], navigation [59] and visual memory [5,26] by maintaining an internal compass that integrates visual input and the fly's own angular motion [60,61]. Recent work suggests that persistent activity in a subset of ellipsoid body neurons in the CX allows maintenance of heading direction for up to 30 s without visual input [60], and that plasticity in CX ring neurons from alteration of synaptic weights in recurrent circuits [62 ,63] enables the fly to maintain a stable ongoing sense of direction even as the visual world changes. Furthermore, accumulation of nitric oxide at ring neuron output sites [64 ] allows for a 4-second working memory for visual objects in the environment. The fly uses this compass and working memory in goal-directed navigation to maintain a heading relative to a landmark [65] or to preferentially choose to fly towards a previously cued stripe amongst alternatives [41,66]. Recent work elucidates the elegant mechanism by which a difference between an internally generated goal direction and the current heading could be used to drive quantitatively matched turning and walking actions [65,67], but it remains unclear how the fly chooses such a 'goal' and how this is represented in the brain.

Mushroom bodies
A third major pathway out of the optic lobes leads from the medulla, lobula, and accessory medulla directly [34,68,69] or through the PLP [69] to a higher-order area called the mushroom bodies (MBs). The MBs are wellstudied for their role in learning and memory, particularly for olfactory input [70]. Anatomical studies suggest the MBs may also perform specific multisensory integrations, as input from visual, olfactory and gustatory processing converge in different combinations on the three MB lobes [34]. The MB may play a key role in decision processes through the assignment of valence (i.e. positive or negative quality) to sensory cues. The architecture by which it encodes valence is increasingly being elucidated [18,71,72] and has distinct similarities with information encoding in vertebrates and leeches [73,74]. However, Multi-regional circuits underlying visually guided decision-making in Drosophila Cheong, Siwanowicz and Card 79   Multi-regional connectivity in the visual decision-making pathways of the brain. Early visual processing occurs in the optic lobe (OL), which consists of the lamina (not shown), medulla (ME), lobula (LO), and lobula plate (LOP) [42,43]. From the OL, visual information is carried in parallel to three primary regions of the central brain: (1) Directly to the 'lateral cerebral ganglia' (posterior lateral and ventrolateral protocerebra, PLP/PVLP, and posterior slope, PS) from the LO and LOP [20,51]; (2) Indirectly to the central complex (CX) by the anterior visual pathway (ME to anterior optic tubercle, AOTU, and bulb, BU) [56 ,106], via LO-to-AOTU projecting LC10 neurons [20,57], or via LO/LOP/AOTU projections to the lateral accessory lobe (LAL), which has outputs and inputs with the CX [58]; (3) To the mushroom bodies (MB) from the ME, LO, PLP and accessory medulla (AME) [34,68,69]. The PLP/PVLP, PS, superior medial protocerebrum (SMP), and LAL further output to motor control and coordination centers in different layers of the VNC: intermediate tectulum (IntTct) and lower tectulum (LTct) are intermediate layers implicated in wing and leg coordination, the segmental upper tectulum (UTct) neuropils control the wings, neck, and halteres, and leg neuropils (LegNp) control the legs [22]. Brain anatomical regions named as in Ref. [107]; VNC regions named as in Ref. [108]. (d) Confocal image of fly compound eye (autofluorescence and F-actin) and optic lobe (mAb nc82). The compound eye comprises an array of individual lenses that each collect light from a different point in space. Image courtesy of Michael Reiser. (e)-(k) Example neuron types from the visual pathways of the brain. (e) Lobula columnar type 10a (LC10a) neurons that project from LO to AOTU (OL0019B driver, image courtesy of Aljoscha Nern) [20]. (f) Lobula plate/lobula columnar type 2 (LPLC2) neurons that project from LO/LOP to PVLP (OL0047B driver, image courtesy of Aljoscha Nern) [88 ]. (g) LO projection neuron ( LO PN) that projects from LO to MB (R72D07 driver, image courtesy of Jinzhi Li and Sophie Caron) [69]. (h) Inner and outer ring neurons of the CX that form part of a ring attractor network encoding heading direction (image courtesy of Tanya Wolff and Gerald Rubin). (i) gd Kenyon cells (gd KC) that participate in encoding memory and valence in the MB (MB028B driver, image courtesy of Yoshinori Aso) [18]. (j) Giant Fiber (GF, aka DNp01) descending neuron that projects from PVLP to the VNC (SS02299 driver). (k) DNa01 descending neuron that projects from LAL, PS and the subesophageal zone (SEZ) to the VNC (SS63715 driver); (j) and (k) images courtesy of Shigehiro Namiki [22]. (e), (f), (h), and (i) were obtained by MultiColor FlpOut, (g) by photoactivation; (j) and (k) are bilaterally symmetrical single neurons in sparse split-GAL4 driver lines. Images are counterstained for brain neuropils (grey, mAb nc82). To better show neuron morphology, the following adjustments were made on maximum intensity projections: (e)-(k) adjusted for brightness and contrast; (g), green channel shows only a selected region of interest in the image stack containing the LO PN, and the brightness of the grey channel is nonlinearly adjusted. In addition, panels (e)-(g), (k) are mirrored horizontally.
the role of the MBs in visual processing specifically remains unclear. The direct pathways from the optic lobes to this area bypass visual feature-extracting areas and instead transmit less-processed color and brightness information directly to the MBs. The MBs can use this information to form visual memories that direct behavioral choices. For example, two medulla inputs to the MB, together with intrinsic MB gd Kenyon cells, are required for visual memory in chromatic and intensity discrimination tasks [68], and memory in a visual cuing task requires dopamine signaling in the posterior a/b Kenyon cells [75]. Furthermore, the MBs and CX are functionally linked: visual novelty choice requires the ring neurons of the CX, as well as MB neurons [76].

Descending pathways
Unlike neurons within the lateral cerebral ganglia regions, intrinsic CX and MB neurons do not connect directly to descending neurons [22] but output neurons from these regions project to the PS, lateral accessory lobe (LAL) or superior medial protocerebrum (SMP), where many descending neuron dendrites reside. The three parallel visual streams thus converge in the PS, where the CX and MB may provide some top-down regulation of rapid visual responses; for example, pontine cells in the fanshaped body of the CX modulate the intensity of the optomotor response [77]. Recent work also identifies a small number of 'atypical' MB output neurons, a subset of which synapse directly onto descending neurons just outside the main MB neuropil [78]. These neurons' dendritic fields project both within and outside of the MB, potentially allowing them to integrate MB valence information with information from other pathways. Descending pathways in Drosophila have recently been described at cellular resolution [22,79]. Descending neurons are a bottleneck population of fewer than 500 bilaterally paired neurons that carry information from the central brain to central pattern-generating circuits and motoneurons in the ventral nerve cord. These neurons are generally divided by whether they terminate in nerve cord layers for flight/wing control, leg control, or intermediate layers thought to house microcircuits for coordinating legs and wings. All three types of descending neurons have dendrites in the lateral cerebral ganglia (PLP/PVLP/ PS), with a graded organization along the anterior-posterior axis, such that along this axis descending neurons change from those projecting to intermediate, then leg, and finally to wing control layers [22]. SMP descending neurons project to the intermediate nerve cord layers, and the few identified LAL descending neurons reach wing, leg and intermediate layers. The majority of descending neurons controlling leg movements arise from the subesophageal zone, but little is known about pathways from the primary visual processing areas to this neuropil. Pathways through the CX thus have limited direct access to both wing and leg motor areas, presumably to guide steering in walking and flight, but may have a further influence on flight actions through integration with direct pathways in the PS.
Together, these visual, central, and descending pathways (Table 1) provide a first sketch of the central nervoussystem-wide circuits that mediate behavioral output from visual input (Figure 1b,c, example neuron types in Figure 1e-k). This sketch suggests that independent decision-making processes may occur in parallel in PVP/PVLP/PS and CX circuits, because each takes an independent set of visual features and processes it into motor neuron output in as few as 1-2 synapses (not accounting for recurrent circuit motifs). In contrast, the MBs are farther removed from VNC motoneurons (most outputs besides a few 'atypical' outputs have to pass through at least the SMP and interneurons in the VNC intermediate layer) and so likely influence decision processes by providing input to other decision-making areas. The nerve cord motor centers themselves may also be action selection areas, as they are capable of semi-autonomous motor output [80], and potentially mediate the behavioral state-dependence of optogenetically activated actions for some descending neurons [81]. 80 Whole-brain interactions between neural circuits Table 1 Function of Drosophila brain areas in visually guided decision-making Convergence of multisensory output and descending neurons [22,50] This sketch highlights the shortest paths from input to output based on current knowledge. These paths may underlie rapid innate responses and ongoing navigational choices. Undoubtedly, more visual pathways will be revealed as more is learned about visual decision-making in the brain. Future work should especially focus on examining interconnectivity between and within brain areas. In particular, the neuroanatomy and functional circuits of the PS/PLP/PVLP (which is underdescribed compared to other areas) should be examined for any recurrent circuit architectures that could enable longer time-scale processing in this direct pathway, and pathways connecting the MB and CX should be explored for possible representations of abstract quantities relevant to decision making, such as predictions or valuation of outcomes.

Circuits for visual feature extraction
To uncover the neural basis of decision processes, it is critical to determine how the brain represents the surrounding world. Sensory processing plays a vital role by emphasizing information most relevant to the organism (i. e. salience). These representations are substrates upon which decision-making processes work, which in some cases are not clearly separate from the sensory processing itself. What features of the world are represented in the brain? How are they built? These features must be extracted and weighed from the plethora of visual information bombarding the retina.
In flies, feature-detecting neurons responsive to object orientation have been found in the ellipsoid body of ring neurons in the CX [82]. These bear resemblance to vertebrate 'simple cells' [83], which are also sensitive to object orientation and have similarly organized receptive fields. Though the spatial resolution of these cells is coarse, they are hypothesized to be sufficient for flies to detect and fixate objects of interest during navigation. Thus, these abstract, oriented shapes may form the basis of navigational choices in the CX.
Recent work has also investigated the lobula columnar (LC) neuron types that provide input to the PLP and PVLP as putative higher-order feature detectors. Each type is a population of 50-100 neurons whose dendrites tile retinotopic space in the lobula, but whose axons bundle and ramify in small distinct compartments in the PLP and PVLP called optic glomeruli [20,33]. Based on their anatomy, they were hypothesized to be higherorder feature extractors because their glomerular termini lack obvious spatial structure [20,84]. This has now been confirmed for several subtypes: for example, in LC11 for small object recognition [85], LC4, 6 and 16, and lobula plate/lobula columnar type 2 (LPLC2) for looming detection [20,86,87,88 ], LC10 for detecting small fly sized objects [11], LC15 for moving bars and LC12 for discrete objects of any size [89 ]. In contrast to the flexible feature detectors in the CX [82], the LC neurons studied to date encode motion-based features tuned to shapes and movements that resemble salient natural objects in the fly's environment, such as a looming predator or passing fly.
Of the myriad LC types, LPLC2's neural mechanism of feature extraction is particularly well understood and exemplifies how a feature critical to directing behavior may be derived from sensory input. This neuron type is tuned to detect looming, the outward expansion on the retina that occurs when an object approaches on a collision course (see also next section and Figure 2). LPLC2 receive information from the motion-processing pathway, yet their microcircuitry selects for looming stimuli while being unresponsive to other motion [88 ]. Specifically, LPLC2 cells arborize in the lobula plate retinotopic map and receive input from motion-direction-selective T4 and T5 neurons [88 ]. Each LPLC2 cell has dendritic projections in all four layers of the lobula plate, where each layer corresponds to motion in a single cardinal direction [88 ]. Individual dendritic branches extend in the preferred direction of said layer, giving the neuron an overall cross-shaped dendritic tree suited for detecting outwards motion [88 ]. To prevent responses to inward motion, the LPLC2 cells also receive inhibitory inputs from lobula plate intrinsic (LPi) neurons [88 ]. These cells arborize in one layer of the lobula plate and provide inhibitory input to the layer corresponding to the opposite direction of motion, thus preventing inward and other non-expanding motion from triggering a response [90].

Sensor-to-motor transformation underlying a survival decision
Putting the visual-motor connectome together with functional studies, it is now possible to describe the stages of information processing that lead to a behavioral decision from photoreceptor to muscle. How do these come together to enable decision-making?
A predator attack demands a quick decision from the prey. For animals such as mice and insects, the choices are similar -freeze, flee, or hide. Flies choosing to flee have a further choice of which of two motor programs to use for takeoff: a 'long mode' that coordinates both wings and legs leading to steady flight, or a 'short mode' to jump with legs only, leading to tumbling flight from which the fly must recover [91][92][93]. A short mode takeoff significantly increases survival during the fastest predator attacks, but makes no survival difference during slower ones [93]. The faster the predator attack, the higher the percentage of flies that choose the short mode [93]. Thus, the fly evaluates an ongoing looming stimulus to choose an escape strategy accordingly. How do flies make this choice?
In Figure 2, we describe how the early visual system successively transforms visual input to light on/off changes, contrast, motion, then finally composes those elements to represent ethologically relevant higher order stimuli such as looming, which are exported to the PVLP of the central brain. In the PVLP, two LC types sensitive to looming (LPLC2 and LC4) synapse directly onto a handful of descending neurons. The Giant Fiber (GF) is one of these. The GF gets input in the brain from the visual system and Johnston's Organ mechanosensory cells from the antennae and terminates in the nerve cord, synapsing directly on leg and indirectly (via one interneuron) on wing motor neurons [94][95][96] (Figure 2a,b, Video S1). A single spike of the GF drives a short mode escape. However, the GF is largely reluctant to spike, being selective towards fast-expanding looming stimuli with large angular size [93,97].
The GF's highly selective response derives from its ability to weigh two features of the looming stimulus, represented in the parallel input it receives from LPLC2 and LC4 [54 ,87,88 ] (Figure 2b). Electrophysiological experiments show that LPLC2 encodes the angular size of looming stimuli, while LC4 encodes looming 82 Whole-brain interactions between neural circuits  Sensor-to-motor transformation during a looming escape decision.
(a) Example synaptic chain within the feed-forward network connecting photoreceptors to leg and wing muscles that is activated during a looming-evoked takeoff decision. A schematic of a portion of this network is shown in (b). Neurons in (a) reconstructed from electron microscopy volumes of the brain [31], ventral nerve cord [109], and light microscopy images [22]: Mi1, LPLC2, Giant Fiber (GF), tergotrochanteral motoneuron (TTMn), peripherally synapsing interneuron (PSI), dorsal longitudinal motoneurons (DLMn). The initial two cell classes in the synaptic chain, photoreceptors and lamina neurons, lie outside these volumes and are not pictured. TTM and DLMs rendered artistically. Optic lobe connectivity in (b) based on Ref. [44]. All cell types shown are bilaterally symmetrical. (c) Algorithmic summary of information transformation in the feedforward sensorimotor circuit depicted in (a) and (b) during a predator attack. The predator casts a looming image on the fly's eye, driving photoreceptors to respond to the temporal sequence of local changes in luminance. Lamina neurons respond to either light increments or decrements, forming ON and OFF channels. Medulla neurons next provide temporal adaptation to report only immediate changes in contrast, eliminating noise for downstream motion computations [110], while a parallel channel reports absolute luminance to aid later perceptual interpretation [111]. T4 (ON pathway) and T5 (OFF pathway) neurons receive luminance change information in the medulla where separation in the dendritic location of synapses from input neurons in spatially neighboring columns allows T4/T5 neurons to compute motion direction [38]. Four types of T4/T5 neuron exist -each detecting movement in a single cardinal direction. T4/T5 terminals create a spatial-feature map for motion direction in the LOP: two LOP dimensions encode retinotopic spatial location, and the third contains four layers, each innervated by T4/T5 cells with a different preferred direction of motion. LPLC2 dendrites within the LOP then 'read out' looming motion, and their response encodes looming size. Some medulla neurons also input to LC4 neurons, which respond to fast motions with signals proportional to looming angular velocity, that is, the instantaneous change in the retinal angle subtended by the looming object. In the PVLP of the central brain, LPLC2 and LC4 neurons converge onto the GF, a large-diameter descending neuron. The GF integrates the looming size and velocity information conveyed by these visual projection inputs and fires a single spike if brought to threshold. The occurrence and timing of the GF spike relative to activity in other descending neuron pathways determines whether a fly will takeoff and the type of takeoff enacted to flee from the predator (see Figure 3).
expansion velocity [54 ] (Figure 2c). Here, LPLC2's response increases with angular size of the looming stimulus, then declines at larger sizes [88 ], whereas LC4's response increases directly with looming expansion velocity [87]. The GF's overall response can thus be modeled as a weighted sum of both LPLC2 and LC4 responses [54 ].
Hence, two types of salient information, looming size and speed, are integrated to determine whether, and when, the GFs spike. The timing of the GF spike thus depends on the kinematics of the looming predator. A fast attack will activate both LPLC2 and LC4 channels, driving the GF to spike sooner; a slow attack will primarily activate LPLC2 and may not drive the GF to threshold. The selection between short mode and long mode escape is then determined by the relative timing of spiking responses to looming stimuli between the GF and other unknown descending neuron(s) that mediate long mode escape ( Figure 3). Critically, the long mode descending pathway has a lower threshold of activation in response to looming, compared to the GF. GF activation can also interrupt long mode escape to force a short mode takeoff [93]. The dynamic activity in the GF and other descending neurons, as determined by their particular synaptic relationship with looming feature inputs, thus determines which takeoff is enacted. The result is that the fly 'chooses' short mode escapes more often when attacks are fast, but long mode escapes otherwise.
Recent evidence also suggests additional behavioral flexibility in response to looming stimuli, mediated by other descending pathways. When exposed to repeated looming stimuli from which they cannot escape, flies may adopt a strategy of freezing or running away depending on their behavioral state, mediated by the descending neuron DNp09 [98]. Flies may also walk backwards away from looming, a behavior mediated by the moonwalker Multi-regional circuits underlying visually guided decision-making in Drosophila Cheong, Siwanowicz and Card 83   Act. thresh. Act. thresh.

Current Opinion in Neurobiology
Parallel descending pathways mediate the choice of escape response. In response to a looming stimulus, which resembles an approaching predator, a fly may choose between a 'short mode' or 'long mode' escape takeoff. The fly evaluates two distinct parameters of the incoming looming response via input from LC4 and LPLC2 visual projection neurons, which synapse onto the Giant Fiber (GF) and other descending neurons (DNs). (a) The long mode is chosen in response to slower predator attacks, which primarily activate LPLC2 (looming angular size channel) but not LC4 (looming angular velocity channel). Activation of only LPLC2 fails to stimulate spiking in the GF, which has a high threshold of activation ('Act. Thres.'). Other DNs then drive long mode escape. In this mode, flies extend their wings before taking off and produce steady initial flight. (b) In response to fast predator attacks, both LC4 and LPLC2 are activated, triggering a spike in the GF that rapidly drives a short mode takeoff. In this mode, flies jump without prior coordination of the wings, resulting in a tumbling initial flight path.
descending neurons and LC16 neurons [86]. Furthermore, looming stimuli presented from the side during flight elicit sharp banking turns in flies, mediated by the descending neuron AX [53]. In some cases, the same looming cue can elicit different behaviors depending on the state of the animal. For example, if looming is presented from the front when the fly is standing, it takes off, whereas if it is flying, the fly extends its legs in a landing response. Two descending neurons (DNp07 and DNp10) that mediate the landing response were found to act as gates based on motor feedback to link the landing response to looming only during flight [55]. Such modulation of behavioral output circuits by contextual information is likely a key theme that underlies behavioral adaptivity across many animals, and has been observed in multiple visuomotor pathways [46][47][48]55,82,99,100]. In addition to motor feedback, the action choices of interacting sensorimotor circuits may also be influenced by behavioral or internal state (for a recent review of motivational states in Drosophila, see Ref. [101]), proprioceptive feedback, or other sensory information. Indeed, electrophysiological data suggests additional input to the GF neurons [54 ], which could play this role. Further details remain to be explored.
While our description above of the sensorimotor cascade underlying decision-making in fly escape behavior may seem to cast this process as an automatic sensory response, such survival decisions have inherent tradeoffs associated with them (e.g. opportunity costs of leaving versus death) and can also be described (at least qualitatively) as economic decisions [102]. In fact, fly escape decisions most closely resemble the theoretical Pavlovian valuation system [103] in which the animal assigns value to a small set of innate behaviors upon which selection processes subsequently act. Given this perspective, where might the values for the different outcomes be physically represented? One possibility is in the activation thresholds of the GF and parallel descending neuron pathways. Future work should examine whether internal states or prior experience can modify these thresholds and how other looming response behaviors are valued and compared.
A brain-wide architecture for decision-making in the fly The ability to chart circuits deep within the Drosophila brain gives us a window into the central processing underlying visually guided behavioral decisions in the fly. But where amongst these regions are decisions made? Current knowledge paints the following picture of brain organization for visual decision making in the fly. Visual signals are broadcast in parallel to both sensorimotor association areas (lateral cerebral ganglia) and 'higher' processing centers (CX, MB). Each channel comprises a different kind of information: projections to the sensorimotor mapping area are matched filters for salient ethological motions the fly is likely to observe when a specific motor response is required (e.g. optic flow >steering, looming > escape); projections to the CX carry spatial information about objects; projections to the MB are less processed, but include color and brightness information that may mark salient objects flies need to remember. Interestingly, the distinction between the sensorimotor versus 'higher' CX/MB pathways has parallels to visual processing systems in primates: the primate subcortical visual system, particularly the Superior Colliculus-Pulvinar pathway, acts as a 'rapid detector/first responder' system to salient ethological cues [104], while the inferior temporal cortex of the ventral visual pathway participate in flexible object recognition [105].
Decisions arise from the processing of information across brain regions, both vertically through the sensor-to-motor hierarchy and horizontally across the different visual parallel channels. For example, the decision to carry out short or long mode escape is embedded in the relative spike-timing details across multiple descending neuron pathways that together integrate information from many areas. In at least one of these, the Giant Fiber, the spike timing is determined by multiple layers of vertical sensorimotor processing that span the optic lobes to the PVLP. More complex choices may require further processing between the MB and CX to select targets/goals, then coordinate how to move or orient towards or away from them. Could the MB assign valence information to visual cues, which is then integrated with egocentric spatial information from the CX to determine such goals?
The MBs also bring multisensory information together in specific compartments -could this allow for the 'binding' of disparate sensory information into single multisensory 'objects' or 'scenes' which the memory system can then associate with a valence? Further work is needed to assess the circuits and behavioral role of visual and multisensory information in the MBs as well as its interaction with the CX. Overall, work in the fly demonstrates that visual decision-making processes are distributed among many brain regions and circuits, and in some cases are an emergent property arising from the interaction of parallel circuits. Powerful Drosophila tools are increasingly allowing us to understand and model decision-making mechanisms, providing inspiration for exploration of decision processes in other species, and perhaps eventually impacting our design of man-made autonomous systems.

Conflict of interest statement
Nothing declared.