Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Noise-enhanced coding in phasic neuron spike trains

  • Cheng Ly ,

    CLy@vcu.edu

    Affiliation Department of Statistical Sciences and Operations Research, Virginia Commonwealth University, Richmond, VA 23284, United States of America

  • Brent Doiron

    Affiliations Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, United States of America, Center for the Neural Basis of Cognition, Pittsburgh, PA 15213, United States of America

Abstract

The stochastic nature of neuronal response has lead to conjectures about the impact of input fluctuations on the neural coding. For the most part, low pass membrane integration and spike threshold dynamics have been the primary features assumed in the transfer from synaptic input to output spiking. Phasic neurons are a common, but understudied, neuron class that are characterized by a subthreshold negative feedback that suppresses spike train responses to low frequency signals. Past work has shown that when a low frequency signal is accompanied by moderate intensity broadband noise, phasic neurons spike trains are well locked to the signal. We extend these results with a simple, reduced model of phasic activity that demonstrates that a non-Markovian spike train structure caused by the negative feedback produces a noise-enhanced coding. Further, this enhancement is sensitive to the timescales, as opposed to the intensity, of a driving signal. Reduced hazard function models show that noise-enhanced phasic codes are both novel and separate from classical stochastic resonance reported in non-phasic neurons. The general features of our theory suggest that noise-enhanced codes in excitable systems with subthreshold negative feedback are a particularly rich framework to study.

Introduction

The neural transfer of synaptic inputs to output spike trains often involves cellular mechanisms with complex filtering properties. A majority of studies have focused on neurons with Class I excitability that show low frequency stimulus selectivity, or neurons with Class II excitability that have resonant properties [16]. However, recordings from neurons in the auditory brainstem [710], spinal cord [5, 11], and cerebellum [12] show a firing response preference for high frequency signals over low ones. Neurons with high pass selectivity are often labeled phasic, and are related to Class III excitability [4, 5, 13]. Phasic neural responses have received considerably less attention than Class I or Class II excitability and present a novel challenge in uncovering how fluctuations and signal combine to determine responses.

A common functional interpretation of phasic neuron responses is that they filter out low frequency inputs, which are presumably either distractors inputs, input noise, or are to be coded by another class of neurons. However, the responses from excitable systems are very nonlinear, and a high pass characterization obtained with a specific input structure may not generalize to other inputs. A classic example of this intuition is noise-enhanced coding, where moderate broadband fluctuations increase the information transfer of a weak signal [1420]. Most models of excitable dynamics are one-dimensional, with artificial spike-reset mechanics erasing any memory of membrane dynamics prior to a spike [1417]. For this reason these spike trains are guaranteed to have renewal statistics, greatly easing the analyses of noise-enhanced coding [15, 17]. One-dimensional neural dynamics, while nonlinear, nevertheless show base filter properties that are low-pass, in contrast to the high pass response of phasic neurons. In most models of phasic neurons, this high pass nature is achieved by a subthreshold negative feedback process, often from a low threshold K+ channel dynamic [5, 7, 10, 2123]. Thus phasic neurons have dynamics that operate along a minimum of two dimensions, and noise induced spiking dynamics that have simple renewal statistics are not expected. Rather, the non-Markovian structure of stochastic responses in higher dimensional systems makes the analysis of noise-enhanced coding especially challenging [2427]. Noise-enhanced coding in phasic neuron models have received recent attention [9, 10, 23, 28], however a sound theoretical framework that highlights the key differences between codes by phasic and non-phasic neurons has not been proposed.

We analyzed a simplified Fitzhugh-Nagumo neuron model with a state dependent timescale in the negative feedback recovery variable, capturing many of the core dynamic features of phasic neurons with a two-dimensional model. While the deterministic spike response of this model cannot code for low frequency signals (no matter the amplitude of the signal), the stochastic response shows a clear phase locking to these signals. We use a phenomenological model to expose the underlying mechanisms as a non-Markovian stochastic process that is distinct from classic stochastic resonance phenomena, yet generalizable to a broad class of excitable systems having subthreshold negative feedback.

Results

Fitzhugh-Nagumo phasic model

Consider the augmented Fitzhugh-Nagumo (FN) model (i.e., Class III excitability T [4]) receiving white noise input that mimics background activity: (1) (2) Here v models the membrane potential of a neuron, and w models a refractory variable responsible for action potential repolarization. In this model the w-nullcline is vertical (Fig 1a, bottom) so that the rest state v = 0 is always stable for all I(t) = I constant. This is in contrast to the standard Fitzhugh-Nagumo model, where a slanted w-nullcline causes the rest state to loose stability via a subcritical Hopf bifurcation when I is increased. We are interested in the response of the model to time varying input, where I(t) is a dynamic signal and ξ(t) is a Gaussian noise source of intensity D obeying 〈ξ(t)〉 = 0 and 〈ξ(t)ξ(t′)〉 = δ(tt′). The variables τv and ε−1 represent the timescales of v and w, respectively.

thumbnail
Fig 1. Phasic spiking with dynamic negative feedback.

a) Typical trajectories in the phase plane for the ε-model (= 0.03, constant) and b) for . D = 0.03 in both (a) & b). c) Comparison of the power spectrum of v(t) with D = 0.03 for both curves. d) Comparison of the firing rates ν as a function of D. Throughout we define ν = 〈y(t)〉 where y(t) = ∑i δ(tti) and ti is the time of the ith trajectory crossing of w = wc from below (wc = 0.14 for ε(v)-model and wc = 0.15 for the ε-model).

https://doi.org/10.1371/journal.pone.0176963.g001

Before we consider the response to dynamic input we note that real phasic neurons have a strong subthreshold negative feedback (often a voltage gated K+ channel), whose timescale determines the high pass property of the spike train response [710]. In most models this is captured by an auxiliary variable [9, 10, 21] that often adds more dimensions to the underlying model. However, in our model we introduce dynamics to the timescale of the negative feedback w by letting ε−1 depend on v. We refer to our dynamic timescale model as the ε(v)-model, as opposed to the constant timescale ε-model. Note that Lundstrom et al. [28] also introduced a voltage dependent time-scale ε(v) in a 2D system based on a reduced Hodgkin-Huxley model developed by Rush & Rinzel [29] (see Discussion). In general we take τvϵ(v)−1 for all v so that action potential dynamics can occur; however, the ε(v)-model has a faster w timescale (larger ϵ(v)) when the model is not engaged in a spike (cf. Fig 1a and 1b, top). This accentuates circular paths in the (v, w) phase-plane near rest (v, w) = (0, 0) (cf. Fig 1a and 1b; bottom). This circular flow is reminiscent of mixed-mode oscillations, a widespread phenomena in many scientific disciplines [3033] characterized by distinct oscillations of various amplitudes. The subthreshold oscillatory dynamic gives an overall increased rhythmic dynamic to the ε(v)-model, as compared to the constant ε-model (Fig 1c). However, an important distinction between the two models is that the ε(v)-model’s spike train firing rate ν has a higher sensitivity to changes in the noise intensity D (Fig 1d). Taken together, many of the known features of phasic neurons are captured by a two-dimensional FN model with a modification of the time-scale of the recovery variable (ε(v)-model).

We next compare the spike train coding of low frequency sinusoidal inputs I(t) = A sin(2πφt) (φ ≪ 1) for both the ε(v) and ε-models (Fig 2a). When D = 0 and φ is sufficiently small both models do not produce a spike, even for very large A (Fig 2c). However, for signals in the presence of noise (D > 0) both phasic models fire and the increased sensitivity to D of the ε(v)-model persists when A > 0. To measure the tendency for spike responses to track dynamic inputs we define the vector strength r ∈ [0, 1] as: (3) Here p(ψ) is the phase density (probability density of the random phase of the sinusoid exactly at the time of a spike) and T = 1/φ. Vector strength is a common measure of the degree of phase-locking to a periodic signal [3437]. Higher values of r indicate better phase locking than lower values, and the ε(v)-model shows a much larger r over a range of D than the ε-model (see Fig 2b). A signal is well encoded if the spike responses are both phase-locked (high r) and dense in time (high firing rate ν), and we thus consider the vector strength weighted by firing rate ν: (4) The large firing rate and phase locking in the ε(v)-model combine to dramatically increase the overall q for a range of D values compared to the ε-model (Fig 2d). Thus, the inclusion of a v-dependent timescale of the negative feedback increases the sensitivity of our phasic model to noisy fluctuations. For moderate D the enhanced sensitivity does not impede, yet it rather enhances, the encoding of a slow signal by the spike train response.

thumbnail
Fig 2. Phasic Fitzhugh-Nagumo model responses lock to slow sinusoidal input.

a) Model responses to low frequency sinusoidal signals (φ = 0.005 with A = 0.01). v(t) superimposed with I(t) = A sin(2πφt) for ε(v)-model. bi) The phase density (i.e., phase of sinusoidal input at a spike), p(ψ), in the ε(v)-model with two different noise levels (corresponding to red asterisks in bii)). bii) The vector strength, r, (see Eq (3)) as a function of noise level D. The ε(v) model response is better locked to sinusoidal input compared to that from the ε-constant model. The strength of the noise has to be large enough to ellicit a response with slow sinusoidal input. c) Firing rates are higher with ε(v) modification and more sensitive to noise than when ε is constant. (d) The q-values are larger for a wider range of noise values D in the ε(v)-model because the firing rate higher (c) and it is better locked to the input.

https://doi.org/10.1371/journal.pone.0176963.g002

The sensitivity of q to the v dependence of ε shows that subtle changes in subthreshold integration can have a large impact of spike train coding. To measure this we compute 〈ξ(t)|tspike = 0〉 for t < 0, often termed the spike triggered average [38]. The spike response of the ε(v) model shows a preference for a strong hyperpolarizaing (negative) input that quickly depolarizes near the spike time (Fig 3a, black curve), in agreement with in vitro experiments of phasic neurons [7, 8] (also see Fig 6 in [11] for recent experimental verification of this phenomena). While the ε model shows a similar selectivity (Fig 3a, blue curve), the effect is diluted compared to that shown by the ε(v) model.

thumbnail
Fig 3. Low dimensional FN model motivates reduced phenomenological model.

a) 〈ξ(t)|ts = 0〉 with I(t) = 0 for ν = 0.0045 (D = 0.03 for ε(v)-model and D = 0.069 for ε-model), scaled to have the same maximum [8]. This demonstrates that the neuron prefers input that is hyperpolarizing, quickly followed by depolarizing input. b) The trajectory in the (v, w) phase plane in gray, starting at (v, w) = (−0.2, 0) with I(t) = D = 0. The minimum distance to the separatrix (gray-dashed) in the v direction is ΔU. c) The evolution of ΔU in (b) (solid black), and initial ΔU (gray) are plotted, showing the benefit of hyperpolarizing input. Scaled ΔUR(t) that is used in reduced model (dotted black) is shown for comparison.

https://doi.org/10.1371/journal.pone.0176963.g003

To understand the how ε(v) shapes the spike triggered average we first considered the relaxation dynamics of the deterministic trajectory (v(t), w(t)) from an initial hyperpolarized v (Fig 3b). The trajectory shows that the distance to the separatrix in the v (horizontal) direction is often smaller than when at rest (Fig 3c, black curve). The increased rotational flow near the rest state for the ε(v)-model transiently lowers the effective ‘threshold’ for spike initiation, an attribute well-known in several neural models [4, 28]. This gives an intuitive understanding of why the combination of hyperpolarization followed by depolarization evokes spike responses. This selectivity of spike dynamics has yet to be incorporated into a theory of noise-enhanced coding. Note that these dynamics are only somewhat related to the post-inhibitory rebound or anode break excitation whereby a spike can be initiated with a quick removal of inhibition [29]. Here, the inhibition is transient as opposed to post inhibitory rebound that initially requires sustained inhibition.

We next propose a reduced framework that builds on past models of stochastic resonance [1419] and exposes how noise enhances coding to a larger degree in phasic than non-phasic neurons.

Phenomenological model of phasic neuron coding

Consider a phenomenological model of the spiking activity, where the v dynamics are captured by a one-dimensional potential well U(v, t): (5) where ′ denotes differentiation with respect to v. For exposition purposes, we ignore the external sinusoidal signal (for the inclusion of a sinusoidal signal, see Materials and Methods section). In what follows we take v = 0 as a stable state and neglect the local dynamics about v = 0. We consider only large, infrequent fluctuations that force v to exceed one of two threshold values: vR > 0 (right of v = 0) and vL < 0 (left of v = 0). When v crosses vR a spike occurs and v is reset to 0. The energy required for this to occur is ΔUR = U(vR) − U(0), we refer to ΔUR as the right barrier. The second threshold vL, with energy barrier ΔUL = U(vL) − U(0), models the activation of the subthreshold negative feedback that is recruited by a hyperpolarizing fluctuation in the ε(v) FN model (see Fig 3b). We incorporate this feedback as a transient dynamic to the spike threshold. Specifically, if v crosses vL at time t = 0 then the right energy barrier has a damped oscillatory dynamic ΔUR(t) for t > 0 to capture the distance to threshold in the ε(v) FN model (Fig 3c, solid black curve). In the phenomenological model, we let ΔUR(t) = vR − 1.4sin(0.8π(t + 0.15))/e0.8(t+0.25) (Fig 3c, dotted curve) to qualitatively capture the distance to the separatrix in the ε(v) FN model (compare the curves in Fig 3c). In effect, a sufficiently large hyperpolarization will impact the likelihood of a future depolarization to cause a spike threshold crossing. This effect is reminiscent of mixed-mode oscillations where small subthreshold oscillations facilitate large suprathreshold oscillations [3033]. From these assumptions we can build a theory for spike dynamics based on ideas from renewal theory [39, 40], an approach has been successfully used to study a variety of systems [41], including non-phasic neural models [18, 19, 28, 4244].

Let TL ∈ [0, ∞) denote the random time that v crosses the left barrier vL, ignoring crossings of vR for the moment. In the model without a time-varying input current, the rate of crossing the left barrier, termed the hazard function HL, is constant in time. We assume that HL necessarily increases with noise level D, and decreases with barrier height ΔUL. In many models a decent approximation is given by the Arrhenius law for small noise D ≪ 1: HLγe−ΔUL/D [28, 41, 42, 44]. Assuming this renewal framework for left barrier crossing we may compute other related quantities of interest, namely the survivor function SL(t) and the probability distribution of the inter-crossing intervals fL(t). For small dt we have the following relations: (6)

Ignoring the left barrier for now, consider the times TR ∈ [0, ∞) that v crosses the moving right barrier. If ΔUR(t) varies slowly (i.e., in the ε(v) FN model when subthreshold), the time-varying hazard function HR(t) is well-approximated by an inhomogenous Poisson process. We assume further that the only temporal dynamics of HR arise via the moving barrier ΔUR(t). The associated hazard, survivor, and inter-crossing interval distribution are given by (7) To more accurately capture the sensitivity of the full FN models to D we set HU, D): = 5e−3ΔU1.5/D, with ΔUL = 0.9 and vR = 1.5. Our theory is not sensitive to this choice, and the subsequent derivations in the Materials and Methods section are with a general HU, D).

Combining both left and right barriers, the crossing statistics are non-Markovian, presenting analytical difficulties that are addressed in the Materials and Methods section. We analytically obtain an equation for the probability density function of time between spikes, T: f(t)dt = Pr(tTt + dt|T ∉ (0, t)) (see Eq (24)), yielding the firing rate: (8) The equation for the phase density p(ψ) (Eq (23)) and subsequently the vector strength r (Eq (25)) are also derived in the Materials and Methods section.

This phenomenological model enables the parsing of how the components (negative feedback, moving threshold, etc.) can lead to enhanced encoding of slow signals. We shed light on these dynamics by comparing three different models (Fig 4):

  1. The Phasic model we have just described where left barrier crossing resets the moving height ΔUR(t).
  2. Dynamic right barrier ΔUR(t) after a spike reset, yet without a left barrier to reset ΔUR(t) (i.e, JL → 0 so only the first term appears in Eq (11)). We refer to this model as the Right moving boundary model.
  3. A non-phasic model with no left barrier, and ΔUR(t) height only changes with sinusoidal input, as previously studied [14–19]. We refer to this last model as the Classic model.

thumbnail
Fig 4. Phenomenological model dissects core mechanisms of noise-enhanced coding.

a) In the reduced system, leftward exits reset the right barrier, effectively lowering it; the right barrier evolves according to the dotted curve ΔU(t) in Fig 3c. b) The firing rate dependence on D for the non-phasic model (classic, with constant right boundary and no left boundary), a model with a dynamic spike threshold only, and the full phasic model (see Eqs (8) and (24)). c) The vector strength r computed by the analytic formulas for all 3 models (Eq (25)). d) The noise-enhanced encoding q of a sinusoidal input for the three models. Throughout we set to capture the sensitivity of ν to noise (cf. Figs 1d) and 4b), and φ = 0.1.

https://doi.org/10.1371/journal.pone.0176963.g004

Over a large range of D, the firing rate ν and noise sensitivity /dD of the phasic model (1) are larger than the other two models (Fig 4b; here A ≪ 1 so the rates are the same with and without sinusoidal input). Thus, lowering of ΔUR(t) by hyperpolarization promotes spike discharge and increases firing rate sensitivity to noise, as observed in the FN model (Fig 2c). More importantly, the reduced phasic model (1) encodes low frequency sinusoidal signals with higher q values than the other models (Fig 4d), also observed in the FN model (Fig 2d). We remark that for small D, model (2) is very different from the phasic model (1) since ν is low and multiple left barrier resets between spikes are likely, however for large D the model responses approach one another. In total, our reduced framework shows how the subthreshold negative feedback in phasic systems promotes and enhances coding of slow signals with background noise.

Discussion

In deterministic frameworks phasic neurons are high pass systems, with a lack of response to sufficiently low frequency inputs of even large amplitude. However, we have shown that in the presence of noise, the coding of phasic systems is rich, allowing phasic systems to easily code for low frequency inputs. Although others have demonstrated computationally that background fluctuations can lead to better coding of slow signals [9, 10, 28], we capture the essence of this effect with a 2D planar model which subsequently motivates a reduced phenomenological description of the dynamics. Our analysis specifically enables parsing of the various pieces (i.e., removal of negative feedback, threshold dynamics) that contribute to the observed effect. Despite its simplicity, our reduced point process description shows how the subthreshold negative feedback inherent in phasic neuron activity is an important feature that enhances the overall capacity for noise to enhance the coding of slow signals. Our theoretical analysis shows how noise-enhanced coding in phasic systems is mechanistically separate from classic stochastic resonance [1419], and is a novel form of noise-enhanced coding.

Introducing a voltage-dependent time constant for the inactivation variable has been beneficial in explaining other electrophysiological recordings. Lundstrom et al. [28] used an energy barrier model to understand how noise can cause more firing in Class III neurons and other kinds of neurons in a variety of cortical areas. Specifically, they explained how ε(v) slows the subthreshold voltage dynamics that can lead to increase in firing rate with noise. They primarily focused on non-phasic neurons with the analysis focused on type A (insensitivity to noise with large mean current) and type B+ (sensitivity to noise but repetitive firing as mean current increases). As such they did not dissect the role of noise as specifically as we did, whereby negative fluctuations remove the slow negative feedback and primes the neuron to spiking (see the spike-triggered average of the ε(v)-phasic neuron). Recently, Ratte et al. [11] provided detailed analyses of various response dynamics in different types of neurons using phase plane analyses in models and with in vitro experiments. They also found similar behaving phasic-type neurons (‘differentiators’) in the spinal cord of rodents that required hyperpolarization following by depolarization to initiate spikes.

How fluctuations drive subthreshold nonlinear dynamics is a potentially fruitful avenue for mixed mode oscillator systems. However, this has been understudied because the intrinsic dynamics are often complicated. Work by Gai, Rinzel and colleagues [9, 10, 23] demonstrated this similar effect of noise enhancing the ability of class III neurons to encode low frequency signals. They used a combination of in vitro experiments and computational modeling (4D Hodgkin-Huxley model), but did not precisely dissect the underlying details of how noise removes the negative feedback (i.e., a fast Potassium current) to promote spiking the way we have in two-dimensional model that inspired our phenomenological model. Our work shows how even in a simple framework, the impact of noise on coding in these rich systems can be dramatic.

Although we have only considered a single sinusoidal amplitude and frequency for exposition purposes, our results will hold with other parameter values as long as these two parameters are relatively small. Examination of the various entities in this paper with other parameter values yield the same story (also see Supporting information, Fig A in S1 Text, Fig B in S1 Text, and Fig C in S1 Text). Determining the exact critical point where our results no longer hold (e.g., increasing the frequency and/or amplitude) is difficult because the analysis of the phenomenological model relies on these assumptions. Such investigations are beyond the scope of this study but may yield interesting results.

An overall better understanding of the dynamics of how phasic neurons encode signals is important in the context of coincidence detection [23, 45] known to be important for example in the auditory system; in fact, some refer to phasic neurons as ‘coincidence-detector’ neurons [45].

Our work highlights the impact of negative feedback in the subthreshold membrane dynamics on spike discharge. Our phenomenological model shows a non-Markovian character of the membrane dynamics, however this memory is not carried over after spike discharge. As a result we would expect that inter-spike interval sequence to be renewal (i.e successive intervals are independent of one another). This should be contrasted with neuronal models with spike driven slow adaptation [46]. The time scale of the negative feedback produces negative serial inter-spike interval correlations, making the spike train process formally non-renewal [47]. This non-renewal character also increases information transfer about static and dynamic signals in these systems (Chacron 2001, 2004). Phenomenological models of these spike dynamics involve time dependent spike thresholds, which are similar to the time dependent left barrier in our model [25, 47]. A topic of further study would be to combine these two models so that sub-threshold and supra-threshold negative feedback interact.

Materials and methods

The next three sections contain the derivation of the probability density of time between spikes f(t)dt = Pr(tTt + dt|T ∉ (0, t)) in the reduced phenomenological model in the following order for exposition purposes:

  • Without sinusoidal input (Eq (12))
  • With slow sinusoidal input that resets after a spike (Eq (19)); the sinusoidal input starts with the same phase, i.e., endogenous input
  • Finally, with slow sinusoidal input that does not reset after a spike (Eq (24)), i.e., exogenous input

Also, the equation for the phase density p(ψ) and thus the vector strength r are derived for the reduced phenomenological model (see in Eqs (23) and (25), respectively).

Phenomenological model without external input

We are interested in the spiking statistics of the phenomenological model with both a left and right barrier (see Fig 4a). Recall the definitions of the associated hazard, survivor, and density of event times in Eqs (6) and (7). Without loss of generality assume that a spike event has happened at time t = 0. Let T ∈ [0, ∞) be the random time of the neuron’s next spike. We are interested in the calculating the inter-spike interval density for T; f(t) = Pr(tTt + dt|T ∉ (0, t))/dt. The probability of TR (crossing the Right boundary first after spike reset) and TL (crossing the Left boundary first after spike reset) simultaneously occurring goes to 0 as dt → 0, this gives that the probability per unit time of crossing the right barrier (firing) without first crossing the left barrier is equal to: SL(t)fR(t). That is, the membrane potential dynamics must survive the left barrier up to time t (with probability SL(t)), then cross the right barrier (with probability density fR(t)). We remark that these are independent events in this formulation because we ignore small amplitude fluctuations, and only consider large fluctuations that cause barrier crossings. Similarly, the probability of crossing the left barrier without ever crossing the right threshold is: SR(t)fL(t). For convenience, label these quantities JR(t) and JL(t), respectively. (9) (10) As stated earlier, the neuron can fire without ever crossing the left barrier, which has probability (per unit time): JR(t). However, it can also fire by first crossing the left barrier at time t′ < t exactly once, and then crossing the right barrier at time t. The probability (per unit time) of this occurring is: because the times of these events are independent random variables. Continuing in this fashion, we see that the neuron can fire by crossing the left barrier n (n = 0, 1, 2, …) times before it crosses the right barrier, which has probability (per unit time) equal to an n-fold convolution: Summing over n = 0, 1, 2, ‥ gives all the different ways the neuron can spike at t (schematically shown in Fig 4a), yielding: (11) Notice the terms in the square brackets above are exactly f(t′): (12) Taking the Fourier Transform and applying the convolution theorem to this equation, we see that the characteristic function is: (13) from which we can obtain the ISI density assuming is uniquely invertible (14) A similar derivation for the transition probabilities in a two-state non-Markovian system has been proposed ([26]; Eqs (14)–(32)); however we have intertwined the crossing of one barrier to the dynamics of a second.

Phenomenological model with sinusoidal input

Assume the system (Eq (5)) now has a weak sinusoidal (AvR) input with slow frequency φ ≪ 1: (15)

With slow frequencies, the dynamics are well-approximated by the quasi-static limit (i.e., that v(t) is its steady-state value that slowly changes in time). Again, we consider the reduced phenomenological model discussed in the last section. The quasi-static approximation is thus equivalent to modulating the left and right barriers sinusoidally with an amplitude A. Here the left barrier changes in time: ΔULA sin(2πφt), and the right barrier is ΔUR(t) − A sin(2πφt).

The formulas developed in Eqs (6)–(14) for the ISI density f(t) can be generalized for Eq (15), except now both left and right hazard functions vary in time (16) The probability (per unit time) of exiting the left and right barrier are now: (17)

The Fourier transform of the ISI density is (recall Eqs (11) and (12): (18) from which we can obtain the ISI density if the system is renewal (i.e., here we assume I(t) is reset to A sin(2πφt) after every spike) (19)

Theory for the phase density and vector strength

The last section showed the derivation of the ISI density in the quasi-static approximation assuming the signal is endogenous (i.e., always start with the same I(t) = A sin(2πφt) after a spike). However, physiological signals are exogenous so that after spike reset there are different initial phases ψ0 ∈ [0, T) of where the signal starts: I(t) = A sin(2πφ(t + ψ0)) (here ), adding further complications for analytic characterization. The subsequent theory for the probability density of the phase at a spike: p(ψ); , i.e., the cycle histogram, that we use is based on the work of Shimokawa et al. [48].

We can calculate a family of ISI densities conditioned on a starting phase ψ0: f(t|ψ0) with I(t) = A sin(2πφ(t + ψ0)). The probability density function of firing at phase ψ given that the starting phase is ψ0 is (see Eq (19)) (20) it also satisfies: for all ψ0. Let us define an operator that maps the probability density of initial phases after the (n − 1)th spike pn−1(ψ0) to a probability density of phases at the nth spike pn(ψ): (21) (22) The phase density p(ψ) is the fixed point of this map [48]: (23)

The ISI density is also given by: (24) The firing rate is

With the phase density p(ψ), we can calculate the vector strength (recall Eq (3)): (25)

Although the expressions for the density of spike times f(t) and vector strength r are theoretically exact, it is quite difficult to compute numerically because the discretization of both t and ω necessarily have finite mesh sizes > 0, dt > 0, introducing numerical errors with each integral (transform). We thus altered the mesh sizes and domains for each noise value D and insured numerical accuracy of Eqs (24) and (25) by insuring the results matched the Monte Carlo simulation of the phenomenological system with the prescribed hazard functions (Eqs (6) and (16)).

Monte carlo simulations

The results of the Monte Carlo simulations for the Fitzhugh-Nagumo model (Eqs (1) and (2)) were noisy. These simulations consisted of a network of 20,000 uncoupled cells run for a total 100 s of biological time. We used an Euler-Maruyama scheme [49] with a time step of 0.01 ms. The firing statistics (ν and p(ψ)) were calculated by averaging over time and over the uncoupled cells. For illustration purposes, we smoothed the simulated curves in Figs 1c and 2bi with the freely available function fastsmooth (see [50]) with a pseudo-Gaussian method because of the small time steps. Note that the curves in Fig 2bii and 2d were not smoothed (i.e., the raw values are plotted).

Computer code is available at http://doi.org/10.21974/V4159N.

Supporting information

S1 Text. Supplemental figures for “noise-enhanced coding in phasic neuron spike trains”.

Fig A, Comparing various STAs of Fitzhugh-Nagumo model with different noise levels. Fig B, The STAs of Fitzhugh-Nagumo model with sinusoidal input. Fig C, Raw STA of Fitzhugh-Nagumo with no sinusoidal input.

https://doi.org/10.1371/journal.pone.0176963.s001

(PDF)

Acknowledgments

We thank J. Rinzel, Y. Gai, and R. Rosenbaum for useful discussions.

Author Contributions

  1. Conceptualization: BD.
  2. Funding acquisition: CL BD.
  3. Methodology: CL BD.
  4. Project administration: BD.
  5. Software: CL.
  6. Supervision: BD.
  7. Validation: CL.
  8. Visualization: BD CL.
  9. Writing – original draft: BD CL.
  10. Writing – review & editing: BD CL.

References

  1. 1. Rinzel J, Ermentrout GB. Analysis of Neural Excitability and Oscillations. In: Segev I, editor. Methods in Neuronal Modeling: From Synapses to Networks. MIT Press; 1989. p. 135–169.
  2. 2. Ermentrout GB. Type I membranes, phase-resetting curves, and synchrony. Neural Computation. 1996;8:979–1001.
  3. 3. St-Hilaire M, Longtin A. Comparison of coding capabilities of type I and type II neurons. Journal of Computational Neuroscience. 2004;16(3):299–313. pmid:15114051
  4. 4. Izhikevich EM. Dynamical systems in neuroscience: The geometry of excitability and bursting. The MIT press; 2007.
  5. 5. Prescott SA, De Koninck Y, Sejnowski TJ. Biophysical basis for three distinct dynamical mechanisms of action potential initiation. PLoS Computational Biology. 2008;4:e1000198. pmid:18846205
  6. 6. Barreiro AK, Shea-Brown E, Thilo EL. Time scales of spike-train correlation for neural oscillators with common drive. Physical Review E. 2010;81(1):011916.
  7. 7. Svirskis G, Kotak V, Sanes DH, Rinzel J. Enhancement of signal-to-noise ratio and phase locking for small inputs by a low-threshold outward current in auditory neurons. The Journal of Neuroscience. 2002;22:11019–11025. pmid:12486197
  8. 8. Slee SJ, Higgs MH, Fairhall AL, Spain WJ. Two-dimensional time coding in the auditory brainstem. The Journal of Neuroscience. 2005;25:9978–9988. pmid:16251446
  9. 9. Gai Y, Doiron B, Kotak V, Rinzel J. Noise-gated encoding of slow inputs by auditory brainstem neurons with a low-threshold K+ current. The Journal neurophysiology. 2009;102:3447–3460.
  10. 10. Gai Y, Doiron B, Rinzel J. Slope-based stochastic resonance: how noise enables phasic neurons to encode slow signals. PLoS Computational Biology. 2010;6:e1000825. pmid:20585612
  11. 11. Ratté S, Lankarany M, Rho YA, Patterson A, Prescott SA. Subthreshold membrane currents confer distinct tuning properties that enable neurons to encode the integral or derivative of their input. Frontiers in cellular neuroscience. 2015;8:452. pmid:25620913
  12. 12. Mapelli J, Gandolfi D, D’Angelo E. High-pass filtering and dynamic gain regulation enhance vertical bursts transmission along the mossy fiber pathway of cerebellum. Frontiers in Cellular Neuroscience. 2010;4. pmid:20577586
  13. 13. Hodgkin A. The local electric changes associated with repetitive action in a non-medullated axon. Journal of Physiology. 1948;107(2):165. pmid:16991796
  14. 14. Gammaitoni L, Hänggi P, Jung P, Marchesoni F. Stochastic resonance. Review of Modern Physics. 1998;70:223–287.
  15. 15. McNamara B, Wiesenfeld K. Theory of stochastic resonance. Physical Review A. 1989;39:4854–4869.
  16. 16. Jung P, Hänggi P. Amplification of small signals via stochastic resonance. Physical Review A. 1991;44:8032–8042.
  17. 17. Wiesenfeld K, Moss F. Stochastic resonance and the benefits of noise: from ice ages to crayfish and SQUIDs. Nature. 1995;373:33. pmid:7800036
  18. 18. Longtin A. Stochastic resonance in neuron models. Journal of Statistical Physics. 1993;70(1):309–327.
  19. 19. Lindner B, Garcia-Ojalvo J, Neiman A, Schimansky-Geier L. Effects of noise in excitable systems. Physics Reports. 2004;392(6):321–424.
  20. 20. McDonnell MD, Ward LM. The benefits of noise in neural systems: bridging theory and experiment. Nature Reviews Neuroscience. 2011;12(7):415–426. pmid:21685932
  21. 21. Rothman JS, Manis PB. The roles potassium currents play in regulating the electrical activity of ventral cochlear nucleus neurons. The Journal of Neurophysiology. 2003;89(6):3097. pmid:12783953
  22. 22. Lundstrom BN, Hong S, Higgs MH, Fairhall AL. Two computational regimes of a single-compartment neuron separated by a planar boundary in conductance space. Neural computation. 2008;20(5):1239–1260. pmid:18194104
  23. 23. Gai Y, Kotak VC, Sanes DH, Rinzel J. On the localization of complex sounds: temporal encoding based on input-slope coincidence detection of envelopes. Journal of neurophysiology. 2014;112(4):802–813. pmid:24848460
  24. 24. Goychuk I, Hänggi P. Non-Markovian stochastic resonance. Physical Review Letters. 2003;91:70601.
  25. 25. Chacron MJ, Lindner B, Longtin A. Noise shaping by interval correlations increases information transfer. Physical Review Letters. 2004;92:80601.
  26. 26. Goychuk I, Hänggi P. Theory of non-Markovian stochastic resonance. Physical Review E. 2004;69:21104.
  27. 27. Lindner B, Chacron MJ, Longtin A. Integrate-and-fire neurons with threshold noise: A tractable model of how interspike interval correlations affect neuronal signal transmission. Physical Review E. 2005;72(2):021911.
  28. 28. Lundstrom BN, Famulare M, Sorensen LB, Spain WJ, Fairhall AL. Sensitivity of firing rate to input fluctuations depends on time scale separation between fast and slow variables in single neurons. Journal of Computational Neuroscience. 2009;27:277–290. pmid:19353260
  29. 29. Rush ME, Rinzel J. The potassium A-current, low firing rates and rebound excitation in Hodgkin-Huxley models. Bulletin of mathematical biology. 1995;57(6):899–929. pmid:8528161
  30. 30. Hudson J, Hart M, Marinko D. An experimental study of multiple peak periodic and nonperiodic oscillations in the Belousov–Zhabotinskii reaction. Journal of Chemical Physics. 1979;71:1601.
  31. 31. Vanag VK, Yang L, Dolnik M, Zhabotinsky AM, Epstein IR. Oscillatory cluster patterns in a homogeneous chemical system with global feedback. Nature. 2000;406(6794):389–391. pmid:10935631
  32. 32. Vo T, Bertram R, Tabak J, Wechselberger M. Mixed mode oscillations as a mechanism for pseudo-plateau bursting. Journal of Computational Neuroscience. 2010;28:443–458. pmid:20186476
  33. 33. Desroches M, Guckenheimer J, Krauskopf B, Kuehn C, Osinga HM, Wechselberger M. Mixed-mode oscillations with multiple time scales. SIAM Review. 2012;54(2):211–288.
  34. 34. Gerstner W, Kempter R, van Hemmen JL, Wagner H. A neuronal learning rule for sub-millisecond temporal coding. Nature. 1996;383:76–78. pmid:8779718
  35. 35. Burkitt A, Clark G. Synchronization of the neural response to noisy periodic synaptic input. Neural Computation. 2001;13:2639–2672. pmid:11705405
  36. 36. Chacron MJ, Maler L, Bastian J. Electroreceptor neuron dynamics shape information transmission. Nature Neuroscience. 2005;8:673–678. pmid:15806098
  37. 37. Gao X, Wehr M. A coding transformation for temporally structured sounds within auditory cortical neurons. Neuron. 2015;86(1):292–303. pmid:25819614
  38. 38. Dayan P, Abbott LF. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Cambridge, MA: MIT Press; 2001.
  39. 39. Cox DR. In: Renewal Theory. New York: Wiley; 1962.
  40. 40. Cox DR, Lewis PAW. In: The Statistical Analysis of Series of Events. London: Mathuen; 1966.
  41. 41. Kramers H. Physica (Utrecht). 1940;1940:284.
  42. 42. Plesser HE, Gerstner W. Noise in integrate-and-fire neurons: from stochastic input to escape rates. Neural Computation. 2000;12:367–384. pmid:10636947
  43. 43. DeVille REL, Vanden-Eijnden E, Muratov CB. Two distinct mechanisms of coherence in randomly perturbed dynamical systems. Physical Review E. 2005;72(3):031105.
  44. 44. DeVille REL, Muratov CB, Vanden-Eijnden E. Non-meanfield deterministic limits in chemical reaction kinetics. Journal of Chemical Physics. 2006;124:231102. pmid:16821899
  45. 45. Chen Y, Zhang H, Wang H, Yu L, Chen Y. The role of coincidence-detector neurons in the reliability and precision of subthreshold signal detection in noise. PloS one. 2013;8(2):e56822. pmid:23418604
  46. 46. Benda J, Maler L, Longtin A. Linear versus nonlinear signal transmission in neuron models with adaptation currents or dynamic thresholds. Journal of Neurophysiology. 2010;104(5):2806–2820. pmid:21045213
  47. 47. Chacron MJ, Longtin A, Maler L. Negative interspike interval correlations increase the neuronal capacity for encoding time-dependent stimuli. Journal of Neuroscience. 2001;21(14):5328–5343. pmid:11438609
  48. 48. Shimokawa T, Pakdaman K, Sato S. Time-scale matching in the response of a leaky integrate-and-fire neuron model to periodic stimulus with additive noise. Physical Review E. 1999;59:3427–3443.
  49. 49. Kloeden PE, Platen E, Schurz H. Numerical solution of SDE through computer experiments. Springer Science & Business Media; 2012.
  50. 50. O’Haver T. Fast Smoothing Function. MATLAB Central File Exchange. 2008;http://www.mathworks.com/matlabcentral/fileexchange/19998.