Detection In Neuronal Communications with Finite Channel State Abolfazl Amiria , Sadaf Salehkalaibara , Behrouz Mahamb a

b

Department of ECE, College of Engineering, University of Tehran, Iran Department of EE, School of Engineering, Nazarbayev University, Astana, Kazakhstan

Abstract Nano-networks are the key factors in developing future nano-machines. The use of molecules to transmit information in such networks is the subject of molecular communications. The future of the nano-networks depends on the development of the molecular communications. Neuro-synaptic communication, which models the data transmission in the body nervous system, is an important example of molecular communication. In this paper, we introduce a comprehensive channel model for the neuro-synaptic communication channel. Our model incorporates the effect of different channel states and spike rates that exist in the body nervous system. We propose a finite state Markov channel scheme for the communication system. This scheme has a two dimensional state space according to the neural firing rate and the number of available carrier resources. Next, we suggest an M-ary signaling scheme for the synaptic area that models the synaptic multi-site activity. In this activity, several neuronal endpoints participate in the neurotransmitter release process. Moreover, we obtain a closed-form solution for the detector at the destination neuron. Finally, we evaluate the decision rules in the detector with simulations. Keywords: Molecular communications, Neuro-synaptic communications, Finite state Markov channel, MAP detector.

Email addresses: [email protected] (Abolfazl Amiri), [email protected] (Sadaf Salehkalaibar), [email protected] (Behrouz Maham)

Preprint submitted to Nano Communication Networks

May 7, 2017

1. Introduction Molecular communication is a bio-inspired paradigm which exists in many natural phenomena. It is receiving a lot of attention among research community to understand nano-scale communication [1]. Molecular communication is a way to interconnect very small-sized machines, called nano-machines. These machines are capable of performing simple tasks like sensing, actuating and computing. In order to accomplish complex tasks with these machines, we need to connect them and make a nano-network. This communication system has a very small transmission range and low power consumption. Some parts of this system are similar to the conventional communication system. For example, the calcium ion and the hormonal signaling are used in this communicational channel for message delivery. Molecules are the carriers of this system and deliver the messages by propagating toward their destination. Among all the molecular communication systems in the nature, the neurosynaptic communication plays a crucial role in the human nervous system. It uses electrical impulses and the diffusion of the neurotransmitters for the data transmission through the body neural network. In this model, the neurons are the communication channel transferring spike pulses between the brain and the other organs. The spikes, which are called “action potentials”, are rapid changes on the membrane potential. The action potential (AP) is a short lasting electrical impulse. The time scale of AP is about 1 ms with a voltage fluctuation of 100 mV and thus, it can be modeled by a delta function [2], [3]. Each part of the neuron is responsible for a specific task in the body nervous system. Any of these parts can be modeled with a component of a communicational system. An overall schematic of a single neuron is shown in Fig. 1. Soma, i.e., the nucleus of the neuron cell, is responsible for generating electrical pulses according to the received nervous stimulus. Axon is the tube part of the cell and conveys the electrical pulses. It can be modeled as a wire in the traditional communication system. Synapses are located between the transmitting neuron pairs, right after the tail part of the transceiver neuron’s axon. Once an electrical pulse arrives at the end of the axon, according to the number of release-ready neurotransmitters, i.e. neural signal carriers, they spread toward the next neuron. The process of spreading is modeled with the diffusional data transferring [4]. At the receiving neuron, a dendrite is responsible for catching these neurotransmitters which propagate in the 2

Fig. 1: Schematic of a neuron cell

synapse and are stored within little bags called vesicles in the presynaptic terminal. Receptors of the dendrite capture these neurotransmitters and let the ionic fluxes flow inside the cell and increase the electrical potential. Thus, the dendrite can be considered as a physical antenna in the receiving neuron. Once this potential reaches a threshold, the soma fires an AP. This is the overall communication system in which the messages are sent cell by cell. Neuron cells are crucial connectors in every neural system, especially, in the visual system. One of the neurons in the nervous system which work in a multi-rate manner is the dorsal Lateral Geniculate Nucleus (LGN) of the thalamus. These neurons are the primary connector neurons by which visual information from the retina reaches the cortex [5]. The retina encodes the visual data with two firing rates: tonic and burst modes. In the tonic mode comparing to the burst mode, the pulses are generated at a lower rate and a larger inter-spike interval. A burst is a series of action potentials fired in a rapid succession. The tonic firing occurs during the normal pathological condition. One hypothesis states that bursts help in the detection of the objects to the cortex [6]. However, the tonic mode serves in the encoding of objects’ details [5]. We consider the LGN as an example of a multi-rate operating cell. Most of the previous works on the neuro-synaptic communication channel focus on the signal estimation (see, e.g., [7] and [8]). Authors in [9] and 3

[10] introduce new schemes for neurological therapy and drug delivery using nano-networks. As another example, [11] proposes equalization methods to compensate the effect of nervous disorders in the human body. In this work, similar to [2],[12], and [13], we consider the signal detection paradigm in the neuro-synaptic channel where the objective is to find a detector analogous to the yes-no decision scheme used in psychophysics. This means that the detector makes decisions on whether a spike is generated at the input, resulting yes, or there is no generated spike, leading to no decision. The objective of this detector is to reduce the probability of error in the bit detection. Furthermore, in the synaptic area and in the presynaptic side, small fluid-filled sacs of neurotransmitters called vesicles are located. According to the number of released vesicles, we assign a state to the channel. Thus, we consider a finite state Markov channel (FSMC) in this communication system. A FSMC [14] has a set of different states. The probability of transition between these states is fixed. At each state, the properties of the communication channel are changed. Moreover, the performance of the system is influenced by the properties of each state, e.g., the probability of error in the receiver side may change. In this paper, we first introduce a comprehensive channel model for the neuro-synaptic communication which incorporates the effect of the different channel states and spike rates. The assumption of having a channel state and a spike rate exist in the body nervous system especially in the retinal ganglion neurons. Although we use retinal cells as a case study, the features of this communication channel can be applied to other neuron cells. This extended model is based on a previous simple channel model described in [13]. Similar to [4], we use the vesicle usage as the definition of the state space. Comparing to [4], we formulate the properties of the states in a comprehensive manner. That is, we derive the transition and stationary probabilities of the states inside the state space. It is worth mentioning that we consider point-topoint communication model in our analyses. In this model, a single pair of transceiver and receiver component with a synaptic area between them are engaged in the communicational message transfer. Due to this modeling, we do not suppose the interference from other neurons. Moreover, we introduce an M -alphabet channel input instead of the binary alphabet which is used in [4]. In the communication theory [15], an M -alphabet signaling is used when there is a need to increase the data transmission rate. This is done by choosing symbols from a set of non-binary symbols, i.e., an M -alphabet set. In addition, we consider the effect of the 4

0

0

1 1

m(t)

LNP Filtering

In

ρ

.

.

.

.

.

.

.

.

.

.

1

z(t) d(t)

qR

h(t)

Variable Gain

EPSP Shape

x(t)

y(t)

h(-t)

wn

Detector

S Poisson Encoding

0

0

M 1

ρ

1

Matched Filter

Vesicle Release

Fig. 2: Proposed system model for the neuro-synaptic communication channel

saturation in the postsynaptic receptors for this communication system. This effect is a result of the multi-vesicular release. That is, more than one vesicle is released in the presynaptic terminal which is discussed in next sections. Next, we find the detector using maximum a posteriori (MAP) criteria [15] at the postsynaptic terminal which receives the data at the destination neuron in the cerebral cortex. Moreover, the MAP criteria is widely used in the communication literature (see e.g., [16], [17] and [18]) in order to find decision rules at the receiver to determine which symbol is sent from the transmitter. We derive a closed-form solution for the binary decision regions of the detector. It is decided whether the transceiver neuron has been stimulated or not. Furthermore, the probability of error at the receiving neuron is investigated. Finally, the simulation results show that the probability of error in the decision system decreases when the number of release-ready vesicles increases. In addition, the probability of single vesicle release influences the performance of the detector. As this probability increases, the bit error rate alleviates as well. The paper is organized as follows: in Section 2, we describe our system model and explain all the features of this communication channel. In Section 3, the signal detection scheme is derived for the proposed system model. Section 4 shows the simulation results. The paper is finalized with the concluding remarks in Section 5. 2. System Model for Synaptic Markov Channel Our proposed system model has been illustrated in Fig. 2. Moreover, a list of notations and definitions used in this paper are given in Table 1. 5

The first block in Fig. 2 is Linear Non-linear Poisson (LNP) filtering block [7]. This block generates spike trains in response to its input m(t), which is a random P electrical stimulation from the body nervous system, in the form of I(t) = n δ(t − tn ), where δ(.) is Dirac delta function and tn s are the occurrence times of the spike generation. The spike train I(t) has a doubly Poisson stochastic process property, with a time varying firing rate [19]. In our system model, we assume that the system operates in two rates, namely r ∈ {r1 , r2 }, where r1 and r2 stand for the rates of the tonic and the burst modes in the neural encoding, respectively. As mentioned before, the burst mode runs at a faster rate, i.e., r1 < r2 . The vesicle release block in Fig. 2 models the release of the vesicles in the presynaptic terminal. A detailed illustration of the vesicle release process is shown in Fig. 3. As it can be seen in Fig. 3, neurotransmitters diffuse into the synaptic area from a single release site. When a spike reaches the end of the axon, based on the number of the release-ready vesicles (the vesicles which are filled with neurotransmitters), a release may occur. In this process, a vesicle merges with the boundary of the neuron. Then, it spreads its neurotransmitters into the synaptic cleft. As mentioned earlier, we assume a multi-vesicular release scheme. Thus, the vesicle release block consists of M release sites. Each of these sites is modeled by a Z channel because without an impulse, the probability of releasing the vesicles is very small, and thus, can be neglected (see [7]). This model is different from [4] where only one site has been considered in the vesicle release block. We suppose that there are M sites in the presynaptic side (see Fig. 2). In addition, we assume each site releases one vesicle at most. Regarding these M release sites, there is a set with M + 1 elements. In this set, each element shows the number of released vesicles. When an impulse reaches to the presynaptic side, each site releases either one vesicle or nothing. Aggregating these released vesicles, we have {0, 1, · · · , M } vesicles in the presynaptic side. In this set, the element 0 stands for the case with no release in any of the sites regarding no incoming impulse. The element 1 stands for a release in a single site and no release in the remaining sites and so on. As mentioned above, Fig. 3 illustrates a single vesicle release site in the presynaptic area. In our model, there are M release sites where each of them goes through the release operation, independently. We assume that the vesicles of different sites reach to the receptors of the receiving neuron, synchronously. This assumption is valid because the vesicle sites are similar and close to each other. Furthermore, they are stimulated simultaneously. Due to temporal summation property 6

Synapse Presynaptic terminal

Neurotransmitter transporter

Axon

termina

V gated

Receptors

Vesicle

Neurotransmitters

density

Postsynaptic terminal

Fig. 3: A schematic of vesicular release of a single site in the presynaptic area.

of the dendrites [7], this synchronized reception results in the change of the potential at the receiving neuron. In the following, we call this model as “M -ary model”. The M -ary model has been proposed in [20] from the neuroscience perspective, however, it has not been studied from the detection theory point of view. In order to study the vesicle release procedure, we define the process a(t) = (m(t), r(t)) which is a continuous-time Markov chain. This process is defined on the discrete state space S = {0, 1, · · · , M } × {r1 , r2 }, where m(t) ∈ {0, 1, ..., M } is the number of release-ready vesicles and r(t) ∈ {r1 , r2 } represents the presynaptic rate. We represent the elements in the state space, sν1 ,ν2 ∈ S, with sν1 ,ν2 = (mν1 , rν2 ), where ν1 ∈ {0, 1, ..., M } and ν2 ∈ {1, 2}. We define c = (ν1 , ν2 ) to represent the pair ν1 and ν2 . Now, we define the infinitesimal generator matrix A, of a(t), which has 2(M + 1) × 2(M + 1) elements. This matrix describes the rate at which the continuous-time Markov chain a(t) moves between states. The off-diagonal elements of A are instantaneous state transition rates given by   1 0 (1) Ac0 ,c = lim Pr a (t + h) = sc |a (t) = sc , c0 6= c. h→0 h 7

\

Synapti cleft

> Dendri

Table 1: The list of notations and definitions used in this paper.

Notation tn r x(t) M τr rrel τs1 , τs2 rs1 , rs2 ρ sm,α n a(t) A τr Wn qn λ pz R0 v and θ τe π Pπ Eh

Definition The occurrence time of spikes The state transition rate The state dynamic variable Size of the vesicle pool The recovery time constant for vesicles The release rate of vesicles from the presynaptic terminal The rate transition time constants of burst and tonic modes The instantaneous rates of switching between burst and tonic modes Single vesicle release probability The state with m vesicles and rate α in the nth time slot Continuous time Markov process The infinitesimal generator matrix of a(t) Vesicle recovery time constant The number of released vesicles in response to nth spike The variable gain in nth spike The saturation parameter in postsynaptic receptors The distribution of random variable z Synaptic response of activation in all postsyneptic receptors The order and the rate parameters of the Gamma distribution Time constant of postsynaptic receptors The stationary state distribution vector of a(t) The state transition matrix The energy of the EPSP wave

8

Diagonal elements of A are chosenPsuch that the sum of the elements of a row is equal to zero, i.e., Ac,c = − k6=c Ac,k . In order to find the elements of A, we have to calculate the different state transition rates which a(t) undergoes. There are different types of state transition for this process. The first is the vesicle recovery procedure in which the number of release-ready vesicles in the presynaptic terminal is increased. The second is the release process where the vesicle merges with the cell boundary and spreads its neurotransmitters into the synapse. The third is the change of the instantaneous rate of the spike generation. Finally, if none of the above transitions occur, a(t) remains in its previous state. The recovery rate, rrec , is the rate at which the vesicles are recovered at the presynaptic terminal and is given by  M −m 1 Pr a(t + h) = (m + 1, r)|a(t) = (m, r) = , h→0 h τr

rrec , lim

(2)

where τr is the recovery time constant and analogous to [21], it is fixed for any kind of neuron. This process increments the value of m(t) by one vesicle. k The release rate, rrel , is the rate at which k vesicles are released from the presynaptic terminal. This rate is defined as follows  1 k rrel , lim Pr a (t + h) = (m − k, r)|a (t) = (m, r) h→0 h m! =r ρk (1 − ρ)m−k , k 6= 0, (m − k)!

(3)

where m ∈ {0, · · · , M − 1}, k ∈ {0, · · · , m}, r ∈ {r1 , r2 }, and ρ is the single vesicle release probability. It is worth mentioning that for a constant release rate, the probability of the vesicle release has a binomial distribution [7]. This means that, in order to find the release rate for k vesicles at a specific firing rate (e.g. tonic or burst), we have a binomial experiment with k releases each with the probability of ρ and m − k no release each with the probability of 1 − ρ. The instantaneous rates, rs1 and rs2 , at which the value of r(t) switches are given by  1 rs1 , lim Pr a (t + h) = (m, r2 )|a (t) = (m, r1 ) = h→0 h  1 rs2 , lim Pr a (t + h) = (m, r1 )|a (t) = (m, r2 ) = h→0 h 9

1 , τ1 1 , τ2

(4) (5)

(0, rf )

(M − 2, rf ) 2 τr

1 τs

1 τs

(M − 1, rf ) 1 τs

1 τr

1 τf

1 τf

1 τf

(M, 0)

1 τs

A1,3

1 τf A1,1

(M, 0)

A1,2

(M − 1, rs )

1 τr

A2,3

(M − 2, rs )

(0, rs )

2 τr A2,2

A3,3

Fig. 4: Markov chain model of the vesicle release process

where τ1 and τ2 are the time constants for rate transition in burst and tonic modes, respectively. Thus, by using the expressions given in (2)-(5), the offdiagonal elements of A can be filled. The Markov chain diagram, describing the continuous process of a(t), is illustrated in Fig. 4. The vesicle release process depends only on the spike arrival. The recovery and the rate switching processes are independent of P I(t). Now, we can model the output of the vesicle release block with d(t) = n Wn δ(t − tn ), where Wn is the number of released vesicles in response to the n-th spike. After the release of the neurotransmitters in the synapse, they undergo a diffusional propagation to the destination neuron. They reach at the receptors of the dendrites of the secondary neuron. The variable gain block in Fig. 2 represents the effects of the saturation of the postsynaptic terminal, 1 the received signal, which we discuss R, and the amplitude distortion, q, of in the following. The release of more than one vesicle, i.e., multi-vesicular 10

release, results in the saturation of postsynaptic receptors. We denote this saturation effect by the saturation parameter λ. We also denote the synaptic response of activation of all postsynaptic receptors by R0 . The corresponding response to the release of k vesicles is given by [20] R = R0 [1 − (1 − λ)k ].

(6)

The amplitude distortion which happens to the signal is the trial-to-trial variability in the amplitude of the postsynaptic responses observed at the receiving neuron. This effect adds a variable gain q to the amplitude of the signal. The probability density function (PDF) of q fits the Gamma distribution [19] and is given by the following: q v−1 exp(− θq ) , f (q; θ, v) = θv Γ(v)

(7)

where v and θ are the order and the rate parameter of the distribution, respectively, and Γ(.) is the Gamma function. The excitatory post synaptic potential (EPSP) shape block in Fig. 2 generates electrical signals in response to the received neurotransmitters. This function can be modeled with an alpha function [19], which is represented by h(t) =

t t exp (1 − )u(t), τe τe

(8)

where u(t) is the unit step function. The value of τe represents the response time of receptors. The last origin of randomness in this communication system is the synaptic noise which is caused by vesicle releases and ionic fluxes at the postsynaptic terminal. It is represented by z(t) in Fig. 2 and modeled by a white Gaussian noise with zero mean and spectral density of N0 /2. We refer to the distribution of z(t) as pz (Z). The received signal at the postsynaptic terminal can be represented as X y(t) = Rn qn Wn h(t − tn ) + z(t), (9) n

where qn is the amplitude of quantal random gain and Rn is the saturation gain in nth time slot.

11

0

0

1 .. . 1

M m

m0

Fig. 5: Detection rule based on the number of received vesicles

3. Signal Detection in Postsynaptic Terminal of Neuro-Synaptic Markov Channel In this section, we describe the signal detection method in the receiving neurons. Without any loss of generality, we consider the system with the discrete time model. Thus, we quantize the time into some slots, each with the duration of ζ. In order to ensure that there is at most one spike in each time slot, the value of ζ is chosen according to the refractory property of neurons [22]. Thus, by this quantization, the information at the discrete time slot n would be In = 1 when there is a spike at the n-th time slot and In = 0, when there is no spike. When In = 0, there is no vesicle released. In this case, there is a probability for vesicle recovery process where its rate is defined in (2). In addition, the refractory period in the neurons is such that there is no inter-symbol interference (ISI). That is, there is no possibility for the occurrence of more than one spike inside a single time slot. In order to maximize the signal-to-noise ratio (SNR) at the receiver side (postsynaptic terminal), it is desirable to use a filter matched to the pulse generating function, i.e., h(t), and sample its output at time interval ζ [15]. Hence, the output of the matched filter is wn = Eh k qn R0 [1 − (1 − λ)k ] + zn0 ,

k = 0, 1, · · · M,

(10)

where Eh is the energy of the pulse h(t) in a slot with duration ζ and k is the number of the released vesicles. Furthermore, zn0 which is the sampled version of the filtered noise, i.e., z 0 (t) = z(t) ∗ h(−t), is a Gaussian random variable with zero mean and variance Eh N0 /2. Regarding the modeling of detection rule at the receiver side, we start with a bit generation at the source neuron cell. Suppose that the LNP 12

filtering block (see Fig. 2) generates the bit In = 1 with probability p and In = 0 with probability 1 − p. In order to demodulate the received signal at the postsynaptic side, we have to estimate the number of released vesicles k from the presynaptic terminal. Similar to [20], we assume that for k = 0, the message is In = 0 and otherwise In = 1. A schematic description of this method is shown in Fig. 5. Here, we use MAP rule as in [15]. The corresponding decision region should have the following property D0 = {wn ∈ R : (1 − p)P [wn |In = 0] > pP [wn |In = 1]},

(11)

where D0 is the decision region for bit 0. The left hand side of the region in (11) can be expressed as the following (a)

P (wn |In = 0) =

X

 P (wn |In = 0, sn = sj+i,β , sn−1 = sj,α )

i,j∈{0,··· ,M }, α,β∈{1,2}

(b)

=

 × Pr(sn = sj+i,β |sn−1 = sj,α ) Pr(sn−1 = sj,α ) X X  P (wn |In = 0, sn = sj+i,β , sn−1 = sj,α ) j∈{0,··· ,M }, i∈{0,1} α,β∈{1,2}

=

× Pr(sn = sj+i,β |sn−1 = sj,α ) Pr(sn−1 = sj,α ) X  P (wn |In = 0, sn = sj,β , sn−1 = sj,α )



j∈{0,··· ,M }, α,β∈{1,2}

 × Pr(sn = sj,β |sn−1 = sj,α ) Pr(sn−1 = sj,α )  + P (wn |In = 0, sn = sj+1,β , sn−1 = sj,α ) × Pr(sn = s

j+1,β

j,α

|sn−1 = s ) Pr(sn−1

  =s ) , j,α

(12) where, in (a) we rewrite the probability P [wn |In = 0] based on the current and the previous states. (b) follows because when In = 0, there is no vesicle released, thus, i takes values in the set {0, 1}, where i = 0 represents no change in the state (the number of vesicles remains unchanged), i = 1 denotes the vesicle recovery process mentioned above. Then, we apply the summation over i. We simplify (12) to get the following (see Appendix A for the details): 13

X

P (wn |In = 0) = pz (wn )

X

Pr(sn−1

 = s ) Pr(sn = sj,l |sn−1 = sj,l ) j,l

j∈{0,··· ,M } l∈{1,2}

+ Pr(sn = s

j,({1,2}−{l})

j,l

|sn−1 = s )+Pr(sn = s

j+1,l

|sn−1

 =s ) . j,l

(13) In order to simplify the terms of (13), we must calculate the probability of being in a state and also the probability of the state transition. In favor of finding the stationary state probability, we use the infinitesimal matrix discussed in (1) and use the equations πA = 0, and π1 = 1,

(14)

where π is the stationary state distribution vector with elements πi = Pr(s = si ), and the second term is an additional constraint insuring sum of the probabilities to be 1. Note that 1 is a 2(M + 1) × 1 vector with unit elements. Calculating above equations simultaneously gives us the stationary state distribution vector π. Now, for the state transition probabilities, we use the other property of infinitesimal matrix which is Pπ (t) = exp(tAT )P (0),

(15)

where Pπ (t) is the state transition matrix with elements pn,πij = Pr(sn = si |sn−1 = sj ),

(16)

and P (0) is the initial probability matrix. We choose the identity matrix for P (0) in our calculations. We can find the probabilities in (13) from the quantities calculated above. Now, we continue the simplification of the terms in (11). The right hand side of (11) can be expressed as the following P (wn |In = 1)=

X



P (wn |In = 1, sn = si,β , sn−1 = sj,α )

i,j∈{0,··· ,M } α,β∈{1,2}

 × Pr(sn = si,β |sn−1 = sj,α ) Pr(sn−1 = sj,α )

14

(a)

X

=



P (wn |In = 1, sn = si,α , sn−1 = sj,α )

i,j∈{0,··· ,M } α∈{1,2}

 × Pr(sn = si,α |sn−1 = sj,α ) Pr(sn−1 = sj,α ) =

X

P (wn |In = 1, sn = sj−k,α , sn−1 = sj,α )

k>0

×

X

Pr(sn = sj−k,α |sn−1 = sj,α ) Pr(sn−1 = sj,α ),

j∈{0,··· ,M } α∈{1,2}

(17) where in the above equations, similar to (12), we use the conditioning of the probability P (wn |In = 1) over the current and previous states. In this case for In = 1 (i.e. generation of an impulse in the transceiver neuron), based on the state of the system, the vesicles are released in the presynaptic terminal. In (a), according to Fig. 4 and referring to the release process discussed in (3), the rate does not change. Therefore, we set α = β in the summation. Furthermore, since a release process occurs in this case, we set i = j − k where k 6= 0 is the number of released vesicles. Following similar steps in (A.1), we substitute the conditional probability with its equivalent Gaussian noise. In other words, for k 6= 0, the output of the matched filter is given by wn = kEh qn R0 [1 − (1 − λ)k ] + zn0 . Thus, the simplification of (17) yields the following (see Appendix B), Z ∞ M X [pz (wn − kqn Eh R0 [1 − (1 − λ)k ])φ(k)dqn ], P (wn |In = 1) = f (qn ; θ, v) 0

k=1

(18) where φ(k) ,

X

Pr(sn = sj−k,α |sn−1 = sj,α ). Pr(sn−1 = sj,α ).

(19)

j∈{0,··· ,M } α∈{1,2}

Each term in the summation (18) is a Gaussian random variable with the mean kqn Eh R0 [1 − (1 − λ)k ] and the variance φ2 (k)N0 Eh /2 as explained earlier. We also define v uM uX φ2 (k), (20) Φ,t k=1

15

and M X

1 − (1 − λ)M (1 + M λ) µ , R0 . k[1 − (1 − λ) ] = R0 (1 − λ) λ2 k=1 k

(21)

Thus, the sum term in (18) can be written as M X

[pz (wn − kqn Eh R0 [1 − (1 − λ)k ])φ(k)] = Φpz (wn − µqn Eh ),

(22)

k=1

which is simplified to the following (see Appendix C) P (wn |In = 1) = pz (wn )ΦFv (wn ),

(23)

where ∞

p 2µ2 Eh θ2 −v β 2wn qn µ − qn2 Eh µ2 )dqn=( ) 2 e 2 Λ−v ( 2β), f (qn ; θ, v) exp( N0 N0 0 (24) (2wn θµ − N0 )2 β, , (25) 4N0 Eθ2 µ2 Z

Fv (wn ) ,

and Λv (.) is the parabolic cylinder function of order v [23, Eq. 9.240]. Finally, to determine the decision region D0 mentioned in (11), we combine (13) and (23) as the following

D0 = {wn ∈ R :

pΦ wth (1 − p) pz (wn ) > p pz (wn )ΦFv (wn )}, 1−p

(26)

where 1−p wth , Φp

X

X

j,l



Pr(sn−1 = s ) Pr(sn = sj,l |sn−1 = sj,l )

j∈{0,··· ,M } l∈{1,2}

+ Pr(sn = s

j,({1,2}−{l})

j,l

|sn−1 = s ) + Pr(sn = s

16

j+1,l

|sn−1

 = s ) . (27) j,l

Fig. 6: An example for illustrating the decision regions

Thus, for the case which the cortex decides that there are no spikes generated in the transceiver neuron cells, the decision region becomes D0 = {wn ∈ R : Fv (wn ) < wth }.

(28)

The procedure for D1 is very similar to D0 and yields D1 = {wn ∈ R : Fv (wn ) > wth }.

(29)

Therefore, we have derived a closed form solution for the decision regions in the cortex. The parameter v in the amplitude distribution of f (q; θ, v) is used to model the variability of q [13]. In addition, in the worst-case scenario, i.e. v = 1, the Gamma distribution equals to the exponential distribution with a large variance and it can be used to find the bounds for the detection system. Thus, the detection rule for this case is as the following s p D1 N0 π exp(β)Q( β), (30) wth ≶ 2 2 Eh θ µ D0 R∞ 2 where Q(t) = √12π t exp −u2 du is the Q-function or normal probability tail integral. Fig. 6 illustrates an example for the decision regions D0 and D1 for a given threshold value rth = 0.35. The decision system, first calculates the 17

10 0 ρ ρ ρ ρ

Bit error rate (Pe)

10 -1

=0.1 =0.3 =0.5 =0.7

10 -2

10 -3

10 -4

10 -5

1

2

3

4

5

6

Number of vesicles (M)

Fig. 7: Probability of error versus the number of release-ready vesicles for different single vesicle release probabilities, i.e. ρ, the parameters are chosen as SNR = 20dB, τe =2ms, τu = 800ms, τ1 = r11 = 40ms, τ2 = r12 =20ms, λ = 0.1, p = 0.5, θ = 1 and v = 1.

threshold value given in (27), and then, using (30) finds the corresponding regions. Finally, the error probability of our proposed decision system is given by the following Z Z P [wn |In = 0]. (31) P [wn |In = 1] + (1 − p) Pe = p Fv (wn )>wth

Fv (wn )
There is no closed form solution for this error probability calculation. In other words, considering the limits of the integrals in (31), we have to solve a nonlinear equation Fv (wn ) ≶ wth which has no analytical solution. Thus, we try to solve it numerically in the next section and by that, we can calculate the probability of error Pe . 4. Simulation Results In this section, we simulate the Markov channel in the neuro-synaptic communication model described in Fig. 2 with the Monte-Carlo method with 106 transmitted spikes and analyze its performance. In order to map the neural system into a digital communication system, we use the bit “1” as spike occurrence and the bit “0” for spike absence in each time slot. To 18

Probability of error (Pe)

10 0

10 -1

10 -2

ρ=0.7 ρ=0.5 ρ=0.3 ρ=0.1

1

2

3

4

5

6

7

8

9

10

11

12

SNR (dB)

Fig. 8: Probability of error versus SNR for different single vesicle release probabilities, i.e. ρ, the parameters are chosen same as the previous figure, except for M = 3.

implement a realistic stimulation for our nervous system, we use a binary Poisson random number generator. In Fig. 7, we study the probability of error, Pe , versus the number of release-ready vesicles M . The parameters for this simulation are chosen similar to [24]. This figure plots Pe for different values of the single vesicle release probability ρ. As it can be seen, increasing the number of the vesicles in presynaptic terminal (M ) results in the reduction of the error probability in the postsynaptic terminal (i.e., the cerebral cortex in the brain). This results in an increment in the number of neurotransmitters at the synaptic area. Thus, a signal has a higher chance of being transferred inside the synapse. In other words, when we have a small number of release-ready vesicles in the presynaptic terminal, most of the incoming spikes lose the potential to trigger the vesicles and therefore the signal transmission fails. Moreover, the error probability decreases because of the increment in the single vesicle release probability (ρ). This follows from the fact that with a large ρ, the number of neurotransmitters grow in the synaptic area. These neurotransmitters propagate a large number of impulses to the postsynaptic side. This results in the improvement of the detection of the multi-rate data from the neurons in the cortex.

19

0.5 0.45 0.4 0.35

Bit error rate

0.3 0.25 0.2

ρ =0.7 ρ =0.5 ρ =0.3 ρ =0.1

0.15

0.1

0

0.1

0.2

0.3

0.4 0.5 0.6 Saturation parameter (λ)

0.7

0.8

0.9

1

Fig. 9: The effect of saturation parameter λ on the probability of error for different single vesicle release probabilities.

Fig. 8 illustrates the bit error rate of the decision system versus SNR for different single vesicle release probabilities ρ. As a result, Fig. 8 shows that when ρ is 0.1 (i.e. the probability of the neurotransmitter diffusion in the synaptic area between the cells is very small), the density of the neurotransmitters in the synaptic area would be small. Thus, increasing the SNR does not affect the probability of error in the cortex side. Furthermore, similar to the conventional communicational system with an additive Gaussian noise, as we increase the SNR or strengthen the desired signal versus a fixed noise power, the probability of error falls. In Fig. 9, we study the impact of the saturation parameter, λ, on the probability of the error in the detection system for different single vesicle release probabilities. Consider the formula for the signal at the input of the detector in (10). The input is a function of [1 − (1 − λ)k ]. As λ goes to 0, the detector observes pure noise and thus, it randomly decides on the transmitted bit. Therefore, in this case, the error probability is 0.5. On the other hand, when λ goes to 1 (especially for λ > 0.3), the receptors in the receiver become saturated. This saturation results in closing the ionic gates at the postsynaptic terminal and therefore neurotransmitters cannot bind to these gates for activating the ionic flow inside the receiving neuron. Thus, the parameter ρ has a slight effect on the performance of the system. 20

5. Conclusion In this paper, we introduced a comprehensive neuro-synaptic channel model that incorporates the effect of spike train rates and channel states. The proposed Markov channel for this communication system incorporates the mentioned effects in its two-dimensional state space. In addition, we derived full properties of this Markov state space. Furthermore, we included the impact of the saturation in the receptors of the postsynaptic side in our model. Moreover, we derived a closed-form solution for the detector at the postsynaptic side and the probability of error in the decision system. Finally, we simulated the proposed system model in various scenarios and analyzed the results. The proposed system model can be extended with different approaches. From an information theoretic view, one can determine the rate bounds within a Markovian neural channel. On the other hand, considering a multiinput neural structure would represent a more realistic system model. Advancements in the area of neuro-synaptic communications would lead to an elevation in both artificial nano-networks and therapeutic methods. New system models can be used as the structures of the man-made neural network. Also, by modeling different types of diseases in the body nervous system, one can propose engineering techniques for therapies. Appendix A. Simplification of the term (12)

X



 P (wn |In = 0, sn = sj,β , sn−1 = sj,α )

j∈{0,··· ,M }, α,β∈{1,2}

× Pr(sn = sj,β |sn−1 = sj,α ) Pr(sn−1 = sj,α )  + P (wn |In = 0, sn = sj+1,β , sn−1 = sj,α ) × Pr(sn = s (a)

=

j+1,β

j,α

|sn−1 = s ) Pr(sn−1

  =s ) j,α

  pz (wn ) × Pr(sn = sj,β |sn−1 = sj,α ) Pr(sn−1 = sj,α )

X j∈{0,··· ,M }, α,β∈{1,2}

+ Pr(sn = s



j+1,β

j,α

|sn−1 = s ) Pr(sn−1 21

  =s ) j,α

X

= pz (wn )

j,α

Pr(sn−1 = s )

j∈{0,··· ,M }, α∈{1,2}

+ Pr(sn = s

j+1,β

(b)

X

= pz (wn )

X 

Pr(sn = sj,β |sn−1 = sj,α ))

β∈{1,2}

|sn−1

 =s )

X

j,α

j,α



Pr(sn−1 = s ) Pr(sn = sj,1 |sn−1 = sj,α )

j∈{0,··· ,M } α∈{1,2}

+ Pr(sn = sj,2 |sn−1 = sj,α ) + Pr(sn = sj+1,1 |sn−1 = sj,α )  j+1,2 j,α |sn−1 = s ) + Pr(sn = s X  (c) = pz (wn ) Pr(sn−1 = sj,1 )[Pr(sn = sj,1 |sn−1 = sj,1 ) j∈{0,··· ,M }

+ Pr(sn = sj,2 |sn−1 = sj,1 ) + Pr(sn = sj+1,1 |sn−1 = sj,1 ) + Pr(sn = sj+1,2 |sn−1 = sj,1 )] + Pr(sn−1 = sj,2 )[Pr(sn = sj,1 |sn−1 = sj,2 ) + Pr(sn = sj,2 |sn−1 = sj,2 ) + Pr(sn = sj+1,1 |sn−1 = sj,2 )  j+1,2 j,2 + Pr(sn = s |sn−1 = s )] (A.1) where (a) follows because while there is no release in the presynaptic terminal, by substituting k = 0 in the output of the matched filter, wn = kEh qn R0 [1 − (1 − λ)k ] + zn0 , we have wn = zn0 . Thus, we substitute the conditional probability over wn with its equivalent Gaussian noise, i.e., pz (wn ). (b) and (c) follow from the application of summation over β and α, respectively. Here, we have eight terms inside the summation that each of them represents a transition in the Markov process a(t). According to Fig. 4, in the rate transition process which is represented in (4) and (5), the vesicles are not recovered and the number of the vesicles does not change. Thus, the terms Pr(sn = sj+1,2 |sn−1 = sj,1 ) and Pr(sn = sj+1,1 |sn−1 = sj,2 ) are equal to zero. Therefore, we have pz (wn )

X



Pr(sn−1 = sj,1 )[Pr(sn = sj,1 |sn−1 = sj,1 )

j∈{0,··· ,M }

+ Pr(sn = sj,2 |sn−1 = sj,1 ) + Pr(sn = sj+1,1 |sn−1 = sj,1 ) 22

+ Pr(sn = sj+1,2 |sn−1 = sj,1 )] + Pr(sn−1 = sj,2 )[Pr(sn = sj,1 |sn−1 = sj,2 ) + Pr(sn = sj,2 |sn−1 = sj,2 ) + Pr(sn = sj+1,1 |sn−1 = sj,2 )  j+1,2 j,2 + Pr(sn = s |sn−1 = s )] X  = pz (wn ) Pr(sn−1 = sj,1 )[Pr(sn = sj,1 |sn−1 = sj,1 ) j∈{0,··· ,M }

+ Pr(sn = sj,2 |sn−1 = sj,1 ) + Pr(sn = sj+1,1 |sn−1 = sj,1 )] + Pr(sn−1 = sj,2 )[Pr(sn = sj,1 |sn−1 = sj,2 ) + Pr(sn = sj,2 |sn−1 = sj,2 )  j+1,2 j,2 + Pr(sn = s |sn−1 = s )]  X X j,l = pz (wn ) Pr(sn−1 = s ) Pr(sn = sj,l |sn−1 = sj,l ) j∈{0,··· ,M } l∈{1,2}

+ Pr(sn = s

j,({1,2}−{l})

j,l

|sn−1 = s ) + Pr(sn = s

j+1,l

|sn−1

 =s ) . j,l

Appendix B. Simplification of the term (17) X

P (wn |In = 1, sn = sj−k,α , sn−1 = sj,α )

k>0

X

×

Pr(sn = sj−k,α |sn−1 = sj,α ) Pr(sn−1 = sj,α )

j∈{0,··· ,M } α∈{1,2}

=

M  X

 Eqn pz (wn − kqn Eh R0 [1 − (1 − λ)k ])|qn

k=1

×

X

Pr(sn = s

j−k,α

j,α

|sn−1 = s ) Pr(sn−1

 =s ) j,α

j∈{0,··· ,M } α∈{1,2} (a)

Z

=



M  X f (qn ; θ, v) pz (wn − kqn Eh R0 [1 − (1 − λ)k ])dqn

0

k=1

×

X

Pr(sn = s

j−k,α

j,α

|sn−1 = s ) Pr(sn−1

j∈{0,··· ,M } α∈{1,2}

23

 =s ) j,α

(A.2)

(b)

Z



f (qn ; θ, v)

=

0

M X

[pz (wn − kqn Eh R0 [1 − (1 − λ)k ])φ(k)dqn ].

(B.1)

k=1

Here, we replace the first conditional probability with its equivalent Gaussian distribution and also an expectation over the random variable qn has been taken. Also, we fix the upper bound of summation over k to M which is the maximum number of resources for the vesicles in the presynaptic terminal. In (a), we replace the expectation over qn with its integral formula over the distribution of qn . In (b), we use the definition (19). Appendix C. Simplification of the term (22)



Z P (wn |In = 1) =

f (qn ; θ, v)

0

M X

[pz (wn − kqn Eh R0 [1 − (1 − λ)k ])φ(k)dqn ]

k=1 ∞

Z

f (qn ; θ, v)Φpz (wn − µEh qn )

= 0

Z =Φ



f (qn ; θ, v) √

0

w2 + qn2 Eh2 µ2 − 2wn qn µEh 1 exp(− n ) dqn N0 Eh πEh N0

2 Z n exp ( E−w )Φ ∞ q 2 E 2 µ2 − 2wn qn µEh h N0 f (qn ; θ, v) exp(− n h = √ ) dqn N0 Eh πEh N0 0 Z ∞ q 2 E 2 µ2 − 2wn qn µEh f (qn ; θ, v) exp(− n h ) dqn = pz (wn )Φ N0 Eh 0

(a)

= pz (wn )ΦFv (wn ),

(C.1)

where in (a) we used the definition in (24). References [1] I. F. Akyildiz, F. Brunetti, C. Bl´azquez, Nanonetworks: A new communication paradigm, Computer Networks, Elsevier 52 (12) (Aug. 2008) 2260–2279. [2] B. Maham, A communication theoretic analysis of synaptic channels under axonal noise, IEEE Comm. Letters 19 (11) (Nov. 2015) 1901– 1904. 24

[3] J. P. Meeks, S. Mennerick, Action potential initiation and propagation in ca3 pyramidal axons, Journal of neurophysiology 97 (5) (2007) 3460– 3472. [4] E. Balevi, O. B. Akan, A physical channel model for nanoscale neurospike communications, IEEE Trans. Comm. 61 (3) (Mar. 2013) 1178– 1187. [5] P. Reinagel, D. Godwin, S. M. Sherman, C. Koch, Encoding of visual information by lgn bursts, J. neuroph. 81 (5) (1999) 2558–2569. [6] J. Feng, Computational neuroscience: a comprehensive approach, CRC press, 2003. [7] D. Malak, O. B. Akan, A communication theoretical analysis of synaptic multiple-access channel in hippocampal-cortical neurons, IEEE Trans. Comm. 61 (6) (June 2013) 2457–2467. [8] F. Gabbiani, C. Koch, Coding of time-varying signals in spike trains of integrate-and-fire neurons with random threshold, Neural Computation 8 (1) (1996) 44–66. doi:10.1162/neco.1996.8.1.44. [9] L. Galluccio, S. Palazzo, G. E. Santagati, Modeling signal propagation in nanomachine-to-neuron communications, Nano Communication Networks 2 (4) (2011) 213 – 222. [10] L. Galluccio, S. Palazzo, G. E. Santagati, Characterization of molecular communications among implantable biomedical neuro-inspired nanodevices, Nano Communication Networks 4 (2) (2013) 53 – 64. [11] A. Amiri, B. Maham, Inter-symbol interference analysis in neurosynaptic communications, in: 2016 8th International Symposium on Telecommunications (IST), 2016, pp. 478–483. [12] A. Amiri, B. Maham, S. Salehkalaibar, Inter-neuron interference analysis in neuro-synaptic communications, IEEE Communications Letters 21 (4) (2017) 737–740. [13] A. Manwani, C. Koch, Detecting and estimating signals over noisy and unreliable synapses: information-theoretic analysis, Neural computation, MIT Press 13 (1) (2001) 1–33. 25

[14] A. J. Goldsmith, P. P. Varaiya, Capacity, mutual information, and coding for finite-state markov channels, IEEE Transactions on Information Theory 42 (3) (1996) 868–886. [15] M. Salehi, J. Proakis, Digital Communications, McGraw-Hill Education, 5th edition, 2007. [16] J. L. Gauvain, C.-H. Lee, Maximum a posteriori estimation for multivariate gaussian mixture observations of markov chains, IEEE Transactions on Speech and Audio Processing 2 (2) (1994) 291–298. [17] G. Z. Dai, J. M. Mendel, Maximum a posteriori estimation of multichannel bernoulli-gaussian sequences, IEEE Transactions on Information Theory 35 (1) (1989) 181–183. [18] G. Sparacino, C. Tombolato, C. Cobelli, Maximum-likelihood versus maximum a posteriori parameter estimation of physiological system models: the c-peptide impulse response case study, IEEE Transactions on Biomedical Engineering 47 (6) (2000) 801–811. [19] P. Dayan, L. F. Abbott, Theoretical neuroscience, Vol. 806, 2001, Cambridge, MA: MIT Press. [20] V. Matveev, X.-J. Wang, Implications of all-or-none synaptic transmission and short-term depression beyond vesicle depletion: a computational study, The J. Neuroscience 20 (4) (Feb. 2000) 1575–1588. [21] S. Reich, R. Rosenbaum, The impact of short term synaptic depression and stochastic vesicle dynamics on neuronal variability, J. comp. neuroscience 35 (1) (2013) 39–53. [22] H. W. Heiss, Human physiology, Clinical Cardiology 6 (9) (Sep. 1983) A43–A44. [23] D. Zwillinger, A. Jeffrey, Table of integrals, series, and products, Academic Press, , 7th edition, 2007. [24] R. Rosenbaum, J. Rubin, B. Doiron, Short term synaptic depression imposes a frequency dependent filter on synaptic information transfer, PLoS Comput. Biol 8 (2012) e1002557.

26

Detection In Neuronal Communications with Finite ...

tion, which models the data transmission in the body nervous system, is an ..... this case, there is a probability for vesicle recovery process where its rate is.

878KB Sizes 0 Downloads 172 Views

Recommend Documents

Predicting Synchrony in a Simple Neuronal Network
of interacting neurons. We present our analysis of phase locked synchronous states emerging in a simple unidirectionally coupled interneuron network (UCIN) com- prising of two heterogeneously firing neuron models coupled through a biologically realis

Predicting Synchrony in a Simple Neuronal Network
as an active and adaptive system in which there is a close connection between cog- nition and action [5]. ..... mild cognitive impairment and alzheimer's disease.

Mutant huntingtin is present in neuronal grafts in ...
Tampa, FL, USA, 33606,7John van Geest Centre for Brain Repair, Department of Clinical. Neurosciences, University of ..... Immobilon PVDF membranes (EMD Millipore, Billerica, MA, USA) overnight at 15V in the transfer buffer (20% methanol (Fisher .....

Ebook Finite Mathematics with Applications In the Management ...
the Management, Natural, and Social Sciences ... but has been refreshed with revised and added content, updated and new applications, fine-tuned and.

Predicting Neuronal Activity with Simple Models of the ...
proposed by Brette and Gerstner therefore seems ide- ally suited to deal with these issues [6]. It includes an additional mechanism that can be tuned to model.

Object Detection in Video with Graphical Models
demonstrate the importance of temporal information, we ap- ply graphical models to the task of text detection in video and compare the result of with and without ...

Morphological characterization of in vitro neuronal ...
Aug 14, 2002 - ln(N)/ln( k , solid line and regular graphs lreg n/2k, open triangles. Data for our networks are also compared to other real networks open dots, data taken from Albert and Barabasi 26,. Table I. b The network's clustering coefficient c

Cellular communications system with sectorization
Nov 8, 2007 - Wireless Network Access Solution Cellular Subscribers Offered a .... Lee et al., 1993 43rd IEEE Vehicular Technology Conference, May. 18-20 ...

Anomaly Detection and Attribution in Networks with ...
Abstract—Anomaly detection in communication networks is the first step in the challenging task of securing a net- work, as anomalies may indicate suspicious behaviors, attacks, network malfunctions or failures. In this work, we address the problem

Bioelectronical Neuronal Networks
Aug 4, 1999 - those wonderful SEM pictures and for performing the EDX measurements. Many thanks ... to reconsider the sheer endless possibilities, and to find the (hopefully) best approaches. ...... Tools for recording from neural networks.

ICESCA'08_Nabil_chouba_Multilayer Neuronal network hardware ...
Abstract— Perceptron multilayer neuronal network is widely ... Index Terms— neuronal network, Perceptron multilayer, ..... Computers, C-26(7):681-687, 1977.

Community detection in networks with positive and ...
networks, and last but certainly not least, in social networks. In this paper ... tive links, with negative links sparse within and more dense ..... ten accordingly,. H. =.

Neuronal Population Decoding Explains the Change in ...
(b) Schematic illustration of motion induction by the surrounding stimulus, ..... targets, both activation patterns are nearly flat and therefore do not widely ..... making no contribution never depend on the vertical speed, which is not the case in.

Neuronal activity regulates the developmental ...
Available online on ScienceDirect (www.sciencedirect.com). ..... Multi-promoter system of rat BDNF .... data provide the additional information that deprivation of visual ..... Egan, M.F., Kojima, M., Callicott, J.H., Goldberg, T.E., Kolachana, B.S.,

A two-phase growth strategy in cultured neuronal ... - Semantic Scholar
Oct 27, 2004 - developing neuronal networks (see Connors and. Regehr, 1996; Mainen and Sejnowski, 1996; Sporns et al., 2000, 2002). At the early phase of neuronal development functional requirements are dominant: minimizing time to the formation of s

Signal Detection with Interference Constellation ...
i. ]). As a remark notice that the scrambling sequences of different interference transmitters can be different, which usually depend on the user ID in cellular systems. For the elaboration convenience, we define Fi,j scr as the scrambling mapping us

Cooperative OFDM Underwater Acoustic Communications with ...
Royal Institute of Technology (KTH). SE-100 44 Stockholm, ... be a promising technique in next generation wireless ... cooperative communication system with limited feedback. Minimum ... same advantages as those found in MIMO. There are.

Disciplined Structured Communications with ...
Mar 1, 2014 - CITI and Departamento de Informática. FCT Universidade Nova de .... cesses which execute in arbitrary, possibly nested locations, we obtain a property which we call consistency: update .... shall consider annotated located processes lh

Spectrum Efficient Communications with Multiuser ...
separately on interference and multiple access channels. However ..... R a tio o. f s u m ra te. Milcom 2015 Track 1 - Waveforms and Signal Processing. 1497 ...