This paper examines, in a digital computer simulated nerve cell, the property of information processing established at synaptic level. To clarify this property, the information theory is applied under the consideration that information is carried not simply by the mean frequency or rate but by the detailed sequence of spikes themselves. The information
H (
Y) for the response distribution is a measure for the description of the input-output frequency transfer characteristic and the average mutual information
I (
X1,
2 :
Y) is that of the pattern sensitive characteristic of successive impulses intervals. With the use of these measures, following results were obtained :
a) When the pre-synaptic form varies from exponential to gamma of order 10,
H (
Y) becomes larger in the order of gamma of order 10, gamma of order 5, gamma of order 3, gamma of order 2 and exponential. b) The average mutual information
I (
X1,
2 :
Y) is closely related to the EPSP size, the pre-synaptic mean frequency and form.
In the relation between EPSP size and pre-synaptic mean frequency at which
I (
X1,
2 :
Y) is maximum, the value of the EPSP size decreases as that of the pre-synaptic mean frequency increases. This tendency is common to all pre-synaptic forms.
At all EPSP sizes
A0=0-1.0, each input mean frequency at which
I (
X1,
2 :
Y) is maximum becomes higher in the order of exponential, gamma of order 2, gamma of order 3, gamma of order 5 and gamma of order 10.
抄録全体を表示