Modeling chemical synapses

and wrong brain maps in Devlog #9

Published:
Published:

These guys are bullshitting. Though I like the map.

But anyway, GeForce RTX 40 Series has already reached the number in 76 billion transistors, so this video card can be compared with a human brain that has 86 billion neurons. A human with slightly damaged brain. Something like too much alcohol at college parties

This is very interesting https://news.mit.edu/2022/dendrites-help-neurons-perform-0217

  • The researchers found that within a single neuron, different types of dendrites receive input from distinct parts of the brain, and process it in different ways.

  • apical oblique dendrites

  • tuft dendrites (input from the thalamus)

  • basal dendrites (demonstrate supralinearity)

  • neurons are not limited to producing a single neurotransmitter, and there is considerable heterogeneity in neurotransmitter synthesis within the nervous system

If only an electric signal, i.e. K, Ca, Cl, or Na molecules are reaching the axon terminal, then what affects the neuron to choose what neurotrasmitter to synthesize?

  • The availability of precursor molecules and enzymes

What can cause changes in the electric potential across the cell membrane?

  • Neurotransmitters released by other neurons can bind to specific receptors on the cell membrane, triggering changes in ion channel activity.

How the process of thinking can be explained by neurons activity?

  • Thinking often involves the integration of information from different brain regions. Neuronal activity can synchronize and coordinate across multiple regions, allowing for the integration of various cognitive processes, such as perception, memory, attention, and decision-making.

Neural facilitation, also known as paired-pulse facilitation (PPF) https://en.wikipedia.org/wiki/Neural_facilitation - when impulses come in equal intervals it increases amount of Ca in the cleft and ensures the constant firing rate of postsynaptic neuron. Consistency is a key

Post-tetanic potentiation (PTP) - spikes are lower than a threshold, so no activating postsynaptic potential, some type of expectation that previous experience is gonna repeat

Differences

  1. Fully connected feed forward networks with dropout - very silly artificial model that tries to explain time-encoding and inhibitory effect. What I think, the network should have fixed connections with immediate effect to several layers upfront, but not to every neuron on the next level (hard implementation of dropout). And cycles. Backpropagation is not a specific step during training. The feedback is a part of forward step (inference) also. But in this case it’s not going forward only it’s a closed loop with feedback. It’s like a recurrent network where prediction returns back as input, but such architecture in RNN usually causes instability in training. However here there is no gradient descent, because there is no weights. Though the cost function we will introduce later.

A neuron technically can inhibit and actuate at the same time but it’s usually a sine of schizophrenia.

Rate this page