Logarithmic nature of the brain 💡
Вставка
- Опубліковано 5 чер 2024
- Shortform link:
shortform.com/artem
My name is Artem, I'm a computational neuroscience student and researcher.
In this video we will talk about the fundamental role of lognormal distribution in neuroscience. First, we will derive it through Central Limit Theorem, and then explore how it support brain operations on many scales - from cells to perception.
REFERENCES:
1.Buzsáki, G. & Mizuseki, K. The log-dynamic brain: how skewed distributions affect network operations. Nat Rev Neurosci 15, 264-278 (2014).
2.Ikegaya, Y. et al. Interpyramid Spike Transmission Stabilizes the Sparseness of Recurrent Network Activity. Cerebral Cortex 23, 293-304 (2013).
3.Loewenstein, Y., Kuras, A. & Rumpel, S. Multiplicative Dynamics Underlie the Emergence of the Log-Normal Distribution of Spine Sizes in the Neocortex In Vivo. Journal of Neuroscience 31, 9481-9488 (2011).
4.Morales-Gregorio, A., van Meegen, A. & van Albada, S. J. Ubiquitous lognormal distribution of neuron densities across mammalian cerebral cortex. biorxiv.org/lookup/doi/10.1101... (2022) doi:10.1101/2022.03.17.480842.
OUTLINE:
00:00 Introduction
01:15 What is Normal distribution
03:03 Central Limit Theorem
04:23 Normality in biology
05:46 Derivation of lognormal distribution
10:20 Division of labour in the brain
12:20 Generalizer and specialist neurons
13:37 How lognormality arises
15:19 Conclusion
16:00 Shortform: sponsor message
16:54 Outro
CREDITS:
Icons by biorender.com/
Mathematical animations were created using Manim CE python library - www.manim.community/
Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem
@Artem Kirsanov the text at 15:03 doesn't seem to correspond to the biorxiv paper you have linked in the description 😅
Log-normal distributions are closely related to pink noise (power is 1/freq), since d(log) = 1/x. This is said to be the hallmark of self-organization. It shows up everywhere you have fractal symmetry: brains, turbulence, finance, weather, even migration patterns.
The everything
yep I was totally thinking of Quantitative Linguistics the moment log-normal distribution cropped up
D(ln)=1/x
@@Simonadas04 some people prefer log to denote ln, since log base e is more common in pure mathematics.
@@luker.6967 i see
Great video!
Super happy to see my paper about neuron densities made it into this video 15:12 :)
wow, the UA-cam algorithm is crazy
As an electrical engineer who has also always been extremely interested in neuroscience, stumbling across this channel is pure gold.
Thank you for all the effort you put into your videos, Artem. You're doing a great job taking complex topics and making them easy to visualize and to understand.
In case you're looking for topic suggestions for future videos, I have a few:
1. curriculum you would follow if you had to start from scratch and wanted to teach yourself neuroscience (computational or, if you prefer, a different concentration)
2. sources of information neuroscientists should follow in order to stay current with the research in the field (e.g. journals, labs, companies, people, etc)
3. list of open problems in neuroscience
Thank you again for your videos. Keep up the great work. 😊
Thank you for wonderful suggestions!
Right now, I'm actually preparing the script for a video about getting started with computational neuroscience! So stay tuned ;)
@@ArtemKirsanov Thank you. I look forward to it. 🙂
@@ArtemKirsanov Can you clarify how exsctly normal.dostrobtions arise eve tally even when you have wildly extreme and different values? Is it basically just evening out?
@@leif1075 pretty much! look at height; there's a wide variance, and in any town you can find a tiny person and a giant. But overall, most people are average height, and these outliers are rare. Hence normal
Excellent video, Artem! I enjoy a lot watching your videos, they are incredibly well done and explained. I particularly liked the ones involving topology.
Your videos are fantastic for anyone interested in neuroscience!
I never studied it in depth but it's fascinating and I'm discovering it
This channel is the most beautiful thing that has happened in my life this week, maybe even this month. Thank you for your effort, greetings from Mexico!
This was one of the most mindblowing videos I've seen in a while. Such amazing content Artem!
Wow you are such an effective communicator!!! Your insights were very clear and easy to understand
I am happy I didn't skip this video, and now I know another great channel for math and science
thank you Artem
great quality, and topics I am interested in
The best explanation of logarithms I've ever seen. How surprising that a neuroscience UA-cam video managed to describe the concept and its application so much more completely than any of the math classes I've ever taken. Well done!
Your videos are fantastic. What an excellent communicator!
I studied statistics for biologists at university some 25 years ago.
Your explanations are logical and intuitive. Good job Artem.
I can’t believe this valuable information is available on YT for free!! I just finished my a level studies and am keen on biology and neuroscience so I loved the fact I got to see a computational perspective on the brain. Makes me wonder where else can the log-normal distributions be seen in the body or what other mathematical models can be deduced in Biological systems.
Keep up!
one word, EXCELLENT!!! So happy to watch this.
This is definitely one of my favourite channels now. Up there with 3B1B. You explain things really well, and the topics you cover are just my cup of tea.
Your videos are such high quality thanks for the efforts.
Thanks! This is fascinating and useful information. You presented it so clearly, and the visuals were top notch. Really appreciate this work.
i love your channel and the way you explain everything with such detail!
On a completely unrelated note, the lognormal distribution also pops up in the field of mathematical finance! In particular, it is used to model the stock prices in the Black-Scholes model.
Wow, cool info! Thanks for sharing
The wikipedia page on log-normal distribution has some examples too:
- city sizes
- number of citations of journal articles and patents
- surgery durations
- length of hair, nails, or teeth
- length of chess games
- length of comments in forums, etc.
It's an interesting read.
Most stimulating content in ages! 👍🖖🤘
This is exactly what I was looking for today!! How strange that I found this exact topic here. Thank you :)
Terrific video, Artem. Mind-blowing: not only the production values, but in particularly highly engaging content. Thank you for sharing with us. Fantasti❤
Brilliant, Artem! And fascinating.
Wow man amazing videos, I wanna do research as a computational neuroscientist and your content is really what I was looking for!
Dude that’s so cool! I’m a first year grad student at OSU looking to research Nuclear Theory! And I’ve been watching your videos since late high school, I’d love to see a series on QFT!
This channel is a hidden gem. +1 subscriber
Impressive ... neuroscience is a powerful topic
once wrote a spiking neural net with around a million neurons
some neurons would fire almost every iteration, some every 10 iteration, and some would average once every thousands.
didn't bother to plot the distribution but that could have been fun.
Thanks for high quality content! I love the use of palmistry as an example of a random process at 3:42 :)
Thank you for a great video! Very interesting topic and very nice of you to show the article to make people more likely to actually look it up for themselves. 😀👍
Truly amazing topics, thank you !
Its interesting because I thought the video would be able how the brain perceives information logarithmicly, but it actually shows its actually physically built logarithmicly as well.
Thank you Artem ❤
Absolutely amazing video! Subscribed.
Your videos have a Good dinamic and didacts and the edictions is verry harmony, its really impressive why you not have 1 million of subscribers, more one subscriber from brazil 🇧🇷
Thanks for the clear explanation, great video
Beautiful, thank you!
This lecture is wow...thanks
Cool. Thanks for sharing.
Outstanding!
This video is fantastic
Wow! An amazing video! Thank you very much Artem. You have a new suscriber from Argentina 🇦🇷
Super cool video Artem! Keep up!
Thanks for the informative video
Very productive vid. It inspires me to be productive as well.
Great animations and explanations. However, as a fellow scientist and learner I wish that you had presented the central limit theorem and the derivation of the log-normal distribution in it's full mathematical glory. I feel that half the power of MANIM is in it's ability to concisely represent both the graphical and textual aspects of mathematics, to avoid one of them is to kneecap the platform. As a learner it is essential that I build associations between the graphical and textual representations. I think you did this better in your video on wavelets!
Anyway, thank you so much for taking the time to create these videos. I am sure that they will make a lasting contribution to the field of computational neuroscience and inspire students for years to come.
Wonderful content on a most interesting topic.
really interesting video for someone reseraching on NN, thanks!
Very good. Thank you.
Really nice explanation, thanks!
Great video!
I would love to see a follow-up video on neuronal avalanches and the critical brain hypothesis. A nice review on the topic that you might find useful is "Being critical of criticality in the brain" by Beggs and Timme (2012).
Thank you! I will definitely look into it!
@@ArtemKirsanov Thank you for sharing Artem. I hope you can respond to my message about how to deal with scientific papers and dealing with math when you can. Thanks very much.
Interestingly, I used to work in solar physics where avalanches are also a commonly used model for how solar flares occur
That was very well done
That was amazing! Great work, Artem - love your videos :-)
Insightful video. 👍 Keep going.
Great video, really interesting!
High quality content here!
Excellent video, keep the great content coming!
Thanks!
Awesome episode
A very interesting video. Thank you very much
Wonderful explanation of gaussian distribution
Freaking fascinating. I imagine these properties would transfer to neutral network design as well!
great video analysis
Excellent video, much appreciated!
fascinating and engaging!
I had studied advanced mathematics in my last semester but never understand the concept of random variables and distribution with that much clarity.
Amazing video with great explanation.
I just want to probe the parts of my brain where the picture and sounds form so I can record my dreams and then play them back like a movie.
i have a heavy background in audio production, and i figured this made a lot of sense given the logarithm nature of how we perceive sound, it’s cool to see that this is just inherent to our brains in general
Nice video! Great pacing
New to this channel and finding this very intriguing!
It seems to even parallel the patterns in how we actually think on the macro level.
Are you familiar with cognitive scientist and UA-camr John Vervaeke?
I bet you two could have a wonderful conversation that both audiences would enjoy!
He speaks much more about the big picture of cognition, yet so much of it involves these similar patterns with a split in extending out vs honing in.
Loved the video and the presentation :) Can biasing the weights of an artificial neural network toward such a log normal distribution provide any advantage?
Exactly what I was wondering!
Awesome video
I've been interested in brain science since I was a kid. This is definitely understandable to a 10 year old kid. Well done! More content please!! And you shud have more subscribers!!
damn you must be hella smart for a 10 yr old
great effort
Interesting video!
Why Guys like this are so under subscribed . Wish you success
Amazing video!
Great video!
Lognormal is the central limit theorem for RVs that combine in a multiplicative fashion (as long as the tails aren't too heavy).
This is a very nice connection of logarithmic perception and biological features of humans. I wonder if there is an analogy explanation of the rule 70-30?
Amazing video. Thank you.
I would guess that one or a mix of the following two are at the (physical) root of this :
1) either this is due to the "serial nature" of things eg they are connected in series and hence they are physically embodying a multiplication. One neuron firing triggering the next one triggering the next one etc. Since its a multiplication therefore it becomes log normal.
2) alternatively it could be because of the hierarchical structure of the network (brain). You mentioned there is a spectrum of the general (higher level) vs specialist (lower level) neurons, and since they are organized hierarchical, there is again this serial-Ness, since a higher level neuron might be triggered by a lower level neuron.
Helpful content, with a good lowering of entry barrier for someone uninitiated. I learned a lot. A small but important point: sum of independent random variables is not normally distributed, but mean of independent random variables is normally distributed.
Technically you're right, since the limit of the sum of the random variables diverges. However, I don't think stressing that point helps with conceptual understanding, since in practice all sums are finite, and then the sum approximately resembles the SHAPE of a normal distribution. Once you normalize it, which is what taking the mean does, you obtain a probability distribution.
Technically it's not the mean but mean times sqrt(n) where n is the number of samples taken to get the mean and under the limit that n is large. Otherwise the mean would just be a point as n becomes very large.
Awesome! Can I ask how do you create these fantastic animations? Thanks!
The shape certainly makes some intuitive sense. Extremely short firing rates are more likely to be mistaken as random noise so a neuron wants to be above that limit. However, it doesn't want to be too far above it, because firing is energy-intensive and the brain is already a calorie-hungry organ. At the same time if information is encoded partially in the firing rate, then utilizing only a small subsection of possible firing rates is not information efficient, so neurons that need to be heard more often would be incentivized to use lower utilized firing rates as there is less noise in those channels. I don't know whether that explanation would necessarily result in a log-normal distribution as opposed to a low-median normal distribution, but it is interesting to see roughly the shape I was thinking emerge at the end.
Wow so awesome, thanks a lot 🤙🏻🤙🏻
Whenever there is a log-normal behaviour, we can think of connnected and combined behaviour of things, namely evolutionary step. order of importance of things can only stabilized enough in a logarithmic fashion.
Brain is the most complex and fundamental part of our body - Brain
3:21 this video so far is more helpful than the statistics course i took
So glad I'm subscribed. Great video! Would love to see a video on neural field models in the future.
I'm quite impressed how you present all the information, very concise and clear
Cool video.
I first discovered this when my psych professor explained that we experience loudness not additionally but through log.
i may of course be wrong, but i do not think that is related.
But rather that the receptor cells themselves lose sensitivity with higher input, i don’t know anything about how those cells work, but it may possibly be due to a limited store of chemicals, like it is with visual receptors.
What it means is that the things we measure to be lognormal, we are assuming an additive linearity, when likely there exists a more natural measure of the thing in a multiplicative non-linearity, e.g., ignoring the fact that the thing is self-interacting, or grows from itself.
Very interesting thank you for the vidéo. Now I would like to know: could you explain why when we measure EEG we measure signals with power up to 50Hz, with notably an alpha peak in 10Hz, knowing that as you said we have a vast majority of neurons at 1 Hz?
I would suppose it's contructed by the sommation of neurons of different phases but it would be interesting to dig up a little more!
Maybe they are more interested in the minority
Very nice!
An explanation of why the distribution of firing rates in the cortex is log-normal can be found in Roxin, Alex, et al. "On the distribution of firing rates in networks of cortical neurons." Journal of Neuroscience 31.45 (2011): 16217-16226.
Thanks!
learned a lot, ty
You deserve subscribe
Nice!