This video really allowed me to connect some dots here - this concept is also why the assumptions in linear regression are so concerned with the residuals (which was counterintuitive to me when I first learned about it - "why make assumptions about the random component?")... but now it makes sense - if you create a model and are only left with white noise in the residuals, you can say that your model doesn't omit some pattern in the data and thus it captures all the key information you can get from your data. Thanks!
This is really helpful and the relationship between White noises and AR models are very well explained after you watch the video !!!! thank you so much .
Thank you for these amazing videos, please may I request if possible, can these videos related to Time Series be put under one playlist so there is a pattern that can be followed to understand the concepts & thank you once again for these simple explanations - amazing.
hmm.. you have a good skill in making complex concepts easily understandable, i wonder if you can do the same when going into more detail. Excellent job man, keep posting more please.
First time someone using words for explaining something in statistic and not only symbols. Maths is not difficult the only problem there are teach for some kind mind...
I feel that WN is kind of like an identity element for the class of random variables. Do we still have WN if we square it, or exponentiate it some n times? I don't think closure under addition works, since -WN is also white noise, and that will add to 0 (0 everywhere fails the 3rd property), while scaling seems to be fine. But if we just exclude additive inverses, everything seems to work. I'm trying to get at this idea of an Identity Class of like a field where it's closed in all the operations of that field (but open to the field itself ofc) and a moment, which is an associated measure (or set of measures) (that allows neg. vol.s) which defines a mapping to that field(s) that - for each member of the Identity Class - is 0. And then linear combinations of the identity class and the field span the superclass (random variables in this case).
Hello, Great Video, keep doing amazing work. I have one question, generally, we want our time-series to be stationary to make better predictions right?? In this video, you said that white noise is a stationary time series, but we can not make any predictions, it is a little bit counterintuitive, can you better clarify this part? Thank you
Hi, thank you for the kind words! And a great question. Just because a time series is stationary doesn't necessarily mean it is useful for making predictions about its future values. Put another way, stationarity is a necessary condition for using our ARMA-family models, but is not sufficient to guarantee that these models will succeed. White noise is a stationary process which has no predictive power.
@@ritvikmath Thank you for your quick response, Generally, we want to have stationary otherwise the results of your regression, will be considered ‘spurious’.In my opinion, a counterexample for white noise in which white noise will have predictive power is the following. Suppose a stock is following a stochastic process(white noise) with μ=0 and variance σ^2 (constant), we know that the price is fluctuating around that interval (lowerb,upperb),so if the price is equal to lowerb we buy or if the price is equal to upperb we sell, Therefore we make a pretty good prediction and also we make money :D Can you tell me where my thinking process was invalid?
Really great explanation! But in my opinion to be perfect you should have talked a little bit about IID noise, Gaussian white noise and all that stuff. I think people usually get confused thinking that white noise imply a Gaussian distribution and emphasize that White Noise only implies that the correlation is zero, so IID noise is always white noise (assuming the mean is also 0, independence implies uncorrelation) but the opposite is not true (uncorrelated does not imply independence) unless the distribution is Gaussian.
Great video. I have a question when you do a mean check(whether mean is constant over time). In the previous video, you mentioned that we need to compare mean over two periods of time to see if it's same/near in both time periods. Based on this understanding: What is the criteria to define the above mentioned two time periods? Because @1:55 in the first graph. If I take the negative Y area of the curve(the mean will be positive) and if I take an entire revolution of the curve (mean will be 0). This doesn't satisfy the criteria. I'm thinking on the wrong line somewhere but not able to discern it. Thanks.
Hey, I've got a question about sound in general and its effect on other sound. Let's say you're sitting in a quiet room playing bongos or something, and the sound of the bongos can be heard in a room sharing a common wall. Okay?.. For some reason, your friend (who happens to be in the same room you're in) decides to turn on a fan. A big, loud industrial blower. It's not really annoying, but it is pretty loud. Loud enough to drown out the sound of your bongos. Does it effect the sound of the bongos heard in the room next to yours? If so, how? If the fan produces a noise that really isn't noticed by the people in the other room, but it drowns out the sound of the bongos in your room, does it effect what the neighbors hear? I know that's the same question twice. But I feel I need to ask it in a way I would understand if someone were to ask me that question.
I think this is more of a frame of reference question and not so much a statistical noise question. I would take a guess and say the answer is yes and no. The sound waves produced by the blower affect the in the quiet room clash with the sound waves from the bongos. Conversely they are muffled by doors insulation etc in the bongo room so there is less of an affect in that room. To attempt an answer with another question if you and a friend are having a conversation does that conversation become drowned out by another pair of people having a conversation on the opposite side of the world?
Could you please explain how there is correlation 2:10. Acc to definition, r= summ. [ (Xi - Xbar)*(Yi - Ybar) ] / root [ (Xi - Xbar)2*(Yi - Ybar)2 ]. So if it is sin wave, value there (the points that you shown) would be -1, -1, -1, -1....as Xbar (mean) is -1. isnt the numerator becomes 0, hence r becomes 0? Please help!
Hmm, I've learned white noise as having mean 0 because it means you aren't consistently predicting too high or too low in your model. For stationarity however, the mean can be different from zero.
The oscillation part remains same through out the time.. See the second figure.. There is sudden increase in the values and then it dips.. This is not a case with sine/periodic wave.
Hi ritvikmath,your explanation is very nice please provide best Time Series books, and if you have prepared powerpoint slides for batter understanding theritical wise
So White Noise is about 5% Unpredictable so is considered Unpredictable. My question is WHEN is it actually unpredictable in an example? When does this 5% show up as unpredictable? Thank you.
Just to confirm, in the visual test you did, the graphs are graphs of only the residuals of the model, right? So it's the time series of the residuals? And the other method is to plot the ACF of the residuals to see if the time series of the residuals is correlated with any of the lags of the original time series?
Hello :) Very good explanation. I am about to right a seminar paper about stationarity and your video was a big help to get into the topic. Do you have any recommendation on sources I should look up? I would realy appreciate a hint ;)
Hi, your explanation are really good. However, your videos are scattered. Is it possible to provide entire explantion on a particular topic or exam centered. It will be really helpful
If you figure out a model for the time series and your error is white noise, then you are essentially done. White noise pretty much had no information content. It's just TV static. :))
No advanced presentation software, no Python or R and the simplicity that you've used to illustrate the idea is fantastic. Thank you for the clarity.
I haven't seen in any of his videos requesting for like and subscribe. His work pretty much gets that!! Hats off to you man!
This video really allowed me to connect some dots here - this concept is also why the assumptions in linear regression are so concerned with the residuals (which was counterintuitive to me when I first learned about it - "why make assumptions about the random component?")... but now it makes sense - if you create a model and are only left with white noise in the residuals, you can say that your model doesn't omit some pattern in the data and thus it captures all the key information you can get from your data. Thanks!
This is by far the greatest explanation of White Noise present over the Internet !
was about to cry over my time series hw then i found these vids. feeling a lil better now thank you
You are a blessing to us all, thank you mightily for doing such a wonderful, selfless service.
Wow, thank you
After three weeks of trying to understand this... all I needed was this video! Thank you!
This is really helpful and the relationship between White noises and AR models are very well explained after you watch the video !!!! thank you so much .
Thank you very much for carring out and uploading this very useful video!
Thank you for these amazing videos, please may I request if possible, can these videos related to Time Series be put under one playlist so there is a pattern that can be followed to understand the concepts & thank you once again for these simple explanations - amazing.
Thank you! the playlist should be ordered now
now it is done.
@@ritvikmath thank you
This is amazing lecture to understand white noise concept. I do appreciate for your kind and detail explanation. Definitely will recommend to peers
You cannot imagine how it took for others to explain it. They ended up with something that cannot be understandable though! Thank you
You explain really well these concepts. They actually sink in, thanks a lot!
every your video is so clear. Straight to points! Thanks
Your videos are extremely helpful, appreciate your work!
You're extremely brilliant. Thank you so much. From Vietnam
Thank you! 😃
You're an amazing teacher! Thank you for the content!
great channel, great videos, thank you so much!
Thank you for this amazing explanation!
You're very welcome!
amaizing hope you were my teacher instead of my actual one
real teachers are on youtube xD
Thank you for keeping it simple and on point :)
You have done a great job! Thanks man!
Looking forward to see more econometrics topics.
More to come!
Amazing as always, thank you.
Very helpful videos for a person studying one night before a quiz
great video, it has been very helpful to me in understanding these concepts. thanks a bunch
Good job for sure. So easy to understand
Glad it was helpful!
You could also define white noise as a signal whose power spectral density is uniformly distributed across all frequencies.
This is really helpful and easy to understand!
you are very fine in your concepts. Rare gem. followed you. Please posting.. happy to pay for any courses you sell
my most precious 7:35 that concluded what lasted for 20 mins at school
Amaaaazing! Thank you very much.
very nice explaination. i'm much more clear on what's going on with white noise. keep it up! *thumbs up*
@Yunjae Cho nope
Your videos are amazing!!!
Can you suggest a book Or rather books for econometrics, from where I can learn it from scratch..
Thanks... Very helpful in my Phd..
It's really helpful.... Thank you Sir....😊👍
Thank you!
hmm.. you have a good skill in making complex concepts easily understandable, i wonder if you can do the same when going into more detail. Excellent job man, keep posting more please.
First time someone using words for explaining something in statistic and not only symbols. Maths is not difficult the only problem there are teach for some kind mind...
I feel that WN is kind of like an identity element for the class of random variables. Do we still have WN if we square it, or exponentiate it some n times? I don't think closure under addition works, since -WN is also white noise, and that will add to 0 (0 everywhere fails the 3rd property), while scaling seems to be fine. But if we just exclude additive inverses, everything seems to work. I'm trying to get at this idea of an Identity Class of like a field where it's closed in all the operations of that field (but open to the field itself ofc) and a moment, which is an associated measure (or set of measures) (that allows neg. vol.s) which defines a mapping to that field(s) that - for each member of the Identity Class - is 0. And then linear combinations of the identity class and the field span the superclass (random variables in this case).
great explanation ! But why white noise series cannot be used for prediction when it satisfies all the condition of stationary time series?
Hello, Great Video, keep doing amazing work. I have one question, generally, we want our time-series to be stationary to make better predictions right?? In this video, you said that white noise is a stationary time series, but we can not make any predictions, it is a little bit counterintuitive, can you better clarify this part?
Thank you
Hi, thank you for the kind words! And a great question. Just because a time series is stationary doesn't necessarily mean it is useful for making predictions about its future values. Put another way, stationarity is a necessary condition for using our ARMA-family models, but is not sufficient to guarantee that these models will succeed. White noise is a stationary process which has no predictive power.
@@ritvikmath Thank you for your quick response, Generally, we want to have stationary otherwise the results of your regression, will be considered ‘spurious’.In my opinion, a counterexample for white noise in which white noise will have predictive power is the following.
Suppose a stock is following a stochastic process(white noise) with μ=0 and variance σ^2 (constant), we know that the price is fluctuating around that interval (lowerb,upperb),so if the price is equal to lowerb we buy or if the price is equal to upperb we sell, Therefore we make a pretty good prediction and also we make money :D
Can you tell me where my thinking process was invalid?
BRILLIANT!!!! THANK YOU
Really great explanation!
But in my opinion to be perfect you should have talked a little bit about IID noise, Gaussian white noise and all that stuff.
I think people usually get confused thinking that white noise imply a Gaussian distribution and emphasize that White Noise only implies that the correlation is zero, so IID noise is always white noise (assuming the mean is also 0, independence implies uncorrelation) but the opposite is not true (uncorrelated does not imply independence) unless the distribution is Gaussian.
Hope that there will be a book covering a good explanation like his's
haha thanks!
Great video. I have a question when you do a mean check(whether mean is constant over time). In the previous video, you mentioned that we need to compare mean over two periods of time to see if it's same/near in both time periods.
Based on this understanding: What is the criteria to define the above mentioned two time periods? Because @1:55 in the first graph. If I take the negative Y area of the curve(the mean will be positive) and if I take an entire revolution of the curve (mean will be 0).
This doesn't satisfy the criteria. I'm thinking on the wrong line somewhere but not able to discern it.
Thanks.
you are fantastic
I see hope for my econometrics course
Great video! Could you please do a video on pink noise and brownian noise?
Amazing! Thx for this!!
Very helpful. Thank you.
Hey, I've got a question about sound in general and its effect on other sound.
Let's say you're sitting in a quiet room playing bongos or something, and the sound of the bongos can be heard in a room sharing a common wall. Okay?.. For some reason, your friend (who happens to be in the same room you're in) decides to turn on a fan. A big, loud industrial blower. It's not really annoying, but it is pretty loud. Loud enough to drown out the sound of your bongos.
Does it effect the sound of the bongos heard in the room next to yours? If so, how?
If the fan produces a noise that really isn't noticed by the people in the other room, but it drowns out the sound of the bongos in your room, does it effect what the neighbors hear?
I know that's the same question twice. But I feel I need to ask it in a way I would understand if someone were to ask me that question.
I think this is more of a frame of reference question and not so much a statistical noise question. I would take a guess and say the answer is yes and no. The sound waves produced by the blower affect the in the quiet room clash with the sound waves from the bongos. Conversely they are muffled by doors insulation etc in the bongo room so there is less of an affect in that room.
To attempt an answer with another question if you and a friend are having a conversation does that conversation become drowned out by another pair of people having a conversation on the opposite side of the world?
Could you please explain how there is correlation 2:10.
Acc to definition,
r= summ. [ (Xi - Xbar)*(Yi - Ybar) ] / root [ (Xi - Xbar)2*(Yi - Ybar)2 ].
So if it is sin wave, value there (the points that you shown) would be -1, -1, -1, -1....as Xbar (mean) is -1. isnt the numerator becomes 0, hence r becomes 0? Please help!
the mean can also take values other than 0 as long as it stays constant.
Hmm, I've learned white noise as having mean 0 because it means you aren't consistently predicting too high or too low in your model. For stationarity however, the mean can be different from zero.
For the sin wave graph, why is the standard deviation constant with time? Thank you in advance.
The oscillation part remains same through out the time.. See the second figure.. There is sudden increase in the values and then it dips.. This is not a case with sine/periodic wave.
Hi ritvikmath,your explanation is very nice
please provide best Time Series books, and if you have prepared powerpoint slides for batter understanding theritical wise
You re a genius. Your channel should be government funded ! Any way to make a donation ? Great idea to add python demo to these concepts. Thanks again
Thanks. Was very helpful.
Can you point me towards a resource that has a numerical example checking for ACF?
Is The white noise important in Holt winters model?
Is "correlation between lags is zero" the same as "there is no seasonality" ? As in the assumptions of stationary.
So White Noise is about 5% Unpredictable so is considered Unpredictable. My question is WHEN is it actually unpredictable in an example? When does this 5% show up as unpredictable? Thank you.
What is the difference between the White Noise residual and Stationary residual?
it is apparent that everyone of your subscribers and then some have watched this video
Can white noise have a non-zero, but constant mean while satisfying the other two assumptions. Thanks?
Awesome explanation
Muy bueno Negro 🌹🌹
Great tutorial! Thank you! May i asked what method would be good to capture the increase in volatility for the second example ?
Global vs local tests, the standard dev would be different at different local points and also wouldnt match with the overall global value
Thanks for sharing!
just perfect!
really straightforward thaank!
Thank you very much! To the point
thanks ! happy to help :)
If a white noise is stationary, doesn't that mean we can capture and model it? And isn't stationary signal predictable?
So, does that mean i can add 1 to each data point in white noise to violate first criterion in order to have a predictable time series???
i loved it!!! can you please make pdf of those cards ? many can use it for revision...
thank you, hope more video
Thanks! good video...
wao
Kindly explain why you concluded mean to be zero in graph1 and graph2?
how to predict mean from a time series graph?
Just to confirm, in the visual test you did, the graphs are graphs of only the residuals of the model, right? So it's the time series of the residuals? And the other method is to plot the ACF of the residuals to see if the time series of the residuals is correlated with any of the lags of the original time series?
thank you, very good!
Hello :)
Very good explanation. I am about to right a seminar paper about stationarity and your video was a big help to get into the topic. Do you have any recommendation on sources I should look up?
I would realy appreciate a hint ;)
thank you sooooooooo much 💜
White Noise have a distribution Normal ??
but how do you find out the white noise?
mean could be constant as well
subbed
Is white noice and iid noise is same?
nicely explained
THAAANK YOUUUUU
welcome!
Hi, your explanation are really good. However, your videos are scattered. Is it possible to provide entire explantion on a particular topic or exam centered. It will be really helpful
The time series playlist should be organized now!
This is perfect!
iid ?
Could you give me the definition in three sentences?
Pleazzze.
cool !!!
why is it good to have a model with white noise proved ?
did you get the answer of your question cz i need to understand too please share with us
If you figure out a model for the time series and your error is white noise, then you are essentially done. White noise pretty much had no information content. It's just TV static. :))
The sum of 2 white noises is a white noise ?
Tenés que armar clases completas, con ejercicios..
Not clear about mean Sir
can you clarify your question? maybe I can help
What's lag😭
This noise is kinda racist.