Probability Lecture 1: Events, probabilities & elementary combinatorics - 1st Year Student Lecture

Поділитися
Вставка
  • Опубліковано 6 січ 2024
  • The First Year Probability lectures are for Oxford students of Mathematics, Computer Science and joint degree courses between Mathematics and Statistics and Mathematics and Philosophy.
    Lecture 1 (the first of six we are showing) takes the intuitive notion of randomness (and perceived randomness) in the real world and introduces mathematical models in which events that may be observed are captured as sets of possible outcomes to which we can assign probabilities.
    In situations where the number of possible outcomes is finite and all outcomes are equally likely, this naturally leads to counting problems. We introduce and use factorials and binomial coefficients to count numbers of distinct arrangements of a finite number of objects. General settings and formulas are illustrated by examples.
    You can watch many other student lectures via our main Student Lectures playlist (also check out specific student lectures playlists): • Oxford Mathematics Stu...
    All first and second year lectures are followed by tutorials where students meet their tutor to go through the lecture and associated problem sheet and to talk and think more about the maths. Third and fourth year lectures are followed by classes.

КОМЕНТАРІ • 50

  • @user20517
    @user20517 13 днів тому +3

    Writing on the white board makes learning and understanding math easier than just quickly reading it from powerpoint slides, although it takes more efforts from the lecturer and student to write. Thank you

  • @edbertkwesi4931
    @edbertkwesi4931 4 місяці тому +16

    there is an 80 % chance of rain what an ammazing way to begin.

  • @jorgejoaquindelgadogomez4745
    @jorgejoaquindelgadogomez4745 2 місяці тому +3

    Mathias is a fantastic Professor!

  • @barryparsons6092
    @barryparsons6092 2 місяці тому +3

    at 11.35 wonderful ; the concept of ink and think .

  • @kwiky5643
    @kwiky5643 Місяць тому

    Learning from the best, thanks 💪

  • @mgamgam7642
    @mgamgam7642 4 місяці тому +2

    we love matthias 🙏

  • @ZaibiDesigner
    @ZaibiDesigner 3 місяці тому

    Hey sir, I just watched your video and I must say that it was really informative and well-made. I was wondering if I could help you edit your videos and also highly engaging thumbnails which will help your video to reach to a wider audience.

  • @matiippolito5625
    @matiippolito5625 4 місяці тому +2

    I love maths and sometimes I dont, but Oxford lecturers do know how to explain things clearly. Humans have a hard time processing probability its not in our nature to be able to comprehend these types of concepts 😂.

  • @wlmarvin
    @wlmarvin 4 місяці тому +1

    aaaaaannnnddd... u got a first ✌️😎

  • @Hermes1548
    @Hermes1548 4 місяці тому

    ‘Is weather random? Lots of philosophical
    questions here’ (min. 1:41). My reason to click
    on the opportunity to hear this lecture precisely.
    Popper (1934, 1959, The Logic of Scientific
    Discovery) has some piercing critiques on
    Bayesian-Probabilistic scientific methodology.

    • @kishou
      @kishou 4 місяці тому +1

      please cite your primary source by Thomas Bayes (i.e. doctrine of chances, etc).

    • @kavorka8855
      @kavorka8855 4 місяці тому

      modern AI wouldn't work without Bayesian probability.

    • @Hermes1548
      @Hermes1548 4 місяці тому +1

      I’m sure what you say is true, but my remark
      was not on the practical side, but on the
      philosophical side. As Popper says in The
      Two Fundamental Problems of the Theory
      of Knowledge, corroboration cannot be based
      on probability. I have read a philosopher deciding
      a question of fact via Bayesian probability, which
      cannot prove confirmation of a fact (event, theory).
      Footnote 11 of Popper’s 1978 Introduction to the
      work mentioned above reads:
      At the time of writing The Two Fundamental Problems, and for many years after that, I did not move significantly beyond the following intuitive insights: (1) Newton’s theory is exceedingly well corroborated. (2) Einstein’s theory is at least equally well corroborated. (3) Newton’s and Einstein’s theories largely agree with each other; nevertheless, they are logically inconsistent with each other because, as for instance in the case of strongly eccentric planetary orbits, they lead to conflicting predictions. (4) Therefore, corroboration cannot be a probability (in the sense of the calculus of probabilities). Unfortunately, until recently I have neglected to think through the intuitively very plausible point (4) in detail and to prove it through points (1), (2) and (3). But the proof is simple. If corroboration were a probability, then the corroboration of “Either Newton or Einstein” would be equal to the sum of the two corroborations, for the two logically exclude each other. But as both are exceedingly well corroborated, they would both have to have a greater probability than ½ (½ would mean: no corroboration). Thus, their sum would be greater than 1, which is impossible. It follows that corroboration cannot be a probability. These thoughts may be generalised: they lead to a proof that the probability of even the best corroborated universal laws is equal to zero. Peter Havas (“Four-Dimensional Formulations of Newtonian Mechanics and their Relation to the Special and the General Theory of Relativity”, Reviews of Modern Physics 36 (1964), pp. 938 ff.) has shown that Newton’s theory may be rendered in a form that is very similar to Einstein’s theory, with a constant k that in Einstein’s case becomes k = c (c is the velocity of light) and in Newton’s case k = ¥. But then there will be more mutually exclusive theories with c ≤ k ≤ ∞ that are denumerable, all of them being at least as well corroborated as Newton’s theory. (We avoid randomly distributed a priori probabilities.) In any case, from this set of theories one can select denumerable sets; for example, theories with k = c; k = 2c; …; k = nc; …; k = ¥. Since any two different theories in this infinite sequence are logically inconsistent with each other, the sum of their probabilities probabilities cannot be greater than 1. From this it follows that Newton’s exceedingly well-corroborated theory with k = ¥ has a vanishing probability. (Therefore, the degree of corroboration cannot be a probability in the sense of the calculus of probabilities.) It would be interesting to hear what the theoreticians of induction - such as the Bayesians, for instance, who identify the degree of corroboration (or the “degree of rational belief”) with a degree of probability - would have to say about this simple refutation of their theory.
      That was all. @@kavorka8855

    • @paromita_ghosh
      @paromita_ghosh Місяць тому

      Why deny science

    • @paromita_ghosh
      @paromita_ghosh Місяць тому

      Wtf

  • @GhulamNabiDar
    @GhulamNabiDar 24 дні тому

    I want to get boards used here..... Beautiful lecture.....

  • @yorha2b278
    @yorha2b278 2 місяці тому

    Obviously he knows his subject very very well.

    • @unidentified5390
      @unidentified5390 2 місяці тому

      Hes reading of the paper I think

    • @yorha2b278
      @yorha2b278 2 місяці тому

      @@unidentified5390 For the blind, yes.

  • @lakshyamath
    @lakshyamath 4 місяці тому

    I just tried to come up with a deduction as follows:
    Consider the statement, "80% chance of rain on this Friday". So, we perceive the following:
    a. the chance 80% is estimated from data statistically (i.e. a statistical estimate).
    b. an estimation is performed in a deterministic way, because it involves a computation technique.
    c. so, a statistical estimation is carried through measurement (or observation) using a computation technique.
    d. furthermore, a measurement involves measurement randomness due to computation approximation and measuring unit.
    Let me know, if the above points are having some discrepancies or not.

  • @michaelkoch6863
    @michaelkoch6863 2 місяці тому

    Nice run, but the basics are simple.

  • @subhadipsarkar7692
    @subhadipsarkar7692 4 місяці тому

  • @AzCode-ux3uv
    @AzCode-ux3uv 8 днів тому

    Hi, how can I love maths? How can I love a difficult thing like this. It's take my time to dive into but almost time I actually dont understand. Sometime, I got excited feeling when I undertstand some math's definition. I dont know how to love it, how to deal with it. How can I explore and dive into Math with positive feeling in almost time?

  • @miguelgonzalezperez4832
    @miguelgonzalezperez4832 4 місяці тому

    Excellent!

  • @ansmunir5277
    @ansmunir5277 4 місяці тому +3

    Can you please upload the full lecture series?

    • @OxfordMathematics
      @OxfordMathematics  4 місяці тому +6

      We'll be showing six lectures which are a self-contained set (though part of a bigger lecture series of 16).

    • @ansmunir5277
      @ansmunir5277 4 місяці тому +3

      @OxfordMathematics That's great but why not all the 16 lectures? If you upload all the lectures, it will help us to understand the subject completely. Because all those students who can not get to oxford still want to learn from the top university. Thank you.

    • @OxfordMathematics
      @OxfordMathematics  4 місяці тому +2

      @@ansmunir5277 We want to give an introduction for people to then go away and find out more. We also have to consider our own students. There are some full courses on the channel including the Introduction to University Mathematics course

    • @ansmunir5277
      @ansmunir5277 4 місяці тому +1

      Okay. Thank you.@@OxfordMathematics

    • @srinivasadusumilli7881
      @srinivasadusumilli7881 4 місяці тому +1

      Thank you Oxford for these lectures @@OxfordMathematics

  • @vansf3433
    @vansf3433 4 місяці тому

    If you put set difference as A\B, instead of A - B, it can also be interpreted as conditional probability, which means that the notation is ambiguous
    Statistics is subjective, because the same set of data can be interpreted in different ways.It depends on each individual's purposes how it's interpreted

    • @HabibuMukhandi
      @HabibuMukhandi 21 годину тому

      isn't conditional probability notated by A|B and not A\B?

  • @mitsunam7001
    @mitsunam7001 4 місяці тому

    I think someone's brain is not braining right now..... 🤯 🧠

  • @davidbock2863
    @davidbock2863 3 місяці тому +1

    I'm not sure, but I think that's John Malkovich...

  • @ibrahimcamur34
    @ibrahimcamur34 4 місяці тому

    Türkçe altyazı ekleyin.

  • @burnere633
    @burnere633 4 місяці тому +3

    Who is the lecturer?

  • @mehradmoini20
    @mehradmoini20 4 місяці тому

    Thanks so much for sharing these valuable lectures. I have two questions:
    1- will you also kindly share the future lectures of this course on probability?
    2- would you please also share which textbooks or references are being used for this course or alternatively is there a link to the course material one can use?

  • @vult07
    @vult07 4 місяці тому +3

    Whya re they using white board

    • @donaldhenderson5039
      @donaldhenderson5039 Місяць тому

      Visual Reference for Everybody and to attach sub points to.

  • @user-hm2gb6pm6b
    @user-hm2gb6pm6b День тому

    College statistics sir

  • @Deviruchee
    @Deviruchee 18 днів тому

    I got 80% chances to get an ads on this video

  • @aryan_71
    @aryan_71 4 місяці тому +6

    Any indians here who are watching this❤

  • @user-hm2gb6pm6b
    @user-hm2gb6pm6b День тому

    Return
    Umbrellas
    Maths
    Debts
    Mortgage
    Scrap
    The data collected by your camera has recorded missing things of .......of my house so the probability of retrieval is impossible
    Random chance of watching the same rain ......is ............peculiar ......models .......
    How the models appeared in the lobby of university .....how they got funds ????
    AM

  • @painpeace3619
    @painpeace3619 Місяць тому

    Too much writing.

    • @sunnysideup4863
      @sunnysideup4863 19 днів тому

      You’ll fail no matter what. I feel bad for u