How to Construct Every Randomness | Transformations of Random Variables, Random Number Generators

Поділитися
Вставка
  • Опубліковано 19 гру 2024

КОМЕНТАРІ • 56

  • @Jacob.Cornejo
    @Jacob.Cornejo 8 місяців тому +6

    wow this makes integration by substitution so clear

  • @jedediahjehoshaphat
    @jedediahjehoshaphat 8 місяців тому +25

    Very insightful, just finished my Advanced Measure Theory paper in university. Wasn't expecting to find applications here, but it surely supplemented my knowledge.

  • @neoMushroom
    @neoMushroom 8 місяців тому +6

    wow i finally understand what transformation of rv geometrically means because of this video

  • @emanuellandeholm5657
    @emanuellandeholm5657 8 місяців тому +12

    There is a method in image processing called histogram equalization, which is basically taking an image and processing it such that its histogram becomes more uniform. This can be useful for discarding things like shadows from projections when doing feature detection, as well as a way to salvage overexposed images where the histogram is digitally clipped.

  • @cartatowegs5080
    @cartatowegs5080 8 місяців тому +94

    Guess my sleep is delayed by 26 minutes

  • @blacklistnr1
    @blacklistnr1 8 місяців тому +5

    2:38 Really solid into! As a side note: one probability approach I'd like to see more often is setting bounds for classes of the input space.
    e.g. the ball reaching a point 500m away from the goal? put some bounds on the initial kick energy + wind resistance + time of flight and you get 0%
    or another class: the ball reaching the top left corner: it needs one of -> which if you diff backwards -> you get of input ranges -> which covers of the input space, so it's now a question of how common/easily do those initial conditions happen
    This way you get to progressively shape the actual distribution, even if you don't know it (as opposed to usual simplify and "it's just a model")
    P.S. I just like visual math videos, I don't do math professionally

  • @Kapomafioso
    @Kapomafioso 8 місяців тому +1

    My favorite formula of random variable transformation is (from any dimension to any dimension)
    f_Y (y1, ..., yn) = integral dx1 ... dxm f_X (x1,...xm) delta(f1(x1,...,xm)) delta(f2(x1,...xm)) ... delta(fn(x1,...xm))
    where f1, ...fn encode the functional relationship between x1, ..., xm and y1, ...yn.
    This can go from 1->1 random variable. Or 2->1. Usually n

  • @MsSlash89
    @MsSlash89 8 місяців тому +6

    The thumbnail tricked me! As an algebraist, the word “Rng” made me believe there was some Algebraic structure underneath; I cam out disappointed, but also happy to have learned something new!

  • @lees4416
    @lees4416 8 місяців тому +8

    Little prince distribution. Sounds good, actually

  • @rugbybeef
    @rugbybeef 8 місяців тому +10

  • @intrepiddt
    @intrepiddt 8 місяців тому +3

    Great explanations - thank you!

  • @fardinahsan2069
    @fardinahsan2069 8 місяців тому +6

    N(100,15), the IQ curve, we meet again

  • @AnythingGoesCodes
    @AnythingGoesCodes 3 місяці тому +1

    10:02 Does that mean P(-1)=0.5 ?

  • @artmowo2779
    @artmowo2779 8 місяців тому +8

    good illustration, thanks!

  • @blacklistnr1
    @blacklistnr1 8 місяців тому +8

    4:42 "times the indicator function from a to b" So this is how mathematicians do ifs :))

  • @coltonmartin864
    @coltonmartin864 3 місяці тому

    At 14:00 where was the function f_X(x) pulled from. what does this function refer to?

  • @pegrat
    @pegrat 8 місяців тому +4

    Nice hat in the thumbnail

  • @Vincent-kl9jy
    @Vincent-kl9jy 8 місяців тому +3

    I would love you to talk about Fokker-Planck Equations in a future video

    • @grayjphys
      @grayjphys 8 місяців тому

      my thoughts too, especially if things like this could be useful in solving them

  • @APaleDot
    @APaleDot 8 місяців тому +3

    9:00
    It should technically be called a "affine" transformation, not "linear"

    • @HaramGuys
      @HaramGuys 8 місяців тому +1

      All about context.
      Not everything is in linear algebra language, and in the context of probability theory, linear is more common.
      Piecewise linear manifolds, linearization of differential equations, all of these concepts technically are affine maps, but no one calls it affine.

  • @thenationalist8845
    @thenationalist8845 8 місяців тому +4

    Very interesting 🤓

  • @Ivan_1791
    @Ivan_1791 8 місяців тому +2

    How are all your videos so great?

  • @grayjphys
    @grayjphys 8 місяців тому +1

    I'm wondering if transformations like this could be useful in solving nonlinear odes

  • @minecrafting_il
    @minecrafting_il 8 місяців тому +3

    That is just what I wanted!

  • @mehdimabed4125
    @mehdimabed4125 8 місяців тому +1

    Very cool video ! Actually, I'm struggling trying to derive a formula for the CDF (or PDF) of the product of two random variables, and explore some sort of algebra of random variable (I know there is a book with this name but I nothing really satisfying for the product of two random variables....) ; by taking the log maybe ?

  • @kasiphia
    @kasiphia 8 місяців тому +4

    Interesting.

  • @boium.
    @boium. 8 місяців тому +2

    4:10 it's not an integral from -∞ to the dummy variable, but to x. In this case, t is the dummy variable.

    • @98danielray
      @98danielray 8 місяців тому

      dummy variable as in the argument of the function. that was pretty understandable.

  • @bluekim9771
    @bluekim9771 8 місяців тому

    1. if we know the function Y =g(X) then we can calculate f_Y(y) from f_X(x)
    2. we can generate numbers with algorithm (linear congruential generator) or by natural phenomenon
    so if the x is generated by phenomeon ->
    The distribution of x, which is f_X(x) will be made ->
    but we want the disrtibution be f_Y(y)
    then we have to find function g where Y=g(X)?
    is that how we can make a generator for any probability distribution?
    And why this is realted with inverse integrals?

  • @peeepeeepooopooo
    @peeepeeepooopooo 8 місяців тому +3

    BASED

  • @friggy1899
    @friggy1899 8 місяців тому +2

    Fangraphs sighting!!!!

  • @frba9053
    @frba9053 8 місяців тому +1

    Good to know

  • @lih3391
    @lih3391 8 місяців тому +2

    ❤ awesome

  • @gmdFrame
    @gmdFrame 8 місяців тому +1

    You're so cool!!!

  • @johncorn7905
    @johncorn7905 8 місяців тому +9

    Ok but what is the probability i can get a gf

    • @speye
      @speye 8 місяців тому +4

      non-zero 😊

    • @montadermajed9456
      @montadermajed9456 8 місяців тому

      Mathematically: 50%

    • @johncorn7905
      @johncorn7905 8 місяців тому

      @@montadermajed9456 what makes you say that

    • @johncorn7905
      @johncorn7905 8 місяців тому

      @@speye i appreciate the confidence

    • @montadermajed9456
      @montadermajed9456 8 місяців тому

      @@johncorn7905 i have absolutely no idea

  • @DonQuiGoddelaManCHAD
    @DonQuiGoddelaManCHAD 8 місяців тому +3

    why did you make the thumbnail a hat

    • @HaramGuys
      @HaramGuys 8 місяців тому +6

      "I showed the grown ups my masterpiece, and I asked them if my drawing scared them. They answered:'why be scared of a hat?' My drawing was not a picture of a hat. It was a picture of a boa constrictor digesting an elephant." - Antoine de Saint-Exupéry, The Little Prince

    • @MouhibBayounes
      @MouhibBayounes 8 місяців тому +2

      Great story especially if you know french. ​@@HaramGuys

  • @abdulrhmanaun
    @abdulrhmanaun 7 місяців тому

    ❤❤

  • @zhonyss
    @zhonyss 8 місяців тому +2

    I think i'am stupid

  • @MatthisDayer
    @MatthisDayer 8 місяців тому +1

    mersenne twister kinda sucks

    • @MatthisDayer
      @MatthisDayer 8 місяців тому +1

      it's needlessly overcomplicated
      is not random at all on the lower bits
      can get stuck producing only zeroes for millions of iterations
      is hard to seed properly
      it needs so much memory that it doesn't fit on registers
      it's kinda slow
      adds unnecessary binary size in an application using it
      you really don't need equal distribution in 623 dimensions, 4 is enough for any computation that lasts less than a human lifetime
      look for xoroshiro128 or xoshiro256 for much better alternatives.