Gaussian Process Summer School
Gaussian Process Summer School
  • 64
  • 121 128

Відео

GPSS2024 Using GP emulation in cardiovascular modelling
Переглядів 88Місяць тому
GPSS2024 Using GP emulation in cardiovascular modelling
GPSS2024 Emulating cohorts of cardiac digital twins using Gaussian Processes
Переглядів 96Місяць тому
GPSS2024 Emulating cohorts of cardiac digital twins using Gaussian Processes
GPSS2024 Multi-task Gaussian processes
Переглядів 1742 місяці тому
GPSS2024 Multi-task Gaussian processes
GPSS2024 Bayesian Optimization and Beyond
Переглядів 2032 місяці тому
GPSS2024 Bayesian Optimization and Beyond
GPSS2024 Biological applications of Gaussian process modelling
Переглядів 832 місяці тому
GPSS2024 Biological applications of Gaussian process modelling
GPSS2024 From geostatistics to graphs: Gaussian processes in practice
Переглядів 1612 місяці тому
GPSS2024 From geostatistics to graphs: Gaussian processes in practice
GPSS2024 Gaussian processes and non-Gaussian likelihoods
Переглядів 1962 місяці тому
GPSS2024 Gaussian processes and non-Gaussian likelihoods
GPSS2024: A first introduction to Gaussian processes
Переглядів 9672 місяці тому
GPSS2024: A first introduction to Gaussian processes
GPSS2024: Adjoint aided inference of Gaussian process driven differential equations
Переглядів 1452 місяці тому
GPSS2024: Adjoint aided inference of Gaussian process driven differential equations
GPSS2024: A second introduction to Gaussian processes
Переглядів 3622 місяці тому
GPSS2024: A second introduction to Gaussian processes
Robust Empirical Bayes for Gaussian Processes
Переглядів 392Рік тому
Robust Empirical Bayes for Gaussian Processes
Bayesian Optimization and Beyond
Переглядів 731Рік тому
Bayesian Optimization and Beyond
A conversation with Neil Lawrence
Переглядів 274Рік тому
A conversation with Neil Lawrence
Computationally efficient GPs
Переглядів 404Рік тому
Computationally efficient GPs
Latent space geometry
Переглядів 864Рік тому
Latent space geometry
Deep generative modelling aiding GPs and spatial statistics
Переглядів 408Рік тому
Deep generative modelling aiding GPs and spatial statistics
Gaussian Process Emulators for Cardiac Digital Twins: Enabling Scalable Patient-Specific Modeling
Переглядів 320Рік тому
Gaussian Process Emulators for Cardiac Digital Twins: Enabling Scalable Patient-Specific Modeling
Modelling London’s Air Quality Using Spatio-Temporal Gaussian Processes
Переглядів 266Рік тому
Modelling London’s Air Quality Using Spatio-Temporal Gaussian Processes
Identifying Dynamic Systems for Digital Twins of Engineering Assets
Переглядів 150Рік тому
Identifying Dynamic Systems for Digital Twins of Engineering Assets
Bayesian Surrogate Modelling of Computer Experiments using Gaussian Processes
Переглядів 225Рік тому
Bayesian Surrogate Modelling of Computer Experiments using Gaussian Processes
Multi-task Gaussian processes
Переглядів 1,2 тис.Рік тому
Multi-task Gaussian processes
Introduction to Gaussian Processes
Переглядів 1,5 тис.Рік тому
Introduction to Gaussian Processes
A second introduction to Gaussian Processes
Переглядів 1,6 тис.Рік тому
A second introduction to Gaussian Processes
Opensource tools at Secondmind
Переглядів 5223 роки тому
Opensource tools at Secondmind
GPSS 2021: Introduction to GPs
Переглядів 8 тис.3 роки тому
GPSS 2021: Introduction to GPs
GPSS2019 - Invariances in Gaussian processes and how to learn them
Переглядів 7505 років тому
GPSS2019 - Invariances in Gaussian processes and how to learn them
GPSS2019 - Active Multi-Information Source Bayesian Quadrature
Переглядів 5995 років тому
GPSS2019 - Active Multi-Information Source Bayesian Quadrature
GPSS2019 - Constraining Gaussian Processes by Variational Fourier Features
Переглядів 7145 років тому
GPSS2019 - Constraining Gaussian Processes by Variational Fourier Features
GPSS2019 - State Space Methods for temporal GPs
Переглядів 1,4 тис.5 років тому
GPSS2019 - State Space Methods for temporal GPs

КОМЕНТАРІ

  • @user-nq4ct9xf7y
    @user-nq4ct9xf7y Місяць тому

    Would have been good to see a clear presentation of posterior mean calculation and nll calculation *without* using inducing points.

  • @mekersemito
    @mekersemito 5 місяців тому

    Any one who can say something about the kernel k(x,x'). Here what does it mean by x and x' I thought they are like two inputs of a random variables that gives a value, but I saw something like a vector k(x,x')=x^Tx'???. Does it mean that x is the observed points and x' is point for prediction?

  • @charlesity
    @charlesity 5 місяців тому

    Arguably the best presentation on this subject.

  • @mahdibahrampouri6627
    @mahdibahrampouri6627 5 місяців тому

    Such a great presentation. I wish I could see the rest of the presentation.

  • @bencavus
    @bencavus 7 місяців тому

    Thank you!

  • @be1tube
    @be1tube 9 місяців тому

    I loved learning that a diagonal noise term can help with ill-conditioned matrix inversions.

  • @joe_hoeller_chicago
    @joe_hoeller_chicago 10 місяців тому

    Great video on causality!

  • @WahranRai
    @WahranRai 11 місяців тому

    If you were targeting an international audience, you should articulate and reduce your speed to make it easier to understand !

  • @matej6418
    @matej6418 Рік тому

    Can your model deal with exogenous control variables ? Often in literature denoted u(t)

  • @bryanshi3774
    @bryanshi3774 Рік тому

    Very good introduction to GPS

  • @pariseuselain1759
    @pariseuselain1759 Рік тому

    where is the slide pls🥲

  • @franard4547
    @franard4547 Рік тому

    I really appreciate for this. Been studying GP with lots of confusion. This was light for me.

  • @GreenFlyter
    @GreenFlyter Рік тому

    Thanks!!!

  • @jonathancangelosi2439
    @jonathancangelosi2439 Рік тому

    I appreciate how thorough this video was. Many tutorials on GPs tend to handwave a lot of the mathematical details of Gaussians and use sloppy notation (which is a huge problem with machine learning education in general, in my opinion).

  • @l.yans47
    @l.yans47 2 роки тому

    20:32

  • @khanwaqar7703
    @khanwaqar7703 2 роки тому

    Its amazing. Shall I contact with this professor for some questions.

  • @rudolfreiter5217
    @rudolfreiter5217 3 роки тому

    Great talk! Thank you

  • @hosseinrezaie7958
    @hosseinrezaie7958 3 роки тому

    very very nice thanks Dr!

  • @charilaosmylonas5046
    @charilaosmylonas5046 3 роки тому

    Amazing and insightful presentation! Thanx for publicly sharing this!

  • @origamitraveler7425
    @origamitraveler7425 3 роки тому

    Very important topic, thanks for the talk

  • @origamitraveler7425
    @origamitraveler7425 3 роки тому

    Woah! Great introduction!

  • @origamitraveler7425
    @origamitraveler7425 3 роки тому

    Great talk! The first 30 minutes really helped ease in to the topic

  • @eduardomedina5081
    @eduardomedina5081 3 роки тому

    Nice explanation! Very useful for my thesis :)

  • @AAAE2013
    @AAAE2013 3 роки тому

    Thanks for the nice explanation.

  • @AAAE2013
    @AAAE2013 3 роки тому

    Thanks for the nice explanation.

  • @rohannuttall2577
    @rohannuttall2577 3 роки тому

    Starts at 4:54

  • @pattiknuth4822
    @pattiknuth4822 3 роки тому

    Would be nice if they picked a speaker who could speak English properly. Very hard to understand (and I'm a native English language speaker)

    • @prashantmdgl9
      @prashantmdgl9 3 роки тому

      I don't find anything wrong with the diction of the speaker. It seems you haven't worked in an international environment.

    • @diegoacostacoden8704
      @diegoacostacoden8704 3 роки тому

      jaja, no sabe lo que dice

    • @ceskale
      @ceskale Рік тому

      its more about the quality of the audio, the english is pretty good

  • @AntifachoOi
    @AntifachoOi 3 роки тому

    Really good complement for the Rasmussen & Williams Gaussian Processes for Machine Learning Chapter 3 which is quite involving.

  • @microndiamondjenkins566
    @microndiamondjenkins566 3 роки тому

    I don't see the slides on the website. the speaker says they are there ..

  • @mohsenvazirizade6334
    @mohsenvazirizade6334 3 роки тому

    thank you so much for this wonderful video. In most of your figures, you have about 10 different colors that are moving along the x-axis. On each slice (a vertical line at x=x_j lets say j is 5) we have 10 points in 10 different colors that are normally distributed while they are correlated based on a kernel with the 10 points in x_i {i=0,1,.., 5, ..,n}, and lets say n is 20. My question is how do you generate this point? In total we have 10 (colors) * 20 (n) = 200 points which have to satisfy 2 conditions: 1) being normally distributed at each section 2) following the correlation based on kernal. Thank you

    • @shankyxyz
      @shankyxyz 3 місяці тому

      the lines are sampled independently. each line is generated by randomly sampling a multivariate normal distribution of dimension as much as there are points on each line. so the lines have no relation to each other.

  • @yanhongzhao3141
    @yanhongzhao3141 3 роки тому

    The boris johnson comment is gold :))))

  • @hossanatwino
    @hossanatwino 4 роки тому

    Thank you for these classes, very helpful - and probably COVID for enabling them to go to UA-cam :-)

  • @miguelbatista9493
    @miguelbatista9493 4 роки тому

    great talk. Not easy to find structured materials on this

  • @MCPEStuff
    @MCPEStuff 4 роки тому

    Cool!

  • @sourabmangrulkar9105
    @sourabmangrulkar9105 4 роки тому

    Great lecture. Thank you :)

  • @klingonxxxxxx
    @klingonxxxxxx 4 роки тому

    My poor english + teacher defects in speaking + strict and fast UK english = I'm not able to enjoy the lesson :(

  • @nicolassoncini2266
    @nicolassoncini2266 4 роки тому

    A great introduction to GPs, it's concise and very visual. Kudos to you Dr Wilkinson! Thanks for uploading these, I hope to attend one someday :D

  • @barath_
    @barath_ 5 років тому

    Hi are the slides available ?