CAMLab, ETH Zürich
CAMLab, ETH Zürich
  • 39
  • 117 948
ETH Zürich AISE: Applications of AI in Chemistry and Biology Part 2
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓
ETH Zürich AI in the Sciences and Engineering 2024
*Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html
Guest lecturer: David Graber
▬ Lecture Content ▬▬▬
0:00 - Introduction
0:47 - Recap: proteins, AlphaFold and GNNs
4:42 - RFdiffusion for backbone design
16:59 - ProteinMPNN for sequence design
34:57 - Summary
▬ Course Overview ▬▬▬
Lecture 1: Course Introduction ua-cam.com/video/LkKvhvsf6jY/v-deo.html
Lecture 2: Introduction to Deep Learning Part 1 ua-cam.com/video/OXmLwCQA7F4/v-deo.html
Lecture 3: Introduction to Deep Learning Part 2 ua-cam.com/video/z3tQaNOwQqM/v-deo.html
Lecture 4: Importance of PDEs in Science ua-cam.com/video/UiZxDRBd0Q8/v-deo.html
Lecture 5: Physics-Informed Neural Networks - Introduction ua-cam.com/video/D-F7BYRhAkQ/v-deo.html
Lecture 6: Physics-Informed Neural Networks - Limitations and Extensions Part 1 ua-cam.com/video/S11QK8baGVI/v-deo.html
Lecture 7: Physics-Informed Neural Networks - Limitations and Extensions Part 2 ua-cam.com/video/NFtE1pyD5LA/v-deo.html
Lecture 8: Physics-Informed Neural Networks - Theory Part 1 ua-cam.com/video/AaChPylEH6U/v-deo.html
Lecture 9: Physics-Informed Neural Networks - Theory Part 2 ua-cam.com/video/FqdJ2Jx9MVc/v-deo.html
Lecture 10: Introduction to Operator Learning Part 1 ua-cam.com/video/yhHhMmiNl_g/v-deo.html
Lecture 11: Introduction to Operator Learning Part 2 ua-cam.com/video/lEUgPvDi5O8/v-deo.html
Lecture 12: Fourier Neural Operators ua-cam.com/video/b96wRdjH1Lg/v-deo.html
Lecture 13: Spectral Neural Operators and Deep Operator Networks ua-cam.com/video/BxklDO0TMlA/v-deo.html
Lecture 14: Convolutional Neural Operators ua-cam.com/video/5XaLKR08TwI/v-deo.html
Lecture 15: Time-Dependent Neural Operators ua-cam.com/video/u1KFcAvjyCI/v-deo.html
Lecture 16: Large-Scale Neural Operators ua-cam.com/video/FPXW9MxjV48/v-deo.html
Lecture 17: Attention as a Neural Operator ua-cam.com/video/wJSgLRiU7D4/v-deo.html
Lecture 18: Windowed Attention and Scaling Laws ua-cam.com/video/YtJhReM5bHY/v-deo.html
Lecture 19: Introduction to Hybrid Workflows Part 1 ua-cam.com/video/fJbt6VKYycA/v-deo.html
Lecture 20: Introduction to Hybrid Workflows Part 2 ua-cam.com/video/h8BH-6tjecc/v-deo.html
Lecture 21: Neural Differential Equations ua-cam.com/video/jnjYsm4NjhE/v-deo.html
Lecture 22: Introduction to Diffusion Models ua-cam.com/video/Tohlijxz3XQ/v-deo.html
Lecture 23: Introduction to JAX ua-cam.com/video/0JsPcm_Vl1g/v-deo.html
Lecture 24: Symbolic Regression and Model Discovery ua-cam.com/video/fe-PC4lw4yw/v-deo.html
Lecture 25: Applications of AI in Chemistry and Biology Part 1 ua-cam.com/video/Y3rvzsW8TVU/v-deo.html
Lecture 26: Applications of AI in Chemistry and Biology Part 2 ua-cam.com/video/dDvTA_MoO_4/v-deo.html
▬ Course Description ▬▬▬
AI is having a profound impact on science by accelerating discoveries across physics, chemistry, biology, and engineering. This course presents a highly topical selection of AI applications across these fields. Emphasis is placed on using AI, particularly deep learning, to understand systems modelled by PDEs, and key scientific machine learning concepts and themes are discussed.
▬ Course Learning Objectives ▬▬▬
- Aware of advanced applications of AI in the sciences and engineering
- Familiar with the design, implementation, and theory of these algorithms
- Understand the pros/cons of using AI and deep learning for science
- Understand key scientific machine learning concepts and themes
Переглядів: 311

Відео

ETH Zürich AISE: Applications of AI in Chemistry and Biology Part 1
Переглядів 4465 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Guest lecturer: David Graber ▬ Lecture Content ▬▬▬ 0:00 - Introduction 2:54 - Proteins and small molecules 9:52 - Engineering of protein and small molecules 15:03 - Text-based AI models 26:03 ...
ETH Zürich AISE: Symbolic Regression and Model Discovery
Переглядів 9225 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Introduction 1:41 - Can AI discover the laws of physics? 5:52 - Model discovery 7:00 - Function discovery 8:...
ETH Zürich AISE: Introduction to JAX
Переглядів 6645 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Introduction 2:02 - What is JAX? 3:08 - JAX in ML and scientific computing 5:34 - Accelerated array computat...
ETH Zürich AISE: Introduction to Diffusion Models
Переглядів 4265 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Introduction 1:52 - What is diffusion? 4:48 - Forward model 7:54 - Reversing diffusion 12:40 - How does a di...
ETH Zürich AISE: Neural Differential Equations
Переглядів 4345 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 2:04 - Lotka-Volterra system 3:52 - Solving the ordinary differential equation (ODE)...
ETH Zürich AISE: Introduction to Hybrid Workflows Part 2
Переглядів 2055 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: autodiff 7:25 - Memory requirements of autodiff 10:31 - Autodiff in practice 13:36 - Recap: hybrid ap...
ETH Zürich AISE: Introduction to Hybrid Workflows Part 1
Переглядів 3815 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Introduction 3:09 - Recap: physics-informed neural networks 5:31 - Recap: operator learning 8:06 - When to u...
ETH Zürich AISE: Windowed Attention and Scaling Laws
Переглядів 1895 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 3:20 - Windowed attention 5:37 - Windowed attention as a neural operator 7:15 - Shif...
ETH Zürich AISE: Attention as a Neural Operator
Переглядів 2955 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 1:39 - Recap: attention as a neural operator 6:52 - Computational complexity of atte...
ETH Zürich AISE: Large-Scale Neural Operators
Переглядів 3155 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 5:19 - Time conditioning in neural operators 12:28 - Training strategies 25:46 - Inf...
ETH Zürich AISE: Time-Dependent Neural Operators
Переглядів 3075 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 4:33 - Challenges with neural operators 13:45 - Time-dependent partial differential ...
ETH Zürich AISE: Convolutional Neural Operators
Переглядів 5035 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 1:26 - Designing a ReNO with convolutions 5:34 - Convolutional neural operators (CNO...
ETH Zürich AISE: Spectral Neural Operators and Deep Operator Networks
Переглядів 3515 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 6:33 - Designing representation equivalent neural operators (ReNOs) 13:08 - Spectral...
ETH Zürich AISE: Fourier Neural Operators
Переглядів 9475 місяців тому
↓↓↓ LECTURE OVERVIEW BELOW ↓↓↓ ETH Zürich AI in the Sciences and Engineering 2024 *Course Website* (links to slides and tutorials): www.camlab.ethz.ch/teaching/ai-in-the-sciences-and-engineering-2024.html Lecturers: Dr. Ben Moseley and Prof. Siddhartha Mishra ▬ Lecture Content ▬▬▬ 0:00 - Recap: previous lecture 3:43 - Recap: Representation equivalent neural operators (ReNOs) 13:48 - Recap: 1D R...
ETH Zürich AISE: Introduction to Operator Learning Part 2
Переглядів 5305 місяців тому
ETH Zürich AISE: Introduction to Operator Learning Part 2
ETH Zürich AISE: Introduction to Operator Learning Part 1
Переглядів 7005 місяців тому
ETH Zürich AISE: Introduction to Operator Learning Part 1
ETH Zürich AISE: Physics-Informed Neural Networks - Theory Part 2
Переглядів 5025 місяців тому
ETH Zürich AISE: Physics-Informed Neural Networks - Theory Part 2
ETH Zürich AISE: Physics-Informed Neural Networks - Theory Part 1
Переглядів 6365 місяців тому
ETH Zürich AISE: Physics-Informed Neural Networks - Theory Part 1
ETH Zürich AISE: Physics-Informed Neural Networks - Limitations and Extensions Part 2
Переглядів 7235 місяців тому
ETH Zürich AISE: Physics-Informed Neural Networks - Limitations and Extensions Part 2
ETH Zürich AISE: Physics-Informed Neural Networks - Limitations and Extensions Part 1
Переглядів 9455 місяців тому
ETH Zürich AISE: Physics-Informed Neural Networks - Limitations and Extensions Part 1
ETH Zürich AISE: Physics-Informed Neural Networks - Introduction
Переглядів 2,6 тис.5 місяців тому
ETH Zürich AISE: Physics-Informed Neural Networks - Introduction
ETH Zürich AISE: Importance of PDEs in Science
Переглядів 8705 місяців тому
ETH Zürich AISE: Importance of PDEs in Science
ETH Zürich AISE: Introduction to Deep Learning Part 2
Переглядів 1,2 тис.5 місяців тому
ETH Zürich AISE: Introduction to Deep Learning Part 2
ETH Zürich AISE: Introduction to Deep Learning Part 1
Переглядів 1,7 тис.5 місяців тому
ETH Zürich AISE: Introduction to Deep Learning Part 1
ETH Zürich AISE: Course Introduction
Переглядів 5 тис.5 місяців тому
ETH Zürich AISE: Course Introduction
ETH Zürich DLSC: Deep Operator Networks
Переглядів 2,8 тис.Рік тому
ETH Zürich DLSC: Deep Operator Networks
ETH Zürich DLSC: Neural Operators
Переглядів 2,7 тис.Рік тому
ETH Zürich DLSC: Neural Operators
ETH Zürich DLSC: Introduction to Operator Learning Part 2
Переглядів 2,2 тис.Рік тому
ETH Zürich DLSC: Introduction to Operator Learning Part 2
ETH Zürich DLSC: Fourier Neural Operators and Convolutional Neural Operators
Переглядів 3,4 тис.Рік тому
ETH Zürich DLSC: Fourier Neural Operators and Convolutional Neural Operators

КОМЕНТАРІ

  • @LordMichaelRahl
    @LordMichaelRahl 26 днів тому

    Fantastic series of lectures.

  • @damienmaitre
    @damienmaitre Місяць тому

    great video ! such cool research area

  • @Bravogiovanni
    @Bravogiovanni Місяць тому

    This was a really fantastic lecture.

  • @jiachenguo4018
    @jiachenguo4018 Місяць тому

    I got one question about curse of dimensionality in PINN. The last example you showed you sampled a lot of data to train the network. isn‘t that cursive dimensionality? If you only care about 1D case, do we need to sample of that amount of data? Thanks for the lecture

  • @jiachenguo4018
    @jiachenguo4018 Місяць тому

    I got one question about a curse of dimensionality. The last example your showed you sampled a lot of data to train the network. isn‘t that cursive dimensionality? If you only care about 1D case, do we need to sample of that amount of data? Thanks for the lecture

  • @Bluecrystal420
    @Bluecrystal420 2 місяці тому

    Why is regression supervised?

  • @chidiokoene4591
    @chidiokoene4591 2 місяці тому

    are there videos for the tutorial

  • @chidiokoene4591
    @chidiokoene4591 2 місяці тому

    Is the notebook file available on the github?

  • @chidiokoene4591
    @chidiokoene4591 2 місяці тому

    I have a question about classification, what happens when the y is a multiclass classification problem

  • @chidiokoene4591
    @chidiokoene4591 2 місяці тому

    This is the most important and valuable playlist on my youtube right now. thanks for sharing

  • @chidiokoene4591
    @chidiokoene4591 2 місяці тому

    thank you so much for this lecture,

  • @RohanDekate-m7c
    @RohanDekate-m7c 2 місяці тому

    what is diag(sigma'(g))?

  • @RohanDekate-m7c
    @RohanDekate-m7c 2 місяці тому

    insightful

  • @rarelycomments
    @rarelycomments 4 місяці тому

    Thank you for putting these lectures online. They have been an invaluable resource for my study. Ben is a great teacher, I love how he focuses on establishing understanding of the concepts, links and implications rather than getting bogged down in mathematical details. And taking a moment to pause to make sure everyone understands key concepts. The questions to the class are great too. I like to pause the video and think about an answer. If I can't, often he'll say it's a non-obvious one which makes me feel much better! 😅Thanks again 👏

  • @tuo9433
    @tuo9433 5 місяців тому

    I wonder if PINNs can solve the three-bodies problem :D

  • @yangluo8317
    @yangluo8317 5 місяців тому

    awesome lecture. Since I am new to PINNs, I just want to know what if we could not have access to the PDEs formulation.....

  • @tamineabderrahmane248
    @tamineabderrahmane248 5 місяців тому

    thank you for sharing sientific ml knowledge !

  • @afrahnajib1218
    @afrahnajib1218 5 місяців тому

    How can I register in this course in place. I am not a student at ETH

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      Only ETH students can register for this course in-person

  • @Banedsyko
    @Banedsyko 5 місяців тому

    related to what was talked about @5:37, are there any cases where you can not find true function no matter what you do(architecture, density , number of parameters, etc), you cannot reach that point in the function space?

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      In theory, neural networks are universal approximators, i.e. there exists weight values where a large enough network can closely approximate any function. In practice it can be very challenging to find these weight values because of optimisation and estimation error

  • @standard_output
    @standard_output 6 місяців тому

    Id be happy to sponsor a microphone for a better audio experience

  • @tinytale5406
    @tinytale5406 6 місяців тому

    Hello. Can you please provide the ppts of these lectures?

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All the slides are now available on our course webpage: camlab.ethz.ch/teaching/deep-learning-in-scientific-computing-2023.html

  • @dholgadom
    @dholgadom 6 місяців тому

    Excellent PINN classes with real application examples!!

  • @berkhacimolla6063
    @berkhacimolla6063 8 місяців тому

    Thanks a lot for sharing. I wish you best of luck for all

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 8 місяців тому

    Pretty bad lecture

  • @francesco3362
    @francesco3362 9 місяців тому

    Hi, Fantasctic lessons ! Any chance to get the slides ?? Thanks !

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All the slides are now available on our course webpage: camlab.ethz.ch/teaching/deep-learning-in-scientific-computing-2023.html

  • @thddnwls
    @thddnwls 10 місяців тому

    This is amazing!

  • @be_happy974
    @be_happy974 10 місяців тому

    it's great, hardly find teach code video of PINN

  • @ludasi9
    @ludasi9 10 місяців тому

    Thank you for the video. Could you clarify please why ReLU is not used as an activation function?

    • @juanmanuelgalindez5673
      @juanmanuelgalindez5673 9 місяців тому

      ReLU is not differentiable at the origin, which might be problematic.

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      Exactly, for many PDEs we want the solution to be continuously differentiable and ReLU activations are not

  • @prefachinho
    @prefachinho 10 місяців тому

    Is the Jupyter file of the harmonic oscillator demo available anywhere?

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All code shown in the lectures is here: github.com/benmoseley/DLSC-2023

  • @asifrashid3212
    @asifrashid3212 11 місяців тому

    Great lectures! I believe this is best resource out there for an enthusiast to get started with Scientific ML. I was wondering if it would be possible to share the lecture slides by any chance?

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All the slides are now available on our course webpage: camlab.ethz.ch/teaching/deep-learning-in-scientific-computing-2023.html

  • @PratikChoudhury-e5e
    @PratikChoudhury-e5e 11 місяців тому

    I was searching for some good resources to learn PINNs and came across these video lectures. Thanks for sharing these; the course content and the way of teaching, I really liked it and learnt a lot from this course. Thank you !!

  • @SuparnoBhattacharyya
    @SuparnoBhattacharyya Рік тому

    So good. I am literally binge-watching ! Haha!

  • @jingzhao1615
    @jingzhao1615 Рік тому

    Very helpful, thank you!

  • @NiccolòMaffezzoli
    @NiccolòMaffezzoli Рік тому

    Can the slides be made available for private use ? Thanks

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All the slides are now available on our course webpage: camlab.ethz.ch/teaching/deep-learning-in-scientific-computing-2023.html

  • @myownbasement
    @myownbasement Рік тому

    Lecturer's way of teaching is very odd, a bit rude if I may say. Thanks for the course.

  • @피클모아태산
    @피클모아태산 Рік тому

    Great overview of PINN in 2023

  • @ezyxas
    @ezyxas Рік тому

    Thank you for the great lecture. Question: I understand how periodic boundary conditions can work with discretized PINNs, but how are they supposed to fork with fully connected networks (continuous PINNs that use Autograd)?

  • @AdilIsmagambetov
    @AdilIsmagambetov Рік тому

    why is the physics loss is 0?

    • @thisisharold9066
      @thisisharold9066 6 місяців тому

      I think that is because the undamped spring mass system can be model with *second order homogeneous ordinary differential equation*, y'' + p(x)*y' + q(x)*y = 0. If you model for forced response, E.g. charging response of resistor-inductor-capacitor circuit with 3.3 volts as input, then physics will not be 0. y'' + p(x)*y' + q(x)*y = -3.3.

  • @ramversingh7867
    @ramversingh7867 Рік тому

    Awesome 👌. Thanks for sharing valuable knowledge about the topic.

  • @sharmilakarumuri6050
    @sharmilakarumuri6050 Рік тому

    Can we get the lecture slides, it would be helpful.

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All the slides are now available on our course webpage: camlab.ethz.ch/teaching/deep-learning-in-scientific-computing-2023.html

  • @afrahnajib1218
    @afrahnajib1218 Рік тому

    There is only one like which you may not care about? please share more!

  • @Shivohamtfb
    @Shivohamtfb Рік тому

    The Tutorials are not organized properly in the github repo. Kindly help with that

    • @yuewang210
      @yuewang210 Рік тому

      where can we see tutorials?

  • @juancolmenares6185
    @juancolmenares6185 Рік тому

    Would it be possible to add the order of the tutorials in the Readme file? Or maybe to rename them with the order number? Thank you

  • @juancolmenares6185
    @juancolmenares6185 Рік тому

    Thanks for sharing!

  • @afrahnajib1218
    @afrahnajib1218 Рік тому

    Many thanks for uploading such a course from those who can not reach such a high quanlity of education🤍

  • @bridgetohell792
    @bridgetohell792 Рік тому

    We could really benefit from the tutorials where there is code and implementation. Most videos are the professor writing some Math symbols on screen. I must say that these videos are pretty boring. And it is like a recorded book and not a proper lecture. From the next iterations of this course, please record the tutorials. I would rather watch those than this spoken math textbook.

    • @bug3308
      @bug3308 10 місяців тому

      What are you even talking about? First, those math are already simplified version, if you don't believe me, read original paper of operator learning by Zongyi Li. Second, welcome to the world of machine learning, where those math are the most interesting part. Check out cs294 and cs285 from UC Berkeley and you gonna understand. I would say this lecture series is by far the best ones on Physics informed ML.

    • @李佳礼
      @李佳礼 8 місяців тому

      @@bug3308 I believe it will be better if prof Siddhartha Mishra can use no hand writing. Some of the symbols are vague which make it is difficult to understand. Also, if there codes then every thing will be clearer especially how the dimensions of input and output changing.

    • @chaoslee-vc9yy
      @chaoslee-vc9yy 18 днів тому

      Agree with you! The whole lectures are great except the parts of this Great Prof. hehe

  • @rito_ghosh
    @rito_ghosh Рік тому

    what a great topic, and such nice lectures! gained much clarity!

  • @rito_ghosh
    @rito_ghosh Рік тому

    Wonderful course. Can we please have the slides for each lecture?

    • @CAMLabETHZurich
      @CAMLabETHZurich 5 місяців тому

      All the slides are now available on our course webpage: camlab.ethz.ch/teaching/deep-learning-in-scientific-computing-2023.html

  • @akshays6272
    @akshays6272 Рік тому

    Why there is no collocation loss term in the second example?

  • @uqyge
    @uqyge Рік тому

    great course