Physics-Informed Neural Networks (PINNs) - An Introduction - Ben Moseley | Jousef Murad

Поділитися
Вставка
  • Опубліковано 31 тра 2023
  • 🌎 Website: jousefmurad.com
    Physics-informed neural networks (PINNs) offer a new and versatile approach for solving scientific problems by combining deep learning with known physical laws. Such networks are able to simulate physical systems, invert for their underlying parameters and even discover underlying physical laws themselves. In this introductory workshop and live coding session we will cover the basic definition of a PINN, their pros and cons compared to traditional scientific techniques and some of the state-of-the-art research in the field.
    👉 My main channel: @Jousef Murad
    ONLINE PRESENCE
    ================
    🌍 My website - jousefmurad.com/
    💌 My weekly science newsletter - jousef.substack.com/
    📸 Instagram - / jousefmrd
    🐦 Twitter - / jousefm2
    #physics
    #engineering
    #neuralnetwork

КОМЕНТАРІ • 30

  • @JousefLITE
    @JousefLITE  Рік тому +4

    🧠More material & talks here: community.sci-circle.com/checkout/community-member
    🌎 Science Courses: courses.jousefmurad.com/

  • @meetplace
    @meetplace 6 місяців тому +11

    +1 for Oxford PhD saying "timesing" instead of multiplying... respect! :D

  • @hreedishkakoty6771
    @hreedishkakoty6771 Місяць тому +1

    at 14:30, it seems like external force will not operate on Unn. External force will be a constant term in the physics loss function.

  • @abdulwaris8
    @abdulwaris8 5 місяців тому +2

    Thanks for sharing this recording from the workshop. Thanks, Ben!

  • @carriefu458
    @carriefu458 Місяць тому

    I love all of the questions!! 🤓 Ben is a great teacher!

  • @ajaytaneja111
    @ajaytaneja111 10 місяців тому +6

    We are talking of relatively simple oscillator problem. How about if we have complex geometries for which FEM methods are most suited today? I have been reading of physics informed graph nets for the purpose of complex geomeries. Do you have any references for complex domains? Lets say i have a complex shaped mechanical component subjected to pressure fir which i normslly use FEM.?

  •  7 місяців тому +1

    Nice lesson and clear presentation. Thank you!

  • @vegetablebake
    @vegetablebake 7 місяців тому +1

    A great introduction and massive thanks for sharing the knowledge!

  • @raju-bitter
    @raju-bitter 7 місяців тому +1

    Fantastic introduction, much appreciated!

  • @vitezslavstembera854
    @vitezslavstembera854 9 місяців тому +2

    Very nice and clear presentation.

  • @jyothish75
    @jyothish75 6 місяців тому +1

    could you please provide the example code of PINN?. Link in the comments not working.

  • @suleymanemirakin
    @suleymanemirakin 3 місяці тому

    Great work!

  • @WeiZhang-sj9sl
    @WeiZhang-sj9sl 8 місяців тому

    great work

  • @fkeyvan
    @fkeyvan 5 місяців тому

    nice tutorial. thank you.

  • @canxkoz
    @canxkoz Рік тому +2

    Great video on this fascinating field. Thanks for sharing.

  • @muhammadsohaib681
    @muhammadsohaib681 Рік тому +3

    Thank you for such an informative lecture on PINN.

  • @mklu0611
    @mklu0611 7 місяців тому +1

    OMG, very cool video!!! The training performance is highly dependent on the "lambda" value, do you have ideas about how to define its value? Many thanks.

  • @shankyxyz
    @shankyxyz 8 місяців тому

    similar question as some others. When we are solving even standard physics electrostatics, heat transfer etc, forget time domain, so only elliptic equations on complex CAD, I am wondering what applications can PINNs be used for. as opposed to using FEM. maybe shape optimization type problems? or inverse problems?

  • @AdrienLegendre
    @AdrienLegendre 2 місяці тому

    A possibly useful method would be to have the neural network identify the invariants or a Lie group for a differential equation. Another approach, compute all scalar quantities and have neural network find the right combination of scalar quantities to find a Lagrangian for a physical system.

  • @cunningham.s_law
    @cunningham.s_law 6 місяців тому +1

    I wonder if this give better results with PDE for option pricing

  • @tanuavi98
    @tanuavi98 3 місяці тому +1

    code link where can I get?

  • @user-lt4zd9zj2h
    @user-lt4zd9zj2h 6 місяців тому

    well done,the trend information is also very important,and it can be involved by a partial differential equation.i think maybe the parameters of the partial differential equation can also be the parameters of the neural network PINNS

  • @rupeshvinaykya4202
    @rupeshvinaykya4202 8 місяців тому +9

    Thanks for PINN , is code available ?

    • @aakashs1806
      @aakashs1806 Місяць тому

      I think MIT developed something related to this, not sure whether it is opensource

  • @sadeghmirzaei9330
    @sadeghmirzaei9330 10 місяців тому

    Great 👍

  • @ihmejakki2731
    @ihmejakki2731 4 місяці тому

    Very nice lesson! I'm stuck on the Task 3 though, I can't get the network to converge for w0=80. Here's the code if anyone can spot what I'm missing here:
    torch.manual_seed(123)
    # define a neural network to train
    pinn = FCN(1,1,32,3)
    # define additional a,b learnable parameters in the ansatz
    # TODO: write code here
    a = torch.nn.Parameter(torch.zeros(1, requires_grad=True))
    b = torch.nn.Parameter(torch.zeros(1, requires_grad=True))
    # define boundary points, for the boundary loss
    t_boundary = torch.tensor(0.).view(-1,1).requires_grad_(True)
    # define training points over the entire domain, for the physics loss
    t_physics = torch.linspace(0,1,60).view(-1,1).requires_grad_(True)
    # train the PINN
    d, w0 = 2, 80# note w0 is higher!
    mu, k = 2*d, w0**2
    t_test = torch.linspace(0,1,300).view(-1,1)
    u_exact = exact_solution(d, w0, t_test)
    # add a,b to the optimiser
    # TODO: write code here
    optimiser = torch.optim.Adam(list(pinn.parameters())+[a]+[b],lr=1e-3)
    for i in range(15001):
    optimiser.zero_grad()
    # compute each term of the PINN loss function above
    # using the following hyperparameters:
    lambda1, lambda2 = 1e-1, 1e-4
    # compute boundary loss
    # TODO: write code here (change to ansatz formulation)
    u = pinn(t_boundary)*torch.sin(a*t_boundary+b)
    loss1 = (torch.squeeze(u) - 1)**2
    dudt = torch.autograd.grad(u, t_boundary, torch.ones_like(u), create_graph=True)[0]
    loss2 = (torch.squeeze(dudt) - 0)**2
    # compute physics loss
    # TODO: write code here (change to ansatz formulation)
    u = pinn(t_physics)*torch.sin(a*t_physics+b)
    dudt = torch.autograd.grad(u, t_physics, torch.ones_like(u), create_graph=True)[0]
    d2udt2 = torch.autograd.grad(dudt, t_physics, torch.ones_like(dudt), create_graph=True)[0]
    loss3 = torch.mean((d2udt2 + mu*dudt + k*u)**2)
    # backpropagate joint loss, take optimiser step
    # TODO: write code here
    loss = loss1 + lambda1*loss2 + lambda2*loss3
    loss.backward()
    optimiser.step()
    # plot the result as training progresses
    if i % 5000 == 0:
    #print(u.abs().mean().item(), dudt.abs().mean().item(), d2udt2.abs().mean().item())
    u = (pinn(t_test)*torch.sin(a*t_test+b)).detach()
    plt.figure(figsize=(6,2.5))
    plt.scatter(t_physics.detach()[:,0],
    torch.zeros_like(t_physics)[:,0], s=20, lw=0, color="tab:green", alpha=0.6)
    plt.scatter(t_boundary.detach()[:,0],
    torch.zeros_like(t_boundary)[:,0], s=20, lw=0, color="tab:red", alpha=0.6)
    plt.plot(t_test[:,0], u_exact[:,0], label="Exact solution", color="tab:grey", alpha=0.6)
    plt.plot(t_test[:,0], u[:,0], label="PINN solution", color="tab:green")
    plt.title(f"Training step {i}")
    plt.legend()
    plt.show()

  • @TerragonCFD
    @TerragonCFD 8 місяців тому +1

    Im a beginner in PyTorch and OpenFOAM since the last few years, but today i learned that my "dream" is called "PINN" 🙂