Point Cloud Classification - Keras Code Examples

Поділитися
Вставка
  • Опубліковано 13 чер 2024
  • This video walks through the Keras Code Example implementation of Point Cloud Classification. I had a tough time understanding what the TNET blocks are motivated by, but if interested the paper link is below. I hope this tutorial still provided a decent enough example of what point clouds are and how to load them into a Keras workspace. Thanks for watching, please check out the rest of the Keras Code Example playlist!
    Content Links:
    Point Cloud Classification - Keras Code Examples: keras.io/examples/vision/poin...
    PointNet (Paper): arxiv.org/pdf/1612.00593.pdf
    ModelNet (Dataset Project Page): modelnet.cs.princeton.edu/
    Point Clouds (Wikipedia): en.wikipedia.org/wiki/Point_c...
    Trimesh library: trimsh.org/
    3D Printing Stack Exchange: 3dprinting.stackexchange.com/....
    GeeksforGeeks raster vs vector graphics: www.geeksforgeeks.org/vector-...
    PyTorch Geometric: pytorch-geometric.readthedocs...
    Thanks for watching!
    Point Cloud Chapters
    0:00 Beginning
    0:44 What are Point Clouds?
    4:14 Download and Visualize Dataset
    6:25 Point Cloud Data Augmentation
    7:50 Model Architecture
    15:50 Training and Evaluating the Model
    18:08 Summary
  • Наука та технологія

КОМЕНТАРІ • 14

  • @basithAA
    @basithAA 2 роки тому

    thanks for the knowledge

  • @arigato39000
    @arigato39000 3 роки тому

    thank you from japan

  • @zddroy1025
    @zddroy1025 2 роки тому +1

    Thank you for your video! Wonder where could we access the notebook.

  • @CristianGarcia
    @CristianGarcia 3 роки тому +3

    Thanks for the video! I am not an expert in this, but the basic intuition is that point clouds are sets so there is no natural way of ordering them i.e. you can NOT identify which is point 0, 1, 2... n, consistently between samples. On the other hand, grid convolutions assume a very precise local ordering so the regularization applied in PointNet I believe is trying to somehow learn an ordering in the first half of the network and then using it in the second half.
    I believe GNNs and Transformers are much better at this task than CNNs since they naturally operate on sets. Things like the SE(3) Transformer even (try to) encode 3D rotational symmetries into the architecture.
    A good data augmentation for this is 3D rotations so the network can try to learn to be invariant to this just like with CNN and rotated images.

    • @connorshorten6311
      @connorshorten6311  3 роки тому +1

      Hey Cristian, thanks for this information! I agree it definitely seems like the Transformer would be better suited for this problem! I'll check out the SE(3), still very new to point cloud research haha! Interesting to see data augmentation in the geometric dl space, 3D rotations like neural radiance fields seems like it could be interesting for 2D image data as well!

    • @stnmtambat9374
      @stnmtambat9374 3 роки тому +1

      @@connorshorten6311 check out the "Point Transformer" published in 2020

  • @blackeagleff
    @blackeagleff 2 роки тому

    Hi, could you bring a video on PointNET ++ or higher networks (SalsaNext, SPVNAS) regarding 3D semantic segmentation with lidar point cloud? I have my own point cloud data captured with velodyne lidar and i wanna know how to use one of this net to predict semantic segmentation on my own data, thank you !

  • @alexsteiner6103
    @alexsteiner6103 Рік тому

    how can i save th model ? to a .h5 file

  • @axo4579
    @axo4579 7 місяців тому

    Why is the validation loss so high and random?

  • @ahmadatta66
    @ahmadatta66 2 роки тому

    why is the validation loss so high?

  • @mehermanoj45
    @mehermanoj45 3 роки тому +2

    😀

  • @ankurkumarsaha4622
    @ankurkumarsaha4622 2 роки тому

    why only 2048 points

  • @jessar82
    @jessar82 Рік тому +2

    But how can you explain and review work you did not understand? Did you check the validation accuracy? Did you plot the loss? Just take a moment to code a model Training and Validation loss, at least! Mate, this work on Keras is basically a fake replication of the original paper. The model is overfitting from the start to the end, and the results are just random.