[FairMOT] Multiple Object Tracking and Mapping the coordinates to map with two different cameras

Поділитися
Вставка
  • Опубліковано 18 лют 2021

КОМЕНТАРІ • 8

  • @mubashirwaheed474
    @mubashirwaheed474 День тому

    This is good but I thought you applied the object detection and tracking at the same time and not separately

  • @user-dg8zj2nb7v
    @user-dg8zj2nb7v 2 місяці тому +1

    Hello!
    How did you implement the tracking on the 2D map to keep the same ID of the person even if he/she changes from one camera to another?

    • @user-cz3tw4eu9h
      @user-cz3tw4eu9h  2 місяці тому +1

      Hello. Thank you for watching.
      The full file of this project is attached to the following link: github.com/jyoonlee/GCU_WifiSensing/tree/master
      This video was implemented by adding a code for drawing a trajectory through the cv2 library after object detection through FairMOT provided as an open source,
      In the case of mapping, I applied it to the map made by our team using the code provided as an open source.
      I'm sharing the link of the open source that we referenced :)
      FairMOT: github.com/ifzhang/FairMOT
      mapping: medium.com/hal24k-techblog/how-to-track-objects-in-the-real-world-with-tensorflow-sort-and-opencv-a64d9564ccb1

  • @abdurrahmankhan6444
    @abdurrahmankhan6444 2 роки тому +2

    Can you please link repo or any material to provide insight into multiple camera setting?

    • @user-cz3tw4eu9h
      @user-cz3tw4eu9h  2 місяці тому

      Sorry for late response. The full file of the project is attached to the following link: github.com/jyoonlee/GCU_WifiSensing/tree/master

  • @dasollee8469
    @dasollee8469 2 роки тому +1

    fairmot를 두개에 카메라에 어떻게 적용하셨나요?

    • @user-cz3tw4eu9h
      @user-cz3tw4eu9h  2 роки тому

      네, 안녕하세요 !
      보이시는 영상은 저희가 카메라 두 대로 target environment를 촬영한 뒤에,
      각 영상을 따로 따로 FairMOT를 통해 object detection을 하였습니다.
      그 후에 하단에 있는 전체 map에 좌표를 mapping하는 작업을 순서로 진행하였습니다 : )

    • @user-cz3tw4eu9h
      @user-cz3tw4eu9h  Рік тому

      @@unistable we recorded the video first, and made the trajectories through the code (FairMOT)