Spectacular AI
Spectacular AI
  • 35
  • 85 618
Aerial navigation without GPS
This demonstration used a MEMS IMU (cost below $5), which was inside the cockpit to reduce vibration noise from the engine, and a consumer-grade barometer. The performance in this test was mostly limited by insufficient noise isolation in the camera. The SDK can also utilize magnetometer data, but that was not required here.
The software in this test consists of two main components:
1. Visual-Inertial Odometry (VIO), which is already available in our commercial SDK
2. A Visual Positioning System (VPS) component, which uses the aerial-imagery-based reference map
This component is not a part of our standard SDK. If you are interested in early access of the VIO+VPS solution, send us a message at www.spectacularai.com/
#contact #sensorfusion #computervision #vio #navigation
Переглядів: 662

Відео

GPS-free drone navigation with less than 1% drift!
Переглядів 1,6 тис.7 місяців тому
Spectacular AI SDK VIO tracking with two different camera devices: 1) Spectacular AI reference HW design (v1) with two global shutter stereo cameras. 2) RTK-VINS prototype device with monocular fisheye rolling shutter camera setup. Both cases also fuse barometric altitude data with VIO. GPS is only shown for reference and no loop closures are used. The total accumulated drift is below 0.6% in b...
Gaussian Splatting on the Move - v2
Переглядів 1 тис.7 місяців тому
Our method enables crisp Gaussian Splatting reconstructions from blurry and wobbly smartphone captures. Motion blur and rolling shutter compensation for 3DGS using VIO IMU data, pose refinement, and a differentiable image formation model. Video for the paper: arxiv.org/abs/2403.13327 Code: github.com/SpectacularAI/3dgs-deblur Project page: spectacularai.github.io/3dgs-deblur/ #gaussiansplatting...
GPS free drone navigation with VIO
Переглядів 2 тис.8 місяців тому
This video demonstrates GPS-free navigation using the Spectacular AI SDK and a standalone reference hardware design. The payload on the drone is an example of a high-accuracy VIO system built using low-cost, mostly off-the-shelf, components. It can track the position and orientation of the drone in real-time using only consumer-grade cameras and IMU, running on embedded hardware, in this case, ...
Spectacular AI on Orbbec cameras
Переглядів 3,4 тис.11 місяців тому
Spectacular AI SDK now supports Orbbec Astra 2 and Femto Mega (the official drop-in replacement for Azure Kinect DK) out-of-the-box without any extra configuration! The SDK is available FREE for non-commercial use, you can download it from github.com/SpectacularAI/sdk Link to documentation (Orbbec wrapper): spectacularai.github.io/docs/sdk/wrappers/orbbec.html #gaussiansplatting #slam #computer...
Gaussian Splatting reconstructions with Android & iPhone
Переглядів 3,1 тис.Рік тому
Android app: play.google.com/store/apps/details?id=com.spectacularai.rec iOS app: apps.apple.com/us/app/spectacular-rec/id6473188128 Step by step instructions: github.com/SpectacularAI/sdk-examples/tree/main/python/mapping Example gallery: www.spectacularai.com/mapping#gallery #computervision #slam #nerf #gaussiansplatting
Spectacular Rec for Gaussian Splatting and NeRF reconstructions (iPhone)
Переглядів 4 тис.Рік тому
Here are all the links you need to get started with your reconstructions! Spectacular Rec for iPhones: apps.apple.com/us/app/spectacular-rec/id6473188128 Spectacular AI SDK mapping scripts: github.com/SpectacularAI/sdk-examples/tree/main/python/mapping Nerfstudio: docs.nerf.studio/ #computervision #slam #nerf #gaussiansplatting
Gaussian Splatting and NeRFs on OAK-D (+ RealSense & iPhone)
Переглядів 2,7 тис.Рік тому
The latest Spectacular AI Mapping API version allows training Gaussian Splatting and NeRF reconstructions from data recorded on any device supported by the Spectacular AI SDK. The mapping and post-processing stages are fast, robust and totally COLMAP-free! The final training phase is powered by Nerfstudio. See github.com/SpectacularAI/sdk-examples/tree/main/python/mapping for instructions. Comi...
Spectacular AI with RS-LiDAR-M1 from RoboSense
Переглядів 720Рік тому
Real time tracking and 3D reconstruction of surrounding environment using Spectacular AI SDK with data from a stereo camera, an IMU and a solid state lidar (RS-LiDAR-M1 from RoboSense). RGB camera is used for visualization. No GPS required #lidar #computervision #slam
Gaussian Splatting and NeRFs with Spectacular AI Mapping API
Переглядів 5 тис.Рік тому
Demonstration of Gaussian Splatting and various NEural Radiance fields trained on data processed using Spectacular AI Mapping API (no COLMAP). All data in this video was recorded with Azure Kinect, registrered using the Spectacular AI SDK (Mapping API) and finally trained using either Gaussian Splatting (Taichi implementation github.com/wanmeihuali/taichi_3d_gaussian_splatting), Nerfacto (Nerfs...
Real-time VISLAM on board a drone with Raspberry Pi
Переглядів 895Рік тому
Spectacular AI is collaborating with Tampere University to record a new benchmark dataset for low-cost embedded VISLAM. See tutvision.github.io/TampereDroneDataset/ for more information. Two different devices were used for data recording: Device 1 is built exlusively from off-the-shelf parts (Raspberry Pi 4 OAK-D Pro W). Device 2 is built from lower level components: R-Pi 4 monocular InnoMaker ...
Simulated visual data along EuroC trajectory
Переглядів 252Рік тому
Simulated visual data along the EuroC "v2-03-difficult" trajectory using different camera models. This data "matches" the actual IMU data in the original dataset and can be used to study the effects of different visual properties of the scene and camera on Visual-Inertial SLAM.
MPU6050 vs. SCHA634 - Testing the effect of IMU quality on VISLAM accuracy
Переглядів 4,3 тис.2 роки тому
We built a test device with two different IMUs to study how the quality of the IMU affects the accuracy of inside-out tracking (VISLAM) in the Spectacular AI SDK. The device has two global-shutter camera sensors (OV9281) with wide-angle lenses (ArduCam B0223), and two inertial-measurement units (IMUs): * TDK/InvenSense MPU6050, a popular and inexpensive MEMS IMU * Murata SCHA634-D03, a high-qua...
Spectacular AI SDK + NeRF
Переглядів 1,7 тис.2 роки тому
Spectacular AI SDK mapping API outputs can be fed into various NeRF frameworks. Here is a demonstration with NVidia Instant NeRF that produces impressive 3D reconstructions in seconds. These examples do not have a separate (slow and fragile) COLMAP step, but the mapping API outputs can be directly used as inputs for the NeRF. www.spectacularai.com/ #deeplearning #computervision #realsense #kinect
New SLAM postprocessing for large-scale visual mapping
Переглядів 1,2 тис.2 роки тому
Demonstrating Spectacular AI SDK 1.4 and the new SLAM postprocessing for large-scale visual mapping www.spectacularai.com
Large-scale real-time mapping example using Azure Kinect
Переглядів 4,3 тис.2 роки тому
Large-scale real-time mapping example using Azure Kinect
Real-time mapping with OAK-D and RealSense
Переглядів 9 тис.2 роки тому
Real-time mapping with OAK-D and RealSense
Fast texturized reconstruction demo
Переглядів 1,5 тис.2 роки тому
Fast texturized reconstruction demo
RGBD-Inertial mapping
Переглядів 1,9 тис.2 роки тому
RGBD-Inertial mapping
SLAM and relocalization tests on OAK-D & RealSense
Переглядів 9 тис.2 роки тому
SLAM and relocalization tests on OAK-D & RealSense
HybVIO: Pushing the limits of real-time visual-inertial odometry
Переглядів 7 тис.3 роки тому
HybVIO: Pushing the limits of real-time visual-inertial odometry
Spectacular AI SDK AR demo
Переглядів 1,3 тис.3 роки тому
Spectacular AI SDK AR demo
City-scale GNSS-VIO Augmented Reality
Переглядів 1,8 тис.3 роки тому
City-scale GNSS-VIO Augmented Reality
Spatial AI demonstration
Переглядів 9573 роки тому
Spatial AI demonstration
Low-power real-time VIO (with RealSense comparison)
Переглядів 5 тис.3 роки тому
Low-power real-time VIO (with RealSense comparison)
Comparison to ARCore, ARKit and RealSense
Переглядів 3,8 тис.3 роки тому
Comparison to ARCore, ARKit and RealSense
GPS-aided visual-inertial odometry on the OAK-D
Переглядів 4,1 тис.3 роки тому
GPS-aided visual-inertial odometry on the OAK-D
GNSS-VIO in bad weather - high-velocity tunnel
Переглядів 3583 роки тому
GNSS-VIO in bad weather - high-velocity tunnel
GNSS-VIO in a tunnel (using ZED 2)
Переглядів 1,3 тис.3 роки тому
GNSS-VIO in a tunnel (using ZED 2)
Spatial AI in a WW1 fortification
Переглядів 6833 роки тому
Spatial AI in a WW1 fortification

КОМЕНТАРІ

  • @andreasthegreat8509
    @andreasthegreat8509 19 днів тому

    I would like to have the api as a just a slam program in windows

  • @slevinshafel9395
    @slevinshafel9395 2 місяці тому

    wow this is soo advance. Ok not so powerfull for indoor, tunels or caves but close. With that acuracy ELECTRONIC WARFARE are more advanced than i was thinked. Good job. In terms of energy? it cost more right?

  • @Orbbec3d
    @Orbbec3d 2 місяці тому

    Big thanks for using our camera and making it look amazing!

  • @brdane
    @brdane 3 місяці тому

    This is the type of accuracy test I have been looking for for these low-cost gyroscope/accelerometer ICs. This is the only test someone really needs to see before deciding on purchasing.

  • @sumitsarkar4517
    @sumitsarkar4517 3 місяці тому

    how u correct the error to GPS at 0:40 and specially at 1:37

  • @WenhangDONG
    @WenhangDONG 4 місяці тому

    @Spectacular AI thank you for your contribution! I follow you step, but after i run "sai-cli process MY_INPUT_PATH MY_OUTPUT_PATH", i got and error"mapping failed: no output generated" I tried on three different video of Spectacular REC, my iphone is 15pro, I don't know what happend, could you please tell me?

  • @filmproducerpro
    @filmproducerpro 5 місяців тому

    Hi! Please tell me, can the program work with the 3d scanner "creality scan Ferret pro" to scan the room?

  • @bearyzhang
    @bearyzhang 6 місяців тому

    it is so similar to RTABMAP

  • @VorpalForceField
    @VorpalForceField 7 місяців тому

    Impressive ...!!! Thank You for sharing .. Cheers :)

  • @pleabargain
    @pleabargain 7 місяців тому

    Okay. It's cool but put your jargon into layman's terms. What is the significance of this flight path? You told the drone to fly around and not crash?

    • @김도녕-m3b
      @김도녕-m3b 4 місяці тому

      It's a drone localization using vision, they show a comparison result of Red(GPS, ground truth) and Blue(Visual Inertial Odometry, Prediction) in real data

  • @TheBarthew
    @TheBarthew 7 місяців тому

    Love it! Did you test it on a high altitude mission?

  • @khangletruong8196
    @khangletruong8196 8 місяців тому

    Very interesting! I wonder if there would be any chance to test/evaluate this with your sdk on our drone ?

    • @SpectacularAI
      @SpectacularAI 8 місяців тому

      Generally yes, as a commercial NRE/pilot project. Contact us at www.spectacularai.com/#contact for more details. Please include a high-level description of your hardware and use case.

  • @sudarshanpoudyal5089
    @sudarshanpoudyal5089 8 місяців тому

    Will it be supported in opensource sdk. .

  • @malaysiastreetview
    @malaysiastreetview 8 місяців тому

    👍

  • @gaussiansplatsss
    @gaussiansplatsss 9 місяців тому

    what iphone did you use here

  • @王彪钓
    @王彪钓 9 місяців тому

    Astra 2 ,It has the best 3D restration capability,especially corners

  • @DongPham-kj9rb
    @DongPham-kj9rb 9 місяців тому

    Have you achieved colors for OAK cameras?!

  • @DongPham-kj9rb
    @DongPham-kj9rb 9 місяців тому

    Thank you

  • @gusstplt
    @gusstplt 9 місяців тому

    I try to use SpectacutarAI but after recording data from an android smartphone, when I launch s "sai-process" it get on two different computer (1 on windows(portable computer) , 1 on linux(GPU server) : SpectacularAI ERROR: x:441 Abandon (core dumped) I don't know what to do.

  • @uzaiftalpur
    @uzaiftalpur 10 місяців тому

    @SpectacularAI how can I process without OAK-D? any other way to use such devices

  • @Mark001986
    @Mark001986 10 місяців тому

    Very nice! Is the lidar used for SLAM tracking and bundle adjustment?

  • @GroFilms
    @GroFilms 11 місяців тому

    LOVE this presentation- awesome!!

  • @alessandro_valli
    @alessandro_valli 11 місяців тому

    Which one would you recommend for indoor use? Astra 2 or Femto? I am interested in having the higher resolution possible. Thank you!

    • @SpectacularAI
      @SpectacularAI 11 місяців тому

      Femto. It has the best depth sensor for indoor use

    • @michelesacco1638
      @michelesacco1638 10 місяців тому

      @@SpectacularAI Yes but it can't see black surface

    • @alessandro_valli
      @alessandro_valli 9 місяців тому

      ​@@michelesacco1638really?

    • @Orbbec3d
      @Orbbec3d 2 місяці тому

      Yes!!!Femto Mega!

  • @전종택-v3b
    @전종택-v3b 11 місяців тому

    I thought this is same but a little different. If someone just want to 6 axis about motion, it doesn't matter, is it? In the event of vr, it doesn't matter. also If someone more improvement somethings, get a good things. But this video is interesting ;)

  • @kamalakrishnan9427
    @kamalakrishnan9427 11 місяців тому

    The link provided just directs us to this video which tells us to install the requirements.txt. There is no requirements.txt in the git hub repo. Help. ​ @SpectacularAI

  • @Deadnature
    @Deadnature 11 місяців тому

    Can this be done on a mac studio m2?

  • @engfernandolsf
    @engfernandolsf 11 місяців тому

    how much is this equipment? is a budget solution to scan and bring a house or small company to a 3d model?

    • @Orbbec3d
      @Orbbec3d 2 місяці тому

      Hello, thank you for your interest! You can consult the testimonials of bloggers Spectacular AI, the current price of Mega is 695$

  • @OlgaLight13
    @OlgaLight13 11 місяців тому

    I can't seem to find the requirements.txt file and i do pip install spectacularAI and it doesn't work. Please help!

    • @SpectacularAI
      @SpectacularAI 11 місяців тому

      The most up-to-date instructions can be found here spectacularai.github.io/docs/sdk/tools/nerf.html (you should "pip install spectacularAI[full]")

    • @kamalakrishnan9427
      @kamalakrishnan9427 11 місяців тому

      No, The link provided just directs us to this video which tells us to install the requirements.txt. There is no requirements.txt in the git hub repo. Help. ​@@SpectacularAI

    • @SpectacularAI
      @SpectacularAI 11 місяців тому

      You're correct that the page links to this video, which has some outdated info. You can skip that part of the docs (no need to follow the full video at that point) and follow the rest of the instructions on the page. The requirements.txt has been removed. The new command you should run is "pip install spectacularAI[full]" (notice the "[full]"). You may have to uninstall (pip uninstall spectacularAI) first if you have an older version.

  • @naurk
    @naurk Рік тому

    VRAM required?

    • @SpectacularAI
      @SpectacularAI Рік тому

      The 3DGS models in this video were trained on an NVIDIA GeForce RTX 3080 Ti, which has 12GB of RAM

  • @wrillywonka1320
    @wrillywonka1320 Рік тому

    But we cant use gaussian splats in any video editing software. They are awesome but pretty much are useless

    • @SpectacularAI
      @SpectacularAI Рік тому

      The splats (or efficiently training them) is a new technology that has did not exist a few months ago. The amount of software that will support them in some form will probably increase quite a lot in 2024, but you are right that the support currently rather limited.

    • @cekuhnen
      @cekuhnen 9 місяців тому

      They are pretty useful because you can relight them and don’t need meshes and textures to render.

    • @wrillywonka1320
      @wrillywonka1320 9 місяців тому

      @@cekuhnen thats true. I mean honestly i love how detailed they are compared to meshes and how easy they are on computing power but for someone that uses da vinci n blender there is really no way to utilize this great tech

    • @cekuhnen
      @cekuhnen 9 місяців тому

      @@wrillywonka1320 yeah it has limited use only

  • @zyang056
    @zyang056 Рік тому

    Is the ios app source available for customization?

  • @SpectacularAI
    @SpectacularAI Рік тому

    iOS app and tutorial video released! ua-cam.com/video/d77u-E96VVw/v-deo.html

  • @prarthanahegde9819
    @prarthanahegde9819 Рік тому

    Hii, can we implement real time mapping just like shown in the video with just Real Sense D455 camera. I dont have the OAK-D.

  • @JReinhoud
    @JReinhoud Рік тому

    Please share settings of the Kinect and RTAB-Map. And did you use standalone RTAB-Map or ROS with RTAB-Map?

  • @JINLAI-c5r
    @JINLAI-c5r Рік тому

    I try,but it shows “SpectacularAI WARN: VIO may be running too slow, data is being input too fast, or IMU samples are missing / time-offset from frames. (buffer size 10)”.How to slove?

  • @jakesnake5534
    @jakesnake5534 Рік тому

    Nice

  • @tbuk8350
    @tbuk8350 Рік тому

    Dang, ARCore is rock solid the whole time lol

  • @bernat4289
    @bernat4289 Рік тому

    Very impressive. What kind of hardware configuration is used to integrate RTK data?

  • @yiboliang8338
    @yiboliang8338 Рік тому

    Cool. Now I am just going to buy a 40$ second handed screen-cracked Sharp R5G with TOF sensor as the depth camera and the computing board for my project to run ARCORE.

  • @metalthower
    @metalthower Рік тому

    I would like to do this with a trail mapping bike - outdoors. Is this possible with an OAK-D-Pro?

  • @seble_pikachu3732
    @seble_pikachu3732 Рік тому

    It's really interesting to compare the results you can get in a VISLAM with a 2 euro IMU and a 150 euro IMU. Depending on your means and desired performance, you can easily choose one IMU or the other thanks to this comparison.

  • @kimsanmaro
    @kimsanmaro Рік тому

    can i get more information about GPS, ZED fusion? or any github rep anthing about fusion. please help. i want to more detail...

  • @liangzijian4452
    @liangzijian4452 Рік тому

    Great work! 👍I've been following SpectacularAI's sdk for a long time and have tried to apply oak's examples to my drones. But the results would drift badly. The camera I'm using is also oak-d-pro-w, do I need imu calibration, or some other action to achieve the results in the video?

    • @SpectacularAI
      @SpectacularAI Рік тому

      This has been improved significantly in the latest SDK versions. Depending on your hardware and, e.g., the level of vibration noise in the drone, the performance may also benefit from use-case-specific parameter tuning, which is available as a commercial service.

  • @seble_pikachu3732
    @seble_pikachu3732 Рік тому

    Was the µc Raspberry Pico just there to interface with the camera ? Or was there an algorithm on it ? Tanks and congratulations for the results !

    • @SpectacularAI
      @SpectacularAI Рік тому

      The MCU currently reads the IMU and triggers the camera, which also causes the IMU and camera timestamps to be accurately synchronized in the same monotonic clock. It is not strictly necessary to use one but it is a technically convenient choice.

  • @IzotopShurup
    @IzotopShurup Рік тому

    did you add support for ARMhF?

  • @imignap
    @imignap 2 роки тому

    Very cool! Fusing the optical data with the IMU to dead reckon. Wish that Murata was cheaper... Jw why no 3d mag?

  • @matspg
    @matspg 2 роки тому

    Thanks for posting this - I have a question: in the first test, the light grey is ground truth. The red and the blue tracks - are those from IMU *alone*, or are they from SLAM with the corresponding IMU. (if they're from IMU alone, that's incredible to me...way better than I expect). Thanks!

    • @SpectacularAI
      @SpectacularAI 2 роки тому

      They are computed using visual-inertial SLAM, where the image data is the same for both tracks but the IMU is different

  • @sudarshanpoudyal5089
    @sudarshanpoudyal5089 2 роки тому

    Can you make the hardware setup opensource. I am unable to find low cost visual inertial sensor setup currently.