LUCID Vision Labs
LUCID Vision Labs
  • 74
  • 223 828
Powering a Machine Vision Camera
This is quick beginners video showing the different ways to power a machine vision camera.
0:00 Intro
0:13 PoE using Network Interface Card
0:55 PoE using a Network Switch
1:39 PoE using a PoE Injector
2:05 Using External Power via the GPIO port
2:43 PoE Standards that LUCID cameras support
Industrial Machine Vision Cameras thinklucid.com
© LUCID Vision Labs
#MachineVisionCamera #IndustrialCamera #LUCIDVisionLabs #MachineVision
Переглядів: 144

Відео

Triton2 EVS: Event-Based Machine Vision Camera
Переглядів 1,9 тис.2 місяці тому
LUCID's Triton2 EVS camera features the Sony / Prophesee 0.9MP IMX636 and 0.3MP IMX637. These cameras provides event data instead of frame-based data. For Event-Based sensors, each pixel intelligently activates itself depending on the contrast change (movement) it detects. This enables the acquisition of only essential motion information, continuously. There is no framerate anymore. Triton2 EVS...
Wafer Inspection using Triton2 SWIR - 2.5GigE camera with 5.2MP Sony IMX992 SenSWIR Sensor
Переглядів 2,3 тис.3 місяці тому
This video showcases LUCID Vision Labs' Triton2 - 2.5 GigE SWIR camera featuring Sony's 5.2 MP Sony IMX992 SenSWIR™ sensor. Watch how we inspect different wafers and see how this second-generation SWIR sensor compares to Sony's first-genenation SenSWIR sensor. Triton2 SWIR with 5.2MP IMX992 thinklucid.com/product/triton2-swir-5-2mp-model-imx992/ Triton2 SWIR with 3.2MP IMX993 thinklucid.com/pro...
Time of Flight 3D Camera Comparison: Helios2 Wide vs Helios2 point cloud
Переглядів 1993 місяці тому
Here is a quick side by side comparison showing the difference in area that can be 3D imaged by Helios2 Wide (FoV: 108° x 78°) vs Helios2 (FoV: 69° x 51°) Helios2 Wide: thinklucid.com/product/helios2-wide-time-of-flight-tof-ip67-3d-camera/ Learn more about LUCID's 3D Cameras at thinklucid.com/helios-time-of-flight-tof-camera/
IEEE-1588 PTP (Precision Time Protocol) JupyterLab Notebook for LUCID Cameras
Переглядів 9014 місяці тому
Download PTP Notebook: thinklucid.com/jupyterlab-resource-center/ PTP and Action Commands: support.thinklucid.com/app-note-multi-camera-synchronization-using-ptp-and-scheduled-action-commands/ Intro to PTP: support.thinklucid.com/knowledgebase/precision-time-protocol-ptp/ PTPSync and Bandwidth Sharing in Multi-Camera Systems support.thinklucid.com/app-note-bandwidth-sharing-in-multi-camera-syst...
Visionary Machine's 3D Spatial Imaging System using LUCID Vision Labs Triton IP67 Industrial Cameras
Переглядів 7996 місяців тому
The collaboration between Visionary Machines and LUCID exemplifies how innovative sensor technologies can revolutionize machine perception, providing solutions that surpass the limitations of existing modalities. Pandion™’s real-time 3D perception emerges as a game-changer in spatial awareness, setting new standards for efficiency and safety in various industries. Visionary Machines - visionary...
3D Time-of-Flight (ToF) Cameras: Helios2 3D Cameras Compared
Переглядів 2,5 тис.6 місяців тому
In this video we give an overview of the similarities and differences of the Helios2 Time-of-Flight (ToF) 3D cameras. The Helios2 3D cameras are perfect for automated assembly, bin picking, inspection, mobile robotics, position and location identification, and recognition and object classifications. 0:00 Helios2 Models 0:31 Model Similarities 1:41 Distance Modes 1:56 Filtering Controls and Exam...
AI Controlled Autonomous, Electric Passenger Ferry Uses Triton HDR Cameras
Переглядів 7357 місяців тому
Read the full case study at thinklucid.com/case-studies/autonomous-passenger-ferry-uses-triton-hdr-cameras/ Zeabuz, in collaboration with the ferry operator Torghatten, successfully launched the world’s first commercial autonomous ferry in Stockholm in the summer of 2023. Zeabuz employs a comprehensive suite of sensors, including cameras, LiDAR, Radar, and AIS. These sensors provide input to ob...
SWIR Camera Coffee Bean Inspection - Tutorial: SWIR Camera, Lens, Lights
Переглядів 2,1 тис.10 місяців тому
SWIR Camera Coffee Bean Inspection - Tutorial: SWIR Camera, Lens, Lights
Standard Camera vs Triton HDR Camera with AltaView Tone Mapping
Переглядів 64911 місяців тому
Standard Camera vs Triton HDR Camera with AltaView Tone Mapping
Photonics Media Webinar: RDMA for High-Speed Cameras. Optimal Image Transfer Over 10GigE
Переглядів 531Рік тому
Photonics Media Webinar: RDMA for High-Speed Cameras. Optimal Image Transfer Over 10GigE
LUCID Vision Labs at CVPR 2023
Переглядів 778Рік тому
LUCID Vision Labs at CVPR 2023
MVPro Media's Matt Williams speaks to LUCID's' Torsten Wiesinger (EMEA Region)
Переглядів 340Рік тому
MVPro Media's Matt Williams speaks to LUCID's' Torsten Wiesinger (EMEA Region)
HDR Tone Mapping on the Camera: LUCID's AltaView On-Camera Engine
Переглядів 1,4 тис.Рік тому
HDR Tone Mapping on the Camera: LUCID's AltaView On-Camera Engine
Sneak Peek: RDMA for 10GigE Cameras
Переглядів 1,3 тис.Рік тому
Sneak Peek: RDMA for 10GigE Cameras
Roboshin - Multi-stack pick up gripper and arm using Helios2
Переглядів 887Рік тому
Roboshin - Multi-stack pick up gripper and arm using Helios2
LUCID SWIR and UV Cameras - inVISION Webinar: Spectral Imaging Presenation
Переглядів 705Рік тому
LUCID SWIR and UV Cameras - inVISION Webinar: Spectral Imaging Presenation
Conventional Camera vs HDR camera (Triton HDR with Sony 5.4MP IMX490 CMOS)
Переглядів 1,5 тис.Рік тому
Conventional Camera vs HDR camera (Triton HDR with Sony 5.4MP IMX490 CMOS)
Sony SWIR Sensors & TEC Cooling: Atlas SWIR and Triton SWIR Cameras
Переглядів 1,6 тис.Рік тому
Sony SWIR Sensors & TEC Cooling: Atlas SWIR and Triton SWIR Cameras
2.5 Gigabit Ethernet (2.5GigE, 2.5GbE) Triton2 Industrial Camera
Переглядів 2 тис.Рік тому
2.5 Gigabit Ethernet (2.5GigE, 2.5GbE) Triton2 Industrial Camera
2.5GigE, Event-Based, 10GigE with RDMA, 25GigE Industrial Cameras Coming 2023
Переглядів 791Рік тому
2.5GigE, Event-Based, 10GigE with RDMA, 25GigE Industrial Cameras Coming 2023
VISION 2022 Presentation: Advantages of JupyterLab for Machine Vision
Переглядів 401Рік тому
VISION 2022 Presentation: Advantages of JupyterLab for Machine Vision
Lens Mounts Compared: Phoenix Machine Vision Camera
Переглядів 1,3 тис.2 роки тому
Lens Mounts Compared: Phoenix Machine Vision Camera
TensorFlow Object Detection Jupyter Notebook - JupyterLab in ArenaView Tutorial
Переглядів 3,8 тис.2 роки тому
TensorFlow Object Detection Jupyter Notebook - JupyterLab in ArenaView Tutorial
Atlas10 (10GigE Machine Vision Camera) - Sony 47MP IMX492 Rolling Shutter Sensor Overview
Переглядів 2,4 тис.2 роки тому
Atlas10 (10GigE Machine Vision Camera) - Sony 47MP IMX492 Rolling Shutter Sensor Overview
HDR Imaging for Automotive Sensing Applications - Sony IMX490 CMOS Sensor
Переглядів 3,5 тис.2 роки тому
HDR Imaging for Automotive Sensing Applications - Sony IMX490 CMOS Sensor
Jupyterlab for Machine Vision Cameras in LUCID ArenaView (with Barcode Reader Example)
Переглядів 3 тис.2 роки тому
Jupyterlab for Machine Vision Cameras in LUCID ArenaView (with Barcode Reader Example)
LUCID Triton Edge - AMD Xilinx MPSoC Embedded Vision Camera - On-Demand Webinar
Переглядів 7352 роки тому
LUCID Triton Edge - AMD Xilinx MPSoC Embedded Vision Camera - On-Demand Webinar
Atlas 5GigE (5GBASE-T) Machine Vision Camera
Переглядів 12 тис.2 роки тому
Atlas 5GigE (5GBASE-T) Machine Vision Camera
TCP for 10GigE Machine Vision Cameras: Reliable Image Transfer for the Atlas10
Переглядів 9702 роки тому
TCP for 10GigE Machine Vision Cameras: Reliable Image Transfer for the Atlas10

КОМЕНТАРІ

  • @OccamBurton-m6t
    @OccamBurton-m6t 4 дні тому

    Greenholt Extension

  • @unclesamautos
    @unclesamautos 4 дні тому

    I keep saying, configure this cameras for microscopes and grab market share. Use IMX585

  • @ferminsalcedojr.2122
    @ferminsalcedojr.2122 22 дні тому

    😊

  • @ConsultingjoeOnline
    @ConsultingjoeOnline 28 днів тому

    Very cool. Thanks for the info!

  • @7039jonas
    @7039jonas Місяць тому

    Hi, what is the name of the rods and accessories you are using for holding the light in place? 😅

  • @user-fy2ho1us5w
    @user-fy2ho1us5w Місяць тому

    May I know the cost

  • @dileepk4024
    @dileepk4024 Місяць тому

    Very informative video! What's the type of wafer you used? Where can I possibly get this from? Thanks!

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Місяць тому

      You can find all softs or different sizes of wafers online, etched or unetched, from places such as Ebay or Alibaba.

  • @likithgannarapu8454
    @likithgannarapu8454 2 місяці тому

    Hi, I am using a 2 camera set up using Lucid Triton064S cameras. To implement PTPSync I have used the code provided in the JupyterLab-Resource-Center , I am facing the following two issues to set the AcquisitionStartMode and PTPSyncFrameRate # Error 1: c.device.nodemap.get_node('AcquisitionStartMode').value = "PTPSync" -------------------------------------------------------------------------- ValueError: 'AcquisitionStartMode' node does not exist in this nodemap (some suggestions): ['AcquisitionStart', 'AcquisitionMode', 'AcquisitionStop', 'AcquisitionFrameRate', 'AcquisitionFrameCount', 'AcquisitionControl', 'AcquisitionLineRate', 'AcquisitionBurstFrameCount', 'ActionUnconditionalMode', 'DecimationVerticalMode', 'AcquisitionFrameRateEnable', 'DecimationHorizontalMode', 'BinningHorizontalMode', 'ActionGroupKey', 'LineActivationVoltage', 'SequencerMode'] -------------------------------------------------------------------------- # Error 2: c.device.nodemap.get_node('PTPSyncFrameRate').value = FRAME_RATE --------------------------------------------------------------------------- ValueError: 'PTPSyncFrameRate' node does not exist in this nodemap (some suggestions): ['TransmissionFrameRate', 'AcquisitionFrameRate', 'AcquisitionFrameRateEnable', 'GevPAUSEFrameReception', 'FirmwareUpdate', 'PixelDynamicRangeMin', 'PixelDynamicRangeMax', 'GevSCPSDoNotFragment', 'BalanceRatio', 'TransferPause', 'AwbStatsFrameCount', 'TriggerLatency', 'TransferStatus', 'SerialBaudRate', 'PtpOffsetFromMaster', 'GevPAUSEFrameTransmission'] ---------------------------------------------------------------------------

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 2 місяці тому

      Hi! Please email this information to support(at)thinklucid.com

  • @haves_
    @haves_ 2 місяці тому

    We got predator dinosaur vision before GTA 6.

  • @johnwick9935
    @johnwick9935 2 місяці тому

    4 minutes of information. 👍🏼

  • @t.hartig4570
    @t.hartig4570 2 місяці тому

    Dont get it... You could do this processing with Software or?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 2 місяці тому

      No, because then you're back to processing image frames from a standard camera. Event-based vision doesn't send image frames, doesn't have an FPS. The data stream dynamically changes based on the amount of pixels detecting brightness changes.

    • @shApYT
      @shApYT 2 місяці тому

      What you see on the monitor is what you get without processing. You get motion information for free.

  • @liltnt380
    @liltnt380 2 місяці тому

    You guys are dope!🎉😮

  • @tegar070990
    @tegar070990 2 місяці тому

    do you have example code real-time point cloud stream?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 2 місяці тому

      Example codes can be accessed by downloading our Arena SDK from our website.

  • @MrinaliniSingh-ov9xi
    @MrinaliniSingh-ov9xi 3 місяці тому

    Hi can you comment on Triton2 12.3 MP Model (IMX304) camera

  • @sonunaku7715
    @sonunaku7715 3 місяці тому

    Please make video on the integrating it with python by using sdk

  • @deplorablesecuritydevices
    @deplorablesecuritydevices 3 місяці тому

    These comparison videos are invaluable keep it up!

  • @bernsbuenaobra473
    @bernsbuenaobra473 5 місяців тому

    My interest is the detection of bruising in fresh fruit. The spectral band of the camera includes 900nm and the custom light at these bands will greatly matter in our fresh fruit work. This could be a possibility for fresh fruits outgoing inspection. We need to contact the supplier in the ASEAN region if any and enter into a proof of concept. Is there a demo you show using fresh fruits?

  • @Sun_Rise4
    @Sun_Rise4 7 місяців тому

    Is it possible to upload a video regarding how to calculate distance using python for Helios 2 .

  • @AndiMi1972
    @AndiMi1972 7 місяців тому

    Hi, the code thing at the end of the clip looks like what we are looking for. Can it be setup, that the full frame view will be displayed with the camera image by firing up the computer or open a link from the desktop? We are looking for a camera view on a flatsceen as a window replacement.

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 7 місяців тому

      Yes, this is possible. You can set the program's window size to specific dimensions or full screen. The code at the end is a JupyterLab Notebook for our camera viewer software (ArenaView). Our full Arena SDK provides other APIs as well to create custom camera software for you application.

  • @GopalDas-qs7kc
    @GopalDas-qs7kc 7 місяців тому

    Thank you. could you please show an example of combining RGB camera with Helios?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 7 місяців тому

      Here are two 3D models with RGB+3D sketchfab.com/3d-models/assorted-objects-helios2triton-cameras-73d4ef94789c41e8a99bf40fdbb7b72b sketchfab.com/3d-models/laundry-detergent-bottles-helios2triton-750769440e8c47edb18995ca1f008feb

  • @GopalDas-qs7kc
    @GopalDas-qs7kc 7 місяців тому

    Do you have an example of showing the overlaying of color camera with the TOF camera? I appreciate your input

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 7 місяців тому

      Here are two 3D models with RGB+3D sketchfab.com/3d-models/assorted-objects-helios2triton-cameras-73d4ef94789c41e8a99bf40fdbb7b72b sketchfab.com/3d-models/laundry-detergent-bottles-helios2triton-750769440e8c47edb18995ca1f008feb

  • @trondJohansen-yb4eh
    @trondJohansen-yb4eh 7 місяців тому

    I need a ToF camera below the 850nm. Why cant i find any product for my application 😔

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 7 місяців тому

      That's because any VCSELs below 850nm will conflict with many types of indoor lighting and ambient light. 850nm is used because it is outside the visible spectrum. ToF cameras have to filter out ambient light and using a wavelength lower will end up getting filtered too.

  • @gesmar
    @gesmar 7 місяців тому

    Whats the price of the camera? I dont find the information anywhere.

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 7 місяців тому

      Atlas SWIR 1.3MP with TEC $14,500 Atlas SWIR 0.3MP with TEC $9,950 Triton SWIR 1.3MP without TEC $10,950 Triton SWIR 0.3MP without TEC $6,950 (all in USD)

    • @bg7293
      @bg7293 5 місяців тому

      you dont have to buy such as expensive camera as 10,000 $ every camera has infrared spectrum, you need to remove inrfared filter and put infrared pass filter

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 5 місяців тому

      @@bg7293 This is only a little bit true. If you remove the IR filter on a standard color sensor, you are looking at increased sensitivity from the 700nm to ~900nm, with sensitivity falling off to near 0% QE around 1000nm. These SWIR cameras provide excellent sensitivity far above that, up to 1650nm.

  • @user-eq6bb7ct8d
    @user-eq6bb7ct8d 8 місяців тому

    Great Video. Love your products And good to see you diving into embedded systems BTW, may u share the slides with us? thanks

  • @Richard_Broom_Photography
    @Richard_Broom_Photography 9 місяців тому

    Very useful. I would like to hear more about Lucid cameras.

  • @SireComplexity
    @SireComplexity 9 місяців тому

    Which assembly frame you used?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 9 місяців тому

      Check out Bosch Rexroth aluminum profiles

  • @tg5190
    @tg5190 9 місяців тому

    Finally.... I no longer have to drink plastic coffee....

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 9 місяців тому

      Fun fact! Did you know that the global coffee beans economy was valued at around $31.93 billion in 2022 and is expected to grow at a CAGR of 6.8% from 2023 to 2029, reaching a value of $50.61 billion? Detecting foreign substances in coffee beans is a serious matter! There are lots of opportunities for plastic bits (and other bad guys) to fall into bean harvests and SWIR cameras are used to detect and remove them!

  • @tg5190
    @tg5190 9 місяців тому

    Can this detect a jelly belly in a bowl of raisins? I've been trying to figure this one out for years and finally it looks like I might just be able to do that.

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 9 місяців тому

      There is a good chance it could! SWIR cameras are really good at detecting the water content in small objects. If the water content is different between the jelly bellys and the raisins there is a good chance that one would appear darker (or lighter) that the other, irrespective of jelly belly color.

  • @Fatherblahaj
    @Fatherblahaj 9 місяців тому

    I’m never ever drinking coffee again

  • @tonmoysarker3447
    @tonmoysarker3447 10 місяців тому

    Very informative.

  • @bra1nsen
    @bra1nsen 10 місяців тому

    Costs?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 10 місяців тому

      USD $6,950.00 (Triton 0.3MP SWIR) up to USD $14,500.00 (Atlas 1.3MP SWIR)

  • @olvintful
    @olvintful 11 місяців тому

    Does 10GigE network card compatible with 5GigE cameras? What speed will be maintained&?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 11 місяців тому

      It depends, some 10GigE cards may not support NBASE-T speeds (5 & 2.5GigE). If it does support NBASE-T speeds, the 5GigE camera will run at 5GigE speeds (~600MB/s). Please check your 10GigE card's datasheet or website for more info.

  • @TimofeyUvarov
    @TimofeyUvarov 11 місяців тому

    good job with 490. good dynamic range gain between a two systems, different white balance makes it harder to focus on actual improvement. it would be interesting if you compared with same generation sensor from onsemi, such as ar0323,

  • @a.a.patrick
    @a.a.patrick 11 місяців тому

    Hi, nice camera collections. please, do the camera support or provide raw uncompressed pixel by pixel data? I have need of camera but I need to access raw pixel data of the camera. Also which of these best fit for range from mid IR (1000nm) to near UV (300nm).

  • @user-wt6sl2rw2h
    @user-wt6sl2rw2h Рік тому

    You have mantioned on 1:50 corrupted frams/lines and so on. i have a very strong PC with intel i9 12k, 64 ram 1 gige Cognex 4k camera. and on my high speed application i have corrupted images sometimes. On next image there is a part of old image. i run only with one camera, tryed 4 different PCs, 2different cameras. Using basler gigevision driver. In pylon software i can wee the using Windows Performance filter. Jumbo packets 9k.. tryed everything but cant locate what couses the errors. No matter image size, no matter FPS (10 or 150) its still happening. Help pls

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      Sorry to hear about your issues. Are you sure the Basler gige driver is the proper driver for your Cognex cameras? If you feel you've done everything correctly the only thing you didn't mention you tested is your cables. Are you using the same cables when you tried different computers? Poor quality cables can absolutely cause frame corruption and is often overlooked as a source of issues.

    • @user-wt6sl2rw2h
      @user-wt6sl2rw2h Рік тому

      @@LUCIDVisionLabs Hi, thanks for suggestions. Yes, ive tested it with different cables. I ve found the probable issue, but can not solve it. I am using Halcon library with C# code. WHen i run my application (winforms), it start doing that corruption stuff when i try to move my winforms window or somehow interact with PC or windows (probable cpu thread starvation?). When i run the same code but only in Halcon in Hdevelop, it does not corrupt frames all is OK.. But PC has top hardware and i still can not figure it out what cousing those corruptions.. If cpu got freezes they why frames are corrupeted and lines image lines are populated with image line from different image,,Strange. Thanks for you assistance

  • @granatapfel6661
    @granatapfel6661 Рік тому

    Tf1?

  • @prasad94
    @prasad94 Рік тому

    can this camera work with raspberry pi?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      Yes! We have tried a total of 4 cameras on the Raspberry Pi, (1 via the Ethernet connection + 3 more cameras with an Ethernet adapter on a USB2 port). The 3 cameras using the Ethernet-to-USB2 adapter have limited bandwidth however.

    • @zaidparkar8810
      @zaidparkar8810 10 місяців тому

      @@LUCIDVisionLabs Is this running Ubuntu for ARM version of ArenaView? Do you have anymore resources for running this on a RPi?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 10 місяців тому

      @@zaidparkar8810 Unfortunately we don't have an ArenaView GUI yet for Ubuntu. We just have our SDK (APIs) for ARM Ubuntu 18.04/20.04/22.04, 64-bit

  • @PaulJurczak
    @PaulJurczak Рік тому

    0:13 The distortion looks strange. Assume he is bouncing the ball at 0.5 m amplitude, 2 bounces per second. At the apex, the ball is not moving faster than about 2 m/s. During 62.5 us exposure, the ball travels 0.125 mm, which is practically perfectly still given the pixel size at this target distance. Why the distortion, then? Image Accumulation is off, so I'm assuming that each single frame produces a distinct point cloud.

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      This is because 1 point cloud frame is derived from 4 phases (4 micro frames). If the object is moving too fast and changing positions for each micro-frame it will become distorted. More info here: thinklucid.com/tech-briefs/sony-depthsense-how-it-works/

    • @PaulJurczak
      @PaulJurczak Рік тому

      @@LUCIDVisionLabs That page states: "It is only necessary to have the 0° and 90° micro-frames to calculate depth.". Do you provide 2 micro frames mode?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      @@PaulJurczak Unfortunately we don't. Our Normal mode uses all 4 micro-frames (2 for distance calculation, and the other 2 to refine that calculation). In High-Speed mode we use only 1 micro-frame (provides much less distortion, higher FPS, but at the expense of accuracy and distance range)

    • @PaulJurczak
      @PaulJurczak Рік тому

      @@LUCIDVisionLabs What is the total time of acquiring 4 micro-frames at 62.5 us exposure?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      The total time for the 4 micro-frames, ignoring the idle period (grey part) is about 12.4ms, or 12400us with 62us exposure. The readout + reset time (blue and green parts) is just under 3ms for each of the 4 phases. (This comment is referring to the 1 frame diagram in the Sony DepthSense article.)

  • @galyehuda2
    @galyehuda2 Рік тому

    Great video as always! Can you estimate how much power the tone mapping required?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      Hi Gal, the TDR054S with AltaView has a power consumption is 3.8W via PoE and 3.2W via GPIO. So it takes up a tad more from our Triton camera average which is around 3.5W via PoE, 3.1W via GPIO)

  • @etiennebolduc838
    @etiennebolduc838 Рік тому

    Thanks for this video I feel like I'm an expert now

  • @unclesamautos
    @unclesamautos Рік тому

    Can I live stream with OBS and get a similar result to the guy welding?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      It is possible. My understanding is that OBS will stream whatever is on your monitor, full screen or windowed. So you could build a camera viewer program that's doing HDR processing using our Arena SDK APIs, then launch that program and then use OBS to stream that program.

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      FYI we don't support DirectShow.

  • @randomTake48
    @randomTake48 Рік тому

    awsome explanation

  • @pulkitsharmapremiumvideos9252

    What about SWIR skin penetration?

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      SWIR light can penetrate into deeper levels of skin tissue.

  • @pulkitsharmapremiumvideos9252

    But its not going through the apples at longer wavelengths!

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      Shortwave infrared light not go completely though the apples.

    • @pulkitsharmapremiumvideos9252
      @pulkitsharmapremiumvideos9252 Рік тому

      @@LUCIDVisionLabs But many studies have reported SWIR light going through bone (skull) and reaching the brain for therapy applications. So it should penetrate apples also.

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      @@pulkitsharmapremiumvideos9252 We don't sell SWIR lighting so I can't comment on what SWIR lights which were used in those tests. LUCID's Atlas and Triton SWIR cameras do not emit SWIR light and are only sensitive to it. So it is important to do your own testing with different lighting and optics with our camera to maximize the performance of your specific application.

  • @zombieregime
    @zombieregime Рік тому

    Literally everything EXCEPT how a indirect ToF image sensor works..... Direct single point ToF and static point cloud depth sensing make sense to me, but this real time ToF across an entire image frame without capturing frames in the picosecond range, Im completely lost. I though LUCID could help....but I guess I was wrong.... [walks away, sad, in the rain]

    • @LUCIDVisionLabs
      @LUCIDVisionLabs Рік тому

      There is a delay between the light sending and returning and the camera calculating the point clouds. The delay is around 10 - 12ms.

  • @amandaye1891
    @amandaye1891 Рік тому

    Very detailed information, thanks !

  • @OliverBatchelor
    @OliverBatchelor 2 роки тому

    The camera doesn't do tone mapping onboard I assume?!

    • @LUCIDVisionLabs
      @LUCIDVisionLabs 2 роки тому

      Correct, at the moment the camera can send 24-bit RAW image data to the PC. Then you can apply your tone mapping algorithm running on the host PC.

    • @OliverBatchelor
      @OliverBatchelor 2 роки тому

      @@LUCIDVisionLabs It's very cool. We're doing 3D reconstruction scanning along orchards/vineyards with pretty high optical flow, so I think the rolling shutter is a problem. Do you know how fast the readout is top-to-bottom? (Does that have a proper name)?

  • @sakirkandemir9657
    @sakirkandemir9657 2 роки тому

    ''' THANKS'' SUPERRRRRRR....

  • @sakirkandemir9657
    @sakirkandemir9657 2 роки тому

    ''' THANKS..........!

  • @letsbeadult
    @letsbeadult 2 роки тому

    Too difficult to get a quote, went elsewhere