Enjoy the new episode! Also! Subscribe and stay tuned for May 25 where I tell the story about the Microsoft Zune! I already ordered 3 and can't wait to show them to you. 🔔 P.S. Subtitles are now available!
i hear if you lock 3 Zunes in a small room together, the resulting levels of digital depression cause at least 2 of the Zunes to go on a murder\suicide spree.
I still have a functioning Zune HD and break it out when I'm feeling nostalgic. I used to still add songs on it up until about 2019. Not newer songs, per se, just new to me 😅 I got it because my job at the time allowed us to listen to music when working, so long as it didn't have a camera attached and that was possible on the Zune. While everyone else was trying to hide their phones, I could proudly (yes, proudly! 😂) display my new Zune and not have to worry about anything.
One of the best jobs I've ever held was working at Lytro, developing technical explainer content when we had transitioned to volumetric video capture for VR distribution. Our final camera just prior to Google's acquiring us was an impressive spherical rig with ~90 individual cameras. Looked like something from the Marix. Good work on your YT video. Reach out if you want hear more.
I had the original, nearly bought an Illum, but the company had refused to release an SDK or software for Linux, which put them behind the back burner for me. By the time an SDK was released, it was far too late for the company.
@@spvillano I wrote that python manual :) based on engineering input, and yeah, the project was approved well after plans had been laid to pull the plug on consumer light field projects.
I remember digging footage on Vimeo for the volumetric video camera. It was crazy. I remember the quantity of data to process and store was long blowing
What's your take on company's mistakes in marketing? I think that they just missed their main customer. Low resolution + autofocus = perfect for non-staged videos! In cinema you know where to focus - it is written in director's notes or wherever. You always know where the point of interest is, so you have no need to refocus afterwards. But in wedding recordings, for instance, there is a lot of improvisation. And you would spend some time in post-proccesing anyway. But if operator failed to focus at the right time, than no amount of processing will help. Also, not needing to adjust aperture while filming is a good bonus too.
I loved my Lytros too, but awesome is an awfully strong word for them. They were well made, had some nifty features. They would have been great cameras, if the image quality wasn't hot garbage.
The camera I'm most disappointed because it failed in the market was the Light L16. There's so much to love about that camera. The only reason I didn't buy one is because it didn't take video clips. So I was holding out for one that did.
same here apparently theres a new update one of them was recording but its in beta i saw sum people testing it out but i cant decide if i want the illum or the L16
Before lytro, there was no prosumer level light field technology. Now, it's a whole industry with various technological offshoots. Sadly, that happens sometimes, but that's quite the achievement, and frankly, the entire point of the investment in the first place.
yup, the author of this vid was ignorant enough to NOT research into the actual reasons of Lytro closure and just substituted it with his speculations, he called the proof-of-concept demo model a "product" and criticized it as if it was sold in bulks and as if it was sold for profit, he got no clue of why largest sensors available back then could not produce sufficient light-field resolution, he never played with Illum, he neither ever watched Immerge videos in VR. He also has no clue those "400GB per 1 second" were processed by the Cinema in real-time out of the box...
@@manofsan It does, Corridor Crew did a whole video on modern light field technology in production, NeRFs are the evolution of the concept. It works better than 3D scanning for many uses, because it also captures reflections correctly, which 3D scanning can’t.
I'm happy to have played a role in the making of this episode! Lytro was a pretty interesting story to research, and I'm glad I was given the chance to check it out. :) Also, as always, excellent work Ken!
I just took a computer vision course as part of my graduate degree. The amount of image transformation to stitch all of the sub images together, that quickly, is impressive.
It's really a niche use, though. AF has gotten significantly smarter in that time, machine learning will likely improve that even more overtime. In order to get that particular effect, you need a much more complicated sensor array than what you need for a regular camera. The main market for it would be people who wound up using cellphones and the other areas are photographers t that can afford cameras with better AF and high enough frame rates that being off a couple millimeters isn't an issue. Even Foveon that had a much more practical basis for its technology went bankrupt years ago. In their case, stacking photosites would have been a real benefit to the field. Unfortunately, Foveon was never able to figure out how to get the noise levels low enough to get enough of an advantage over conventional sensors to survive.
As a keen hobbyist photographer I remember it was my non-photographer friends who got most excited about Lytro. Post-Focus is a nifty gimmick, but for any semi-serious photographer, deciding on how to place focus is part of basic composition and developing the skill-set to do that is part of the fun. If Post-Focus could have been built into existing camera systems as an additional feature, great, but buying into a new camera system where all the features are kind of handicapped to permit Post-Focus felt too much like a solution in search of a problem.
Totally agree with you, but I think street photographers, lomo and pinhole/toy camera enthusiasts would have loved it if it had some better specs, price, and ergonomics. I can totally see the benefit of having a compact camera that you can use to snap a pic discreetly and quickly without fiddling with settings and worry about the details later. I do believe that they could have succeeded if they found their niche. I feel like they were always targeting the wrong demographics and in the end, they never made a product that made any one group satisfied.
Yup, Ren Ng: "Autofocus sucks." Every photographer: "Use manual focus." Ren Ng: "I suck at manual focus." Every photographer: "Not my problem." Ren Ng: "Buy my camera!" Every photographer: "...why should I? Ah, I got it: you want to make money so you can hire me..."
The design concept of first Lumos camera was great as well as idea of "always properly focused picture". PS. I'd personally prefer variable exposure over focus for less blinding by the light or real HDR
A more interesting thing than just replacing your focus puller would be if the camera can do things no regular camera can do, like seamlessly focus on multiple arbitrary planes at the same time. You can kind of do that with split focus lenses, but they are a pain and often leave artifacts in the image, and limit your possible shot compositions severely. Imagine being able to have two subjects who are far apart both in shallow focus at the same time in the one shot.
@@tylisirn Indeed... but that is the whole intent of "light field imaging". The light field image is essentially the same image focused, not on arbitrary multiple planes, but in a computational sense "focused simultaneously on ALL planes". It is the computational equivalent of storing a two dimensional image in a 3d cube with the third dimension of the data stored in the cube being the entire range of focal distances. Then, when you run the software, are are computationally selecting the "2d slice" that corresponds to the image at the focal distance that you want. This basically allows you to select both the plane of focus and the depth of field later from real stored information. You are always working from real stored image information and never "adding artificial computational blur". Unfortunately this requires storing a ridiculous amount of information for a single image. The catch is that, with modern electronics and computational capabilities, it's easier to store one image with a really wide depth of field. Then later you can select the parts that you want to be sharply in focus, add computational blur to the other parts, and end up with a good simulation of the part you want in sharp focus, and the other parts blurred just as if by a limited depth of field. (Basically the light field image can "do it for real" but now we can fake it so well, and so much more easily than doing it for real, that there is little incentive to accept the limitations, and pay the much higher cost, of doing it for real.)
The cinematic camera in the video is the wrong way to leverage this technology. It would actually be better to combine multiple cameras in locations around the studio pointed at the center, into a single light field camera. That way instead of being able to shift the image only across the diameter of the single lens, you can shift its viewpoint anywhere around the studio. In essence, the actors can just act out the scene, and the director can decide on camera position afterwards in post op. If he originally imagined it as an over the shoulder shot, but after seeing it he decides a shot showing both actors looking at each other would be better, he can do that without having to re-shoot the scene. (You could do it with a single camera, but that creates blind spots - areas of the background which were covered up by the foreground from the camera's POV. If you use multiple cameras scattered around the studio, they can cover for each others' blind spots.) Unfortunately, to pull this off with the way Lytro's light field camera works, you need to align all the cameras within a single wavelength of light. That's done in radio astronomy to create an interferometer. But radio waves are typically several meters in length making aligning two radio telescopes in different locations pretty easy. Even the black hole image (which was also captured with an interferometer) was created using images at 1mm wavelength. Visible light has a wavelength thousands of times smaller. Eventually we will get there. But our level of tech just isn't capable of this right now.
I remember the promotional demonstration video talked about how it could make chromakeys unnecessary by selecting the depth and deleting or replacing whatever you wanted on whatever depth you wanted. You could also relight scenes, adjust angles in post.
@@solandri69 I'm pretty sure you don't need an interferometric level of accuracy, a millimeter or so is good enough- the multiple plenoptic cameras can be merged via a Photosynth-like process (but each camera must be good quality). They may have been better off doing an array of Illum cameras rather than the monster Immerge sphere. More details at ua-cam.com/video/4qXE4sA-hLQ/v-deo.html
I was a macro photographer when this tech came out. There was massive potential for shooting deep field images correcting the age old issue of razor thin DoF for macro subjects. I spoke with Lytro development a few times on focus stacking techniques. Noting ever materialized.
I had a similar experience. I was into macro photography in 2012 and got one of the first models. I naively thought I’d be able to use it to do stack focusing. Nope. I returned it and got a full refund.
You know, that was the initial intent with this tech. Not to select one focus point but to have several you can merge. But the software side never caught up to make it usable and resolution was lacking. Manual focus stacking is tedious and cumbersome but in camera focus stacking is a simpler and cheaper way to do this and basically made it obsolete.
Now that's an interesting story, talk about abandoning the consumer. Neato though, really thought about grabbing one off eBay updated to the final firmware revision already, but it would be a toy and a paper weight.
I recently purchased an illum and it is extremely pleasant to use, and it has an extremely useful realtime preview mode and the absolute best screen interface I have ever used on a camera. however it does require that you think about focus in 3d which is a bit tough to wrap your head around It is very good for extreme closeups but it still has the cutout issue you mentioned in the video
Thanks Ken, great explanation. I worked in 3D imaging back in the ‘70’s, and did a lot of successful work with lenticular printing. The Lytro seemed to be clever, but offering the wrong product. The lens array tech could have been used to develop multi-angle 3D imaging, and by now could have been mature tech, in every phone and cinema and even tv. The “re-focusable” imaging always seemed a bit of a niche feature, and was doomed from the start. Like the 3D tech of the Nimslo camera, it worked, but that was not enough to make it a commercial success.
I worked at a camera shop when the second version came out. I showed it as an oddity. My boss sold one, the only one to a good customer at $1599. After the hype, it was sent back due to poor sales.
The Cinema Camera should have gone somewhere considering it's a camera immune operator error, meaning you can capture everything. Given that it needed datacenter equipment, a larger closed location would be good. I think *Sports* was the right use case for it. The cameras could a allow capture the player's position and where the ball really was at that moment, ultimate replay.
@@tomboss9940 The issue with starting there, is that good Directors want creative freedom and convenience. The Lytro is a little bit cumbersome making it hard to get the shots creatives want. TV however is a more fixed camera and high turnover. If they could develop a good TV workflow I could see it being used in a studio along side that LED wall technology they used for starwars TV shows, as the LED wall could be used for correct lighting on actors and props and the Lytro would allow for faster background removal than rotoscoping, allowing for more freedom to fix shots. It's one of those technologies like Disney's sodium vapor light prism technology, that should be more common in high budget productions.
Such an intriguing idea. I'm a photographer, and was really curious to play around with one of these. I've sadly never even seen one in the flesh. I really hope someone takes up this technology and develops it further.
It sort of reminds me of the Light L16: That camera was _way ahead of it's time:_ being a camera with 16 lenses and would take some of the best stunning photos into one. The _problem_ was that it cost way too much (about 2000 bucks) and the photos were on a different format (aka they weren't JPGs/PNGs like most photos), forcing one to import them into the software the camera was bundled with. The camera flopped, the company stopped support for it in 2019 and were dissolved in 2022, with its assets and former staff being taken by _John Deere_ of all things.
Light were also involved in the Nokia 9 Pureview, which used a similar concept - as per the main L16, they just couldn't really quite get it right. Computational photography can be a great thing, but it seems as though it is way too easy to get it wrong (and not realise before loads of $$$ are wasted)
John Deere is doing some really cool stuff with mass cameras integrated in to equipment using "AI" to automate tasks like pulling weeds etc. They had a really cool machine at CES that had 180 cameras that could intelligently pull weeds across a large field at 19mph. Actually a smart but unexpected acquisition of talent.
I wanted one of these so bad when they came out. I finally bought one about two years ago just to play with and it's nifty tech. Surprisingly it still works like new.
This is such an insane Star Trek-esque concept, and it actually worked. This video is challenging my most basic assumptions about what's possible in photography I'm sad it still ended up lost to the thorns of relatively early development
ב''ה, it's kind of in between. As I understand at least this product, you're basically just sacrificing sensor pixels to different focal points, but then possibly with some interpolation math that does better than just the literal N of focal points you've stacked up. Useful for some high-speed applications (industrial machine vision, maybe vehicular vision), sort of useful for action photography, sort of gimmicky unless you're in those once in a lifetime situations others described (wedding photography, sports and stunts etc.) If someone wants to lay out the math on how much more data you recover than just the sum of the different focal length images from different focal points on the microlenses, that'd be educational. But if you want something pretty solid state to resolve QR codes on an automated factory line or such, and have the resolution, it's certainly cool for such niches and perhaps more rugged than other forms of adaptive optics.
This was always a really interesting camera I was just waiting for them to be reasonably priced and to produce reasonable resolution images but that never happened.
I was selling cameras in an electronics retailer when 1MPs came out and we confidently told buyers that that was so for beyond what they'd ever want or need.
My coworker at the time, told me about it we being both photographers we were both very excited about the implications, but I know that I was turned off by the price
Your ability to talk us thru software interfaces real time with a device is your Super Power. I understand it quicker watching you than playing with it myself. Bravo. Sub’ed and note’s on.
I remember seeing these back on Indiegogo back in the mid to late 2010s. However, once the campaign was done, i never got a chance to buy one of these cool products, including the easy to setup camera portable projector that was from a different Indiegogo campaign. Nowadays I don't go to that website anymore as I use to, but it's nice to see what people made on that platform and unfortunately they have disappeared from the platform, only to be remembered in memory.
Thank you for the trip down memory lane! I'd forgotten about this company and had no idea they evolved past their first product! I'm not a photographer, but 'back in the day' the Lytro really captured my attention! I loved the idea of just snapping a pic and having all the information there to modify later. The compactness and handiness appealed to me just like those old flip cameras. But ultimately the price kept me from buying one. Looking back, their focus on the consumer market killed them. The resolution and size would be great for instagram, but the need to carry a 5" long rectangle with you, snap the photo, upload to a computer, manipulate, and then upload to Instagram later would never, ever win when you could just take a pic with your phone and upload it in seconds.
This was a great episode, as someone into photography this was more interesting to me than some of your other videos (not that they were bad!) This technology is very fascinating to me, wonder if anyone has used something like this successfully? Would love to see more innovation in cameras.
Watching you stand by the waterfront railing with the camera not strapped around your hand made my head explode. It was like watching a Hallmark movie, where the characters are taking pictures with what is supposed to be a $1,500.00-$3,500.00 Canon camera and lens, with no strap on the camera. To everyone who has a camera, make sure you have a strap and have it securely on you so you do not lose your investment.
I love this channel. Crazy Ken does an excellent job that hosting his channel. He's doing a great service I exposing all the scams out there. Thank you so much crazy Ken!!!!
I think the most underrated part of it is the interactive blur if you could do that with a better camera or part of a phone or action camera it would be amazing
I bought both cameras, when the price dropped. I think I payed less than $100 for the original, and about $300 for the pro. I soon realized that the image quality was disappointing at best. I loved the design, and UI. I still really enjoyed using them. They had a great idea, but they the image quality. You can't do that on a camera. I hope one day another company develops a light field camera and does it right, focusing on making the image. There is certainly a use for them. I think they would be great for action, and for drones.
Love the videos! Watching this one actually made me want to go out and buy it, not the camera but the Chicago deep dish pizza! It looked delicious and the camera looked interesting but as other failed camera products it just couldn’t compete in the market with better products that, although more expensive, made the cost justifiable. When this came out, I think more people would rather pay over time for a $700 iphone with a pretty decent camera which also can make phone calls and has a nice big touch screen.
It makes me so nostalgic, I was closely following Lytro at the time. Speaking of niche consumer electronics companies I was fan of, that miserably failed, I just remembered Neonode. They made touch screen phones, with screens and UI optimized for fingers. Their products featured gestures like swipes and scrolls… since 2004. I used some, and they were actually decent, and even had haptics.
@@ComputerClan It would be really interesting, as there aren't any modern videos of them. I own a Neonode n2, which was their last model they've ever made. The OS is actually just windows CE under the hood, but the shell is totally custom. To the user, it appears as a feature phone mostly, but it supports windows CE .cab apps. The screen features an IR grid to detect touched points; it might not be impressive now, but when these phones were new, and the market had resistive screens at best, it was amazing. Also, you "slide to unlock". If you ever get a hold of one, I can provide some themes and apps I backed up (and also the firmware, since it entirely resides on the miniSD)
Thanks for this video. You've inspired to dig this white (graphite gray) elephant from my necro-tech shelf. It still takes a charge and I'm all fired up about once again playing with this quirky, flawed device. 😁
Loved seeing this, great job! I loved the tech, but the image quality was just never acceptable for me. As a matter of fact, I have my blue Lytro (Original cam) on my desk right here in front of me. A beautiful and unique device that was ahead of its time!
Thank you for this video! We have a Lytro and I thought it was obsolete. Now i have the desktop software again and I am excited to pull out my Lytro and play with it again!
This is the closest the YT algorithm has ever gotten to reading my mind. I’ve been casually about this camera at least once a week for months and I’m not even a photographer. I just vaguely remembered it and occasionally wonder how it turned out. Thanks!
I remember back in 2004 during my Uni days, my photographer class needed us to buy a dlsr camera but it was so expensive back then. I remember high end cameras back then had 5 megapixels and that was pretty impressive. It's funny the Lytro 6 MP camera is considered low by 2012 standards. I would've loved to use that camera back in 2004-2005.
Have both the original Lytro and the Illum. I've been of the mind that had they couldn't have been a bit more open about the limitations of the camera. Instead of making the format and software closed, and kind of frustrating to use, they could have been upfront about "effective" megapixels and allow developers to find interesting creative uses of the tech. They used a lot of hocus-pocus bullshit (mega lens arrays whatever) when Lytro cameras were actually very well-engineered devices with state-of-the-art high megapixel sensors, but most of that interesting data was not made available to customers who might find something WAY more interesting to do with it than refocus, which any camera with optical focus can do anyway w/o throwing pixel data away.
you can use the software to rotate the focus plane in 3 dimensions. Results in some interesting 'tilt shift' type images etc. but more a novelty. More useful is adjusting the fstop and focus plane with high precision. I do wish they had opened up the format more etc.
@@carylittleford8980 it's nice that reverse engineering has been done. I think one potential application for this data is in AI, for instance training a neural net to use the light polarity data from a photograph with tools like ControlNet/Img2img to more accurately infer depth and lightning from photos, or who knows possibly completely relight scenes.
@@rbus You don't need to reverse-engineer anything. The Lytro's desktop app straight-up allows the export of depth map to get 3D images. And at medium range, those are quite accurate.
@@gabrielhmi The problem with the depth information is that it just isn't very good. It's splotchy. I can get way more accurate depth data by using AI (Automatic1111's depth addon is crazy good on almost any 2D image). What I'd like is the pixel polarity data straight off the camera, using it's internal table. This information could be used to train a neural net to relight a scene or perform much more accurate depth generation.
@@rbus What you are talking about isn't "depth information". The depth map the Lytro gives you is an actual depth map in m, with millimeter accuracy. It's a true measurement. What you get using AI isn't even remotely comparable : it looks good, but it's absolutely not accurate, nor is it a measurement : it juste decompose your image into a handful of depth planes. The main application for the lytro, and what their IP is currently used for is industrial parts inspection and 3D modelling.
I have one of those deep lodged in memories of reading a Popular Science/Mechanics article that reviewed a cellphone and they confidently claimed no one could ever justify more than 2MP cameras.
Imagine if this had been adopted to video! It could have had AMAZING VR applications with today’s eye-tracking. Imagine if you could record 3D light-field video. You could refocus your eyes one what’s close or what’s far naturally.
Yeah. Apparently the VR camera they made had playback allowing 1 meter of eye movement in 6 degress of freedom in VR. So ducking to look under a table, or see around a corner in a mirror were all easily to do in real time. Rather cool.
They actually used similar approach for 3d movies, like Avatar. But cost of a camera is five-digit value and they required several at one time. Also, somewhat similar tech is used by satellites. So.. not eally unique tech, a daring way to apply existing stuff.
I could see something like this being used for macro photography. Say, an archeologist documenting a newly discovered artifact before it's shipped out. I can also see it working for cinema; some directors are notorious for doing take after take, and perhaps some of that could be handled by futzing with the images in post.
I use the Illum for lots of 3D macro work. Being able to move the images focus or just close up the fstop till the while scene is in focus. Every shot has a depth map as well. So could be useful for archeologist's , record keepers etc.
A classic solution in search of a problem… To me, Lytro was a solution to a need that nobody had, with tradeoffs that nobody would accept. At the time I’d been testing and reviewing consumer digital cameras for over 10 years and my immediate reaction upon seeing it was “WTF are they thinking”. As it seems you experienced, the much-vaunted refocusing ability was very limited, and only really noticeable when you had objects in the very near foreground. As you noted, there were a lot of very smart people involved, so it always baffled me that no one called it out as uselessly impractical from the beginning. I think ultimately, all the smart people were so wrapped up in their technology that they never stopped to ask what the end-users might want.
For anyone completely obsessed with having the ability to "post-focus"... You might check out panasonic's lumix cameras. Even the ~$350 used G85 has a mode to do this... although this is accomplished with 1 very good sensor taking exposures rapidly as the focus plain sweeps from shallow to deep over fractions of a second (rather than an array of low quality sensors all taking exposures at once). Actually, impressed with the idea of mousing over an image on the web and seeing focus point change! Never occured to me! Very cool!
Step 1: Build a camera with 20 tiny laptop cameras inside. Step 2:put a slightly different lens on each of them. Step c:Have them all shoot at once but take turns sending data to the memory card. Should be less than a second for all of them. Step 4: Afterward choose the photo which is the least blurry. Ta-Dah! After-the-fact focusing!!!
I bought Lytro illum only back as a year ago, pretty much as bargain. I played with it, it seem complicated at first, sadly no manual. I did take few photos and I was amazed at the focusing after it has taken photographs. I understand and thought it been sold to Apple (That what I been told). I hope I get the other cameras soon.
The concept was cool (especially in the early 2000's) but with modern day AI-powered face/eye and subject detection, the times I've had out of focus shots I could basically count on one hand and I've shot hundreds of videos. Just check that the green box is on your eye and start shooting! I love how crazy the data rate was on their cinema product. Just go buy a suitcase full of hard drives
With a lighfield camera you can also change the fstop from 1 to 16 as well as tilt the focus plane in 3 dimensions. All done with optical maths for accurate bokeh, I find the AI solutions in all modern smartphones never look as they should.
@@carylittleford8980 yes but I can shoot several hundred photos in a gathering with friends while never learning a single thing about photography and ever having edit even a single photo using the smartphone + AI adjusting. This product is way too in between. I don’t know why they went direct to consumer and with such a “mid” product. Either aim mass market, and go wider and focus on just one thing, say a sliding dynamically adjustable 3D focus, or go more pro/semi-pro. Even better, just sell or lease to companies like Canon or Apple and they can worry about the consumer.
@@astronemir for me it's the quality of the 3D output for macro photography I use the Illum for. Nothing I can access can beat it's 3d effect and that's my target niche. The depthmap is makes has full parallax, very unique even to this day. It's basically a quick point and shoot compared to other 3d setups : ) I can export content as stills or really cool slow pan video for the latest VR setups, new gen 3d tvs just coming available and other emerging markets (Lume Pas 2) etc. It also looks strange as a camera and that alone, just being different, gets me extra attention and thus work. I still use my phone for happy snaps, but when the scene has a really great 3d composition I wish I had the Illum handy. The first gen camera, the little boxy one is pure novelty and the Illum does a lot more after the shot. Google has the Lytro software patents, so I guess they can use that to train an AI to do focus adjustment in a more optical maths way compared to the current gen. :)
Oh my god , i had seen some ads for this product everyone was thinking dslr will be outdated in no time for retail consumers , great you bought a video about it 😮😮
I would actually go in the opposite direction of your conclusion. The problem of lightfield imagery isn't that the technology wasn't capable, or that it was behind the times. The problem of lightfield photography is that current technology isn't advanced enough to make the technology practical. When you capture an image with a smartphone, you capture a single image that gets stored and compressed. When you do the same capture with a lightfield camera, you end up with an image information volume that is MUCH MUCH larger, because you add onto the image all of the lightfield data. For example, a lightfield image captured with a 12-megapixel camera at a 90-degree field of view and 100 light rays per pixel would be approximately 120 megabytes in size. A regular photographic image captured with the same camera at the same resolution would be only about 3 megabytes in size. To deal with such an extreme data volume, Lytro had only one choice: reduce the size of the captured image. A 1080 x 1080 (1.6 MP) image might have been way behind what other cameras were doing at the time, but with the smaller image, the data volume became much more manageable. I have a feeling lightfield photography will make a comeback as storage space continues to become less and less expensive. In time, I think Lytro will be seen as another company that had ideas that were simply way ahead of their time.
Yes, although I don't think that it will completely disappear, but I wouldn't expect to see it any more than what you see in Canon's 5D MKIV where there's a left and a right subpixel which can be used for slight adjustments to focus or to give the camera more information about noise on the sensor. (And by slight, I really do mean slight, the two subpixels are very close together and you don't get much room to adjust it)
The answer is much easier: The camera was a solution in search of a problem. The selling point was “you can finally refocus your image”, but no consumer needed to refocus their images, as their small phone sensor + lens system of that time could be considered hyperfocal focusing. When Lytro finally understood the real value, i.e. their cameras were actually capturing a 3D scene, it was too little, too late. And in any case, most people prefer resolution to 3D (with basically non-existent visualization option).
I was trying to write a devil's advocate comment about how tech startups aren't overvalued because free market, valuation multiples, growth rates and whatnot but I actually do believe they are overvalued. I'm not sure if these venture investments actually pan out on average. I suspect they don't and there's a survivorship bias kinda thing going on. The venture funds that invested in hugely successful companies got a huge payout and continued to exist using this terrible philosophy which resulted in future tech companies receiving similarly high valuations.
I appreciate your reviews because you are primarily a Mac guy and you actually pay attention to usability. I recently bought an Oxo pizza cutter with very high ratings on Amazon. While washing it, I dropped it a few inches in the sink and the welds in the handle came apart, exposing sharp metal. Did anyone actually try to wash the item they reviewed? The use case for kitchenware includes washing it hundreds of times, and dropping it in the sink will happen. Reviews aren't using something once and concluding that it seems to work. You need to use something in a realistic manner, and not treat it like a delicate flower.
The only time i heard about the Lytro in 2013, like a year after it came out. There was a little hype around it in the videographers/photographers community but nobody really used it, everybody was just reviewing it and never use it again. And now I'm watching this after 10 years.
I remember there was a social network for Lytro photographers and it was very cool. They used gifs to make the focal change part of the final art. But, at the end of the they, it was a photographer's toy, not a professional camera.
I have a red hot. I picked it up a few years after launch at a closeout store for about $50. It was a fun device to kick around with, and it had a lot of potential, but you nailed it when you pointed out the effect that smart phones had on the company. Also, I think a square format photo was just kind of off-putting for a lot of people. All that said, I wonder if this is a product that time will be kinder to. I feel like in a few more years, Lytro might become the new Lomo and experience a surge among hobbyists and people looking for retro novelty.
Listening to you talk about and explain all these different picture taking mechanics puts me in mind of Captain Disillusion and his explanations of how video/photo editing works. You both make these concepts easy to understand and are enjoyable to listen to.
The low resolution was a byproduct of needing the microlens array (they might have been doing it using a patterned sensor filter, but i digress). To do the refocusing trick it probably had a raw image sensor of 10+ mega pixels, which was closer to state of the art. Unfortunately when that sensor data is processed to focus it the resolution is much smaller. So it necessarily was many years out of date in resolution, and would always be unless they incorporated very expensive sensors, thus pricing themselves out of the market. I congratulate them for making it as far as they did with it…
I’ve used (and still have) a Lytro Illum camera purchased from Woot years ago at a discount. It’s genuinely cool tech except for the resulting low picture resolution. It was just a few years too late and as they say timing is everything. It’s still fun to dust it off and use every now and then.
This came out during the peak of my photography interest and I still think of it on occasion. I found the idea of the Lytro very novel and quite enticing. I passed on the first model due to it's form factor, resolution and price. When they released their second camera - I was very tempted but a low-resolution "bridge camera" for big-boy DLSR money was not appealing. I'm sure I was also turned off by the proprietary software but who could remember 🤷 It was cool to learn that they went on to make more stuff before fizzling out. As impractical as it was - That cinema camera looks pretty dope!
Thanks for this! I always wondered what happened after the Illum. This whole thing was such an interesting idea, held back by people just wanting the most pixels possible.
I was an early adopter of Lytro and love it! it's such a shame that it never really went anywhere. I used it for my stepdaughter's wedding and it was great, as someone with better things to be doing than playing photographer, to be able to just copiously snap pictures with less attention to composition or focusing carefully and be able to worry about that later. Would it ever replace a DSLR for serious photography? no, I don't think it ever would. But did it drastically improve the photo quality of someone who wanted to document an event without spending all their time worrying about camera settings? absolutely.
Personally, I think if you are just snapping away and not paying attention to composition / focus, then pretty much *any* camera from the last 10 years in an auto mode - especially Cell Phone cameras - will do the trick as well as a lytro, and with better resolution. I mean, it's really only "saving" you from watching if AF hits, but that's not actually a huge problem with most modern cameras. And on the other hand - if you do care about composition, there's not a lot of "worrying about camera settings" if you're at all familiar with your camera. I can dial in a shutter speed to freeze motion, and then let the camera take care of the rest of the settings, adding flash if I want to turn it on and set to TTL. Not a lot of actual settings to mess with.
it's still very cool that it had a *real* light field sensor and not the fake ones that modern phones have, the ability to ACTUALLY fix the focus in post is something that it's just not possible to do on a standard camera when said camera decides it wants to focus on the random bird in the corner rather than the person taking up most of the frame
Hilariously the closest thing to hearing about anything to do with light field cameras I've seen since these things failed is a subtle reference in The Expanse series. In season 3 of the show the crew is being filmed by a drone camera and the user interface says "Light field enabled". haha
Well done! Enjoyed watching this. The first two Lytro products (consumer, semi-pro), were high-priced within their categories and they produced lower-resolution images. The focus thing is cool, but it's difficult to compete if you only offer one unique feature, while making the core camera features (image quality, usability, lens selection, etc.) less capable than virtually every other competitor. I don't know the Lytro leadership team, but it seems like they got enamored with the focus feature and thought it would be sufficient to get camera people to switch. Especially in the semi-pro arena, those folks aren't going to abandon an investment in their current camera brand (Canon, Nikon, Fujifilm, etc.) unless you offer them something dramatically more compelling in every way. Plus, as you mentioned, cameras in phones happened, killing the consumer camera market.
Hah! Love the old-school iPhone in your background. Used to work tech support for Big Planet back then. Found your video on it and left a longer comment there.
Did you say "electric blue?" My inner Icehouse is flaring!!! I just freeze Every time you see through me And it's all over you Electric blue On my knees Help me, baby Tell me, what can I do? Electric blue
I picked up one of these from woot for $50 in 2015. Forgot about it until I saw it with the rest of my camera gear a couple weeks ago. Everything you said about it was spot on. It was a neat idea, but I never used it much. Glad I didn't pay the original MSRP for it.
I think you are right about the timing. Around 2012 I was pondering whether to buy a new phone with a 'good camera' or keep my crappy phone and just get an actual good camera. Phones were advancing quickly at the time, along with their camera capabilities. At $500 I think the Lytro would have been way out of my range, especially given what the same amount could get me in a second hand DSLR at the time. Had that price been the same 6 years earlier, when the nice DSLRs were still over 2k, the Lytro would have seemed a much better value.
19:00 I disagree. That camera wasn’t what we needed in 2006 either. They failed because they didn’t really understand photographers or our needs. Nobody needs anything they made.
I absolutely loved the concept of the Lytro when it was announced, but I couldn't afford it. It's best to get the focus right from the start, but the very idea of being able to refocus afterwards is great, especially when considering that what is perfectly sharp on your 4-12" preview display, may be badly unfocused when viewed on your 43" monitor and a waste of money to get printed. Being able to refocus and re-expose photo's after the fact by recording all available light data feels like an absolute gamechangers for non-professionals.
And here i am, an owner of a hardly used Graphite Gen1. Simply the fact it only really worked for cirtain ways of shooting pictures, paired with the low resolution (I personly even thought the output of 1080x1080 was too high for what i personally was getting out of the camera). The camera also turned out to be something you can hardly resell, people couldn't grasp it.
Great analysis -TYVM. So sad that his amazing vision couldn't make it back then. Ren Ng apparently a visionary way ahead of the tech needed to make camera work.
I still occasionally use my Lytro Illum (which I got in late 2015 discounted to $300) and it's a fun toy, but not a general-purpose camera. I think I'll take it on my next kayak outing, it should do well for the on-the-fly photo ops of wildlife that have been frustrating with other cameras.
Very cool and unique! The design reminds me of a square version of the Apple iSight FireWire camera. It looks quite nice, but it seems like it would have worked even better if it had a manual focus ring/wheel instead of the touch sensitive surface. The display also looks like the iPod nano 6th gen, but at a lower resolution. If they took the display on the iPod and used them instead, it would look better while maintaining the same form factor. It’s still an interesting device, but the company might still be in business if they focused more on upgrades to this style of Lytro and not focusing on higher end pro markets. Great video!
I just bought one of these the other day from a thrift store for 6 dollars in Pink. For that price, its a super fun novelty camera with a funny/weird sidenote of photography history attached to it.
I never forgot about these, I always wanted to see one and wondered why it just seem to go nowhere? I guess having such an out of date aspect-ratio also killed it.
So basically it just shot long depth of field images and applied a filter to create a fake blur? Because they made it sound like it was an optical process.
Enjoy the new episode! Also! Subscribe and stay tuned for May 25 where I tell the story about the Microsoft Zune! I already ordered 3 and can't wait to show them to you. 🔔
P.S. Subtitles are now available!
i really like the zune actuallly
I know about Zune from its Windows XP theme.
Zune SDF checking in.
i hear if you lock 3 Zunes in a small room together, the resulting levels of digital depression cause at least 2 of the Zunes to go on a murder\suicide spree.
I still have a functioning Zune HD and break it out when I'm feeling nostalgic. I used to still add songs on it up until about 2019. Not newer songs, per se, just new to me 😅 I got it because my job at the time allowed us to listen to music when working, so long as it didn't have a camera attached and that was possible on the Zune. While everyone else was trying to hide their phones, I could proudly (yes, proudly! 😂) display my new Zune and not have to worry about anything.
One of the best jobs I've ever held was working at Lytro, developing technical explainer content when we had transitioned to volumetric video capture for VR distribution. Our final camera just prior to Google's acquiring us was an impressive spherical rig with ~90 individual cameras. Looked like something from the Marix. Good work on your YT video. Reach out if you want hear more.
I had the original, nearly bought an Illum, but the company had refused to release an SDK or software for Linux, which put them behind the back burner for me. By the time an SDK was released, it was far too late for the company.
@@spvillano I wrote that python manual :) based on engineering input, and yeah, the project was approved well after plans had been laid to pull the plug on consumer light field projects.
I remember digging footage on Vimeo for the volumetric video camera. It was crazy. I remember the quantity of data to process and store was long blowing
@@MrXuanB I created most the animations we used to show light field / volumetric video concepts. Need to get those up on YT.
What's your take on company's mistakes in marketing?
I think that they just missed their main customer.
Low resolution + autofocus = perfect for non-staged videos!
In cinema you know where to focus - it is written in director's notes or wherever. You always know where the point of interest is, so you have no need to refocus afterwards.
But in wedding recordings, for instance, there is a lot of improvisation. And you would spend some time in post-proccesing anyway. But if operator failed to focus at the right time, than no amount of processing will help. Also, not needing to adjust aperture while filming is a good bonus too.
I'm so sad about Lytro's tech being basically abandoned... it's awesome
Agreed. I was quite intrigued by this camera when it came out.
I loved my Lytros too, but awesome is an awfully strong word for them. They were well made, had some nifty features. They would have been great cameras, if the image quality wasn't hot garbage.
The iPhone can basically do the same thing, I think the iPhone 12 was the first one to adjust blur on a photo after it was taken.
It’ll be out of patent eventually, so maybe then.
Sounds great on paper. Pity about the image quality
The camera I'm most disappointed because it failed in the market was the Light L16. There's so much to love about that camera. The only reason I didn't buy one is because it didn't take video clips. So I was holding out for one that did.
same here apparently theres a new update one of them was recording but its in beta i saw sum people testing it out but i cant decide if i want the illum or the L16
I'm glad you mentioned L16 because I want to do an episode on that soon! Thanks for watching!
@@SvetlinKolev-j2jdon't buy them there overpriced novelty toys
Before lytro, there was no prosumer level light field technology. Now, it's a whole industry with various technological offshoots. Sadly, that happens sometimes, but that's quite the achievement, and frankly, the entire point of the investment in the first place.
yup, the author of this vid was ignorant enough to NOT research into the actual reasons of Lytro closure and just substituted it with his speculations, he called the proof-of-concept demo model a "product" and criticized it as if it was sold in bulks and as if it was sold for profit, he got no clue of why largest sensors available back then could not produce sufficient light-field resolution, he never played with Illum, he neither ever watched Immerge videos in VR. He also has no clue those "400GB per 1 second" were processed by the Cinema in real-time out of the box...
@@VlaD-tv8to - so does the technology continue to survive and develop, even without this company?
@@manofsan It does, Corridor Crew did a whole video on modern light field technology in production, NeRFs are the evolution of the concept. It works better than 3D scanning for many uses, because it also captures reflections correctly, which 3D scanning can’t.
I'm happy to have played a role in the making of this episode! Lytro was a pretty interesting story to research, and I'm glad I was given the chance to check it out. :)
Also, as always, excellent work Ken!
I just took a computer vision course as part of my graduate degree. The amount of image transformation to stitch all of the sub images together, that quickly, is impressive.
I love the concept of the tech, and it’s definitely a way that photography should be moving ahead. But the costs and the timing killed it.
Brilliant idea, just no good way to turn it into a marketable product.
Toning it down a la multi-cam solutions is the only thing that saved this idea from being poison.
It's really a niche use, though. AF has gotten significantly smarter in that time, machine learning will likely improve that even more overtime. In order to get that particular effect, you need a much more complicated sensor array than what you need for a regular camera.
The main market for it would be people who wound up using cellphones and the other areas are photographers t that can afford cameras with better AF and high enough frame rates that being off a couple millimeters isn't an issue.
Even Foveon that had a much more practical basis for its technology went bankrupt years ago. In their case, stacking photosites would have been a real benefit to the field. Unfortunately, Foveon was never able to figure out how to get the noise levels low enough to get enough of an advantage over conventional sensors to survive.
Great video. Point. My Nokia E7 , released 2011, had a 8mb pure view camera.
@@SmallSpoonBrigade AF has to do so much more work and ai is so much more processing on top of that. A light field camera would beat that.
As a keen hobbyist photographer I remember it was my non-photographer friends who got most excited about Lytro. Post-Focus is a nifty gimmick, but for any semi-serious photographer, deciding on how to place focus is part of basic composition and developing the skill-set to do that is part of the fun. If Post-Focus could have been built into existing camera systems as an additional feature, great, but buying into a new camera system where all the features are kind of handicapped to permit Post-Focus felt too much like a solution in search of a problem.
Its a solution to a problem that doesnt exist.
Totally agree with you, but I think street photographers, lomo and pinhole/toy camera enthusiasts would have loved it if it had some better specs, price, and ergonomics. I can totally see the benefit of having a compact camera that you can use to snap a pic discreetly and quickly without fiddling with settings and worry about the details later. I do believe that they could have succeeded if they found their niche. I feel like they were always targeting the wrong demographics and in the end, they never made a product that made any one group satisfied.
Yup,
Ren Ng: "Autofocus sucks."
Every photographer: "Use manual focus."
Ren Ng: "I suck at manual focus."
Every photographer: "Not my problem."
Ren Ng: "Buy my camera!"
Every photographer: "...why should I? Ah, I got it: you want to make money so you can hire me..."
@@klausstock8020 Nobody use manual focus.
That thing has huge potentiel, especially in a field were focus stacking is almost mandatory.
The design concept of first Lumos camera was great as well as idea of "always properly focused picture".
PS. I'd personally prefer variable exposure over focus for less blinding by the light or real HDR
The cinematic camera seems a great product, but I guess a competent focus puller is just more convenient. And much cheaper.
A more interesting thing than just replacing your focus puller would be if the camera can do things no regular camera can do, like seamlessly focus on multiple arbitrary planes at the same time. You can kind of do that with split focus lenses, but they are a pain and often leave artifacts in the image, and limit your possible shot compositions severely. Imagine being able to have two subjects who are far apart both in shallow focus at the same time in the one shot.
@@tylisirn Indeed... but that is the whole intent of "light field imaging". The light field image is essentially the same image focused, not on arbitrary multiple planes, but in a computational sense "focused simultaneously on ALL planes". It is the computational equivalent of storing a two dimensional image in a 3d cube with the third dimension of the data stored in the cube being the entire range of focal distances. Then, when you run the software, are are computationally selecting the "2d slice" that corresponds to the image at the focal distance that you want. This basically allows you to select both the plane of focus and the depth of field later from real stored information. You are always working from real stored image information and never "adding artificial computational blur". Unfortunately this requires storing a ridiculous amount of information for a single image. The catch is that, with modern electronics and computational capabilities, it's easier to store one image with a really wide depth of field. Then later you can select the parts that you want to be sharply in focus, add computational blur to the other parts, and end up with a good simulation of the part you want in sharp focus, and the other parts blurred just as if by a limited depth of field. (Basically the light field image can "do it for real" but now we can fake it so well, and so much more easily than doing it for real, that there is little incentive to accept the limitations, and pay the much higher cost, of doing it for real.)
The cinematic camera in the video is the wrong way to leverage this technology. It would actually be better to combine multiple cameras in locations around the studio pointed at the center, into a single light field camera. That way instead of being able to shift the image only across the diameter of the single lens, you can shift its viewpoint anywhere around the studio. In essence, the actors can just act out the scene, and the director can decide on camera position afterwards in post op. If he originally imagined it as an over the shoulder shot, but after seeing it he decides a shot showing both actors looking at each other would be better, he can do that without having to re-shoot the scene.
(You could do it with a single camera, but that creates blind spots - areas of the background which were covered up by the foreground from the camera's POV. If you use multiple cameras scattered around the studio, they can cover for each others' blind spots.)
Unfortunately, to pull this off with the way Lytro's light field camera works, you need to align all the cameras within a single wavelength of light. That's done in radio astronomy to create an interferometer. But radio waves are typically several meters in length making aligning two radio telescopes in different locations pretty easy. Even the black hole image (which was also captured with an interferometer) was created using images at 1mm wavelength. Visible light has a wavelength thousands of times smaller. Eventually we will get there. But our level of tech just isn't capable of this right now.
I remember the promotional demonstration video talked about how it could make chromakeys unnecessary by selecting the depth and deleting or replacing whatever you wanted on whatever depth you wanted. You could also relight scenes, adjust angles in post.
@@solandri69 I'm pretty sure you don't need an interferometric level of accuracy, a millimeter or so is good enough- the multiple plenoptic cameras can be merged via a Photosynth-like process (but each camera must be good quality). They may have been better off doing an array of Illum cameras rather than the monster Immerge sphere.
More details at ua-cam.com/video/4qXE4sA-hLQ/v-deo.html
I was a macro photographer when this tech came out. There was massive potential for shooting deep field images correcting the age old issue of razor thin DoF for macro subjects. I spoke with Lytro development a few times on focus stacking techniques. Noting ever materialized.
I had a similar experience. I was into macro photography in 2012 and got one of the first models. I naively thought I’d be able to use it to do stack focusing. Nope. I returned it and got a full refund.
You know, that was the initial intent with this tech. Not to select one focus point but to have several you can merge. But the software side never caught up to make it usable and resolution was lacking. Manual focus stacking is tedious and cumbersome but in camera focus stacking is a simpler and cheaper way to do this and basically made it obsolete.
I just saw my Lytro in storage the other day and wondered "WTH do I DO with this?" Can't wait to watch this!
How was it?
Did the video give you any ideas?
I got two of these. Found them for $50 each. Been looking for a quirky older digital camera and this fits the bill. Now you need to do the Light L16
Now that's an interesting story, talk about abandoning the consumer. Neato though, really thought about grabbing one off eBay updated to the final firmware revision already, but it would be a toy and a paper weight.
@@AJNpa80 it works ok despite the missing firmware. I can push android apps to it like Instagram and Twitter. But yeah it's just a quirky camera
I recently purchased an illum and it is extremely pleasant to use, and it has an extremely useful realtime preview mode and the absolute best screen interface I have ever used on a camera.
however it does require that you think about focus in 3d which is a bit tough to wrap your head around
It is very good for extreme closeups but it still has the cutout issue you mentioned in the video
Cool, I've been looking but they are rare as hen's teeth here in Australia.
Thanks Ken, great explanation. I worked in 3D imaging back in the ‘70’s, and did a lot of successful work with lenticular printing. The Lytro seemed to be clever, but offering the wrong product. The lens array tech could have been used to develop multi-angle 3D imaging, and by now could have been mature tech, in every phone and cinema and even tv. The “re-focusable” imaging always seemed a bit of a niche feature, and was doomed from the start. Like the 3D tech of the Nimslo camera, it worked, but that was not enough to make it a commercial success.
These episodes always are the highlight of an otherwise boring school day! Can’t wait to watch this one!
I worked at a camera shop when the second version came out. I showed it as an oddity. My boss sold one, the only one to a good customer at $1599. After the hype, it was sent back due to poor sales.
The Cinema Camera should have gone somewhere considering it's a camera immune operator error, meaning you can capture everything. Given that it needed datacenter equipment, a larger closed location would be good. I think *Sports* was the right use case for it. The cameras could a allow capture the player's position and where the ball really was at that moment, ultimate replay.
It was marketed and perfect for cgi heavy movies, where its 3d model would integrate perfectly with cgi 3d models.
@@tomboss9940 The issue with starting there, is that good Directors want creative freedom and convenience. The Lytro is a little bit cumbersome making it hard to get the shots creatives want. TV however is a more fixed camera and high turnover. If they could develop a good TV workflow I could see it being used in a studio along side that LED wall technology they used for starwars TV shows, as the LED wall could be used for correct lighting on actors and props and the Lytro would allow for faster background removal than rotoscoping, allowing for more freedom to fix shots.
It's one of those technologies like Disney's sodium vapor light prism technology, that should be more common in high budget productions.
Such an intriguing idea. I'm a photographer, and was really curious to play around with one of these. I've sadly never even seen one in the flesh. I really hope someone takes up this technology and develops it further.
It sort of reminds me of the Light L16: That camera was _way ahead of it's time:_ being a camera with 16 lenses and would take some of the best stunning photos into one. The _problem_ was that it cost way too much (about 2000 bucks) and the photos were on a different format (aka they weren't JPGs/PNGs like most photos), forcing one to import them into the software the camera was bundled with. The camera flopped, the company stopped support for it in 2019 and were dissolved in 2022, with its assets and former staff being taken by _John Deere_ of all things.
Light were also involved in the Nokia 9 Pureview, which used a similar concept - as per the main L16, they just couldn't really quite get it right. Computational photography can be a great thing, but it seems as though it is way too easy to get it wrong (and not realise before loads of $$$ are wasted)
John Deere is doing some really cool stuff with mass cameras integrated in to equipment using "AI" to automate tasks like pulling weeds etc. They had a really cool machine at CES that had 180 cameras that could intelligently pull weeds across a large field at 19mph. Actually a smart but unexpected acquisition of talent.
I wanted one of these so bad when they came out. I finally bought one about two years ago just to play with and it's nifty tech. Surprisingly it still works like new.
Having worked on the industrial design for Lytro, this is bringing back my old memories. Thanks for the video
don't put that awful design on your resume
@@justayoutuber1906don’t be a toxic four year old, looks bad on your resume.
This is such an insane Star Trek-esque concept, and it actually worked.
This video is challenging my most basic assumptions about what's possible in photography
I'm sad it still ended up lost to the thorns of relatively early development
ב''ה, it's kind of in between. As I understand at least this product, you're basically just sacrificing sensor pixels to different focal points, but then possibly with some interpolation math that does better than just the literal N of focal points you've stacked up. Useful for some high-speed applications (industrial machine vision, maybe vehicular vision), sort of useful for action photography, sort of gimmicky unless you're in those once in a lifetime situations others described (wedding photography, sports and stunts etc.)
If someone wants to lay out the math on how much more data you recover than just the sum of the different focal length images from different focal points on the microlenses, that'd be educational. But if you want something pretty solid state to resolve QR codes on an automated factory line or such, and have the resolution, it's certainly cool for such niches and perhaps more rugged than other forms of adaptive optics.
This was always a really interesting camera
I was just waiting for them to be reasonably priced and to produce reasonable resolution images but that never happened.
I was selling cameras in an electronics retailer when 1MPs came out and we confidently told buyers that that was so for beyond what they'd ever want or need.
My coworker at the time, told me about it we being both photographers we were both very excited about the implications, but I know that I was turned off by the price
I love macro photography
Fabulous video. Excellent production and editing. Thank you.
Your ability to talk us thru software interfaces real time with a device is your Super Power. I understand it quicker watching you than playing with it myself. Bravo. Sub’ed and note’s on.
Really enjoyed that. I wondered what happened to them. Your take is good: 2006 camera in 2012.
I remember seeing these back on Indiegogo back in the mid to late 2010s. However, once the campaign was done, i never got a chance to buy one of these cool products, including the easy to setup camera portable projector that was from a different Indiegogo campaign. Nowadays I don't go to that website anymore as I use to, but it's nice to see what people made on that platform and unfortunately they have disappeared from the platform, only to be remembered in memory.
Thank you for the trip down memory lane! I'd forgotten about this company and had no idea they evolved past their first product!
I'm not a photographer, but 'back in the day' the Lytro really captured my attention! I loved the idea of just snapping a pic and having all the information there to modify later. The compactness and handiness appealed to me just like those old flip cameras.
But ultimately the price kept me from buying one.
Looking back, their focus on the consumer market killed them. The resolution and size would be great for instagram, but the need to carry a 5" long rectangle with you, snap the photo, upload to a computer, manipulate, and then upload to Instagram later would never, ever win when you could just take a pic with your phone and upload it in seconds.
This was a great episode, as someone into photography this was more interesting to me than some of your other videos (not that they were bad!) This technology is very fascinating to me, wonder if anyone has used something like this successfully? Would love to see more innovation in cameras.
Watching you stand by the waterfront railing with the camera not strapped around your hand made my head explode.
It was like watching a Hallmark movie, where the characters are taking pictures with what is supposed to be a $1,500.00-$3,500.00 Canon camera and lens, with no strap on the camera.
To everyone who has a camera, make sure you have a strap and have it securely on you so you do not lose your investment.
I love this channel. Crazy Ken does an excellent job that hosting his channel. He's doing a great service I exposing all the scams out there. Thank you so much crazy Ken!!!!
I think the most underrated part of it is the interactive blur if you could do that with a better camera or part of a phone or action camera it would be amazing
Or not even that just haveing that as a patent that can be sold to others being after blue would be awsom
This is probably the next advancement of the Apple portrait mode after taking a photo
I bought both cameras, when the price dropped. I think I payed less than $100 for the original, and about $300 for the pro. I soon realized that the image quality was disappointing at best. I loved the design, and UI. I still really enjoyed using them. They had a great idea, but they the image quality. You can't do that on a camera. I hope one day another company develops a light field camera and does it right, focusing on making the image. There is certainly a use for them. I think they would be great for action, and for drones.
"payed"?
Love the videos! Watching this one actually made me want to go out and buy it, not the camera but the Chicago deep dish pizza! It looked delicious and the camera looked interesting but as other failed camera products it just couldn’t compete in the market with better products that, although more expensive, made the cost justifiable. When this came out, I think more people would rather pay over time for a $700 iphone with a pretty decent camera which also can make phone calls and has a nice big touch screen.
It makes me so nostalgic, I was closely following Lytro at the time.
Speaking of niche consumer electronics companies I was fan of, that miserably failed, I just remembered Neonode. They made touch screen phones, with screens and UI optimized for fingers. Their products featured gestures like swipes and scrolls… since 2004. I used some, and they were actually decent, and even had haptics.
Ooh, that sounds interesting. Perhaps I should research them…
@@ComputerClan It would be really interesting, as there aren't any modern videos of them. I own a Neonode n2, which was their last model they've ever made. The OS is actually just windows CE under the hood, but the shell is totally custom. To the user, it appears as a feature phone mostly, but it supports windows CE .cab apps. The screen features an IR grid to detect touched points; it might not be impressive now, but when these phones were new, and the market had resistive screens at best, it was amazing. Also, you "slide to unlock".
If you ever get a hold of one, I can provide some themes and apps I backed up (and also the firmware, since it entirely resides on the miniSD)
16:51 This is straight up _Deja Vu_ in real life. “No data storage system is big enough to record it.”
It's obvious how focused you are in making these videos. You deserve more exposure from the UA-cam algorithm.
I shutter at how oversaturated these puns are. But some of them are a flash of genius.
@@jimjam4082 nice 😂
Thanks for this video. You've inspired to dig this white (graphite gray) elephant from my necro-tech shelf. It still takes a charge and I'm all fired up about once again playing with this quirky, flawed device. 😁
Loved seeing this, great job! I loved the tech, but the image quality was just never acceptable for me. As a matter of fact, I have my blue Lytro (Original cam) on my desk right here in front of me. A beautiful and unique device that was ahead of its time!
Thank you for this video! We have a Lytro and I thought it was obsolete. Now i have the desktop software again and I am excited to pull out my Lytro and play with it again!
This is the closest the YT algorithm has ever gotten to reading my mind. I’ve been casually about this camera at least once a week for months and I’m not even a photographer. I just vaguely remembered it and occasionally wonder how it turned out. Thanks!
I remember back in 2004 during my Uni days, my photographer class needed us to buy a dlsr camera but it was so expensive back then. I remember high end cameras back then had 5 megapixels and that was pretty impressive.
It's funny the Lytro 6 MP camera is considered low by 2012 standards. I would've loved to use that camera back in 2004-2005.
Have both the original Lytro and the Illum. I've been of the mind that had they couldn't have been a bit more open about the limitations of the camera. Instead of making the format and software closed, and kind of frustrating to use, they could have been upfront about "effective" megapixels and allow developers to find interesting creative uses of the tech. They used a lot of hocus-pocus bullshit (mega lens arrays whatever) when Lytro cameras were actually very well-engineered devices with state-of-the-art high megapixel sensors, but most of that interesting data was not made available to customers who might find something WAY more interesting to do with it than refocus, which any camera with optical focus can do anyway w/o throwing pixel data away.
you can use the software to rotate the focus plane in 3 dimensions. Results in some interesting 'tilt shift' type images etc. but more a novelty. More useful is adjusting the fstop and focus plane with high precision. I do wish they had opened up the format more etc.
@@carylittleford8980 it's nice that reverse engineering has been done. I think one potential application for this data is in AI, for instance training a neural net to use the light polarity data from a photograph with tools like ControlNet/Img2img to more accurately infer depth and lightning from photos, or who knows possibly completely relight scenes.
@@rbus You don't need to reverse-engineer anything. The Lytro's desktop app straight-up allows the export of depth map to get 3D images. And at medium range, those are quite accurate.
@@gabrielhmi The problem with the depth information is that it just isn't very good. It's splotchy. I can get way more accurate depth data by using AI (Automatic1111's depth addon is crazy good on almost any 2D image). What I'd like is the pixel polarity data straight off the camera, using it's internal table. This information could be used to train a neural net to relight a scene or perform much more accurate depth generation.
@@rbus What you are talking about isn't "depth information". The depth map the Lytro gives you is an actual depth map in m, with millimeter accuracy. It's a true measurement. What you get using AI isn't even remotely comparable : it looks good, but it's absolutely not accurate, nor is it a measurement : it juste decompose your image into a handful of depth planes.
The main application for the lytro, and what their IP is currently used for is industrial parts inspection and 3D modelling.
I have one of those deep lodged in memories of reading a Popular Science/Mechanics article that reviewed a cellphone and they confidently claimed no one could ever justify more than 2MP cameras.
Imagine if this had been adopted to video! It could have had AMAZING VR applications with today’s eye-tracking. Imagine if you could record 3D light-field video. You could refocus your eyes one what’s close or what’s far naturally.
Yeah. Apparently the VR camera they made had playback allowing 1 meter of eye movement in 6 degress of freedom in VR. So ducking to look under a table, or see around a corner in a mirror were all easily to do in real time. Rather cool.
@@carylittleford8980 exactly.
They actually used similar approach for 3d movies, like Avatar. But cost of a camera is five-digit value and they required several at one time. Also, somewhat similar tech is used by satellites. So.. not eally unique tech, a daring way to apply existing stuff.
Still have my Lytro. Wished they grew bigger and made fresh new things. Would be cool to see what products they would be making in 2023.
I could see something like this being used for macro photography. Say, an archeologist documenting a newly discovered artifact before it's shipped out.
I can also see it working for cinema; some directors are notorious for doing take after take, and perhaps some of that could be handled by futzing with the images in post.
I use the Illum for lots of 3D macro work. Being able to move the images focus or just close up the fstop till the while scene is in focus. Every shot has a depth map as well. So could be useful for archeologist's , record keepers etc.
Very interesting. I was always curious about this camera and its history. Great video.
Thanks for watching!
A classic solution in search of a problem…
To me, Lytro was a solution to a need that nobody had, with tradeoffs that nobody would accept. At the time I’d been testing and reviewing consumer digital cameras for over 10 years and my immediate reaction upon seeing it was “WTF are they thinking”. As it seems you experienced, the much-vaunted refocusing ability was very limited, and only really noticeable when you had objects in the very near foreground.
As you noted, there were a lot of very smart people involved, so it always baffled me that no one called it out as uselessly impractical from the beginning.
I think ultimately, all the smart people were so wrapped up in their technology that they never stopped to ask what the end-users might want.
For anyone completely obsessed with having the ability to "post-focus"... You might check out panasonic's lumix cameras. Even the ~$350 used G85 has a mode to do this... although this is accomplished with 1 very good sensor taking exposures rapidly as the focus plain sweeps from shallow to deep over fractions of a second (rather than an array of low quality sensors all taking exposures at once).
Actually, impressed with the idea of mousing over an image on the web and seeing focus point change! Never occured to me! Very cool!
Dude, you are a great presenter. It is a pleasure to sit here listening to you (and watching, obvs.).
Step 1: Build a camera with 20 tiny laptop cameras inside. Step 2:put a slightly different lens on each of them. Step c:Have them all shoot at once
but take turns sending data to the memory card. Should be less than a second for all of them. Step 4: Afterward choose the photo which is
the least blurry. Ta-Dah! After-the-fact focusing!!!
I bought Lytro illum only back as a year ago, pretty much as bargain. I played with it, it seem complicated at first, sadly no manual. I did take few photos and I was amazed at the focusing after it has taken photographs. I understand and thought it been sold to Apple (That what I been told). I hope I get the other cameras soon.
Was sold to Google a few years back. Mostly for the software patents. :-/
Lytro sadly never ever arrived here in the Philippines! I heard of them in the online news, but it never arrived here.
The concept was cool (especially in the early 2000's) but with modern day AI-powered face/eye and subject detection, the times I've had out of focus shots I could basically count on one hand and I've shot hundreds of videos. Just check that the green box is on your eye and start shooting! I love how crazy the data rate was on their cinema product. Just go buy a suitcase full of hard drives
Subject detection is limited.. as you say it requires it to usually track on a person. That does not apply to so many different forms of photography.
With a lighfield camera you can also change the fstop from 1 to 16 as well as tilt the focus plane in 3 dimensions. All done with optical maths for accurate bokeh, I find the AI solutions in all modern smartphones never look as they should.
@@carylittleford8980 yes but I can shoot several hundred photos in a gathering with friends while never learning a single thing about photography and ever having edit even a single photo using the smartphone + AI adjusting.
This product is way too in between. I don’t know why they went direct to consumer and with such a “mid” product. Either aim mass market, and go wider and focus on just one thing, say a sliding dynamically adjustable 3D focus, or go more pro/semi-pro.
Even better, just sell or lease to companies like Canon or Apple and they can worry about the consumer.
@@astronemir for me it's the quality of the 3D output for macro photography I use the Illum for. Nothing I can access can beat it's 3d effect and that's my target niche. The depthmap is makes has full parallax, very unique even to this day. It's basically a quick point and shoot compared to other 3d setups : ) I can export content as stills or really cool slow pan video for the latest VR setups, new gen 3d tvs just coming available and other emerging markets (Lume Pas 2) etc. It also looks strange as a camera and that alone, just being different, gets me extra attention and thus work. I still use my phone for happy snaps, but when the scene has a really great 3d composition I wish I had the Illum handy. The first gen camera, the little boxy one is pure novelty and the Illum does a lot more after the shot. Google has the Lytro software patents, so I guess they can use that to train an AI to do focus adjustment in a more optical maths way compared to the current gen. :)
Oh my god , i had seen some ads for this product everyone was thinking dslr will be outdated in no time for retail consumers , great you bought a video about it 😮😮
I would actually go in the opposite direction of your conclusion. The problem of lightfield imagery isn't that the technology wasn't capable, or that it was behind the times. The problem of lightfield photography is that current technology isn't advanced enough to make the technology practical.
When you capture an image with a smartphone, you capture a single image that gets stored and compressed. When you do the same capture with a lightfield camera, you end up with an image information volume that is MUCH MUCH larger, because you add onto the image all of the lightfield data. For example, a lightfield image captured with a 12-megapixel camera at a 90-degree field of view and 100 light rays per pixel would be approximately 120 megabytes in size. A regular photographic image captured with the same camera at the same resolution would be only about 3 megabytes in size.
To deal with such an extreme data volume, Lytro had only one choice: reduce the size of the captured image. A 1080 x 1080 (1.6 MP) image might have been way behind what other cameras were doing at the time, but with the smaller image, the data volume became much more manageable.
I have a feeling lightfield photography will make a comeback as storage space continues to become less and less expensive. In time, I think Lytro will be seen as another company that had ideas that were simply way ahead of their time.
Yes, although I don't think that it will completely disappear, but I wouldn't expect to see it any more than what you see in Canon's 5D MKIV where there's a left and a right subpixel which can be used for slight adjustments to focus or to give the camera more information about noise on the sensor. (And by slight, I really do mean slight, the two subpixels are very close together and you don't get much room to adjust it)
The answer is much easier: The camera was a solution in search of a problem. The selling point was “you can finally refocus your image”, but no consumer needed to refocus their images, as their small phone sensor + lens system of that time could be considered hyperfocal focusing. When Lytro finally understood the real value, i.e. their cameras were actually capturing a 3D scene, it was too little, too late. And in any case, most people prefer resolution to 3D (with basically non-existent visualization option).
I was trying to write a devil's advocate comment about how tech startups aren't overvalued because free market, valuation multiples, growth rates and whatnot but I actually do believe they are overvalued. I'm not sure if these venture investments actually pan out on average. I suspect they don't and there's a survivorship bias kinda thing going on. The venture funds that invested in hugely successful companies got a huge payout and continued to exist using this terrible philosophy which resulted in future tech companies receiving similarly high valuations.
awesome shoutout for the HTC one M8!! I used that phone for many many years and it still works great to this day
“How can a company with that much money and well-known names fail?”
**Quibi shuffles feet**
I appreciate your reviews because you are primarily a Mac guy and you actually pay attention to usability. I recently bought an Oxo pizza cutter with very high ratings on Amazon. While washing it, I dropped it a few inches in the sink and the welds in the handle came apart, exposing sharp metal. Did anyone actually try to wash the item they reviewed? The use case for kitchenware includes washing it hundreds of times, and dropping it in the sink will happen. Reviews aren't using something once and concluding that it seems to work. You need to use something in a realistic manner, and not treat it like a delicate flower.
Anyone else watching this never hearing of Lytro?
The only time i heard about the Lytro in 2013, like a year after it came out. There was a little hype around it in the videographers/photographers community but nobody really used it, everybody was just reviewing it and never use it again. And now I'm watching this after 10 years.
I remember there was a social network for Lytro photographers and it was very cool. They used gifs to make the focal change part of the final art. But, at the end of the they, it was a photographer's toy, not a professional camera.
I love when we get a Computer Clan field trip 😁
I own two and love them. I used them for years. Still pull them out once and a while.
I have a red hot. I picked it up a few years after launch at a closeout store for about $50. It was a fun device to kick around with, and it had a lot of potential, but you nailed it when you pointed out the effect that smart phones had on the company. Also, I think a square format photo was just kind of off-putting for a lot of people. All that said, I wonder if this is a product that time will be kinder to. I feel like in a few more years, Lytro might become the new Lomo and experience a surge among hobbyists and people looking for retro novelty.
Listening to you talk about and explain all these different picture taking mechanics puts me in mind of Captain Disillusion and his explanations of how video/photo editing works. You both make these concepts easy to understand and are enjoyable to listen to.
The low resolution was a byproduct of needing the microlens array (they might have been doing it using a patterned sensor filter, but i digress). To do the refocusing trick it probably had a raw image sensor of 10+ mega pixels, which was closer to state of the art. Unfortunately when that sensor data is processed to focus it the resolution is much smaller. So it necessarily was many years out of date in resolution, and would always be unless they incorporated very expensive sensors, thus pricing themselves out of the market. I congratulate them for making it as far as they did with it…
Plus, 2012. The market for standalone cameras was not doing great then: Phones got good enough to really eat in to it.
I’ve used (and still have) a Lytro Illum camera purchased from Woot years ago at a discount. It’s genuinely cool tech except for the resulting low picture resolution. It was just a few years too late and as they say timing is everything.
It’s still fun to dust it off and use every now and then.
This came out during the peak of my photography interest and I still think of it on occasion. I found the idea of the Lytro very novel and quite enticing. I passed on the first model due to it's form factor, resolution and price. When they released their second camera - I was very tempted but a low-resolution "bridge camera" for big-boy DLSR money was not appealing. I'm sure I was also turned off by the proprietary software but who could remember 🤷
It was cool to learn that they went on to make more stuff before fizzling out. As impractical as it was - That cinema camera looks pretty dope!
Thanks for this! I always wondered what happened after the Illum. This whole thing was such an interesting idea, held back by people just wanting the most pixels possible.
I was an early adopter of Lytro and love it! it's such a shame that it never really went anywhere. I used it for my stepdaughter's wedding and it was great, as someone with better things to be doing than playing photographer, to be able to just copiously snap pictures with less attention to composition or focusing carefully and be able to worry about that later. Would it ever replace a DSLR for serious photography? no, I don't think it ever would. But did it drastically improve the photo quality of someone who wanted to document an event without spending all their time worrying about camera settings? absolutely.
Personally, I think if you are just snapping away and not paying attention to composition / focus, then pretty much *any* camera from the last 10 years in an auto mode - especially Cell Phone cameras - will do the trick as well as a lytro, and with better resolution. I mean, it's really only "saving" you from watching if AF hits, but that's not actually a huge problem with most modern cameras.
And on the other hand - if you do care about composition, there's not a lot of "worrying about camera settings" if you're at all familiar with your camera. I can dial in a shutter speed to freeze motion, and then let the camera take care of the rest of the settings, adding flash if I want to turn it on and set to TTL. Not a lot of actual settings to mess with.
it's still very cool that it had a *real* light field sensor and not the fake ones that modern phones have, the ability to ACTUALLY fix the focus in post is something that it's just not possible to do on a standard camera when said camera decides it wants to focus on the random bird in the corner rather than the person taking up most of the frame
The Holy Grail would be a light field screen that could be used as a virtual window. Imagine seeing Central Park from a basement.
Hilariously the closest thing to hearing about anything to do with light field cameras I've seen since these things failed is a subtle reference in The Expanse series. In season 3 of the show the crew is being filmed by a drone camera and the user interface says "Light field enabled". haha
Well done! Enjoyed watching this. The first two Lytro products (consumer, semi-pro), were high-priced within their categories and they produced lower-resolution images. The focus thing is cool, but it's difficult to compete if you only offer one unique feature, while making the core camera features (image quality, usability, lens selection, etc.) less capable than virtually every other competitor. I don't know the Lytro leadership team, but it seems like they got enamored with the focus feature and thought it would be sufficient to get camera people to switch. Especially in the semi-pro arena, those folks aren't going to abandon an investment in their current camera brand (Canon, Nikon, Fujifilm, etc.) unless you offer them something dramatically more compelling in every way. Plus, as you mentioned, cameras in phones happened, killing the consumer camera market.
Now that's a camera bump on the back of the screen lolol
Hah! Love the old-school iPhone in your background. Used to work tech support for Big Planet back then. Found your video on it and left a longer comment there.
OMG, seeing Lytro took me back a decade. Even then it wasn't actually a scam, just adjacent.
As someone who never knew about this I really like the design, physically.
Did you say "electric blue?"
My inner Icehouse is flaring!!!
I just freeze
Every time you see through me
And it's all over you
Electric blue
On my knees
Help me, baby
Tell me, what can I do?
Electric blue
Wow - this is one of the best produced and presented videos I have seen in a long while. Par excellence!
I picked up one of these from woot for $50 in 2015. Forgot about it until I saw it with the rest of my camera gear a couple weeks ago. Everything you said about it was spot on. It was a neat idea, but I never used it much. Glad I didn't pay the original MSRP for it.
I think you are right about the timing. Around 2012 I was pondering whether to buy a new phone with a 'good camera' or keep my crappy phone and just get an actual good camera. Phones were advancing quickly at the time, along with their camera capabilities. At $500 I think the Lytro would have been way out of my range, especially given what the same amount could get me in a second hand DSLR at the time. Had that price been the same 6 years earlier, when the nice DSLRs were still over 2k, the Lytro would have seemed a much better value.
I stumbled onto this channel yesterday. Great videos, muh dood. Keep up the good work!
19:00 I disagree.
That camera wasn’t what we needed in 2006 either. They failed because they didn’t really understand photographers or our needs. Nobody needs anything they made.
I miss my HTC 3D Phone Camera with 3D Screen!
I absolutely loved the concept of the Lytro when it was announced, but I couldn't afford it.
It's best to get the focus right from the start, but the very idea of being able to refocus afterwards is great, especially when considering that what is perfectly sharp on your 4-12" preview display, may be badly unfocused when viewed on your 43" monitor and a waste of money to get printed.
Being able to refocus and re-expose photo's after the fact by recording all available light data feels like an absolute gamechangers for non-professionals.
And here i am, an owner of a hardly used Graphite Gen1. Simply the fact it only really worked for cirtain ways of shooting pictures, paired with the low resolution (I personly even thought the output of 1080x1080 was too high for what i personally was getting out of the camera). The camera also turned out to be something you can hardly resell, people couldn't grasp it.
Great analysis -TYVM. So sad that his amazing vision couldn't make it back then. Ren Ng apparently a visionary way ahead of the tech needed to make camera work.
I still occasionally use my Lytro Illum (which I got in late 2015 discounted to $300) and it's a fun toy, but not a general-purpose camera. I think I'll take it on my next kayak outing, it should do well for the on-the-fly photo ops of wildlife that have been frustrating with other cameras.
Imagine if this was built today. My Sony 2.8 and 1.4 lenses start at $600. So if they made this for $500-800 with new tech, I would be on it.
Very cool and unique! The design reminds me of a square version of the Apple iSight FireWire camera. It looks quite nice, but it seems like it would have worked even better if it had a manual focus ring/wheel instead of the touch sensitive surface. The display also looks like the iPod nano 6th gen, but at a lower resolution. If they took the display on the iPod and used them instead, it would look better while maintaining the same form factor. It’s still an interesting device, but the company might still be in business if they focused more on upgrades to this style of Lytro and not focusing on higher end pro markets. Great video!
I just bought one of these the other day from a thrift store for 6 dollars in Pink. For that price, its a super fun novelty camera with a funny/weird sidenote of photography history attached to it.
I never forgot about these, I always wanted to see one and wondered why it just seem to go nowhere? I guess having such an out of date aspect-ratio also killed it.
So basically it just shot long depth of field images and applied a filter to create a fake blur? Because they made it sound like it was an optical process.