I love how the podcast is getting more relaxed and more informal, been following it since the first weeks and it felt a bit more "strict" than it is today, it felt more like a news anchor dynamic, now there's a lot more jokes thrown around and the vibe is just better, feels more like some friends having fun and talking about tech
I entirely agree. I have immense respect for what Marques has built over the years, but through the Podcast, I realized that he's sometimes a bit rigid or overly correct. I understand it's probably justified with all the responsibilities, etc., but with such a great team like his, I always wished he would relax and goof around more. The topics they discuss are obviously interesting, but personally, I think it's more about the characters, their personalities, and the way they talk about it. I truly enjoy the team discussing all sorts of things while also showing their human side. It's super fun.
Any matter and its disappearance, after a complete analysis, a comprehensive conversation, this is a meaningful conversation that human thought has made on its past state, in which all efforts have been learned.
This will get lost in comments, but you guys failed to mention how useful this is for older folks and the legally blind. I've been looking for something like this for my dad who is blind. Its perfect, you guys talk about this being an app. My dad can't even see the screen. It's an amazing creation
Ik a blind person who uses an iPhone with voice over , and he use normally like any person and even can write messages , but overall you are right this could be of great help if it’s a no brainer like the marketed it to be
I don't get the point. That's why phones have Siri etc, you don't need to see the phone. If this was just the LLM+API-portion, it'd be not interesting, because all the other players will have something like this by this year. The interesting part is clearly the LAM, pretty much a AI Macro maschine, which nobody has right now.
@@biglevian But to set up an iPhone initally, you need to see the screen to start. I agree that it's helpful to the visually impaired an a good AI macro machine.
my dad is 65, not exactly great with technology and siri is pretty terrible Ai. You really have to be blind or live with someone who is to understand how trash it is.@@biglevian
@@comicaiadventures no no, I agree that the current players from siri to Alexa are terrible, right now. But there are strong indications that all of them plan major enhancements that allow a lot more control, recognization and better app integration.
I want to see someone book an entire holiday on the Rabbit R1 like in the demo and then actually go on the holiday it books for them and make a UA-cam video about the experience!
One of my core memories from childhood was being asked what sound a rabbit makes and being told I was wrong when I made squeaking noises. The first sentences of this episode sent me down memory lane.
my mom was once on a farm and was asked to "quickly and painlessly" kill a rabbit by grabbing its head and [vivid description redacted]. she was not strong enough, apparently. it went bad. when i was a kid, i asked her 'what bunnies sound like', and she told me that story and described its very human-sounding screaming noises. So I became convinced that that was the default "hey, a rabbit is actually making a sound" noise, like, 'clearly a cowardly critter that runs and hides as a first defense, would only make audible noises when it is absolutely panicked. So rabbits sound like human terror. That is the noise they make.'
The sound in that recording is the one rabbits make when in extreme pain. Most of the time they are silent but they make different sounds if they're annoyed or if you manage to surprise one without also startling it.
The "too many taps" point: You have to not only consider the amount of taps it takes to do a task, but the fact that you have to navigate to the app on the phone AND know how to use the app well enough to do things quickly and easily. This could be an amazing accessibility tool. I can also see myself setting one up for a grandparent and signing them in to all their stuff so that life just works a little easier for them. I believe they'll have a web portal as well where you can do some parts of the setup. That might be even better for helping out someone who doesn't live with or near you.
Very on point. I have so many apps that it takes my concentration and two hands to flip though to the correct app. By just pushing a button, you avoid all that.
Rabbit should consider making a dash cam. Take the existing hardware they have announced , and add a beefier camera and video recording/cloud storing software. You can train it on your driving habits and it can use its camera to alert you if it suspects you are sleeping on the wheel, give more accurate turn by turn directions, and allow you to share more accurate pin locations to friends and family. This can especially be handy for older cars.
@@greentea5593 I totally agree. They don’t need to make a dash cam especially if the R1 is already capable of the features I suggested. But the product Rabbit is currently pitching is no more useful than the phone we already have.
@@joshunas7701 100%! I'm struggling to understand why WVFRM suddenly bought into the hype a new company is creating. They've been better at filtering these things out in the past. I think they've even been dismissive of products that are much more realistic and potential than this.
I’m excited about Rabbit’s teaching abilities where you could show it a UI process, like editing on Photoshop, and it can learn at the GUI level. Hopefully Apple will someday have this at the OS level.
So the Alaska Airlines Plane, a Boeing 737 Max 9, was having a caution light for cabin air pressure. The pilots reported the issue but it was cleared to continue to fly, but was pulled from the Hawaii route. Initial reports said that there were some bolts that were not tightened all the way, then it was found out that some of those bolts may have not ever been installed.
Thanks…as a 737 pilot, I too wanted to make sure they knew the pilots and company did not ignore the cabin controller lights and were even more conservative then they needed to be with the ETOPS restriction.
Yeah it made it sound a lot like as if this was the fault of the operators of the plane, when with what's known so far it seems much more like Boeing just did a poor job with the design, production and quality control of its planes. Especially considering that the 737 Max has already been grounded before, and all the defects found with Boeing planes lately, that's a big story here; wtf is Boeing management doing. Was hoping to hear their perspective, as tech guys it might've been interesting, but that just seemed very poorly informed
Making it out as if it was the operators fault is very poor reporting and misinformation. Warning lights are on in every plane you fly, there are thousands off them, they are always dealt with according to the maintenance directives and requirements, which may include note it for next minor service.
Rabbit could make this a choice between $200 single purchase device, or subscription based iPhone app (linkable to action button to remove “too many taps” limitation)
Nice episode. Quick thoughts re Rabbit R1: 1. They mention the upcoming rabbit store to share taught actions with others and ability to monetize 2. This tech is destined to be miniaturized into a single chip thats manufactured into other devices - look at what's happened with dedicated GPUs generally and Apple's M architecture 3. The device is v1 - it shouldn't be compared to mature consumer devices, it doesn't need to be perfect, have mass adoption, or even be a commercial success day 1 4. This product is an incredibly important step forward - it's made multi-modal AI portable. It proves it's possible with a fresh take using current AI tech and will motivate other companies and developers to respond
Yes, It is destined to be miniaturized, but it isn't unique hardware like a chip. They don't have a "R1-chip" that they can license to other manufacturers. it's a small one-purpose computer. This will be integrated functionality in the iPhone and the Apple watch in the future, not be a stand alone product. (We went through that with carrying around a cell phone, camera, iPod, calendar, note pad, book, recording device, flash light etc. We don't want that) The R1 is a cool gadget and the fact that it is a tactile product makes it interesting - many early adopters will want to get one purely on the collectability and to be the first of something. But it will end up unused in a drawer very soon. No one actually wants another thing to bring along, and to charge every day. I don't think the investors agree that it doesn't have to be a financial or commercial success straight away. Because the truth is that the product is cool, and the demo is neat, but multi-modal AI is something EVERYONE is working on. Apple, Microsoft, Open AI, Samsung, Google - Everyone is putting all their effort into making multi-modal AI. Google made a very similar demo of their version a few months ago and we will see many more of these coming out during the year. Rabbit will not be able to compete with those companies. And IF R1 is a commercial success then maybe someone will buy it for the brand and some of the tech and integrate it into their product otherwise it will be sherlocked within months.
Nice one! The R1 is such a cool proof of concept. The company knew exactly what it was doing to release it in a fidget style gadget as opposed to an app. It has definitely grabbed my attention
The rabbit r1’s AI isn’t “right on it” as you mentioned but in the cloud, where it accesses its Large Action Model (LAM). The cloud placement also lets it provide a web-based portal (“rabbit hole”) for integrating external services.
as a student one thing i imagined was getting it to listen to long lectures and then summarise all the information, maybe even making a pdf with all the summarised content and a voice note, and then saving that for me for later when im ready to study. that would literally shave hours off my day.
Perhaps Rabbit created the R1 device as a gimmick to generate hype that they can spin into their eventual AI app, which otherwise would have gone unnoticed 🤔
Ellis is gold. While you guys were discussing use cases for the Rabbit R1, I was trying to find a use case for it as well, I already have a pretty expensive phone that takes up enough space in my pocket as it is and I'm someone who triple-checks my calendars, notes, messages, etc.
I’m curious to know wether the teaching concept and learn to use PC programs like autocad. It would be an easy way to automate work if it can learn to use programs that aren’t web based
Just discovered this podcast a few weeks ago and I've been loving it ever since. Went back and watched some old episodes too. It's like The whan show but if they actually talked about technology and stayed on topic
Its interesting that tech people aren't "everything" kind of tech people, we all have our own domains. As someone from domain of AI professionally, Rabbit is a hype product. Its not going to compete even with apps on your modern phones in few months time, as AI has SIGNIFICANTLY advanced recently. Running online-cloud device, when you'll soon be able to grasp hardware-based generation (so your requests NEVER leave device) is already possible for enthusiasts like myself and will soon come to global market.
I'm also in the field of AI. I'm really curious about the architecture for the LAM. I would imagine that it is a transformer trained to predict masked app elements. This could embed the features for the elements in the context of the page and would seemingly be very useful in down-stream "action" tasks.
What I like about this in comparison to your phone is that it is focused to do "the thing" it is intentioned to do. No distractions for ppl like me who may start down a rabbit hole when on my phone. The talk mentioned how it would help cut down on being on the phone and being distracted by all the entertainment distractions. And perfect for generations not interested in excess on the phone.
FYI about the plane thing. They error they got about cabin pressure they would test plane and it was passing that and apparently they can sometimes just have a faulty sensor that triggers it so they were not just ignoring, though maybe in hindsight should have checked some stuff but because that was a hard mounted plug on the side of the plane since their configuration didn't need that as an extra emergency exit I am guessing they assumed too much that it was to spec. I think also the phone was locked but they did not have any passcode on the phone to get in, which is wild to me!
Oh my god Ellis has just made my Saturday morning. It is so funny hearing his commentary at the show and then your guys's reaction. I'm literally laughing as I'm typing this
One point I’d like to make about the rabbit recording feature is that it will break like macros if the recording only looks at the actual visuals of the page, but if it does any sort of network request analysis it could probably largely just map the user clicks to the important network requests and learn how to use the API for any given app. If that is the case, I think you will see much less of this “macro breaking” problem than you seem to think, as APIs tend to be more stable over time and are progressively enhanced for UI features
Disregarding the fact that private APIs changes often, the main problem of that is that APIs don't relies on authorization that is not only created _from_ your credentials (auth flow) with short-lived tokens, but also to actions made in the app to prevent Cross-Site Forgery Requests (CSRF). It would have to reverse engineer the app and API in order to understand the flows and side-effects to be useful. Also that would be pretty bad for security if it can read network calls as those also contain sensitive information that is not displayed on screen such as passwords, authentication and authorization tokens, etc. It would be a huge liability to security.
47:30 The R1 just reminds me of those personal language translators. If it doesn't replace your phone, then it's an extra device you have to carry around. I'm sure at some point, Apple/Samsung will allow a similar function to the R1 with their respective 'action' button.
That's an interesting perspective, but it's important keep an open mind when it comes to innovation and technology. While it's true that some gadgets may focus more on aesthetics rather than practicality, they can still have their own unique value. The founder of Rabbit may have a different vision and approach to creating products, and it's worth considering that their gadgets could bring joy or inspiration to certain individuals It's always exciting to see different ideas and designs in the tech industry.
I do not know if the Rabbit device will be the final device but the concept is right. Why wouldn’t you want an actual assistant rather than a myriad of apps that you first have to load to complete a task? Isn’t that the promise of one aspect of AI? If this can work as advertised, this is a big deal and could challenge the phone we see today for your pocket space, relegating screens for larger devices. I am 70 years old and the reaction from several of you sounded like it should have been coming from me not young techies.
Wow~ It's a little assistant in your pocket that reminds you of your schedule when it's time to go, reads your texts out loud, and even more amazingly, knows which method to use to get the best results based on the situation. And I believe it can even predict the next outcome of an action through its vision perception of the object. It's a really, really amazing device.
The problem with this product is people will need to figure out it's main use case. But definitely this company is onto something with Language Action Model
Couldn’t stop laughing when Andrew asked Marques if he’s planning to buy the new Apple Vision Headset for every member of the studio. The look on Marques’ face was priceless!😂
I just can't see the R1 being relevant for long. A nice idea but there's so many existing mediums and hardware that can do the same thing. Only a matter of time before we see this in smart-watches, home assistants etcetc. Will be interesting when all this tech + VR/AR headsets mature together - maybe one day we'll simply have it all via a contact lens/earbuds or smart-glasses.
Anything Teenage Engineering touch turns to gold. I'd say most people buy this because TE is involved. I hope they continue to be involved because they design think like no one else.
These CES correspondent segments are COMEDY GOLD. Funniest thing I've seen so far in 2024. Anything that involves Nickolas Tesla is based on strong science.
Humorous to me with David and Marquez talking about how Google Assistant is so close to being able to do all the things the Rabbit R1 can do...only for Google to lay off like 200 jobs in the Google Assistant team while stripping features from the service.
Former rabbit owner here. Rabbits make various sounds. They don't really have a main vocalization in the way cats meow or dogs bark, which is why they're considered quiet animals, but they do make sounds for different reasons. When a rabbit is feeling threatened it will make a grunting or snorting sound. When a rabbit is feeling comfortable and content it will make a purring-like sound by gnashing their teeth together. Essentially vibrating their teeth. When a rabbit is in extreme duress, like thinking it will die, it will scream. Baby bunnies will make squeaking sounds but that tends to go away pretty early.
Just some feedback for Andrew, the upspeak is crazy man. He literally talks like this? Like? Every single thing is a question? I _wish_ I could unhear it? But once you notice it it's kind of hard not to? It'd be ok if it were here or there but it is virtually every single sentence. 10:13 - 10:35 but really it's all throughout. 12:30 -12:40 literally 4 examples per 10 seconds lmfao this could be a podcast game I think the editors should try and count every single time Andrew does this with a sentence? Ellis was great and great podcast otherwise
34:07 I wish project LINDA took off because I remember back in 2012, Logitech had an app you could download that would turn your phone into a universal track pad. I remember using my iPhone 5 for that feature.
For me, I don’t even care like Google can replicate functionality of the rabbit, I’ll still get the rabbit - how often do we see cool gadgets like this
There's no way Rabbit isn't acutely aware that Google can do exactly what they're doing. They want Google or someone else to buy their LAM. No one is ever going to buy the device. It's just a gimmick to get people interested in their software. Also at 1:05:27 David literally described a Turing machine haha. This is Alan Turing's future and it makes me so happy to see him getting the recognition he deserves.
Speaking as a paratrooper, I have a buddy who dropped his iPhone from his pocket upon exiting the bird at about 11 to 1200 ft AGL. Terminal velocity for an object with that mass and shape I suspect is definitely reached before impact. We later found his phone perfectly fine on the drop zone.
I have a blind friend. She and I brainstormed a bunch of ideas that could be used with Rabbit R1. This devise destroys what smart phones are doing right now. Read medication, expiration dates, help cross streets at intersections. Interact with smart devices (Alexa is doing it already but there's so much more) my elderly parents can't use a smart phone. They have a tough time using messenger and can't video chat with family unless the young members of the family set it up. This would definitely simplify everything.
Those vibration plates are actually really helpful after whiplash. i was skeptical also, but it helped. I am not sure comparing it to amethyst is comparable. It really helps relax muscles and short periods in 2-3 Yoga poses feels like an hour in a session.
7:25 the fact that i actually got an ad from mous, saying that the phone that fell out of the plane used a mous case. And then in the same video you talk about this and mention someone should do something like that is amazing and a little concerning.
For me the R1 is exciting because of 2 things: no subscription fees for access to their model and the teach thing. It is basically a model that learns interfaces, can I teach it AutoCAD? GIS? Unity? , even better, can I use footage to teach it? And yes it us "like a macros" but far from it technologically. I also saw on their website that they want to have a store, they are doing that.
54:45 - It's not that the R1 is phone sized, it's that a phone is hand sized. The R1 is a 'hand'-held device, after all. I regard the R1 as an electronic notebook. However, instead of referring back to my notes to be reminded of which tasks I need to complete, each 'note' takes care of those tasks for me too.
Very cool project but here are some TIPS I think will be good : 1. After giving the rabbit a complicated task like the trip booking, it may takes time ,, so I suggest add an option to initiate a new task ,so that we don't get stuck for that one task to finish ...kinda like Tabs on Chrome ,,with a progress bar that shows how much individual tasks are completed. 2. Another annoying problem I face is the image recognition scans everything ,,,, so ,like in the fridge demonstration , it scans all the elements in the fridge which is great but in reality there are many weird stuffs in normal peoples fridge like Beer or chocolates ,,,, so I want a option to select few options and get suggestion from them ..kinda like I take a photo of the fridge and round the eggs and butter only and told the Rabbit to give a recipe out of this ingredients ... 3. It will be crazy helpful if it can detect hands pointing to something and what it is pointing to ,,, as humans points things the most to show problems that they don't know ,,, like , imagine you can point at a car with your hand and ask the Ai what model is it ,,,, or a fruit/plant you don't know name of can be recognized by the Ai if we point at them and ask what is this one thing , 3. Another suggestion I would give is to show the actual process of the Ai doing things ,,,, iit will be way more cooler to see it doing stuff in action like opening apps and selecting options rather seeing a loading screen ..., like Jarvis 4. Voice recognition but not like robots ,,,,, I cant say the entire trip planning in one go ,,, voice recognition nowadays is mostly useless cause humans needs time to think and say things ,,,, there are filler words ,,, can you imagine writing a script on paper to read it in one go without any fillers , also if you wait a bit long between words the Ai just take that as end and starts searching ,,,,,, so I suggest adding a recording option ,so that we can record what we need and take our time explaining and then the Ai can analyze the task in the record .... this will be super helpful I have more suggestions too but these are the most crucial flaws I have noticed using various Ai tools - ALL THE BEST FOR THE FUTURE
6:40 the fact that nobody got hurt was that the plane was still only 6 minutes after takeoff and still attempting to gain altitude. Everyone was still wearing their seatbelt, and it was only 13,000 ft at time of failure, and cruising altitude is 30,000 ft.
The R1 seems very useful for the blind, elderly, and maybe children to an extent. This may not be for everyone, but it does have those who can benefit from it. We all age, so there will always be consumers for it.
My dog was very intrigued by the rabbit sound 😂
Most people don’t realize the reason dogs love squeaker toys is because it triggers their hunter instincts for small animals 😅
My cat too 😅
It's a real live squeaky toy!!
same mine went nuts searching for something
same
Ellis' Reports are incredibly funny. I genuinely cannot get enough of them!
I love how the podcast is getting more relaxed and more informal, been following it since the first weeks and it felt a bit more "strict" than it is today, it felt more like a news anchor dynamic, now there's a lot more jokes thrown around and the vibe is just better, feels more like some friends having fun and talking about tech
Yeah they seem a lot more relaxed now
I entirely agree. I have immense respect for what Marques has built over the years, but through the Podcast, I realized that he's sometimes a bit rigid or overly correct. I understand it's probably justified with all the responsibilities, etc., but with such a great team like his, I always wished he would relax and goof around more. The topics they discuss are obviously interesting, but personally, I think it's more about the characters, their personalities, and the way they talk about it. I truly enjoy the team discussing all sorts of things while also showing their human side. It's super fun.
I prefer the Lew Later Show, but this isn't a bad option.
That's what experience leads to.
@@DrRDWugh I miss Lee later so much
Ellis’ coverage is the best coverage. I would legitimately watch an hour podcast of him interacting with the absurdity of CES.
We need a live feed 😂
OMG I WAS LITERALLY CRYING WITH LAUGHTER @@a.girl.has.no.name_
Any matter and its disappearance, after a complete analysis, a comprehensive conversation, this is a meaningful conversation that human thought has made on its past state, in which all efforts have been learned.
Word! :)
Ellis' report on the vibration board feels like a really well written skit. It just keeps getting better as it goes on
What part of it do you think wasn’t written?
Ellis has definitely missed his call as a storm reporter. His reports are amazing 😆
*david
This will get lost in comments, but you guys failed to mention how useful this is for older folks and the legally blind. I've been looking for something like this for my dad who is blind. Its perfect, you guys talk about this being an app. My dad can't even see the screen. It's an amazing creation
Ik a blind person who uses an iPhone with voice over , and he use normally like any person and even can write messages , but overall you are right this could be of great help if it’s a no brainer like the marketed it to be
I don't get the point. That's why phones have Siri etc, you don't need to see the phone. If this was just the LLM+API-portion, it'd be not interesting, because all the other players will have something like this by this year. The interesting part is clearly the LAM, pretty much a AI Macro maschine, which nobody has right now.
@@biglevian But to set up an iPhone initally, you need to see the screen to start. I agree that it's helpful to the visually impaired an a good AI macro machine.
my dad is 65, not exactly great with technology and siri is pretty terrible Ai. You really have to be blind or live with someone who is to understand how trash it is.@@biglevian
@@comicaiadventures no no, I agree that the current players from siri to Alexa are terrible, right now. But there are strong indications that all of them plan major enhancements that allow a lot more control, recognization and better app integration.
The live reporting from CES 😂 10/10
I want to see someone book an entire holiday on the Rabbit R1 like in the demo and then actually go on the holiday it books for them and make a UA-cam video about the experience!
Or try and book a holiday to the moon and see how rabbit responds or something obscure
or try booking a book, and reading it
see the rabbit could guide anyone to be a millionaire in the shortest time with their ability
One of my core memories from childhood was being asked what sound a rabbit makes and being told I was wrong when I made squeaking noises. The first sentences of this episode sent me down memory lane.
Villain arc has commenced
my mom was once on a farm and was asked to "quickly and painlessly" kill a rabbit by grabbing its head and [vivid description redacted]. she was not strong enough, apparently. it went bad.
when i was a kid, i asked her 'what bunnies sound like', and she told me that story and described its very human-sounding screaming noises. So I became convinced that that was the default "hey, a rabbit is actually making a sound" noise, like, 'clearly a cowardly critter that runs and hides as a first defense, would only make audible noises when it is absolutely panicked. So rabbits sound like human terror. That is the noise they make.'
The sound in that recording is the one rabbits make when in extreme pain. Most of the time they are silent but they make different sounds if they're annoyed or if you manage to surprise one without also startling it.
The "too many taps" point: You have to not only consider the amount of taps it takes to do a task, but the fact that you have to navigate to the app on the phone AND know how to use the app well enough to do things quickly and easily. This could be an amazing accessibility tool. I can also see myself setting one up for a grandparent and signing them in to all their stuff so that life just works a little easier for them. I believe they'll have a web portal as well where you can do some parts of the setup. That might be even better for helping out someone who doesn't live with or near you.
Very on point. I have so many apps that it takes my concentration and two hands to flip though to the correct app. By just pushing a button, you avoid all that.
Rabbit should consider making a dash cam. Take the existing hardware they have announced , and add a beefier camera and video recording/cloud storing software. You can train it on your driving habits and it can use its camera to alert you if it suspects you are sleeping on the wheel, give more accurate turn by turn directions, and allow you to share more accurate pin locations to friends and family. This can especially be handy for older cars.
Dash mount
100%
You can do all of this with the hardware of your phone.
@@greentea5593 I totally agree. They don’t need to make a dash cam especially if the R1 is already capable of the features I suggested. But the product Rabbit is currently pitching is no more useful than the phone we already have.
@@joshunas7701 100%!
I'm struggling to understand why WVFRM suddenly bought into the hype a new company is creating. They've been better at filtering these things out in the past. I think they've even been dismissive of products that are much more realistic and potential than this.
Marques's face to the rabbit sound is epic. :)
my new go-to sticker for sure lmao
I’m excited about Rabbit’s teaching abilities where you could show it a UI process, like editing on Photoshop, and it can learn at the GUI level. Hopefully Apple will someday have this at the OS level.
Annual CES report should be a recurring segment in the show 😂
~ " Google, Call me an Uber home"
~ "OK, I'll call you 'an uber home'" from now on
This made me laugh so hard.
So the Alaska Airlines Plane, a Boeing 737 Max 9, was having a caution light for cabin air pressure. The pilots reported the issue but it was cleared to continue to fly, but was pulled from the Hawaii route. Initial reports said that there were some bolts that were not tightened all the way, then it was found out that some of those bolts may have not ever been installed.
Thanks…as a 737 pilot, I too wanted to make sure they knew the pilots and company did not ignore the cabin controller lights and were even more conservative then they needed to be with the ETOPS restriction.
thanks guy want to fly with them in few weeks and could use their pricing ;)
Yeah it made it sound a lot like as if this was the fault of the operators of the plane, when with what's known so far it seems much more like Boeing just did a poor job with the design, production and quality control of its planes. Especially considering that the 737 Max has already been grounded before, and all the defects found with Boeing planes lately, that's a big story here; wtf is Boeing management doing. Was hoping to hear their perspective, as tech guys it might've been interesting, but that just seemed very poorly informed
Making it out as if it was the operators fault is very poor reporting and misinformation. Warning lights are on in every plane you fly, there are thousands off them, they are always dealt with according to the maintenance directives and requirements, which may include note it for next minor service.
Rabbit could make this a choice between $200 single purchase device, or subscription based iPhone app (linkable to action button to remove “too many taps” limitation)
Love these Friday morning episodes. Helps me get through the silence of working remotely on Fridays
Nice episode. Quick thoughts re Rabbit R1:
1. They mention the upcoming rabbit store to share taught actions with others and ability to monetize
2. This tech is destined to be miniaturized into a single chip thats manufactured into other devices - look at what's happened with dedicated GPUs generally and Apple's M architecture
3. The device is v1 - it shouldn't be compared to mature consumer devices, it doesn't need to be perfect, have mass adoption, or even be a commercial success day 1
4. This product is an incredibly important step forward - it's made multi-modal AI portable. It proves it's possible with a fresh take using current AI tech and will motivate other companies and developers to respond
Yes, It is destined to be miniaturized, but it isn't unique hardware like a chip. They don't have a "R1-chip" that they can license to other manufacturers. it's a small one-purpose computer. This will be integrated functionality in the iPhone and the Apple watch in the future, not be a stand alone product. (We went through that with carrying around a cell phone, camera, iPod, calendar, note pad, book, recording device, flash light etc. We don't want that)
The R1 is a cool gadget and the fact that it is a tactile product makes it interesting - many early adopters will want to get one purely on the collectability and to be the first of something. But it will end up unused in a drawer very soon. No one actually wants another thing to bring along, and to charge every day. I don't think the investors agree that it doesn't have to be a financial or commercial success straight away. Because the truth is that the product is cool, and the demo is neat, but multi-modal AI is something EVERYONE is working on. Apple, Microsoft, Open AI, Samsung, Google - Everyone is putting all their effort into making multi-modal AI. Google made a very similar demo of their version a few months ago and we will see many more of these coming out during the year. Rabbit will not be able to compete with those companies. And IF R1 is a commercial success then maybe someone will buy it for the brand and some of the tech and integrate it into their product otherwise it will be sherlocked within months.
The guy who is doing the CES coverage is a natural 😂
Nice one! The R1 is such a cool proof of concept. The company knew exactly what it was doing to release it in a fidget style gadget as opposed to an app. It has definitely grabbed my attention
Nobody will be talking about it in a year….
The "You get sucked out of a plane" quip about the Vision Pro was set up so perfectly by the first part of the episode 😂
The rabbit r1’s AI isn’t “right on it” as you mentioned but in the cloud, where it accesses its Large Action Model (LAM). The cloud placement also lets it provide a web-based portal (“rabbit hole”) for integrating external services.
Elis is a gem 😂
Man ive been following this dude for over 10 years now. Definitely one of the most nicest, realistic person on UA-cam. Super smart guy.
More of Ellis reports !! Loved it 😂❤
as a student one thing i imagined was getting it to listen to long lectures and then summarise all the information, maybe even making a pdf with all the summarised content and a voice note, and then saving that for me for later when im ready to study. that would literally shave hours off my day.
Even better, give it to a friend so you don’t have to watch the lectures and do something useful in your free time
@@radwanakel604 this is the end of university man😂
I love that google assistant made a dad joke about calling you an uber 😂
Perhaps Rabbit created the R1 device as a gimmick to generate hype that they can spin into their eventual AI app, which otherwise would have gone unnoticed 🤔
This
Ellis’ report was informative and funny as hell 😂
Ellis is gold. While you guys were discussing use cases for the Rabbit R1, I was trying to find a use case for it as well, I already have a pretty expensive phone that takes up enough space in my pocket as it is and I'm someone who triple-checks my calendars, notes, messages, etc.
I’m curious to know wether the teaching concept and learn to use PC programs like autocad. It would be an easy way to automate work if it can learn to use programs that aren’t web based
Finally, the weekend and the awesome waveform is here! Thanks, guys, and have a great weekend from the UK. Love the new car Marques
Matt!!!
Just discovered this podcast a few weeks ago and I've been loving it ever since. Went back and watched some old episodes too. It's like The whan show but if they actually talked about technology and stayed on topic
This is the best CES coverage I've seen in years!
I don't think the rabbit device will take off, but I like that companies are trying new things.
^ this comment will be laughed at in 8 years
Nah it won't. Best case, the company gets acquired by a bigger company
I'm willing to bet that it'll get acquired by a Chinese tech giant (probably not Baidu this time though... maybe Huawei)@@akinwunmioluwaseun3772
Its interesting that tech people aren't "everything" kind of tech people, we all have our own domains. As someone from domain of AI professionally, Rabbit is a hype product. Its not going to compete even with apps on your modern phones in few months time, as AI has SIGNIFICANTLY advanced recently. Running online-cloud device, when you'll soon be able to grasp hardware-based generation (so your requests NEVER leave device) is already possible for enthusiasts like myself and will soon come to global market.
Looking forward to the day we can use AI on our phones, the way that Rabbit promo video shows!
I'm also in the field of AI. I'm really curious about the architecture for the LAM. I would imagine that it is a transformer trained to predict masked app elements. This could embed the features for the elements in the context of the page and would seemingly be very useful in down-stream "action" tasks.
@@Crayphor function generating model, nothing new
What I like about this in comparison to your phone is that it is focused to do "the thing" it is intentioned to do. No distractions for ppl like me who may start down a rabbit hole when on my phone. The talk mentioned how it would help cut down on being on the phone and being distracted by all the entertainment distractions. And perfect for generations not interested in excess on the phone.
FYI about the plane thing. They error they got about cabin pressure they would test plane and it was passing that and apparently they can sometimes just have a faulty sensor that triggers it so they were not just ignoring, though maybe in hindsight should have checked some stuff but because that was a hard mounted plug on the side of the plane since their configuration didn't need that as an extra emergency exit I am guessing they assumed too much that it was to spec. I think also the phone was locked but they did not have any passcode on the phone to get in, which is wild to me!
Oh my god Ellis has just made my Saturday morning. It is so funny hearing his commentary at the show and then your guys's reaction. I'm literally laughing as I'm typing this
Ellis at CES deserves an EMMY
Omg this episode is hilarious. Ellis is a legend and a perfect fit for CES as you guys say. Couldn't stop laughing 😂😂😂
The best CES coverage so far.
One point I’d like to make about the rabbit recording feature is that it will break like macros if the recording only looks at the actual visuals of the page, but if it does any sort of network request analysis it could probably largely just map the user clicks to the important network requests and learn how to use the API for any given app. If that is the case, I think you will see much less of this “macro breaking” problem than you seem to think, as APIs tend to be more stable over time and are progressively enhanced for UI features
Disregarding the fact that private APIs changes often, the main problem of that is that APIs don't relies on authorization that is not only created _from_ your credentials (auth flow) with short-lived tokens, but also to actions made in the app to prevent Cross-Site Forgery Requests (CSRF). It would have to reverse engineer the app and API in order to understand the flows and side-effects to be useful.
Also that would be pretty bad for security if it can read network calls as those also contain sensitive information that is not displayed on screen such as passwords, authentication and authorization tokens, etc. It would be a huge liability to security.
47:30 The R1 just reminds me of those personal language translators. If it doesn't replace your phone, then it's an extra device you have to carry around. I'm sure at some point, Apple/Samsung will allow a similar function to the R1 with their respective 'action' button.
I agree Andrew. That truck was cool that back seat folding down giving you like two more feet. Awesome
The founder of Rabbit has a track record of making aesthetically cool but useless gadgets. The career of someone rich's kid.
Yeah the "doesn't matter if I loose a bunch of money, I still have millions to fall back on" vibe
how do you know?
Is it the guy talking on keynote? That's disappointing to hear.
That's an interesting perspective, but it's important keep an open mind when it comes to innovation and technology. While it's true that some gadgets may focus more on aesthetics rather than practicality, they can still have their own unique value. The founder of Rabbit may have a different vision and approach to creating products, and it's worth considering that their gadgets could bring joy or inspiration to certain individuals It's always exciting to see different ideas and designs in the tech industry.
Can you use it as a phone to make calls?
Ellis is made for TV. Top-notch reporting.
Should have had a second mic for Ellis :D
I do not know if the Rabbit device will be the final device but the concept is right. Why wouldn’t you want an actual assistant rather than a myriad of apps that you first have to load to complete a task? Isn’t that the promise of one aspect of AI? If this can work as advertised, this is a big deal and could challenge the phone we see today for your pocket space, relegating screens for larger devices. I am 70 years old and the reaction from several of you sounded like it should have been coming from me not young techies.
It's always fun to see this podcast go derailed everytime 😂
Wow~ It's a little assistant in your pocket that reminds you of your schedule when it's time to go, reads your texts out loud, and even more amazingly, knows which method to use to get the best results based on the situation. And I believe it can even predict the next outcome of an action through its vision perception of the object. It's a really, really amazing device.
The problem with this product is people will need to figure out it's main use case. But definitely this company is onto something with Language Action Model
If you're wondering, the processor in the Rabbit R1 is the MediaTek Helio P35 (8x A53 cores)
Couldn’t stop laughing when Andrew asked Marques if he’s planning to buy the new Apple Vision Headset for every member of the studio. The look on Marques’ face was priceless!😂
Time stamp?
I love how you guys explain what CES is after the live videos. Had me dying! 😂😂😂😅
Awesome podcast.Loved Ellis at Ces and definitely so Happy About Adam's News Congrats.Great Podcast so enjoyed as Usual. 💯✌️
9:55 Marques going fully country for just a second lolol
I just can't see the R1 being relevant for long. A nice idea but there's so many existing mediums and hardware that can do the same thing. Only a matter of time before we see this in smart-watches, home assistants etcetc. Will be interesting when all this tech + VR/AR headsets mature together - maybe one day we'll simply have it all via a contact lens/earbuds or smart-glasses.
I’m an audio listener and I thought the CES clips were fake, but wanted to go back and watch after the fact. Hilarious that they were actually real!
Anything Teenage Engineering touch turns to gold. I'd say most people buy this because TE is involved. I hope they continue to be involved because they design think like no one else.
An Uber to Newark Airport really killed it this episode
Keep it up guys … thanks for such a great tech podcast 👌👌
ellis at Ces is gold; drop a full segment please
The Rabbit R1 sorry to say it would be best either bought out or turned into a phone 😅.
49:45 "We know google assistant is going to get better"
LMAO meanwhile they removed features from google assistant a couple of days ago
If the Rabbit is an app you can just connect it to the action button in the iPhone and get the not taps just one click experience.
These CES correspondent segments are COMEDY GOLD. Funniest thing I've seen so far in 2024. Anything that involves Nickolas Tesla is based on strong science.
Humorous to me with David and Marquez talking about how Google Assistant is so close to being able to do all the things the Rabbit R1 can do...only for Google to lay off like 200 jobs in the Google Assistant team while stripping features from the service.
Former rabbit owner here. Rabbits make various sounds. They don't really have a main vocalization in the way cats meow or dogs bark, which is why they're considered quiet animals, but they do make sounds for different reasons.
When a rabbit is feeling threatened it will make a grunting or snorting sound.
When a rabbit is feeling comfortable and content it will make a purring-like sound by gnashing their teeth together. Essentially vibrating their teeth.
When a rabbit is in extreme duress, like thinking it will die, it will scream.
Baby bunnies will make squeaking sounds but that tends to go away pretty early.
I love the energy your team has. keep it up!
Just some feedback for Andrew, the upspeak is crazy man. He literally talks like this? Like? Every single thing is a question? I _wish_ I could unhear it? But once you notice it it's kind of hard not to? It'd be ok if it were here or there but it is virtually every single sentence. 10:13 - 10:35 but really it's all throughout. 12:30 -12:40 literally 4 examples per 10 seconds lmfao this could be a podcast game I think the editors should try and count every single time Andrew does this with a sentence?
Ellis was great and great podcast otherwise
David's laugh warms up my heart, love you David
34:07
I wish project LINDA took off because I remember back in 2012, Logitech had an app you could download that would turn your phone into a universal track pad. I remember using my iPhone 5 for that feature.
Here’s the obligatory “It survived the fall from the plane ‘coz it’s on airplane mode” joke. You’re welcome?
The time stamps are awesome! Thank you!
The rabbit is gonna be like the raspberry pi of ai assistants
somehow this has accidentally become one of my fav shows on youtube
For me, I don’t even care like Google can replicate functionality of the rabbit, I’ll still get the rabbit - how often do we see cool gadgets like this
There's no way Rabbit isn't acutely aware that Google can do exactly what they're doing. They want Google or someone else to buy their LAM. No one is ever going to buy the device. It's just a gimmick to get people interested in their software. Also at 1:05:27 David literally described a Turing machine haha. This is Alan Turing's future and it makes me so happy to see him getting the recognition he deserves.
I think the Rabbit commercial is a borderline fraud. I really think so.
Awesome finally we have real world Jarvis now
Speaking as a paratrooper, I have a buddy who dropped his iPhone from his pocket upon exiting the bird at about 11 to 1200 ft AGL. Terminal velocity for an object with that mass and shape I suspect is definitely reached before impact. We later found his phone perfectly fine on the drop zone.
I have a blind friend. She and I brainstormed a bunch of ideas that could be used with Rabbit R1. This devise destroys what smart phones are doing right now. Read medication, expiration dates, help cross streets at intersections. Interact with smart devices (Alexa is doing it already but there's so much more) my elderly parents can't use a smart phone. They have a tough time using messenger and can't video chat with family unless the young members of the family set it up. This would definitely simplify everything.
Those vibration plates are actually really helpful after whiplash. i was skeptical also, but it helped. I am not sure comparing it to amethyst is comparable. It really helps relax muscles and short periods in 2-3 Yoga poses feels like an hour in a session.
7:25 the fact that i actually got an ad from mous, saying that the phone that fell out of the plane used a mous case. And then in the same video you talk about this and mention someone should do something like that is amazing and a little concerning.
😂😂I was ROFL throughout the video in reference to Dr Fuji at CES!! But also informed. Keep up the good work!
For me the R1 is exciting because of 2 things: no subscription fees for access to their model and the teach thing. It is basically a model that learns interfaces, can I teach it AutoCAD? GIS? Unity? , even better, can I use footage to teach it? And yes it us "like a macros" but far from it technologically. I also saw on their website that they want to have a store, they are doing that.
54:45 - It's not that the R1 is phone sized, it's that a phone is hand sized. The R1 is a 'hand'-held device, after all.
I regard the R1 as an electronic notebook. However, instead of referring back to my notes to be reminded of which tasks I need to complete, each 'note' takes care of those tasks for me too.
Very cool project but here are some TIPS I think will be good :
1. After giving the rabbit a complicated task like the trip booking, it may takes time ,, so I suggest add an option to initiate a new task ,so that we don't get stuck for that one task to finish ...kinda like Tabs on Chrome ,,with a progress bar that shows how much individual tasks are completed.
2. Another annoying problem I face is the image recognition scans everything ,,,, so ,like in the fridge demonstration , it scans all the elements in the fridge which is great but in reality there are many weird stuffs in normal peoples fridge like Beer or chocolates ,,,, so I want a option to select few options and get suggestion from them ..kinda like I take a photo of the fridge and round the eggs and butter only and told the Rabbit to give a recipe out of this ingredients ...
3. It will be crazy helpful if it can detect hands pointing to something and what it is pointing to ,,, as humans points things the most to show problems that they don't know ,,, like , imagine you can point at a car with your hand and ask the Ai what model is it ,,,, or a fruit/plant you don't know name of can be recognized by the Ai if we point at them and ask what is this one thing ,
3. Another suggestion I would give is to show the actual process of the Ai doing things ,,,, iit will be way more cooler to see it doing stuff in action like opening apps and selecting options rather seeing a loading screen ..., like Jarvis
4. Voice recognition but not like robots ,,,,, I cant say the entire trip planning in one go ,,, voice recognition nowadays is mostly useless cause humans needs time to think and say things ,,,, there are filler words ,,, can you imagine writing a script on paper to read it in one go without any fillers , also if you wait a bit long between words the Ai just take that as end and starts searching ,,,,,, so I suggest adding a recording option ,so that we can record what we need and take our time explaining and then the Ai can analyze the task in the record .... this will be super helpful
I have more suggestions too but these are the most crucial flaws I have noticed using various Ai tools - ALL THE BEST FOR THE FUTURE
Full marks for everyone for keeping a straight face on that Doctor Fuji segment😂
The midgate bed is something that the chevy avalanche had. Great idea that I think more trucks should have.
I was expecting to find this mentioned a few times, I did a quick search (control+F) and this is the Only comment to mention the Avalanche.
I guess there isn’t much crossover between older truck/suv hybrids and tech podcast listeners haha.
Ellis made those CES reports hilarious 😂
“Sounds like their problem” is the highlight of today’s episode. ❤
6:40 the fact that nobody got hurt was that the plane was still only 6 minutes after takeoff and still attempting to gain altitude. Everyone was still wearing their seatbelt, and it was only 13,000 ft at time of failure, and cruising altitude is 30,000 ft.
Late Night with Conan O'Brien vibes from the 'Ellis Reports' segment 🤣😂
ellis should post the whole thing coz damn what else did he discover 🤣🤣 im locked in
Dr. Fuji was it? About that whole "universe is vibrating" bit. That was so great. I laughed so hard LMAOO
I like the point you made about trust with the r1. The device really appeals to me, but I simply can‘t believe it does stuff right.
The R1 seems very useful for the blind, elderly, and maybe children to an extent. This may not be for everyone, but it does have those who can benefit from it. We all age, so there will always be consumers for it.