Air Canada Required To Pay After AI Chatbot Gives Bad Advice

Поділитися
Вставка
  • Опубліковано 17 лют 2024
  • Patreon: / runkleofthebailey
    Locals: runkleofthebailey.locals.com/
    BTC address: bc1qdqzpz6ny6w35qyl2rnasshjm60jvwcjgllwcay
    All comments for information only. Do not take anything as legal advice--if you have a legal issue, contact a lawyer directly so that you can received advice tailored to your situation. All views expressed are solely those of the creator.

КОМЕНТАРІ • 291

  • @worsethanjoerogan8061
    @worsethanjoerogan8061 3 місяці тому +215

    I feel like it would have cost Air Canada much less to give this guy his discounted fare than to pay lawyers to fight it in court. Guy's Grandma just died, maybe cut him a break...

    • @TKing2724
      @TKing2724 3 місяці тому +31

      It was small claims and the lawyers never actually had to go to court. The bad publicity is what should motivate them more than the small amount it cost to have their lawyer on retainer submit a letter to the court.

    • @spantigre3190
      @spantigre3190 3 місяці тому +10

      They're not saving on this case, but would staffing a customer service dept be more expensive than this lawsuit?

    • @camberweller
      @camberweller 3 місяці тому +25

      "Corporate cultures" exist. AC has for many decades become accustomed to -- shall we say -- not being customer-centered and not being held accountable, to either individual customers or regulators or elected governments. That has to become habit.

    • @elenabob4953
      @elenabob4953 3 місяці тому +15

      The good thing is that they have created a precedent.

    • @Kalysta
      @Kalysta 3 місяці тому +14

      Air Canada also losing in the court of public opinion on this one

  • @TheMorrogoth
    @TheMorrogoth 3 місяці тому +187

    So they decided to fight this in court - Instead of just give the guy his discount?...
    Man... That greed was real!

    • @elenabob4953
      @elenabob4953 3 місяці тому +29

      Because they bet that the guy won't push it further but due to this they have created a precedent that would be used in future lawsuits when the AI gave the wrong information.

    • @etherealessence
      @etherealessence 3 місяці тому

      Stupidity really. They allowed a precedent to be set.

    • @Ms.Pronounced_Name
      @Ms.Pronounced_Name 3 місяці тому

      ​@@elenabob4953no court of appeals, so I don't think there's any binding precedent

    • @TheMorrogoth
      @TheMorrogoth 3 місяці тому +18

      @@elenabob4953 I mean... His case was pretty cut and dry... It was small claims and he didn't need to pay a lawyer (which they did)...
      You don't shit where you eat... Businesses shouldn't try and gouge their customers when anyone can see where the wrong is (which wouldn't have made this a public ordeal).

    • @shinpai_uwu
      @shinpai_uwu 3 місяці тому +1

      No where but Canada!

  • @dinaanand6388
    @dinaanand6388 3 місяці тому +102

    Having a technical background with AI, I can say the issue of marketing and hype running ahead of engineering is a serious problem. Making companies liable, I think, is the only answer

    • @linus1703
      @linus1703 3 місяці тому +2

      Absolutely, the technology has these issues right now but there is no incentive for them to solve it if they aren't held accountable

    • @jarvy251
      @jarvy251 3 місяці тому +7

      I think a lot of execs are seeing "AI? You mean like a nonperson?" and thinking they can have the best of both worlds, an employee they don't have to pay, and can shove accountability onto. Pretty happy to see this theory fall flat on it's face immediately.

  • @camberweller
    @camberweller 3 місяці тому +112

    Air Canada's defence should have been "we have never, ever been held accountable for taking people's money and then screwing them before, so why now?" ;)

    • @josie4401
      @josie4401 3 місяці тому +25

      "but your honor, i scam people out of their money all the time. how was i supposed to know there would be consequences this time?"

    • @etherealessence
      @etherealessence 3 місяці тому +5

      Facts

  • @michaelblacktree
    @michaelblacktree 3 місяці тому +11

    Ironically, it seems the AI chatbot had more humanity than the actual people in Air Canada. 🙄

  • @TimPeterson
    @TimPeterson 3 місяці тому +44

    the policy the chatbot came up with us actually a better policy than the official one though. trying to jump through all their hoops while trying to book a flight while your family member is in the hospital about to pass away is less than ideal

    • @osmia
      @osmia 3 місяці тому

      +

  • @AkiSan0
    @AkiSan0 3 місяці тому +80

    this would have been so much cheaper for air canada, if they would be ACTUALLY compassionate and would have said "sure, here is your discount"

    • @AkiSan0
      @AkiSan0 3 місяці тому +14

      i think corpos should get a yearly training on streisand effect and internet. the damage they will get just from this becoming public is thousands times worse than the 0.36 cents...

    • @annm9589
      @annm9589 3 місяці тому +1

      I was a college student flying back from Champaign, Illinois to Alaska when a Washington volcano erupted and messed up the flights. I had an aunt in Chicago, but relied on their info to fly beyond Chicago and was grounded in Portland for 3 days. I kept the receipts for every meal I ate as I had little money and it was “act of god” so they were not even putting ppl up in hotels. I slept in the airport because I couldn’t pay hotels. When I got home I wrote United explaining my issue that my Aunt could have picked me up in Chicago. They gave me a $800 coupon towards any next flight. This was in the 80s. I fly United until this day because of that customer service. It goes a long way. And don’t you think repeat stories about their service when treated well? Of course they do.

  • @danielweston9188
    @danielweston9188 3 місяці тому +36

    In 1975 a college roommate had to fly to his grandfather's sudden funeral from Portland to Seattle. We all raised cash for him to go it was maybe $200. The ticket agent (on the phone) told him to ask the stewardess about filling out forms for a discount once on board. He did. When he got to the gate in Seattle he was met by an agent with a envelope with cash and asked him if he had a ride to his destination (he did). They had confirm the death by the newspaper notice with information that was radioed ahead. That's how you do this - I was impressed (Alaska Airlines).

    • @annm9589
      @annm9589 3 місяці тому +2

      Alaska Airlines always prided itself on good customer service. I’m sure they have obtained and retained a lot of customers by small acts like this. Thanks for sharing.

  • @daleannharsh8295
    @daleannharsh8295 3 місяці тому +28

    Far too often I believe these companies rely on the fact that Most people won't follow up. I'm glad this passenger did.

  • @RN1441
    @RN1441 3 місяці тому +56

    I think it's totally rational that a company is liable for offers and representations made by the chatbots they use in the same way they would be if it was an employee doing it.

    • @irinam31
      @irinam31 3 місяці тому +10

      "Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives." So if a customer service agent would have said it, they still wouldn't have accepted the liability.

    • @nnelg8139
      @nnelg8139 3 місяці тому +12

      I think they should be held to an even stricter standard, because chatbots are not independent agents which could simply choose to go rogue in the way of a human. The best analogy I could think of is if you trained a dog to carry meals to customers, and the dog decides it wants to eat that food itself, it's your responsibility, not the dog's. Except chatbots are much, much dumber than dogs, so it's more like training an ant colony.

    • @Friendstoyourend
      @Friendstoyourend 3 місяці тому +2

      @@nnelg8139 actually Ants are VERY intelligent
      it'd be more like training a non jumping spider spider

  • @EarnestWilliamsGeofferic
    @EarnestWilliamsGeofferic 3 місяці тому +15

    The idea that you can't be held liable for information provided by your agent ... that basically rewrites the law about what an agent is. Preposterous argument. Honestly, that should get the attorneys censured.

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому +6

      Yeah, I was like, "Since _when_ ?" 🧐

    • @annm9589
      @annm9589 3 місяці тому +4

      Maybe a bot wrote the argument

  • @chrisp1601
    @chrisp1601 3 місяці тому +16

    The AI Bot writing better, more compassionate policies than humans.

  • @johngordy8071
    @johngordy8071 3 місяці тому +21

    WHY is ANYBODY surprised that Air Canada would behave in that manner? After all, the Universe revolves around their narcissistic butt-cheeks, doncha know?

  • @rancorslayer5228
    @rancorslayer5228 3 місяці тому +9

    Air Canada would rather pay a lawyer thousands than give this guy the $900 he’s owed. Not surprising considering I’ve never heard anything good about Air Canada.

  • @curteaton
    @curteaton 3 місяці тому +21

    Its terrible that had the advice been given by a person the AC wouldn't have owed him the cash. The telephone companies do it all the time. The 'contractors' promise packages that don't exist then the company won't honour the deal.

  • @ros8986
    @ros8986 3 місяці тому +15

    $800 expense to Moffat - $millions in bad publicity

  • @InvadrFae
    @InvadrFae 3 місяці тому +37

    I mean if it is their AI chat bot, and it gives bad information...yeah it makes sense that they would have to pay

  • @oxylepy2
    @oxylepy2 3 місяці тому +17

    Holding companies 100% liable for AI is the main current way that it can be kept in check. In this case punative damages should probably be applied to show the severity of the company's neglegance, but that's a whole other kettle of fish

  • @matthewsecord7641
    @matthewsecord7641 3 місяці тому +8

    It's kind of aggregious that Air Canada were just such jerks over this.

  • @wrldvw1836
    @wrldvw1836 3 місяці тому +11

    This was beyond stupid on Air Canada’s part.

  • @txjchacha1163
    @txjchacha1163 3 місяці тому +9

    I used to work in customer care, if there's a misrepresentation like that and it's a big impact over a set number of $, we just make it right. Costs less than lawsuit.

  • @wildandliving1925
    @wildandliving1925 3 місяці тому +5

    It would've been nothing if they just worked the price out and refunded him....

  • @jeanetteinthisorn4955
    @jeanetteinthisorn4955 3 місяці тому +12

    Air Canada has such a monsterous reputation, this did nothing to tarnish it. 😂

  • @human0id
    @human0id 3 місяці тому +5

    As someone who has had the opportunity to request a bereavement rate, the fare is not offered at cost or at a deep discount anymore.
    It may be at a slight discount, but typically the main advantage is the ability to reschedule last minute for no extra charge and possibly free checked baggage. It also does not apply to economy fares, meaning you can often find a cheaper fare.

    • @annm9589
      @annm9589 3 місяці тому

      Yes we found this to be the case. It wasn’t much cheaper. We even shopped several airlines. It was discounted and they waived change flight fees

  • @PleasantSkulman
    @PleasantSkulman 3 місяці тому +30

    I wonder if a chatbot suggested these 'remarkable' defences...

    • @AnastasiaRoseDamrau
      @AnastasiaRoseDamrau 3 місяці тому +3

      Funny story. Facebook is currently showing me ads for an AI-powered plug-in for Microsoft Word that allegedly helps write legal documents.

    • @michaelblacktree
      @michaelblacktree 3 місяці тому

      LOL

    • @annm9589
      @annm9589 3 місяці тому +1

      I think there already was a case where AI “invented” cases to cite in the brief

    • @AnastasiaRoseDamrau
      @AnastasiaRoseDamrau 3 місяці тому

      @@annm9589 There was! In New York state. He said his subscription to Westlaw had gotten messed up and he thought ChatGPT was a research tool.

    • @PleasantSkulman
      @PleasantSkulman 3 місяці тому +1

      @@annm9589 Yeah I remember that. That case was airline related as well.

  • @edp8592
    @edp8592 3 місяці тому +15

    Interesting case. One point, if possible for clarification, AC claimed that 'it cannot be held liable for information provided by one of agents, servants, or representatives". Does this mean that if one of their 'agents, servants or representatives,' makes an error in favour of the customer, that error can be rescinded at a later point in time and the customer is subsequently liable? If so, then how do we, as customers wanting information, trust the information that is given to us?

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому +4

      That policy isn't enforceable. A company is _always_ liable for what their agents or representatives do. They are acting _AS_ the company itself. It's laughable that the company would even say such a thing!

  • @spicyvetmedgeek
    @spicyvetmedgeek 3 місяці тому +23

    I'm not shocked given this is Air Canada.

    • @jamesgorman5241
      @jamesgorman5241 3 місяці тому +4

      I don't live in Canada but there seems alot of people that dislike them. Are they really bad in dealing with customers?

    • @DoritoBot9000
      @DoritoBot9000 3 місяці тому

      @@jamesgorman5241they have legendary bad customer service

  • @stevenrobinson6544
    @stevenrobinson6544 3 місяці тому +19

    I hate Chatbots

    • @phantomkate6
      @phantomkate6 3 місяці тому

      ​@@JK-gq5rlI am very intelligent! 😂

  • @MurphysLaw996
    @MurphysLaw996 3 місяці тому +14

    It’s funny how the legal language differs from common usage. Usually the word remarkable is part of a compliment in common usage. But in legal language, remarkable is used to point out invalid, dishonest or full on stupid arguments.

    • @KaiHenningsen
      @KaiHenningsen 3 місяці тому +8

      "This is remarkably stupid" is perfectly common usage. I don't think it's a compliment.

    • @MurphysLaw996
      @MurphysLaw996 3 місяці тому +2

      @@KaiHenningsen yes remarkable followed by an insult is common usage. But just saying that “your comment is remarkable”. In common usage, someone would think that it’s a good comment while in legal usage it means that it’s either invalid, dishonest or stupid.

    • @phantomkate6
      @phantomkate6 3 місяці тому +2

      Hmm. Sometimes this seems like the case with legal language, though I can't call any examples to mind right now.
      I did not think of "remarkable" as an exclusively positive adjective and don't use it that way, myself. I think I see how someone could make that assumption, though.
      I just thought it meant noteworthy or extraordinary (maybe that's another word in the same boat), in either a good or bad way!
      ETA: The Cambridge dictionary seems to support my description, if that makes any difference.

    • @briant7265
      @briant7265 3 місяці тому +5

      Remarkable: Worthy of remark. By itself, it is neither positive nor negative.
      Runkle's point is that judges seldom find sound legal arguments to be "remarkable". Thus the judge's comment is certainly a "nice" way of calling the argument stupid or laughable. Gotta maintain decorum.
      Like if you're building a house, and the inspector shows up and says, "I've never seen it done this way before." He's not admiring your workmanship.

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому +2

      I've seen a judge use "incredible" when referring to a litigant's claims. Pretty sure it was the most recent smackdown by Judge Hunter Carroll to Johns Dropskids in the Maya Kowalski matter. 😆

  • @NameGame935
    @NameGame935 3 місяці тому +6

    Thank goodness Mr. Moffatt had that screen shot of the bot convo!!!

  • @Arglefaster
    @Arglefaster 3 місяці тому +10

    At about 5:20 or so I see
    "Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives...". Really? So, an Air Canada agent tells you something and you can't hold Air Canada to it?

    • @jimmyzhao2673
      @jimmyzhao2673 3 місяці тому +4

      Air Canada lawyers ought to look up the legal doctrine of "respondeat superior"

    • @phantomkate6
      @phantomkate6 3 місяці тому +1

      Remember Future Shop? That was their MO with their sales associates, who would tell customers absolutely anything to get them to buy a product!

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому +2

      That policy isn't enforceable. A company is _always_ liable for what their agents or representatives do. They are acting _AS_ the company itself. It's laughable that the company would even say such a thing!

  • @spantigre3190
    @spantigre3190 3 місяці тому +22

    I wonder if Air Canada will turn around and try to sue the AI chatbot.

    • @KaiHenningsen
      @KaiHenningsen 3 місяці тому +3

      Well, a legal entity responsible for the chatbot. Sure, depending on what their contract says, they might be able to do that. Nothing fundamentally wrong there. As long as what looks like an agent to a reasonable customer is the responsibility of the airline to the customer. The customer does not - and should not have to - care how they structure their business. That's their responsibility.

    • @kreglamirand2637
      @kreglamirand2637 3 місяці тому +6

      *Chatbot offers $200 voucher to Air Canada*

    • @klaykid117
      @klaykid117 3 місяці тому +1

      It really depends on the specifics of the purchasing contract of the chatbot. If the AI company is smart then it would probably say something like "we are not responsible for the chatbot providing misinformation. You should clearly label at the start of a conversation that the chat bot is an AI and might give false information"

    • @streitrat
      @streitrat 3 місяці тому

      They only started putting that disclaimer in _after_ a *lawyer* relied on it!

  • @michaelhart7569
    @michaelhart7569 3 місяці тому +13

    You would think they might instruct the chatbot to advise customers to also check the written terms, just to make sure, and direct them there. I certainly couldn't trust a chatbot in these matters.

  • @rcmrcm3370
    @rcmrcm3370 3 місяці тому +13

    Those coupons expire real fast too.

  • @elenabob4953
    @elenabob4953 3 місяці тому +7

    In Europe we have a thing that says that the customer should be informed correctly and completely regarding the product they are buying so hopefully we won't have such situations.

    • @DoritoBot9000
      @DoritoBot9000 3 місяці тому +1

      Consumer rights are barely existant in Canada compared to the EU 😔

  • @l.baughman1445
    @l.baughman1445 3 місяці тому +7

    This one case may represent only a smidge$ of money. Look at the long game: what is the aggregate loss to the flying public customers who have all been misled, lost money but did not sue? This unfair profiteering by Air Canada has now been called out saving others from their chatbot fails. Consumers beware always. Screen shots, save receipts, emails, and read as much of the fine print as you can manage.

    • @annm9589
      @annm9589 3 місяці тому

      Yes if he didn’t screen shot he would be sol

  • @kingjames4886
    @kingjames4886 3 місяці тому +4

    every step of this is them just trying to make things as difficult as possible so people won't bother... cuz that's how these companies operate.

  • @SpencerHHO
    @SpencerHHO 3 місяці тому +3

    This reminds me of when an eating disorder charity service decided to use a "therapist AI" chat bot that almost immediately went on to suggest dieting hacks to anorexic users talking about their body image issues.

  • @lizbird9628
    @lizbird9628 3 місяці тому +1

    The stress of court ... further damages during a period of grief 😢

  • @benjaminshropshire2900
    @benjaminshropshire2900 3 місяці тому +2

    "the chat-bot is a separate legal entity" sounds like sov-AI... now *that* would be a novel theory!

  • @IsaardP
    @IsaardP 3 місяці тому +3

    So I'm reading that a company can be held responsible for what their website says, chatbot or not.
    If a chatbot tells you there is a deal where $20 gets you flights for life, that's the same as Air Canada advertising it on the front page of their website.
    Same as a grocery store putting the wrong prices up, it doesn't matter if it was a 'mistake' - that's what you're advertising and you have to honor it.

    • @phantomkate6
      @phantomkate6 3 місяці тому

      Interesting. I recently encountered a Canadian company's website that offered free shipping, only to be told by customer service that they didn't offer free shipping 'anymore.'
      It certainly is very annoying, not that people would have the time or resources to pursue the smaller-value instances of this.

  • @bufordhighwater9872
    @bufordhighwater9872 3 місяці тому +4

    You'd think that if someone were to act in bad faith and manipulate a chatbot in some way for their own financial benefit, that individual would open themselves up to criminal charges of fraud or something.
    Also, you have to wonder why Air Canada wouldn't just reimburse the guy immediately. Anyone can see that reimbursing him immediately would have been the company's best option. It would have saved them time, money, and labor. Not only would they have avoided negative publicity, but they could have used it to promote excellent customer service for marketing.

    • @georgeorwell3501
      @georgeorwell3501 3 місяці тому

      Because it’s a communist company. Government screws EVERYTHING they touch.

  • @adifferentkennybaker
    @adifferentkennybaker 3 місяці тому +5

    This is more of a story of lawyer incompetance. I would hate to be the law firm in this one. But it doesn't do much for the rest of us, because the lawyers won't forget the contract showing we can't sue next time.

    • @LynxSnowCat
      @LynxSnowCat 3 місяці тому +3

      I kinda feel like the Air Canada's representative wasn't enthusiastic about trying to 'win' this argument.

    • @annm9589
      @annm9589 3 місяці тому +1

      That was my feeling. They didn’t even included the contract referenced

    • @electrakate
      @electrakate 3 місяці тому +1

      The CRT does not allow lawyers (except in specific circumstances) - it is meant for lay people and is managed fully online. it is only for claims below $5k and is a simlified process. I can't imagine any in-house lawyer (or paralegal) would support defending such an action to the point of an adjudictor's decision. The optics of this is terrbile given the subject matter of death and the fact that their own chatbot gave incorrect information. it sounds to me that someone inexperienced in dealing with legal matters must have handled this.

  • @jackielinde7568
    @jackielinde7568 3 місяці тому +8

    QUESTION: Because this was a ruling, what does the judge's opinion on the role of a chatbot within a company's website mean going forward? Is this considered case law since it was only small claims court?

  • @claire2088
    @claire2088 3 місяці тому +3

    I hope that companies think twice about treating people like this in the future- it's such a trivial amount of money to them as a company but a significant amount for an individual, especially as it's an emergency flight- it's not like you can add 'grandma dying in april' as a item to budget for at the start of the year. The argument that they're not responsible for the info they provide is ridiculous (and like you said it's a reasonable thing to believe that a company might structure something this way, a stressed out person booking a last minute flight to get to a funeral would probably relieved that they can get the flight booked and worry about all the admin after)

  • @dotarsojat7725
    @dotarsojat7725 3 місяці тому

    What was lacking on the part of Air Canada, was ACTUAL INTELLIGENCE.

  • @jimmyzhao2673
    @jimmyzhao2673 3 місяці тому +2

    The sheer brilliance of the idiotic defense strategy.

  • @michaeldeleted
    @michaeldeleted 3 місяці тому +1

    Air Canada's chatbot is a freeman on the land.

  • @cttommy73
    @cttommy73 2 місяці тому +1

    Considering every company is going the way of the AI chatbox, yeah, letting your chatbox provide the wrong info should be found in favor of the customer.

  • @phred196
    @phred196 3 місяці тому +3

    In The Terminator didn't the apocalypse start when they tried to shut down a chatbot? Could this mean that Air Canada is responsible for the extermination of mankind? And if that is the case, what would the appropriate damages be for a settlement? Mankind v Skynet?

  • @gbalfour9618
    @gbalfour9618 3 місяці тому +2

    What the hell, as a company I’d be like ‘I am so sorry for it all sure let me see what I can do’ then just do it. Take down the chat bot and move on with my life. How did this even make it to court?

    • @TheLabRatCometh
      @TheLabRatCometh 3 місяці тому +1

      Telling some random grieving person to pound sand costs the companies representative nothing, taking down the chatbot is you publicly announcing that several fairly senior IT and PR managers did something stupid! Which path would you like to go down and do you still want to be employed tomorrow.

  • @carlknibbs2849
    @carlknibbs2849 3 місяці тому +1

    They really need a massive fine since they have argued a stupid case in the hopes the guy just gave up

  • @user-jm8ho2hy8g
    @user-jm8ho2hy8g 3 місяці тому +3

    I am really glad he sued because of the chatbot

  • @Snowdog070
    @Snowdog070 3 місяці тому +2

    Good to see a common person prevail over "The Man".

  • @brianwaayenberg3099
    @brianwaayenberg3099 3 місяці тому +3

    I mean. Why not make it 90 days to submit proof?
    I’m in an emergency and need to get moving MEOW!
    Wait while I fill out this aircanada paperwork…..
    Cum on

  • @tyrannosaurusimperator
    @tyrannosaurusimperator 3 місяці тому

    Air Canada has a better bereavement policy than most US airlines. When my grandfather died, the best policy we found was "no rebooking fees, but you're still getting bent over for booking last minute"

  • @ergosum5260
    @ergosum5260 3 місяці тому +2

    -and they want to be compassionate-
    ...and the want to _advertise_ compassion

  • @briancox2721
    @briancox2721 3 місяці тому +2

    AI is going to make agency law wild in the very near future.

  • @redslate
    @redslate 3 місяці тому

    Props to the Canadian Judicial System. This is the precedent we need to keep companies in check.
    Far too many businesses are rushing to roll out untested, infantile "AI" without recognizing the risks/consequences.

  • @Kalysta
    @Kalysta 3 місяці тому +1

    A chatbot is not a person! *Skynet has entered the chat*

  • @debasishraychawdhuri
    @debasishraychawdhuri 3 місяці тому +3

    "AI will replace all jobs" 😅 and make everything shittier.

    • @jimmyzhao2673
      @jimmyzhao2673 3 місяці тому +1

      Things will get so bad that people will beg for someone to save them, that's when Klaus Schwab & the WEF will swoop in to 'Build Back Better'

  • @bartsanders1553
    @bartsanders1553 3 місяці тому +2

    Tom Servo's Canada Song intensifies.

  • @Spartanixxx
    @Spartanixxx 3 місяці тому

    Im shocked that Air Canada would do this, they totally don't have a history of anti consumerism and screwing over anyone who uses their company.

  • @terrylazurko2476
    @terrylazurko2476 3 місяці тому +1

    How much did they pay the lawyer to argue this? So much more than the small amount that they paid him.
    This is corporate greed.

  • @molybdomancer195
    @molybdomancer195 3 місяці тому +2

    I thought companies are responsible for what employees or other “agents” say when acting on behalf of the company.

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому

      Their policy isn't enforceable. A company is _always_ liable for what their agents or representatives do. They are acting _AS_ the company itself. It's laughable that the company would even say such a thing!

    • @TheLabRatCometh
      @TheLabRatCometh 3 місяці тому

      Air Canada has an exception :)

  • @RichardDanielli
    @RichardDanielli 3 місяці тому

    Hammers can build walls and destroy them, it is nice to see a court that understands that the hammer is not at fault...

  • @sharihazlett3774
    @sharihazlett3774 3 місяці тому

    If Chatbox is giving this advice, Air Canada should give him a discount. It sure would be less money to give him the money. Considering the bad press, they should have given him money

  • @krab1791
    @krab1791 3 місяці тому +1

    At least in the USA bereavement rates are rarely low. I have had to book flights for funerals several times and have never found a bereavement rate lower than a standard rate I can find by searching several airlines.
    In additional, bereavement rates generally come with a ton of restrictions.
    The offer of a coupon is ridiculous. After the treatment they gave him it’s unlikely he would choose to fly Air Canada within the near future. Any coupon would expire before he decided he wanted to fly Air Canada again. In addition to which the coupon just generates more business for Air Canada.
    You have to ask yourself how many times this has happened for them to decide to spend the money on the lawsuit. Or Air Canada has decided not to go thru the expense of fixing the chatbot and is trying to head off any future issues - this guy took a coupon so you need to as well.

  • @justincase4812
    @justincase4812 3 місяці тому

    Humans will always be more deplorable than AI.

  • @nomanejane5766
    @nomanejane5766 3 місяці тому

    I can't believe air Canada got a fancy lawyer and they still dropped the ball, like this.. amateur hour

  • @lauraweiss7875
    @lauraweiss7875 3 місяці тому +1

    I’m a long-time IP paralegal, and I’ve seen everyone get excited that AI was going to do every job for every company. I’ve been continually saying that litigation based on liability will put the breaks on the AI takeover. Not unlike self-driving cars, the buck stops where the liability lands.

  • @fayej6591
    @fayej6591 3 місяці тому

    Yay! I’m so glad you covered this!

  • @Ieezeca
    @Ieezeca 3 місяці тому +2

    Runkle Im always happy after your streams, then your outro music makes me melancholy to the core!!

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому +1

      I freaking _hate_ that outro music!

  • @franklinfamulski8638
    @franklinfamulski8638 3 місяці тому +1

    I think it more so just speaks to the unfairness of the policy and that whether it's a person or a chat bot could be mistaken or confused by the lack of common sense.

  • @LyonByTheSea
    @LyonByTheSea 3 місяці тому

    Its amazing that companies that have a policy that can be made good on would do this petty crap. Oh well stupidity never takes a vacation. Thanks for the video.❤

  • @bostonbruinsfanboy
    @bostonbruinsfanboy 3 місяці тому +1

    Oh god, Air Canada 🤮

  • @juliemunoz2762
    @juliemunoz2762 3 місяці тому

    we consider corporations as people for legal purposes, so why wouldn’t we consider AI the same? i could see this happening in the future.

  • @mdl8767
    @mdl8767 3 місяці тому +2

    Am I wrong? I thought a corporation is in fact liable for the representations of their agents, representatives, or servants.

    • @ThatRedhedd
      @ThatRedhedd 3 місяці тому

      Their policy isn't enforceable. A company is _always_ liable for what their agents or representatives do. They are acting _AS_ the company itself. It's laughable that the company would even say such a thing!

  • @xenialafleur
    @xenialafleur 3 місяці тому

    Go to any chatbot and ask it a question that has no answer. It will invariably make something up.

  • @barbarabunker2791
    @barbarabunker2791 3 місяці тому

    I was happy to see you addressing this case. I too would have sent it to you otherwise. Before this, I have never thought of those chat offers on websites as a “bot,” and it hadn’t registered to me as AI. I’m slowly (at 75) starting to “get it”! As always, wonderful breakdown, Ian! Barbara in Colorado

  • @jackiestowe6987
    @jackiestowe6987 3 місяці тому

    I must have been talking to a chat bot at the insurance company last month. They didn’t want to pay for a particular medication. I had been on the exact mediation for years. They wanted a letter from my doctor. I had to pay for it out of my pocket. The insurance company tried to get me to use a discount card. If I hadn’t ask about payment back the insurance company wouldn’t have paid me back if I had used that discount card. Liars and scammers. I got reimbursed for all of it! But they were not happy that I had outsmarted them!

  • @tomhalla426
    @tomhalla426 3 місяці тому +1

    Eugene Voloch, of the Voloch Conspiracy website, has something of an ongoing theme of AI hallucinating convincing legal references that are as libelous as they are imaginary.

    • @marlbboro8091
      @marlbboro8091 3 місяці тому

      Remember the lawyer in NY who relied on chat got who hallucinated and made up cases? 🤦🏽‍♀️

    • @tomhalla426
      @tomhalla426 3 місяці тому

      @@marlbboro8091 Voloch has a regular reporting of lawyers who were sanctioned for using ChatGPT or other AIs.

  • @LibraInSeattle
    @LibraInSeattle 3 місяці тому

    I found this story really interesting and I love that the gentleman won this case against a big company like Air Canada. Chat bots are rarely if ever helpful. I find them to be just as frustrating as IVR systems. Just let me speak with a human being. Bots just provide links to the company website and you usually end up needing to contact a customer service representative or tech support eventually to get your issue resolved. I’m thinking of my experiences with Comcast, Geico and Verizon chatbots.
    I work in the medical insurance industry for a smaller locally owned, not for profit company in the medical billing department and we recently implemented a chat system for our customer service department but it’s run by our customer service team and the chat is answered by actual people. What a novel concept. If a company the size of my employer (around 3,250 employees) can seamlessly integrate a live chat system, why can’t a larger business? Maybe it’s the number of chats that they get? There has to be a better way than these useless bots that end up telling you to contact customer support in the end. At least that’s been my experience with chat bots.

  • @gblargg
    @gblargg 3 місяці тому

    They're just learning that AI bots just make up things and act confidently that it's correct.

  • @momof1576
    @momof1576 3 місяці тому

    More people need to start suing airlines and cruise lines for breach of fiduciary duty. This isn’t the first time they’ve pulled this you can be sure of that.

  • @TownGirl04
    @TownGirl04 3 місяці тому

    Glad they paid!
    Even more sad is I wouldn't recognize a chat bot. I would think it's real person talking to me.

  • @user-gh4lv2ub2j
    @user-gh4lv2ub2j 3 місяці тому +1

    I can trigger chatgpt to give medical advice.

  • @ChristopherDinh
    @ChristopherDinh 3 місяці тому

    Sounds like the law firm was sold a chatbot from the same company responsible for the airlines customer chatbot.

  • @FionaC1
    @FionaC1 3 місяці тому +1

    “Sophisticated litigant” = they get sued a lot? 😂

    • @electrakate
      @electrakate 3 місяці тому

      every large company gets sued a lot.

  • @Oblithian
    @Oblithian 3 місяці тому

    Chat bots were a bad idea 10 years ago, the fact all these companies are trying to use them is going to bite them in the end.

  • @robertnessful
    @robertnessful 3 місяці тому +1

    Part of this is bad AI design. If the full web site has a bereavement travel page, the chatbot's database should have that loaded as the first thing it accessed when addressing a customer question about bereavement travel.

  • @matityahubermanfalk3127
    @matityahubermanfalk3127 3 місяці тому +1

    Having the chatbot provide links to the relevant page on the website explicitly (rather than a summary where you might miss the link's existence) would be almost as useful but not have any of the risks of using an AI that might misunderstand things.

  • @josoffat7649
    @josoffat7649 3 місяці тому

    Air Canada: Chatbot lives matter 🤦‍♂️

  • @vawlkus
    @vawlkus 3 місяці тому +2

    *sigh*
    Air Canada 😑

  • @nightowl8548
    @nightowl8548 3 місяці тому

    Sounds like Air Canada used the same bot to provide their defense 😂😂😂

  • @rreiter
    @rreiter 3 місяці тому +1

    The argument was as silly as a factory claiming its equipment is responsible for its own actions. Another takeaway is to make sure you always keep hardcopy or imagery as backup in case you need it. I've seen many "courtroom videos" in which people claim they spoke to someone or paid something, but can't recall who or when and kept no proof whatsoever.

    • @phantomkate6
      @phantomkate6 3 місяці тому

      Is it just me or is this kind of problem becoming more commonplace. I'm starting to feel like I need to start an evidence folder the moment I become anyone's customer.

  • @whispersinthedark88
    @whispersinthedark88 3 місяці тому

    Maybe they had the chat-bot defend itself in court 😂

  • @amechealle5918
    @amechealle5918 3 місяці тому +1

    Always screenshot! Especially if there is any kind of money involved. Always!

  • @2A.Freedom
    @2A.Freedom 3 місяці тому

    Never even heard of a bereavement flight lol

  • @sillysad3198
    @sillysad3198 3 місяці тому

    without corporal punishments for CEOs it is a nothingburger