Zoom Sign Language Interpretation Mode Demo

Поділитися
Вставка
  • Опубліковано 6 лют 2025
  • See Zoom's new Sign Language mode in action. Watch how the meeting host sets it up, how viewers can activate the feature, and see some of the issues interpreters have to work around to partner effectively.
    For more information including a detailed how-to and deeper discussion of the issues raised in the video, visit: www.tealanguag...
    For a more accessible version with fewer zoom-ins that put the ASL interpreters out of view, watch: • Zoom Sign Language Int...
    Thank you to ASL interpreters Tailyn Kaster and Amber Tucker for their excellent work and insight.

КОМЕНТАРІ • 29

  • @EDICaucus
    @EDICaucus Рік тому +2

    Just wanted to say thank you for sharing this information. We had our first webinar yesterday, invested in having the British Sign Language interpreter, but it did not go well initially, until we realised we hadn't started interpretation, and then it got worse later when we looked at the recording and discovered none of the BSL was captured. We've learned the hard way that the recordings will not capture the BSL without a work around.
    We will do some experiments to do this better as you suggest.
    In terms of the hand off - I hadn't realised what a challenge that could be. In our case yesterday, the interpreters were in the same room taking turns in front of the one camera. But I'm glad I'm now aware of the challenge when there are two separate log ins as interpreters.

    • @MrEase123
      @MrEase123 Рік тому +1

      Glad this was useful. Feel free to check out the link in the description. There is an entire guide we wrote discussing some of these issues that may be of use! It may be slightly out of date in some regard. For example, I think some recordings do capture signed interpreters now. You may have to investigate that a little further, though.

  • @myoldblackdog
    @myoldblackdog 2 роки тому +2

    This is great! Every little feature update counts.

  • @4adinak
    @4adinak 2 роки тому +7

    I can say that as an interpreter who has been in many Zoom calls, there is no way the host will be interested or possibly even able to navigate managing the "allow to speak" and turn off "allow to speak" every 20 minutes. Cool new feature, but needs a few tweaks to make it workable when working in a team. If working alone, great!

    • @aimeebenavides6086
      @aimeebenavides6086 2 роки тому +2

      The host wouldn't need to do anything every 20 minutes - just do it once at the beginning of the meeting. It is not much different from current protocols where sign language interpreters are routinely added as co-hosts.

  • @mylesdb
    @mylesdb 2 роки тому +7

    A step in the right direction! We need to know more about how the interpreters are captured in Zoom recordings and live streams... It appears they will no longer be included in live streams with this feature since it's visibly activated and placed by the users in the meeting now. So this is a step backward and the old method will need to be used for live streaming.

    • @MrEase123
      @MrEase123 2 роки тому

      We have tested and verified that Zoom's native recording function does NOT capture Sign Language interpreters in their separate window. Our guide on the feature (link to free download in the description) mentions this and offers some workarounds. We haven't tested streaming yet, I would be shocked if streaming captured the interpreters' window. In that case, you would have to stream via a program like OBS that allows you to share your screen. We've done this in the past to stream spoken language interpretation and captioning in multiple languages (also not captured natively within Zoom).

    • @CArch-nw8gw
      @CArch-nw8gw 2 роки тому

      @@MrEase123 Enjoyed the video. The separate channel is not captured in the recording but, if you enable the "Ask to Talk" feature for the interpreter. This will insert the active camera into the Gallery View which is recorded.

  • @jesscobar
    @jesscobar 11 місяців тому

    Thank you for another fantastic and generous sharing of your hard-won knowledge, TEA Language Solutions colleagues! In my own limited experiments with this feature, I found that the interpreters never disappear from Gallery View unless a person has the option "Hide Non-video Participants" selected and the interpreters currently have their cameras off. You mention in this video and in the accompanying document that the only way to see an interpreter in Gallery View is for the host to either allow them to talk or make them a co-host. In my own experiments, that didn't seem to make any difference to their visibility or invisibility in Gallery View -- only hide or show non-video participants did. Am I missing something?

  • @madelinerios3582
    @madelinerios3582 2 роки тому +4

    Thank you as always for keeping us up on technology. Us oral language interpreters still often find it necessary to have Whatsapp, Skype or the like to fully communicate with our partners. Are the sign language interpreters still doing something similar?

    • @MrEase123
      @MrEase123 2 роки тому

      The way this feature is configured sign language interpreters have to have a video backchannel to see each other on equal terms (I discussed this at the end of the video, interpreter 2 can't see interpreter 1 for a signal because as soon as interpreter 1 turns on their camera, they automatically take over the "stage")

    • @ASL4U
      @ASL4U 2 роки тому

      @@MrEase123 yes - ASL interpreters usually are in phone text, zoom chat and sometimes also on phone voice while working to be able to work together... this new thing... this will be interesting - I'm not a fan

  • @blueurpi
    @blueurpi 2 роки тому +1

    Thank you for creating this very interesting video. :)

  • @ambertucker8841
    @ambertucker8841 2 роки тому

    Great to see they are thinking about accessibility, but I wonder if they consulted any Deaf or DeafBlind folks, interpreters or Deaf interpreters about this. Since ASL is a different modality than spoken language, there isn't really a need for a separate channel for ASL interpreters the way there is for spoken language interpreters. I think there will be benefit for folks who would like to be able to have the interpreter in a separate window, which would reduce disruptions when there's screen sharing. There's a lot of setup required from all parties with the new feature, where currently interpreters being co-hosts is all that's needed.

  • @watsonwuffer6939
    @watsonwuffer6939 2 роки тому

    Yes, Please add captions if possible!

  • @deborahcates9531
    @deborahcates9531 2 роки тому +4

    As a sign language interpreter who works A LOT on Zoom, I can tell you that this feature is not very interpreter-friendly. I don’t think my Deaf colleagues are fans of it either. It is much easier to just make the interpreters co-hosts.

  • @jerilynhartley5991
    @jerilynhartley5991 2 роки тому +1

    Hmm, its good they are thinking about and implementing these things, but I there are still tweaks to be made to make this actually feel like a beneficial to the users and not just more clunkiness, and hoops to jump through. I can imagine having to explain this each time we have a new Deaf and Hearing client, and host, and then it only makes it so there is a pop out, which might not even be beneficial depending on the circumstance. Again grateful they are thinking of new ideas but it seems like something I am disinclined to use. (Though maybe there could be some good application when using a CDI, that might be where its merit lies.)

  • @paullevenson2202
    @paullevenson2202 2 роки тому +2

    You also would need feedback from DeafBlind community (only those who can see with limited vision). In my DB wife, she would demand a huge window to see a lot better and also need very large font of captions as well. Please consider them as well. Thank you

    • @aimeebenavides6086
      @aimeebenavides6086 2 роки тому +2

      We found the feature to have pros and cons. The pro is that the window is resizable and is not affected by shared content when a presenter begins to share their screen. The other pro is that the interpreters appear in one window, so even when they trade off, there is no need to search for a separate window to find the new interpreter. The downside seems more for the interpreters and how they can manage the way they take turns. Additionally there is a learning curve in selecting the option to see the pop out window for sign language. We heartily concur that Zoom should listen to all feedback both from sign language interpreters and especially from the Deaf community.

  • @sarahcansler2288
    @sarahcansler2288 2 роки тому

    Wondering about pros/cons using this feature vs giving interpreters and Deaf client multi-pin capabilities? Depending on the host to control anything regarding interpreters doesn’t usually work well.

    • @MrEase123
      @MrEase123 2 роки тому +1

      For a meeting with a lot of interpreters/deaf clients, it´s easier for the host to turn on the sign language ¨channel¨ with two clicks instead of having to give 10 people multi-pin privileges one by one (though cohosts can also do this). A big advantage of the sign language feature is that the window is very customizable, both in terms of size and location. A pinned window is more fixed in size and position and especially problematic when content is being shared because that shrinks the video windows!
      So, I don´t think there´s a single universal answer to the question you pose. The best solution will really depend on different variables like the number of participants, deaf clients, meeting format (lecture or big discussion, shared content of just talking heads). Understanding how each option will play out for host, interpreters and deaf clients in that particular meeting is the key.

  • @centrefordeafstudiestrinit4841
    @centrefordeafstudiestrinit4841 2 роки тому

    Are there captions available for this lovely overview, by any chance? Thank you!

  • @melissafoster180
    @melissafoster180 2 роки тому +1

    I don't understand how the interpreters are able to feed/support each other when the "on" interpreter is onscreen and the team has their camera off.

    • @MrEase123
      @MrEase123 2 роки тому

      There's always the option to have a video call going outside of zoom (as some have already been doing). We did discover in a subsequent demo that the Interpreter's version of the little window shown lets them toggle their view to see thumbnails of all interpreters whose cameras are on. This works while interpreter 1 is active: they can see their partner who doesn't have to be visible in the main meeting. Doesn't work when interpreter 2 is on, though, because if interpreter 1 has their camera on, they automatically bump them out of the window participants see.

  • @aleksandrrozentsvit2164
    @aleksandrrozentsvit2164 2 роки тому +1

    Also it would be nice to have Deaf Interpreter feature.

    • @MrEase123
      @MrEase123 2 роки тому

      A CDI could use the sign interpretation channel- Host can create a custom "Deaf Interpreter" channel that works the same as what you see here. The challenge is the relay from the ASL interpreter or team. Possibilities would be a separate video call, seeing ASL via the main meeting video, or seeing the ASL in the "Sign interpreter" window (interpreters can toggle between seeing only the interpreter who is "on" or seeing all interpreters in the channel who have cameras on). For the latter ASL and CDI would have to be set in the same channel but ordered so CDI were the only ones who ever appeared for viewers in the window.

  • @properjob2311
    @properjob2311 2 роки тому

    surely this is all going to be done by AI in the future? no need for human interpretation