I found the audio generated to be very and I mean very far left in what is stated when I gave it simple text about agenda items for a city council meeting about affordable housing and a homeless Day Center. The audio went on and on about helping the homeless, lived experience, blah, blah showing the Googles AI still is extremely biased Other than that it is amazing technology but one that can be used for great evil
Thank you for the comment. Yes the models could be biased. So far I can not find source to evidence, for example, for a specific topic which training data was used, to train any models. We need to be cautious to verify facts.
I found the audio generated to be very and I mean very far left in what is stated when I gave it simple text about agenda items for a city council meeting about affordable housing and a homeless Day Center. The audio went on and on about helping the homeless, lived experience, blah, blah showing the Googles AI still is extremely biased
Other than that it is amazing technology but one that can be used for great evil
Thank you for the comment. Yes the models could be biased. So far I can not find source to evidence, for example, for a specific topic which training data was used, to train any models. We need to be cautious to verify facts.