My only concern on organic adaptation is teaching users about these "optimizations", and that they occur. I think most users would not "expect" interfaces to adapt to their usages. I could imagine users getting frustrated when elements change size or form, when they are used to the same interface every time. On the other hand, I can also imagine a scenario where the user is pleasantly surprised. I think we'll really see all of these ideas come to life with the Occulus Rift. As a developer myself, my biggest issue is getting access to these eye-tracking tools, as most of the tools we have today are either really expensive or still in development (ex. Google Glass). Once the Occulus Rift comes out, I think we'll see many unique types of data forms from eye-tracking (such as heat maps, eye movement, etc.). The interesting thing will be how pre-existing analytics software will integrate with these new technologies. Great talk. Thanks for sharing.
My only concern on organic adaptation is teaching users about these "optimizations", and that they occur. I think most users would not "expect" interfaces to adapt to their usages. I could imagine users getting frustrated when elements change size or form, when they are used to the same interface every time. On the other hand, I can also imagine a scenario where the user is pleasantly surprised.
I think we'll really see all of these ideas come to life with the Occulus Rift. As a developer myself, my biggest issue is getting access to these eye-tracking tools, as most of the tools we have today are either really expensive or still in development (ex. Google Glass). Once the Occulus Rift comes out, I think we'll see many unique types of data forms from eye-tracking (such as heat maps, eye movement, etc.). The interesting thing will be how pre-existing analytics software will integrate with these new technologies.
Great talk. Thanks for sharing.
"umm"