Usability/UX wishlists . . .
If this topic has been discussed elsewhere on AI, I haven't been able to find it. Please redirect me as necessary ;-)
Wouldn't it be nice if applications like Face time displayed your face (and your connected videocon partner's face) as if you were actually looking at them - directly face-to-face?
I'm thinking of an iPhone/iPad implementation specifically, so that when I make a video-call I'm not looking up my conversationalist's nose or seeing a bloated fish-eye view of them.
It may be difficult to implement a camera system that "looks through the middle of any iPhone/iPad/Mac screen" at a reasonable cost. On the other hand, given today's camera & software technology, it shouldn't be impossible to have four, cheap, low-res cameras, one at each corner of the screen (rather than in the top/middle) and use software to present a "corrected" view of a person as if they were looking directly into a standard (non fish-eye) camera lens.
Thoughts?