Hehe, the last chance to try to predict anything before it will be made official on WWDC…
In my previous post on the iScreen, I’ve already described the most important features of the product from my point of view. What I’ve missed was the actual user interface. I don’t think it will be Siri. In my opinion, usage of antropomorphic user agents can only be accepted by the users in the situations where a human being is also thinkable. For example, I can use Siri-like UI to schedule an appointment, or to be consulted about the best investment strategy, or to find a potential source of foodporn pictures around me. I cannot find Siri useful when I need to add three long numbers, to turn a car to the left when driving on a highway, or to switch channels or change TV set volume.
In my guess I was trying to analyse what has already been done with the iPhone. Question: how selecting a music title was implemented before the iPhone? You had an item list, one item has been shown selected, and you had two buttons up and down to change the selection, and OK to apply. What has changed? In the iPhone, you can now use the second best natural way of selecting an item for a human being, which is directly based and derived from our anatomy and the history of the development of our biological kind. The touching. Note that the most natural way would be grabbing. But they didn’t have technology to implement it yet. Recent announcements hint that there is a lot of research currently in this area. We will hear about touchable displays changing their surface quite soon.
Alas, this can’t be applied for the TV sets, because they stand too far from where I sit. Therefore I can see only two possibilities. The first one is to move TV sets in the 2 feet distance so that they still can be touched. Imagine a couch for 2 or 3 persons, a coffee table before it, and a veeery long (cinemascope format or longer) TV set staying on it. This will still allow for a some social watching experience (one of the most important factors differentiating TV sets from big iPads). The UI would have to be made in the way that it works no matter where you put your first touch. For example, on the first touch, a flower of possible actions appears at this place, and you work from there. This might sound and look a little weird, but compare your experience in a cinema (expecially if you sit in one of the first five rows), where you also have to move your head to watch across the whole action space. This isn’t that much different.
Another possibility is to leave the TV set where it is – relatively far away – and to use the third best humanly natural way of interaction. The pointing. This is different from Kinect insofar that you see an avatar of your palm when using Kinect. This reduces the ergonomics of interaction, first because the virtual palm reacts with quite a big latency on your movements, so you have to get used to it, and second and most important, because you stop feeling your hand as interacting with some object on the screen. After establishing the initial contact with the virtual palm, one could turn off the light in the room so that you wouldn’t know where your hand actualy is, and with Kinect this won’t prevent you from manipulating the virtual palm, because virtually, in your mind, you have replaced your actual palm with the virtual one.
Pointing would work differently. Imagine a situation where you are presented with several running videos on the screen, and you want to select one of it to watch it in fullscreen. You point on it, just like how you would have pointed on it telling your friend “look, there, isn’t that cool”, and the TV set will recognize this gesture and react on it.
I have no idea why is isn’t implemented with Kinect yet. Perhaps there are some technological limitations, or I’m too optimistic about the actual accuracy human beings can have when pointing to a something as distant as 10 feet from them…
Anyways, let’s wait and see what will happen on WWDC. And I want to repeat myself: if Apple won’t announce an iScreen (or something similar) in this year, this might be their first step into decline. Microsoft and Google are already running with all their speed.