Time for a new User Interface

Posted:
in Future Apple Hardware edited January 2014
Like everybody else, I have seen MultiTouch on the iPhone and find it interesting. I also have read some of the threads on MT here and elsewhere. What I would like to see is a good discussion on not only this, but other ideas thrown into the mix. Please, no pie in the sky unrealistic crap. Here's my thoughts to seed the discussion.



First, I doubt MT would work well for typing out text. However, I really don't want to be typing out text anyway on any kind of surface. I would prefer to speak my text entry instead. I could see having the ability to call up a virtual keyboard if needed but theoretically speech-to-text is way better then typing.



So I envision a GUI using MT combined with voice recognition. I haven't seen anybody else bring up this scenario and I wonder what it would be like. Is voice recognition advanced to the point that it is usable? If it is, I think this is a scenario that would usher in a big change in computing.



Also, is it possible to include the iSight camera to add to the input? If so, what could it add? Lip reading, security, possible gestures, reading moods and emotions? What about yes or no head movements?



Also, what about a computer that talks to us not just pops up message boxes? This is very intuitive also and could help reduce screen clutter.



With all this in mind, here is what I find possible. A future computer uses a big 40 or 50" flat screen arms distance away or so. The user would wear a lightweight headset with a microphone. The headset has mini speakers right by the ears and uses bluetooth for wireless operation. It has a small battery that runs all day. I plug it in at night. If I forget, I can have a spare battery ready. A iSight camera is buit into the screen.



I walk up to the computer and it recognizes me. I say login and immediately I am logged in. If the computer can't identify me right away by sight and by voice, it asks me who I am not in a popup window but in a voice ( yeah it could even use that HAL voice :-) I answer by speaking, not by typing. It run voice identification and confirms my id. If extra security is needed I could always sign my signature using MT, type in a password, or even speak it.



After logging in, my computer greets me according to my preferences. It might tell me a daily joke, give me stock info, remind me of tasks to do, speak a word of encouragement or inspiration. It could notify me of things of interest I have either flagged or that the computer knows I check frequently. If the computer was really good, it might even guess my temperment based on facial recognition and voice quality. It could tell when I am sad, happy, etc. and take a tone of voice accordingly. It is all up to one's preferences.



My iChat, email, calender, and web apps opened immediately upon logging in but they are in the background. They are always running and pop up at a moments notice. In fact, they are part of the operating system actually. They put all those extra processors to work.



I ask the computer for a status report. Up pops a list of choices and the computer asks me what I would like to do. On the status report is a summary of new emails and messages, my daily schedule and to do list, news and pages, etc.



I then say show me my mail. The computer then shows me a more detail list of my video messages, audio messages, text messages. I then point to the ones I want to view/hear/see or say the ones I want to act on and the computer does it. I then go through my mail and respond, delete, etc as needed. When I need to type out a message, I just speak the text and then send the reply. The computer converts the speech to text and sends the message on. I even add notes and keywords to some mail and that get stored with the emails. The notes are not separate files but attached to the emails. Of course I can search these notes later if needed.



When done with mail, I then ask the computer to show me my schedule. I create appointments, to dos, alarms, all by voice. I just speak the time, the text, any notes or special tasks. When done I tell the computer to keep me informed and on schedule. It tells me reminders and such however I have set up my preferences.



Next I look at my news and web sites of interest. Navigation is easy with gestures. Select from thumbnails, zoom in, close or follow hyperlinks. I can quickly add notes and keywords to a page and tell the computer to archive it for future recall. According to my preferences, it can either store the link or the actually data too. That way I can recall pages from years ago if needed. I don;t woirry about broken links. I store what I want and it is saved. I can always override the default settings easily for a given page if needed.



I then take a few minutes to check out the latest music and video files. The computer tells me of media I might find interesting. My favorite artist is releasing a new single tomorrow and a movie I am interested in seeing is coming up for download in a week. I tell the computer to download them when available. When done downloading, the computer will automatically notify me.



That reminds me to check on my financial situation so I ask the computer to pull up my checkbook ledger. It shows me what me balance is and what my cash flow looks like for the next week. I get a quick overview of my stocks, 401k, etc. I tell the computer to remind me to ask for a raise at the end of the month. This task gets added to my calendar.



I then take a quick look at my music library and select a tune to listen to from the on screen jukebox. I don't call up iTunes for this because this app is literally part of the operating system. The files track all the metadata and handle all the functions needed.



Same with iPhoto. It is part of the OS. The OS also does the same thing with video and DVD projects. Just as old OS X has documents, photos, videos, music listed on the left of the finder, my new finder does too. But rather than view all my files with finder, I use a iPhoto, iTunes type of interface. The interface fits the data type.



I also have piles view for a quick overview of file and directory sizes. In fact, the old way of viewing files and folders in a finder is outdated. I have many special view depending on the data. This was a big improvement since keeping track of all the photos, documents, music, and video files was getting to be a nightmere.



In fact, the use of applications is also kind of disappeared. Apple was finally able to fulfill the OpenDoc promise. People didn't realize that is why they developed all their apps. Most apps are now just part of the operating system. One doesn't open apps but instead focuses on the actual documents, data, and media files. When one opens the file requested, they don't even know an app is opening in the background. They just work with the file using heads-up-displays and such specific to the file type.



One can always open legacy apps but with the new GUI, the new apps work so much better. people don't miss MS Office and Adobe aps because most aps are old fashioned. The newer computers use more memory and processing but prices have come down. People wondered what the next killer ap was going to be but they were barking up the wrong tree. The killer app is the OS. An evolutionary combination of all the separate components. Apple was able to integrate all the peices because they didn't have to worry about getting hauled into court for being a monopoly. Microsoft's hands were tied.



In fact, Microsoft was way ahead of its time. Remember Bob? He's back in a way. And it works this time. But rather than a silly cartoon, it is a full 3D assistant. This has spawned a whole new industry. Who would have ever thought Playboy would be so into software. People get a new secretary each month yet each new one has the memory of the old one. Some people even keep a few different ones around. But it is not just looks but personalities too. Each comes with unique parameters. Oh well, somethings never change, do they?



Another cool thing about the future operation system is how it ties into broadband. Apple has a huge library online that a users computer can access as if it were its own. For example, maps, atlases, dictionaries, encyclopedias, even most books you would find at a library or book store. In fact, libraries are doomed. Now authors just publish to Apple's library and people can buy a book for a couple dollars. When it first came out, who would have though that iTunes would grow into the WalMart of the online world. One can buy their music, movies, books, personalities, templates, stock photos, software, etc all online.



And what is cool is since Apple stores all this data, you do not have to. Your computer merely caches the latest files and keeps pointers to the related files. You do not worry about backing up this data because Apple does it for you. It sucks when you lose your internet connection but that is rare since WiMax and other wireless broadband has come out. In fact, laptops and handheld computers like the iPhone do not need to carry all this data because Apple knows what you own. So you don't have to sync your data much anymore. Just your private stuff.



However, your home computer stores all that. When on the road, the office, etc. you just access your home systems. Of course now everybody has RAID storage systems for their data. Other places have online backup for offsite backup and storage.



Apple also rents videos, books, and such. Apple even finally has a subscription model if you want. The amazing thing is that subscription money actually go to the actual authors and artists that people use. ASCAP and the other unions are now a thing of the past. In fact, most content creation people get a check every month from Apple. When somebody buys a song for example, the money automatically gets divided amoung the song writer, the artist, and anybody else with a claim to a share of the pie. Music labels are a thing of the past. It is easy for anybody to distribute electronic media and find their target market.



Advertisers don't like this of course. People are now used to watching television without commericials. Branded entertainment and product placement are now the way of choice to market a product.



Well, that hopefully will get the wheels turning. I really think most of these ideas are not that far off in the future. I don't know for sure what Apple and others are doing but I try and connect the dots. I'm guessing we could see something like this in five years - maybe sooner.



Okay, give me some feedback. Is this scenario feasible and realistic? Is it good? What should be subtracted and what should be added?

Comments

  • Reply 1 of 5
    vineavinea Posts: 5,585member
    This is too long for even me to comment on.



    So...I'll point you at work 20 years old:



    Apple's Knowledge navigator:



    http://www.digibarn.com/collections/...navigator.html



    And 15 years old:



    http://asktog.com/starfire/



    I will comment about voice recognition. My first job as a summer intern I was researching voice commands on a Apple ][. I managed to get it to recognize basic commands and such but it was really buggy but most folks thought usable voice recognition would be available within 5 years. Well, 25 years later Dragon does work MUCH better than what we had 25 years ago but its not quite there yet and likely wont be for more than 5 years for general voice recognition.



    My office mate uses Dragon for dictating memos and I gotta tell you that it's fipping annoying for anyone else in the room if you're trying to think. Just imagine everyone talking on their cell phones near you while you're trying to work.



    If anyone is aware of similar forward looking UI designs (like Starfire and KN) made in the last 5 years or so I'd be very interested. Sun and Apple got out of that UI research business about...oh a decade ago AFAIK.



    Vinea
  • Reply 2 of 5
    wilcowilco Posts: 985member
    Quote:
    Originally Posted by visionary View Post


    Like everybody else, I have seen MultiTouch on the iPhone and find it interesting. I also have read some of the threads on MT here and elsewhere. What I would like to see is a good discussion on not only this, but other ideas thrown into the mix. Please, no pie in the sky unrealistic crap. Here's my thoughts to seed the discussion.



    First, I doubt MT would work well for typing out text. However, I really don't want to be typing out text anyway on any kind of surface. I would prefer to speak my text entry instead. I could see having the ability to call up a virtual keyboard if needed but theoretically speech-to-text is way better then typing.



    So I envision a GUI using MT combined with voice recognition. I haven't seen anybody else bring up this scenario and I wonder what it would be like. Is voice recognition advanced to the point that it is usable? If it is, I think this is a scenario that would usher in a big change in computing.



    Also, is it possible to include the iSight camera to add to the input? If so, what could it add? Lip reading, security, possible gestures, reading moods and emotions? What about yes or no head movements?



    Also, what about a computer that talks to us not just pops up message boxes? This is very intuitive also and could help reduce screen clutter.



    With all this in mind, here is what I find possible. A future computer uses a big 40 or 50" flat screen arms distance away or so. The user would wear a lightweight headset with a microphone. The headset has mini speakers right by the ears and uses bluetooth for wireless operation. It has a small battery that runs all day. I plug it in at night. If I forget, I can have a spare battery ready. A iSight camera is buit into the screen.



    I walk up to the computer and it recognizes me. I say login and immediately I am logged in. If the computer can't identify me right away by sight and by voice, it asks me who I am not in a popup window but in a voice ( yeah it could even use that HAL voice :-) I answer by speaking, not by typing. It run voice identification and confirms my id. If extra security is needed I could always sign my signature using MT, type in a password, or even speak it.



    After logging in, my computer greets me according to my preferences. It might tell me a daily joke, give me stock info, remind me of tasks to do, speak a word of encouragement or inspiration. It could notify me of things of interest I have either flagged or that the computer knows I check frequently. If the computer was really good, it might even guess my temperment based on facial recognition and voice quality. It could tell when I am sad, happy, etc. and take a tone of voice accordingly. It is all up to one's preferences.



    My iChat, email, calender, and web apps opened immediately upon logging in but they are in the background. They are always running and pop up at a moments notice. In fact, they are part of the operating system actually. They put all those extra processors to work.



    I ask the computer for a status report. Up pops a list of choices and the computer asks me what I would like to do. On the status report is a summary of new emails and messages, my daily schedule and to do list, news and pages, etc.



    I then say show me my mail. The computer then shows me a more detail list of my video messages, audio messages, text messages. I then point to the ones I want to view/hear/see or say the ones I want to act on and the computer does it. I then go through my mail and respond, delete, etc as needed. When I need to type out a message, I just speak the text and then send the reply. The computer converts the speech to text and sends the message on. I even add notes and keywords to some mail and that get stored with the emails. The notes are not separate files but attached to the emails. Of course I can search these notes later if needed.



    When done with mail, I then ask the computer to show me my schedule. I create appointments, to dos, alarms, all by voice. I just speak the time, the text, any notes or special tasks. When done I tell the computer to keep me informed and on schedule. It tells me reminders and such however I have set up my preferences.



    Next I look at my news and web sites of interest. Navigation is easy with gestures. Select from thumbnails, zoom in, close or follow hyperlinks. I can quickly add notes and keywords to a page and tell the computer to archive it for future recall. According to my preferences, it can either store the link or the actually data too. That way I can recall pages from years ago if needed. I don;t woirry about broken links. I store what I want and it is saved. I can always override the default settings easily for a given page if needed.



    I then take a few minutes to check out the latest music and video files. The computer tells me of media I might find interesting. My favorite artist is releasing a new single tomorrow and a movie I am interested in seeing is coming up for download in a week. I tell the computer to download them when available. When done downloading, the computer will automatically notify me.



    That reminds me to check on my financial situation so I ask the computer to pull up my checkbook ledger. It shows me what me balance is and what my cash flow looks like for the next week. I get a quick overview of my stocks, 401k, etc. I tell the computer to remind me to ask for a raise at the end of the month. This task gets added to my calendar.



    I then take a quick look at my music library and select a tune to listen to from the on screen jukebox. I don't call up iTunes for this because this app is literally part of the operating system. The files track all the metadata and handle all the functions needed.



    Same with iPhoto. It is part of the OS. The OS also does the same thing with video and DVD projects. Just as old OS X has documents, photos, videos, music listed on the left of the finder, my new finder does too. But rather than view all my files with finder, I use a iPhoto, iTunes type of interface. The interface fits the data type.



    I also have piles view for a quick overview of file and directory sizes. In fact, the old way of viewing files and folders in a finder is outdated. I have many special view depending on the data. This was a big improvement since keeping track of all the photos, documents, music, and video files was getting to be a nightmere.



    In fact, the use of applications is also kind of disappeared. Apple was finally able to fulfill the OpenDoc promise. People didn't realize that is why they developed all their apps. Most apps are now just part of the operating system. One doesn't open apps but instead focuses on the actual documents, data, and media files. When one opens the file requested, they don't even know an app is opening in the background. They just work with the file using heads-up-displays and such specific to the file type.



    One can always open legacy apps but with the new GUI, the new apps work so much better. people don't miss MS Office and Adobe aps because most aps are old fashioned. The newer computers use more memory and processing but prices have come down. People wondered what the next killer ap was going to be but they were barking up the wrong tree. The killer app is the OS. An evolutionary combination of all the separate components. Apple was able to integrate all the peices because they didn't have to worry about getting hauled into court for being a monopoly. Microsoft's hands were tied.



    In fact, Microsoft was way ahead of its time. Remember Bob? He's back in a way. And it works this time. But rather than a silly cartoon, it is a full 3D assistant. This has spawned a whole new industry. Who would have ever thought Playboy would be so into software. People get a new secretary each month yet each new one has the memory of the old one. Some people even keep a few different ones around. But it is not just looks but personalities too. Each comes with unique parameters. Oh well, somethings never change, do they?



    Another cool thing about the future operation system is how it ties into broadband. Apple has a huge library online that a users computer can access as if it were its own. For example, maps, atlases, dictionaries, encyclopedias, even most books you would find at a library or book store. In fact, libraries are doomed. Now authors just publish to Apple's library and people can buy a book for a couple dollars. When it first came out, who would have though that iTunes would grow into the WalMart of the online world. One can buy their music, movies, books, personalities, templates, stock photos, software, etc all online.



    And what is cool is since Apple stores all this data, you do not have to. Your computer merely caches the latest files and keeps pointers to the related files. You do not worry about backing up this data because Apple does it for you. It sucks when you lose your internet connection but that is rare since WiMax and other wireless broadband has come out. In fact, laptops and handheld computers like the iPhone do not need to carry all this data because Apple knows what you own. So you don't have to sync your data much anymore. Just your private stuff.



    However, your home computer stores all that. When on the road, the office, etc. you just access your home systems. Of course now everybody has RAID storage systems for their data. Other places have online backup for offsite backup and storage.



    Apple also rents videos, books, and such. Apple even finally has a subscription model if you want. The amazing thing is that subscription money actually go to the actual authors and artists that people use. ASCAP and the other unions are now a thing of the past. In fact, most content creation people get a check every month from Apple. When somebody buys a song for example, the money automatically gets divided amoung the song writer, the artist, and anybody else with a claim to a share of the pie. Music labels are a thing of the past. It is easy for anybody to distribute electronic media and find their target market.



    Advertisers don't like this of course. People are now used to watching television without commericials. Branded entertainment and product placement are now the way of choice to market a product.



    Well, that hopefully will get the wheels turning. I really think most of these ideas are not that far off in the future. I don't know for sure what Apple and others are doing but I try and connect the dots. I'm guessing we could see something like this in five years - maybe sooner.



    Okay, give me some feedback. Is this scenario feasible and realistic? Is it good? What should be subtracted and what should be added?



    F.R.A.T.
  • Reply 3 of 5
    Wow, someone sure likes to typey type. I suppose you don't want to type on a small screen considering yer post. I personally think that "Voice messaging" is the future, why type when you can record your voice for a simple message, send via e-mail, by phone to a mail box or simple memo, have a app that can write out the message to text for import into any doc program. This could be used right now, but most people would ignore this cause typing is more fun.
  • Reply 4 of 5
    Vinea, thanks for those links. I never even heard of that research. I guess I should have guessed Microsoft's Bob was not there first. Kind of scary how similar the Apple video was to my vision. Or I guess I should say the other way around.



    It is really no secret what we all would like and need. I can see why so many people got out of the UI business - because we didn't need reasearch on what to do; what we needed to do was implement that vision.



    I think we have made progress in the last couple of decades. However, one needs the right amount of disk space, CPU power, and network bandwidth to hit the target. The question is whether the technology has caught up. I think it either has or is pretty close.



    I think the OpenDoc scenario is also a interesting possibility. Apple has low end and high end apps for many of the media functions. Apple could include the low end as part of the operating system and people could buy the bigger upgrade package if they needed more pro app power.



    The reason I bring this up is because Apple didn't release iLife '07 in January like they usually do. People think Apple is waiting due to features unique to Leapord. This makes sense. But what if Apple is including iLife as part of the OS itself. Just like Mail, iCal, Safari are apps within the OS, why not iPhoto and iTunes?



    I find it redundant that I can look at my files in the finder but also in the app designed for that media. Actually, Apple is disorganized with video. Some is in iTunes, some in iPhoto, and some only in the finder. When I take videos from my digital photo camera, the video ends up in iPhoto. My downloads from the iTunes store end up in iTunes. However, other video files end up only in the finder. Apple need to expand iMovie not just to edit, but to manage all the video files on the computer.



    But even better would be to include these apps into the OS. The database management of music, videos, and photos become part of the finder. Then the basic editing of music, photos, and videos become part of the OS.



    I have the same problem managing audio files, such as loops and samples. Anybody who has Garageband, Soundtrack Pro, and Logic knows Apple is not consistant on where it stores its loops. It duplicates this data and takes up a lot of extra disk space. Apple needs to get better organized with this.



    iWeb also involves managing files similar to what the finder does. So to does iDVD. And Quicktime. The point is that all these files should not be managed in many different apps. They should all be managed by the finder. Spotlight also works into this mix. Time for a paradigm shift.



    Nor should one have to start different apps based on the data type. Just like I can play video and audio files in the finder, similar to what Quicktime does, why not merge everything together? This is what OpenDoc promised.



    This is what Apple was doing with Copland and their next generation operating system that never came to be. However, the concept was valid and is desparately needed today. Apple just needed to break this huge leap into smaller steps. This is what they have done. I think it is now time to start integrating these pieces.



    I'm sick and tired of having to right click on a file and choose the app with which to open the file. I want to select the file and have the simple apps with editing functions all ready active in the background. If I want to edit, then they jump to the foreground. If I have the pro apps installed on my machine, then I can set my preferences to jump there instead similar to what iPhoto does now. In iPhoto I can edit the photo in iPhoto or jump to an external app like Photoshop. So what I propose is the next logical evolution in operating systems.



    Again, I don't have any insider info on what Apple is doing. All I am doing is putting pieces together from the last ten or twenty years, particularly the last couple of years and seeing where things should go and where people like Apple wanted to go. I look at my needs and realize Apple probably sees exactly what I see. If we do good analysis, we don't need inside information to know what Apple is going to do. We can deduce it from what is past and current public knowledge.



    Some people might think these posts should be shorter. But there are people on these forums who have insight and knowledge. The links to the old Apple and Sun videos is proof of that. If you don't get this thread then ignore it. However, if you have good intel, let's hear it.



    For example, 9/11 happened not because the US didn't have the necessary intel. They had enough data to figure everything out. What they needed to do was have the right hand know what the left hand knew. The US also knew the terrorists wanted to attack. They should have put 2 and 2 and 2 and 2 together and figured out 8. This is what these forums and sites can do.



    I would love for us to be able to accurately predict what Apple is going to do, not just by insider info - although that helps and we can never have enough - but by using the existing data in the public domain. In a way, we do what the CIA should have done before 9/11. We connect the dots and accurately predict what is going to happen.



    This empowers us to know when to buy and when to sell equipment. It also enables us to better understand what we should learn and where to spend our efforts. It sucks to develop an app only to have Apple release an OS with the functions built in.
  • Reply 5 of 5
    Here is (the old mockup) of my take on the iSight/Gestures implementation



Sign In or Register to comment.