Too much clutter for a mouse- sounds too Kensingtonish.
However I could see this for the magic trackpad- makes perfect sense.
Mobile mouse app works fine for me right for now.
Totally.... the magic mouse has too much going on already. I ditched mine as soon as the magic trackpad came out. You really need more space for multitouch than what's possible on the back of a mouse.
It fits. Apple enjoys releasing mice that do nearly everything except the the one thing they are supposed to do? fit your hand and make it easy to control a cursor on the screen
One day physical input devices such as keyboards and mice will look and feel archaic.
Maybe for mice. But keyboards (in some form) will always exist. Language access is essential for any type of computing device. And I don't think voice recognition is going to be taking over any time soon.
Apple is going backwards in mouse usability. My three year old knows how to use a one button mouse, and a mouse with two physical buttons. I can tell him to click the left or right button.
I can't tell him to click the right button when there are no buttons.
Maybe for mice. But keyboards (in some form) will always exist. Language access is essential for any type of computing device. And I don't think voice recognition is going to be taking over any time soon.
Ah yes... my previous comment was phrased wrong. I don't think physical input devices will disappear but I think physical 'buttons' will. Some years from now the keyboard will be a 'touch' extension of the monitor, is what I meant. As for voice input... I have tried it but find it weird and difficult. It might be something one needs to become used to in order to find 'natural', but I believe it will be too disruptive in a communal working space to ever catch on. But who knows.
... I don't think voice recognition is going to be taking over any time soon.
Indeed.
Voice recognition is one of the longest running boondoggles in tech. It's inherently impossible for a computer to "recognise" what you are saying, but we all pretend like it's going to happen someday anyway.
The reason is that the computer would have to be conscious to perform that task and you guessed it ... Artificial Intelligence, or the idea that a computer will someday "think" is a an even longer running boondoggle.
Voice recognition will always be limited to the user shouting phrases and keywords at the computer and will never get to 100% accuracy. It's just not possible right now, and even if it becomes possible someday, that day likely won't be for many decades at the very least.
Is there a speed advantage to taking your eyes off the screen to look down at your touch-sensitive mouse, find a virtual button, hitting it, then bringing your eyes back up to the screen? ...I mean, over simply moving your mouse to click an on-screen button, that is...
Gestures, I'm on board with, but this is a clear step backwards.
Ah yes... my previous comment was phrased wrong. I don't think physical input devices will disappear but I think physical 'buttons' will. Some years from now the keyboard will be a 'touch' extension of the monitor, is what I meant. As for voice input... I have tried it but find it weird and difficult. It might be something one needs to become used to in order to find 'natural', but I believe it will be too disruptive in a communal working space to ever catch on. But who knows.
This might be a good time to review how this works under the Star Trek model of computing. Yes, personal computers do exist, in people's homes, for example, but let's focus on how this works aboard the Enterprise-D. Are there mice? No. Are there keyboards with keys on them? No. Do they primarily use touch screens? No. Does everyone talk to the computer most of the time? No. (Unless they need enable Self-Destruct, when it doesn't really matter that it's disruptive because everyone is fighting for a spot in the escape pods anyway.)
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
This might be a good time to review how this works under the Star Trek model of computing. Yes, personal computers do exist, in people's homes, for example, but let's focus on how this works aboard the Enterprise-D. Are there mice? No. Are there keyboards with keys on them? No. Do they primarily use touch screens? No. Does everyone talk to the computer most of the time? No. (Unless they need enable Self-Destruct, when it doesn't really matter that it's disruptive because everyone is fighting for a spot in the escape pods anyway.)
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
Ha ha... the 'Star Trek model'. Love it. To be fully 'there' I guess 10 years is the ball park but I am not sure haptic feedback is the barrier. I think such feedback may be a little over rated and is more cultural than anything. The barrier is probably cost more than anything. But this is changing and I can imagine a Magic TrackPad II with a software context driven touch display surface as a fantastic input device. Then a full size keyboard (no longer called a keyboard, of course). Being able to operate Photoshop, or FCP or similar using two hands on a dedicated touch surface, with gesture invoked commands will be truly revolutionary.
I prefer a wired mouse. The response and accuracy is much better than any wireless mouse I have used. I get inexplicable lost connection messages periodically with Apple Magic mouse. I also find trying to invoke gestures really frustrating. If the mouse moves unexpectedly the gesture don't work. I really don't need a mouse that takes two hands to operate.
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
You know all those engineering drawings that they flash on the screen in those shows - those won't be drawn with a virtual keyboard even in star date 2050. The human hand will still need the equivalent of a mouse/input device to draw. Controlling a vehicle or calling up database records, sure, but creating graphics will not be done on a virtual keyboard. I'm pretty sure of this unless telekinesis becomes the new input method.
This might be a good time to review how this works under the Star Trek model of computing. Yes, personal computers do exist, in people's homes, for example, but let's focus on how this works aboard the Enterprise-D. Are there mice? No. Are there keyboards with keys on them? No. Do they primarily use touch screens? No. Does everyone talk to the computer most of the time? No. (Unless they need enable Self-Destruct, when it doesn't really matter that it's disruptive because everyone is fighting for a spot in the escape pods anyway.)
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
I like this. But what about those of us like me that are incredibly lazy? I want thought input and a visual cortex stimulator.
You know all those engineering drawings that they flash on the screen in those shows - those won't be drawn with a virtual keyboard even in star date 2050. The human hand will still need the equivalent of a mouse/input device to draw. Controlling a vehicle or calling up database records, sure, but creating graphics will not be done on a virtual keyboard. I'm pretty sure of this unless telekinesis becomes the new input method.
Well, they never actually show us how those graphics are created. Presumably, the ship schematics type of graphics were created elsewhere and loaded into the system. For ad hoc graphics that, for example, are displayed during meetings, it's unclear what input method they use, although, there's no reason that the touch "control panels" can't double for this input purpose, duplicating and extending what a Wacom tablet does today. Remember, this is the 24th century, so graphics software probably works much better with the human hand as a drawing instrument, straightening lines, rounding arcs, etc. However, they haven't completely abandoned "old media", such as paint and canvas, so they may still have "drawing pens" around. Even in the Star Trek model there are still some unknowns.
No, it'll be great, they will tell you what you want to do. There won't be any more mental clutter.
Right, right. I was wondering how google would monetize my brain but of course they would skip that step and just control it outright. I think I'm up to speed now.
No, it'll be great, they will tell you what you want to do. There won't be any more mental clutter.
Awesome. And while you do what you are supposed to be doing you'll think you are sleeping and you'll wake up all rested and all your chores are done. By YOU!. You'll be thanked and appreciated no end and told to go and play.
Comments
Why the hell won't Apple just build a normal freaking mouse?
Why don't you just get one from the multitude of other mice manufacturers - Apple have identified a niche and it must be doing ok for them.
Too much clutter for a mouse- sounds too Kensingtonish.
However I could see this for the magic trackpad- makes perfect sense.
Mobile mouse app works fine for me right for now.
Totally.... the magic mouse has too much going on already. I ditched mine as soon as the magic trackpad came out. You really need more space for multitouch than what's possible on the back of a mouse.
One day physical input devices such as keyboards and mice will look and feel archaic.
Maybe for mice. But keyboards (in some form) will always exist. Language access is essential for any type of computing device. And I don't think voice recognition is going to be taking over any time soon.
I can't tell him to click the right button when there are no buttons.
Maybe for mice. But keyboards (in some form) will always exist. Language access is essential for any type of computing device. And I don't think voice recognition is going to be taking over any time soon.
Ah yes... my previous comment was phrased wrong. I don't think physical input devices will disappear but I think physical 'buttons' will. Some years from now the keyboard will be a 'touch' extension of the monitor, is what I meant. As for voice input... I have tried it but find it weird and difficult. It might be something one needs to become used to in order to find 'natural', but I believe it will be too disruptive in a communal working space to ever catch on. But who knows.
... I don't think voice recognition is going to be taking over any time soon.
Indeed.
Voice recognition is one of the longest running boondoggles in tech. It's inherently impossible for a computer to "recognise" what you are saying, but we all pretend like it's going to happen someday anyway.
The reason is that the computer would have to be conscious to perform that task and you guessed it ... Artificial Intelligence, or the idea that a computer will someday "think" is a an even longer running boondoggle.
Voice recognition will always be limited to the user shouting phrases and keywords at the computer and will never get to 100% accuracy. It's just not possible right now, and even if it becomes possible someday, that day likely won't be for many decades at the very least.
Gestures, I'm on board with, but this is a clear step backwards.
-Clive
Ah yes... my previous comment was phrased wrong. I don't think physical input devices will disappear but I think physical 'buttons' will. Some years from now the keyboard will be a 'touch' extension of the monitor, is what I meant. As for voice input... I have tried it but find it weird and difficult. It might be something one needs to become used to in order to find 'natural', but I believe it will be too disruptive in a communal working space to ever catch on. But who knows.
This might be a good time to review how this works under the Star Trek model of computing. Yes, personal computers do exist, in people's homes, for example, but let's focus on how this works aboard the Enterprise-D. Are there mice? No. Are there keyboards with keys on them? No. Do they primarily use touch screens? No. Does everyone talk to the computer most of the time? No. (Unless they need enable Self-Destruct, when it doesn't really matter that it's disruptive because everyone is fighting for a spot in the escape pods anyway.)
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
This might be a good time to review how this works under the Star Trek model of computing. Yes, personal computers do exist, in people's homes, for example, but let's focus on how this works aboard the Enterprise-D. Are there mice? No. Are there keyboards with keys on them? No. Do they primarily use touch screens? No. Does everyone talk to the computer most of the time? No. (Unless they need enable Self-Destruct, when it doesn't really matter that it's disruptive because everyone is fighting for a spot in the escape pods anyway.)
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
Ha ha... the 'Star Trek model'. Love it. To be fully 'there' I guess 10 years is the ball park but I am not sure haptic feedback is the barrier. I think such feedback may be a little over rated and is more cultural than anything. The barrier is probably cost more than anything. But this is changing and I can imagine a Magic TrackPad II with a software context driven touch display surface as a fantastic input device. Then a full size keyboard (no longer called a keyboard, of course). Being able to operate Photoshop, or FCP or similar using two hands on a dedicated touch surface, with gesture invoked commands will be truly revolutionary.
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
You know all those engineering drawings that they flash on the screen in those shows - those won't be drawn with a virtual keyboard even in star date 2050. The human hand will still need the equivalent of a mouse/input device to draw. Controlling a vehicle or calling up database records, sure, but creating graphics will not be done on a virtual keyboard. I'm pretty sure of this unless telekinesis becomes the new input method.
This might be a good time to review how this works under the Star Trek model of computing. Yes, personal computers do exist, in people's homes, for example, but let's focus on how this works aboard the Enterprise-D. Are there mice? No. Are there keyboards with keys on them? No. Do they primarily use touch screens? No. Does everyone talk to the computer most of the time? No. (Unless they need enable Self-Destruct, when it doesn't really matter that it's disruptive because everyone is fighting for a spot in the escape pods anyway.)
The basic model is the virtual keyboard. This is a touch sensitive control panel that adapts to function, separate from the screen. Difficult to say whether they have haptic feedback, but likely, since they often manipulate it without looking at it. This is where we are headed, the only question is how long it will take us to get there.
I think we are still several years away from this and the thing that's holding it back is the development of haptic feedback interfaces. So, we probably won't be there entirely in 5 years, but maybe in 10. In the meantime, we can enjoy our tricorders and tablets.
I like this. But what about those of us like me that are incredibly lazy? I want thought input and a visual cortex stimulator.
You know all those engineering drawings that they flash on the screen in those shows - those won't be drawn with a virtual keyboard even in star date 2050. The human hand will still need the equivalent of a mouse/input device to draw. Controlling a vehicle or calling up database records, sure, but creating graphics will not be done on a virtual keyboard. I'm pretty sure of this unless telekinesis becomes the new input method.
Well, they never actually show us how those graphics are created. Presumably, the ship schematics type of graphics were created elsewhere and loaded into the system. For ad hoc graphics that, for example, are displayed during meetings, it's unclear what input method they use, although, there's no reason that the touch "control panels" can't double for this input purpose, duplicating and extending what a Wacom tablet does today. Remember, this is the 24th century, so graphics software probably works much better with the human hand as a drawing instrument, straightening lines, rounding arcs, etc. However, they haven't completely abandoned "old media", such as paint and canvas, so they may still have "drawing pens" around. Even in the Star Trek model there are still some unknowns.
I like this. But what about those of us like me that are incredibly lazy? I want thought input and a visual cortex stimulator.
That will come when you get your Google implant and become part of the collective.
That will come when you get your Google implant and become part of the collective.
Oh great. Just what I need, ads cluttering my already cluttered brain.
Guess I better dial back my ideas a bit.
Oh great. Just what I need, ads cluttering my already cluttered brain.
Guess I better dial back my ideas a bit.
No, it'll be great, they will tell you what you want to do. There won't be any more mental clutter.
No, it'll be great, they will tell you what you want to do. There won't be any more mental clutter.
Right, right. I was wondering how google would monetize my brain but of course they would skip that step and just control it outright. I think I'm up to speed now.
No, it'll be great, they will tell you what you want to do. There won't be any more mental clutter.
Awesome. And while you do what you are supposed to be doing you'll think you are sleeping and you'll wake up all rested and all your chores are done. By YOU!. You'll be thanked and appreciated no end and told to go and play.