And the practical applications of QE are? OH making the speed differential at least tolerable. hehehe...
In a way this is a good thing for Apple. Having the GPU take center stage might make Apple more immune to occasional CPU doldrums. Now the just need a way to put the GPU to work crunching more than just interface elements, mebbe encoding/decoding video and 3d (more involved than currently)
And the practical applications of QE are? OH making the speed differential at least tolerable. hehehe...
The point is...who really cares about the practical applications if they're going to be so mundane? At least describe what practical applications the spinny rippling windows can be used for. Example, the genie effect shows you where the window goes when it's minimized...that's a practical application. The rippling window dragging in Longhorn does what?
I was suprised by the number of comments on that site that tipped their hat to Apple for the look of Longhorn. But, much like the HP/MS prototype computer looks like a mac rip-off just as Longhorn seems to borrow its feel from OSX, by the time Longhorn and that prototype cpu are out, Apple will be lightyears ahead of them again.
Microsoft reminds me of a newbie web designer, doing 'cool' stuff 'because they can', not because it adds functionality or makes sense in the context.
sad part is, they aren't new at all. ms has been around since about when apple started.
all that stuff does look pretty cool, but i wouldn't want my OS doing it. if i want to see something wavy and cool, i just drop some hallucinogens. an os that does that stuff is taking proc time and memory away from my programs and services. i dont even use apple's flourishes when i can; the genie effect was turned off as soon as i discovered i could. it's nice to see that microsoft is using some of its resources on making gui toodads, and not spending all of their money on global domination and conquest.
Exactly. If anyone can find a point to all this BS that's going to make an appearance in the next version of Windowze, then please, do enlighten me.
What utter manure --- only dumb consumers are going to buy this as a reason to upgrade. "Look, spinning windows as my screensaver!!"
Personally, I have had many a windows fanatic drool over the Open GL screensavers available for MacOS X. They even went so far as to say that could be the one viable reason to upgrade -- OpenGL compatibility. So, in that light, I guess windows people really do concern themselves about such mundane issues.
Oh, well. Let them game and go to idiotic websites and blog to the heart's desires. We'll produce the masterpieces.
Eugene, you don't need oozing, melting windows just to show you where a window will dock, a simple outline frame will do.
Yes that longhorn stuff is all superfluous, but the only thing that really made QE "functional" was the massive CPU speed disadvantage in OSX that becomes painfully aparent without it -- its ability to compensate for a 3 legged desktop CPU architecture desperately trying to run a modern OS.
One thing I'd like to see are double sided windows. If quartz ex is using the 3D capabilities of the card to display the UI, then it could be possible to have a window (more like a card) with a different view on each side of it. Clicking a tab could rotate said window on the Y-axis and show completely different contents on the back side of it, more like flipping a page or notecard?
Do you really expect them to have everything worked out already? They'll get rid of the black corners before the 2005 release. Obviously there are going to be little problems like that... I mean, it's Alpha software. And it's MS on top of that!
Well anyway, they might get to the black corners, but it'll probably still look ugly and somewhat disjointed even when it's complete. MS just can't get everything to fit together the way Apple can.
Eugene, you don't need oozing, melting windows just to show you where a window will dock, a simple outline frame will do.
Yes, but going by the original Mac HI goals, something that looks like a physical transformation of the window is a better idea. It extends the metaphor of GUI objects as real things. The zoom rect was a compromise in that regard.
Granted, "scale" and "suck" do much the same thing, so there's definitely some gee-whiz involved, but that's not such a bad thing. Small delights have their place in the user experience, even if they consume a few extra cycles for a tenth of a second here and there.
The "ripple effect" here, however, is purely gratitous. My real windows wouldn't ripple unless there was a tornado parked outside my house. In fact, this reduces usability (if slightly), because it distorts the contents of the window while it's being moved.
Sure I like the nice liquidy interface. I'll agree that things that give the interface "character" make it ultimately more pleasant, more approachable. Are we arguing that M$ doesn't meet up with Apple's aesthetic standards? Well, duh!
What interests me more about this whole movement to make the GPU earn its keep, is that the CPU may over the next few years become less important. This is good for Apple, but not useless for M$ either, if they could deliver a baseline experience on even the cheapest CPU's, for example. We might say Apple needed QE a lot sooner than M$ did. But back to the interesting bit. Right now, the GPU is helping out with the UI, and DVD playback and yadda yadda, but, just like say the 3-d sitting there on your GFx card was put to work for the benefit of a better 2-d UI experience, could not some (platform) specific implementations of whatever current API be utlized for the benefit of more meaningful work?
I once read someone on these boards describe a set-em-up-and-knock-em-down approach. Instead of the CPU doing the bulk of complex calculations, a larger chunk of the work could be prepared into bites the GPU is really good at working with. It is afterall, all math. So maybe rather than just paint what the CPU has directed, the GPU might do more "figuring" more "crunching" as it were. I dunno how the magic electrical bits would work, just thinking of it philosophically.
I think if QE is going anywhere more interesting for the legions of Photoshoppers, and video/3-d pros, and mebbe even science, then that's wher it might go. That would be useful in making overall performance less dependent on CPU performance, provided a huge CPU to memory to GPU memory throughput, probably a mite more than current AGP buses provide? But again, iDunno, just thinking out loud.
Comments
Then again, there are severe UI snafus in OSX... but I attribute that to its youth...
... or something...
In a way this is a good thing for Apple. Having the GPU take center stage might make Apple more immune to occasional CPU doldrums. Now the just need a way to put the GPU to work crunching more than just interface elements, mebbe encoding/decoding video and 3d (more involved than currently)
Originally posted by Matsu
And the practical applications of QE are? OH making the speed differential at least tolerable. hehehe...
The point is...who really cares about the practical applications if they're going to be so mundane? At least describe what practical applications the spinny rippling windows can be used for. Example, the genie effect shows you where the window goes when it's minimized...that's a practical application. The rippling window dragging in Longhorn does what?
Originally posted by 1337_5L4Xx0R
Microsoft reminds me of a newbie web designer, doing 'cool' stuff 'because they can', not because it adds functionality or makes sense in the context.
sad part is, they aren't new at all. ms has been around since about when apple started.
all that stuff does look pretty cool, but i wouldn't want my OS doing it. if i want to see something wavy and cool, i just drop some hallucinogens. an os that does that stuff is taking proc time and memory away from my programs and services. i dont even use apple's flourishes when i can; the genie effect was turned off as soon as i discovered i could. it's nice to see that microsoft is using some of its resources on making gui toodads, and not spending all of their money on global domination and conquest.
Exactly. If anyone can find a point to all this BS that's going to make an appearance in the next version of Windowze, then please, do enlighten me.
What utter manure --- only dumb consumers are going to buy this as a reason to upgrade. "Look, spinning windows as my screensaver!!"
Personally, I have had many a windows fanatic drool over the Open GL screensavers available for MacOS X. They even went so far as to say that could be the one viable reason to upgrade -- OpenGL compatibility. So, in that light, I guess windows people really do concern themselves about such mundane issues.
Oh, well. Let them game and go to idiotic websites and blog to the heart's desires. We'll produce the masterpieces.
(damn title length restrictions...)
Yes that longhorn stuff is all superfluous, but the only thing that really made QE "functional" was the massive CPU speed disadvantage in OSX that becomes painfully aparent without it -- its ability to compensate for a 3 legged desktop CPU architecture desperately trying to run a modern OS.
One thing I'd like to see are double sided windows. If quartz ex is using the 3D capabilities of the card to display the UI, then it could be possible to have a window (more like a card) with a different view on each side of it. Clicking a tab could rotate said window on the Y-axis and show completely different contents on the back side of it, more like flipping a page or notecard?
Notice the black next to the rounded corners.
Originally posted by Luca Rescigno
Do you really expect them to have everything worked out already?
Did I say that?
Originally posted by JLL
Did I say that?
oops
Well anyway, they might get to the black corners, but it'll probably still look ugly and somewhat disjointed even when it's complete. MS just can't get everything to fit together the way Apple can.
Originally posted by Matsu
Eugene, you don't need oozing, melting windows just to show you where a window will dock, a simple outline frame will do.
Yes, but going by the original Mac HI goals, something that looks like a physical transformation of the window is a better idea. It extends the metaphor of GUI objects as real things. The zoom rect was a compromise in that regard.
Granted, "scale" and "suck" do much the same thing, so there's definitely some gee-whiz involved, but that's not such a bad thing. Small delights have their place in the user experience, even if they consume a few extra cycles for a tenth of a second here and there.
The "ripple effect" here, however, is purely gratitous. My real windows wouldn't ripple unless there was a tornado parked outside my house. In fact, this reduces usability (if slightly), because it distorts the contents of the window while it's being moved.
Verdict: Cute, but dumb.
What interests me more about this whole movement to make the GPU earn its keep, is that the CPU may over the next few years become less important. This is good for Apple, but not useless for M$ either, if they could deliver a baseline experience on even the cheapest CPU's, for example. We might say Apple needed QE a lot sooner than M$ did. But back to the interesting bit. Right now, the GPU is helping out with the UI, and DVD playback and yadda yadda, but, just like say the 3-d sitting there on your GFx card was put to work for the benefit of a better 2-d UI experience, could not some (platform) specific implementations of whatever current API be utlized for the benefit of more meaningful work?
I once read someone on these boards describe a set-em-up-and-knock-em-down approach. Instead of the CPU doing the bulk of complex calculations, a larger chunk of the work could be prepared into bites the GPU is really good at working with. It is afterall, all math. So maybe rather than just paint what the CPU has directed, the GPU might do more "figuring" more "crunching" as it were. I dunno how the magic electrical bits would work, just thinking of it philosophically.
I think if QE is going anywhere more interesting for the legions of Photoshoppers, and video/3-d pros, and mebbe even science, then that's wher it might go. That would be useful in making overall performance less dependent on CPU performance, provided a huge CPU to memory to GPU memory throughput, probably a mite more than current AGP buses provide? But again, iDunno, just thinking out loud.
Must be a really early alpha build because there is no compositing at all. All objects are opaque. Oh well they've got a couple of years to fix that.
Originally posted by cowerd
Is it just me, or does MS really suck at demos.
Must be a really early alpha build because there is no compositing at all. All objects are opaque. Oh well they've got a couple of years to fix that.