Rip off or... convergence?
I just picked up this from an ongoing MacNN discussion. Many would think that this is a blatant and shameless rip off of some Mac OS X original ideas and implementation. Others would say it is an inevitable convergence in desktop UI concepts between operating systems, now that the hardware has enough power to support such technologies. What you AI fellows do think?
Comments
While you can think the same thing, it's very odd to implimiment it in a fashion that makes very little sense but to copy a previous idea.
Like using "Jill of all trades" is a dead ringer to "Jack of all trades" because you could of stuck any name in there in any combination of the english language to refer to a, "women who is knowledgable in many fields".
Most resently the Zune-ipod is one up for such, because Zune has a clickhwheel that looks like ipod, except it not a clickwheel, it's buttons with a circle. Why couldn't they do an square, or an octogon, or a "Z".
In XGL, you seem some obvious references that serve quite little purpose to it except, "I saw it on something", so I'll use it.
1: Cube effect
2:expose (even down to how it's highlighted and moves)
3: Dock, down to the bounce
I've never been an opponent of copycats because freedom of ideas has always been a core process to improvements, and this is after all open source, so no money is being made even if there is a party that's hurt by this. As long as the ego doesn't get puff up, it's just more kudos to apple in the history books.
The technology side, it's probably very different, as even ideas mean nothing without lots of man hours and coding.
And the fundemental question? No, question, Apple was critizised to bring such powerful technology to a desktop GUI, for no purpose(they thought at the time). In the Aqua intro days, people were like "what a blatant waste of system resources". Apple proved them wrong (just like terminal vs GUI). Now everyone realized the power you can do with, "blatant waste of system resources".
Back in the days a lot of people have said "The earth is round". It took a guy to stretch his neck out and say it to get in the history books.
Linux especially has a great lack of innovation. Most of the programs are just free versions of commercial software made so that people don't have to pay up. Openoffice is a rip-off of office, Gimp-Photoshop, Inkscape-illustrator etc.
I'm not saying that the idea of open source software is bad but it's quite clear to me why Linux has very little commercial backing.
I also find that when rip-off software is implemented in Linux, it is done badly. You can see how clumsy that Expose rip-off looks and most of the UI capabilities are unnecessary.
I honestly don't believe that the operating systems will ever converge. Linux has too much of this 'everything is free' attitude and commercial businesses are turned off to that because they need to make money.
Windows has too many proprietary APIs and Microsoft know that's how they can maintain a stranglehold on the market so that won't change. They are also long time haters of unix.
OS X is tied to the hardware it runs on.
The only way we will get a unified OS is by having a system to run on all hardware, that has commercial backing (mainly through standardization not so much philosophy) and is unix-based. OS X IMO comes closest but Apple would need to decide if potentially gaining a large portion of the 90%+ global software market is sure enough to risk losing a portion of their 5% stake in the global hardware market.
I personally don't think they'd lose much because these days, their machines are far better designed than the competition but you never know.
Real world isn't the same as simulation. Theories, and simulation are just that. They're great to get a picture, but the power needed to run it anywhere close to acutate is astromoical, just look at weather forcasts.
There are so many things in a busness that would kill apple's in a software/os only company, it's not funny.
Just simply the volume they need to put out, is astronomical. People don't talk of infrastructure in software due to it's unlimited reproduction nature, but it's still there.
You would have to fire 70% of your staff, change them all to programers, expand the size of it, while managing a corherent product. Not to mention triple your support, ax your apple store, and so many other things. Oh yea, and completely borrow without showing profits for a good couple of years.
Can it be done? Yes, over long term, and slow stable growth. It's unrealistic otherwise. Growth without maturely means nothing.
I just picked up this from an ongoing MacNN discussion. Many would think that this is a blatant and shameless rip off of some Mac OS X original ideas and implementation. Others would say it is an inevitable convergence in desktop UI concepts between operating systems, now that the hardware has enough power to support such technologies. What you AI fellows do think?
That's funny, I've never seen a freely rotatable desktop-cube before, or flexible UI widgets and windows with momentum, or three-dimensionally layered Matrix type in Mac OS X.
Can somebody tell me the .plist I enable these things in?
I just picked up this from an ongoing MacNN discussion. Many would think that this is a blatant and shameless rip off of some Mac OS X original ideas and implementation. Others would say it is an inevitable convergence in desktop UI concepts between operating systems, now that the hardware has enough power to support such technologies. What you AI fellows do think?
I think the video of the OS was pretty cool. The UI was snappy as hell! Was it Linux? Damn! I agree some of features were unnecessary like the watery effect moving around windows but other than that, it was pretty cool.
S...and this is after all open source, so no money is being made even if there is a party that's hurt by this.
Not so sure on the money side. There are Linux distribution you pay for, and I know people that like to pay just to support development. The line is not so clear there.
It depends what you mean by convergence. Apple implemements something and everyone else does the same thing does not suggest convergence to me.
OK, let me chime in. I think it is is a little of the two. I mean, clearly someone would think to leverage the GPU power at the UI level (whatever OS), knowing what these days GPUs are capable for. But here is Apple that pushes the development in that direction for years now and everyone else just watches. So, there is already a solid ground of useful ideas and implementation of older ideas, coming from Apple, that can serve (and apparently serves) as basis for subsequent development. At some point the line is blurred (e.g., Fast User Switch and the upcoming Spaces; old ideas implemented in an original and user friendly way).
I agree some of features were unnecessary like the watery effect moving around windows but other than that, it was pretty cool.
It looks like a technology preview for now. But at some point decisions have to be made as to what is useful and needed and what not.
I think it's funny that even linux is lapping Windows now...
LOL indeed!
The window wobble effect is overkill and not very practical IMHO. I like how it "magnetises" however.
What I really like is "Spaces" on Linux, using the Cube effect. You can drag and hold a window to another space in a very cool way. I even like it better than the Leopard method.
The window wobble effect is overkill and not very practical IMHO. I like how it "magnetises" however.
Actually it's a bad idea because it's flawed.
You're not giving it more space, you're giving it extended space.
Next Problem is a cube is limited to left and right. Top and bottom would cause so much trouble due to menu's being there.
Third problem is the idea of the movable menu. If you move your menu to the left, how is that going to effect the cube rotation? It's looks quite weird and flawed
In this manner you have screwed up various programs that relies on those boarders. Not ever program plays nice.
Spaces is a more polish ideal, as it enforces the idea of spaces. An enviorment.
It may sound old school, but it's actually it's GUI rule, don't let users shoot themselves in the foot.