Originally Posted by antkm1
So I understand that different OS's use different languages or whatever to operate your computer and applications.
Here's a simplistic breakdown of how it works:
- The CPU in your computer/device executes very simple instructions (e.g. set a memory location to a numerical value, add/subtract two numerical values, etc). Every CPU uses a different instruction set.
- Because CPU instructions are so very simplistic, it would be extremely tedious and time consuming to create a large application with them.
- So software developers use "higher level" instructions (i.e. computer languages) to create applications. These higher level instructions are then turned into those simple CPU instructions via a compiler or an interpreter. Most computer languages can be used on many OSes.
- Even still, these higher level instructions/languages only allow you to do fairly basic things which are common across all computers, which is why they can be used on a number of different OSes. So developers also need to use "libraries" or "bundles" of other functionality which have been created by the OS developers and bundled with their operating system (e.g. to draw a window on the screen, draw text on the screen, etc). Every operating system has it's own set of "libraries" or "bundles" for doing these types of things.
- Hence the reason why a program written for one operating system is incompatible with another operating system. Because the "libraries" of functionality it uses to perform tasks are specific to the operating system it was written for.
There's a couple more complex reasons why a piece of software written for one OS can't be run on another one (e.g. how the OS organizes memory), but that's the gist of it.
I've been a PC/Windows user most of my life but would love to switch to MAC or a more efficient alternative, mainly because i think Windows is plauged with problems. No, perhaps that's because they are too open to all types of programs or just aren't very good at managing the magnitude of programs that are written for Windows, i'm not sure. And, i kind of realize that OS X is more stable because if the restrictions it has on application that it accepts.
My educated guess at the reason why Mac OS X is generally more stable than Windows:
Apple didn't try to reinvent the wheel for everything and instead used existing, well designed, time-proven technology for OS X (BSD kernel, UNIX security model, Apache, Samba, WebKit, etc). There's a history of philosophical differences between Microsoft/Bill Gates and academia/the UNIX developer community, and thus they tend to try and cut their own path for everything, often at the peril of ignoring the lessons of the past and/or not considering the future. The reason they do this is because they intend to lock users into their own (often buggy and ill-conceived) technologies. Which is good for business, but not necessarily good for the end user or the future of technology.
Over time, the lack of foresight or care taken in designing certain technologies catches up with Microsoft, and they are forced to make a choice between abandoning the technology and trying to come up with a replacement (since they refuse to budge on their distaste for adopting open technologies) or spending an inordinate amount of time patching the holes and problems in their existing technology (as they usually do). Whereas Apple tends to choose the first option more often than not (replacing older technology with open, proven alternatives and hiring the person/people who created it).
My questions (or proposition) is, why can't all these OS's just come to a compromise and use a language that works for all software, so if you did want to switch you could simply install all your same programs without having to have a "window's" version to a "Mac" version to a "Linux" version?
As I pointed out above, it's not the "language" which is the problem, it's the set of "libraries" which are bundled with the OS that's the problem. These are things which the OS creators have spent a lot of time and money on developing, and are what differentiate their OS from others, and so they aren't likely to want to give them away for the sake of compatibility (Linux aside, of course).
What Google is banking on is that, if you make the entire OS a web browser, and make all the web technologies look the same, you'll get some level of consistency (since nothing exists outside of the web browser). However, I'm still not convinced that forcing all applications to run in a web browser will allow all types of applications to work well.