High performance tools for OS X?
Is there any OS X software that lets you know if your system is running optimally and what bottlenecks there are? I remember seeing an app that checked pageouts to let you know if you needed more ram so something like that but more full featured.
Also, is there any way to optimize OS X itself? So many people ask me if there are any ways to make it faster or diagnose speed issues and I don't know of any tools to do that.
When a program runs slowly, is it a hard drive issue, OS X's memory allocation, a Ram problem? If there was some software that would let you know where the bottleneck was - it would be most helpful to be able to monitor individual apps. I've used the developer tools and they sort of do that but they only really give information that would help developers.
Another thing I was wondering is about replacing system components with highly optimized versions. Would it be possible to compile the open source parts of OS X and replace them? Similar to how Linux users can compile their own kernel. Apple obviously make OS X to work well across their machine range but possibly they don't optimize as much as they could to ensure compatibility.
Has anyone ever heard of people who know about operating systems taking OS X apart and analysing where it can be improved? I read somwhere that Apple was criticized for using a micro-kernel as opposed to a monolithic kernel solely because of performance issues.
One thing I think that needs some serious improvement is how OS X deals with certain layers. I mean why are X11 apps and Java apps so slow? For me on a 1.25GHz system, most apps running is those system layers are unusable and I don't think it should be the case. Not when Linux has no problem on even lower end machines. Should I use a 3rd party Java VM perhaps?
Also, is there any way to optimize OS X itself? So many people ask me if there are any ways to make it faster or diagnose speed issues and I don't know of any tools to do that.
When a program runs slowly, is it a hard drive issue, OS X's memory allocation, a Ram problem? If there was some software that would let you know where the bottleneck was - it would be most helpful to be able to monitor individual apps. I've used the developer tools and they sort of do that but they only really give information that would help developers.
Another thing I was wondering is about replacing system components with highly optimized versions. Would it be possible to compile the open source parts of OS X and replace them? Similar to how Linux users can compile their own kernel. Apple obviously make OS X to work well across their machine range but possibly they don't optimize as much as they could to ensure compatibility.
Has anyone ever heard of people who know about operating systems taking OS X apart and analysing where it can be improved? I read somwhere that Apple was criticized for using a micro-kernel as opposed to a monolithic kernel solely because of performance issues.
One thing I think that needs some serious improvement is how OS X deals with certain layers. I mean why are X11 apps and Java apps so slow? For me on a 1.25GHz system, most apps running is those system layers are unusable and I don't think it should be the case. Not when Linux has no problem on even lower end machines. Should I use a 3rd party Java VM perhaps?
Comments
If you want something more colorful and pretty, and just checking now, with everything that top shows you, so you might as well use it, is Activity Monitor, also in the Utilities folder.
I would say it's a bad idea (if possible at all) to try to compile OSX for your computer. If anything is possible, it would be with the Darwin source code which you can download.
The micro-kernel issue is one I've heard about but am not really familiar with. I know that a micro-kernel relies on Kernel Extentensions (KEXT). This modularity is where the speed costs comes in, but it also allows speedier and more reliable development, as KEXTs can't screw up everything like they would if they were embedded right in the kernel. (I think). Hopefully someone else can clarify or correct.
As far as Java and X11, I don't know what you mean. I've got a 1.25Ghz G4 PowerBook with 512 and everything runs fine.
Originally posted by macserverX
Well, running "top" without any options in the command line (/Applications/Utilities/Terminal[.app]) shows a whole lot of stuff. You'll have to read the man page for some more information as I don't use for more than to figure what's making the computer run hot and getting the fan going (PowerBook).
If you want something more colorful and pretty, and just checking now, with everything that top shows you, so you might as well use it, is Activity Monitor, also in the Utilities folder.
Yeah, they give some useful info for sure but what I'm looking for is a way to sample a running application and find out what it is doing and how much time it is taking. The sample process option in Activity Monitor doesn't seem to return any useful information.
For example, the sample could return information about how much time disk IO is taking or if the process is addressing Ram and if it's having problems with the VM cache.
The OpenGL profiler is quite good in this respect because it returns what function calls were made and you can list them by the time they take. Again though, that only really helps developers optimize their own code.
Originally posted by macserverX
I would say it's a bad idea (if possible at all) to try to compile OSX for your computer. If anything is possible, it would be with the Darwin source code which you can download.
Yeah, that's the kind of thing I meant. I know only parts of OS X are open source but if the open source parts were critical system processes and could be optimized per processor, it might be very worthwhile. It would be like getting a processor upgrade.
Originally posted by macserverX
The micro-kernel issue is one I've heard about but am not really familiar with. I know that a micro-kernel relies on Kernel Extentensions (KEXT). This modularity is where the speed costs comes in, but it also allows speedier and more reliable development, as KEXTs can't screw up everything like they would if they were embedded right in the kernel. (I think). Hopefully someone else can clarify or correct.
If kexts screw up, they take down the system like a monolithic kernel OS. I think that's why people criticized Apple's decision. But, I've read something recently that suggess it isn't all that bad. Apparently, Apple designed it in such a way that certain components share address space to minimize normal microkernel overhead. Plus, as you say, microkernels have other advantages like development speed and they are probably far easier to maintain.
Originally posted by macserverX
As far as Java and X11, I don't know what you mean. I've got a 1.25Ghz G4 PowerBook with 512 and everything runs fine.
Well, it's just that any X11 app or Java app I run is much slower than native software. Gimp, Inkscape, OpenOffice, Azureus all have performance issues for me. Apple updated Java recently and improved interface speed by a lot but it's still too slow. I don't see how X11 apps or Java apps should be slower than say running an OS 9 app under classic but they really are for me.
What X11 app do people use btw? I use the one that came with the OS but maybe XDarwin is better?
Originally posted by Marvin
If kexts screw up, they take down the system like a monolithic kernel OS.
Actually, no. That's precisely the difference between a monolithic kernel (e.g. Linux) and a Microkernel: the kernel itself is tiny; pretty much everything is handled in modules (kernel extensions). Kernel extensions can be dynamically loaded and unloaded. When they crash, the kernel itself, in theory, will not.
I think that's why people criticized Apple's decision.
The downside to a Microkernel is performance, or rather lack thereof. A Microkernel, because of the messaging between its modules, is slower than a monolithic kernel, which hosts most of its features internally.