You don't have to be a SW development to be able to think rationally. Anyone that thinks version number 10.43.657(b), IP address 192.168.0.1, or the date 2015.02.21 is impossible is simply an idiot or a troll.
Your universe is full of trolls isn't it? Angry trolls under the bed.
BF asked a simple question. The answer is simple. The "dot" isn't a decimal place. You could make that point without recourse to nastiness.
1) No, he didn't ask a simple question. He asked an obvious question to which he's been been the explained the answer numerous times. He's been saying this crap for years now. Are you fucking saying I'm suppose to reply to a fucking troll sincerely until the end of time simply because their trolling is simplistic? :rolleyes:
2) No, I can't answer him, because recently — and finally — became aware of his trolling and as a result blocked him. Note, I responded to [@]SpamSandwich[/@] about the lack of others to think rationally. Again, you're saying that I should continue for all eternity to explain what is obvious to 10 year olds that a period can be used as a separator between various alphanumerics. Seriously?! Well then **** you, too. :no:
Your universe is full of trolls isn't it? Angry trolls under the bed.
BF asked a simple question. The answer is simple. The "dot" isn't a decimal place. You could make that point without recourse to nastiness.
To be fair, BF never asks simple questions, he always has an anti Apple innuendo or insult in the comment. Hence some of us rise to the bait on a quiet day for sport.
This is the right decision. There's got to be a balancing act between supporting older OS versions and making vital features compulsory. ARC has too many benefits to be ignored.
Any developer on Mac who doesn't want to update their apps can always sell direct.
incorrect. this has been discussed ad nauseum but the points in software versioning represent separators, and not a decimal data type. for example popular component developer Telerik includes the date as part of their point versioning scheme: "2014.4.1209" for Decemeber 9; the next release could be "2015.1.0411" for April 11.
Besides, Apple really want to go all Spinal tap and go to 11 ;-), they relish the thought :-).
(1) Some old technologies are specifically removed, it's true. But Apple maintains a list of widely used apps and tests OS changes against them. Still, there are always limits. Apps that are using Cocoa and reasonably modern API continue to work for a long time. If Apple were to *never* remove old stuff, progress on new stuff would be hamstrung due to the increased maintenance cost.
(2) Okay, so *roughly* a decade, anything updated post-Intel. I still play Unreal Tournament 2004 with friends, even on the latest unreleased OS X. Although honestly, apps that old get so little use overall. I know they're near and dear to the people who still use them, but in the grand scheme of things, they are probably a tiny minority.
Fortunately, these days you can run most PowerPC / Carbon stuff by running Snow Leopard Server in VMWare (it's pretty unfortunate if you don't have a developer account, though, since OS X Server is pretty expensive on its own). Still haven't found a good solution for Classic apps, though (SheepShaver is not very stable and fails on a lot of old games).
(3) I know the limitations, I've been using ARC since before it was publicly announced. ;-) When I said "unused" I didn't just mean that they symbol wasn't directly referenced in code. Also, much of the retain/release overhead has been mitigated by greatly increased speed of calling those methods, due to runtime wizardry under the hood. In my experience, performance is generally pretty much the same between ARC and the *best* MRR code, but most MRR code is not the best, and contains bugs like extra/missing retain/release calls.
Like I said, it's usually close enough to be a decent compromise. All those extra calls do have a cost, though. MRC code that didn't use autorelease pools would come out ahead pretty much every time, I'd expect.
As for missing retain/release calls, it's pretty simple to find those using Instruments, and the clang static analyzer catches a lot of that stuff as well. ARC is, in fact, based on the code to the static analyzer, so really, it's pretty comprehensive, and will catch most of the cases where ARC would have helped you.
The searching up the stack is done at compile time (pre process time).
That's not true. The compiler inserts a call named objc_retainAutoreleasedReturnValue() at the method return, and adds a call named objc_autoreleaseReturnValue() at the call site. The former call looks up the call stack for the latter call at runtime, and if both of these calls are present, they both skip what they'd normally do. The check has to happen at runtime, though, because the same function or method may be called by either ARC or non-ARC code, and due to the dynamic nature of the Objective-C runtime, the compiler can't completely predict who may call the method (especially if it's framework code).
Retail price for the server version (which is the only version that works in a VM due to licensing issues; with 10.7 and up the license was modified to allow the standard version to run in a VM, but Rosetta was already removed in that version, so...) was $499.
Of course, it's no longer sold, so nowadays you'll be buying it used, for whatever price people decide to sell it for on eBay. So, probably less than $499, but assuredly more than the $30 that the standard Snow Leopard discs went for.
If you have a developer account, which costs $100 per year, you can download an install image for SL server for free. This is, honestly, probably the most economical option.
There are, of course, hacks like modifying plists in the system to make Snow Leopard Client look like Snow Leopard Server, which is usually enough to get the VMs to run it. This completely breaks Software Update, though, so you either won't get any security updates, or possibly you'll get the wrong ones and break things.
Retail price for the server version (which is the only version that works in a VM due to licensing issues; with 10.7 and up the license was modified to allow the standard version to run in a VM, but Rosetta was already removed in that version, so...) was $499.
Of course, it's no longer sold, so nowadays you'll be buying it used, for whatever price people decide to sell it for on eBay. So, probably less than $499, but assuredly more than the $30 that the standard Snow Leopard discs went for.
If you have a developer account, which costs $100 per year, you can download an install image for SL server for free. This is, honestly, probably the most economical option.
There are, of course, hacks like modifying plists in the system to make Snow Leopard Client look like Snow Leopard Server, which is usually enough to get the VMs to run it. This completely breaks Software Update, though, so you either won't get any security updates, or possibly you'll get the wrong ones and break things.
For that cost and effort it seems like it would be better to buy an old Mac that supports a newer version of Mac OS X with the $20 Mac App Store version of OS X Server.
OT: Is there a way to get either AFP or SMB to connect to a network share faster? It takes 20-22 seconds, which is far too long?
For that cost and effort it seems like it would be better to buy an old Mac that supports a newer version of Mac OS X with the $20 Mac App Store version of OS X Server.
OT: Is there a way to get either AFP or SMB to connect to a network share faster? It takes 20-22 seconds, which is far too long?
The point is to run old Mac software that requires Rosetta in a virtual machine, not to run an actual server. A newer version of OS X with the $20 App Store version of OS X Server can't do that.
It means apps will use less maximum memory (lower "high water mark"). Garbage collection lets unused memory build up for a while and then cleans it all up in one go, whereas ARC cleans up as it goes.
I wonder if this means the garbage collector will be removed (as opposed to deprecated but still there) in OS X 10.11?
I'm not 100% sure what Apple does differently from Microsoft in this regard, but, Microsoft has multiple runtimes, one for each C/C++ compiler version they've ever released. With Unix-like systems this also tends to be true, however the system compiler always compiles to the most recent runtime. So as long as developers still develop with an older version of Xcode on an older version of OS X (eg 10.8) they can still compile software without updating it.
That said, Apple probably doesn't want buggy software in the app store, so the minimum of not using depreciated functionality is the least that can be required. This has repercussions for older libraries that some software use.
At the end of the day, Apple can make any requirements they want of developers. This is less of an issue than say "NO FLASH" on iOS8, but may make software less portable... assuming anyone other than Apple has actually ported OBJC software outside of OSX.
11.0 will happen when there is a major shift in Mac OS X.
I believe Apple will create 11.0 as a A-series only version of the OS, and continue with 10.11, 10.12 etc for some time, for those who prefer Intel. And at some point, when the transition away from Intel is completed, the 10.v series will be stopped.
I'm not 100% sure what Apple does differently from Microsoft in this regard, but, Microsoft has multiple runtimes, one for each C/C++ compiler version they've ever released. With Unix-like systems this also tends to be true, however the system compiler always compiles to the most recent runtime. So as long as developers still develop with an older version of Xcode on an older version of OS X (eg 10.8) they can still compile software without updating it.
Almost everything in the OS X SDK is dynamically linked, so if either libauto or the GC-enabled versions of the frameworks go away, all GC apps will stop working, no matter what version of Xcode they were compiled with.
I believe Apple will create 11.0 as a A-series only version of the OS, and continue with 10.11, 10.12 etc for some time, for those who prefer Intel. And at some point, when the transition away from Intel is completed, the 10.v series will be stopped.
Oh lord, not this again. Look, I'm sure Apple has a prototype somewhere running on ARM, because they have to be prepared for contingencies. But for Apple to phase out Intel would require ARM to be at least equal to it in speed, and look, it's just not there yet, and probably won't be for a long time.
But for Apple to phase out Intel would require ARM to be at least equal to it in speed, and look, it's just not there yet, and probably won't be for a long time.
Where is your proof? Why can't a bespoke ARM SoC designed by Apple have the same or better performance as the currently shipping 1.4GHz Intel processor in the entry-level Mac mini?
Where is your proof? Why can't a bespoke ARM SoC designed by Apple have the same or better performance as the currently shipping 1.4GHz Intel processor in the entry-level Mac mini?
Where's your proof that it can have the same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro? Because that's what would be necessary to phase out the Intel processor.
Where's your proof that it can have the same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro? Because that's what would be necessary to phase out the Intel processor.
1) I don't have to prove it, because I'm not making any absolute claims the way you are. You are the one saying it's not possible for any desktop system from Apple to run ARM and still be fast enough for the user's needs. Hence, the onus is on you.
2) You're clearly in over your head as you moved the argument from Mac OS X running on ARM to now exclusively referring to "same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro."
Comments
Yes, indeed!
BF asked a simple question. The answer is simple. The "dot" isn't a decimal place. You could make that point without recourse to nastiness.
Which is an answer to a question I didn't ask.
Try another.
1) No, he didn't ask a simple question. He asked an obvious question to which he's been been the explained the answer numerous times. He's been saying this crap for years now. Are you fucking saying I'm suppose to reply to a fucking troll sincerely until the end of time simply because their trolling is simplistic? :rolleyes:
2) No, I can't answer him, because recently — and finally — became aware of his trolling and as a result blocked him. Note, I responded to [@]SpamSandwich[/@] about the lack of others to think rationally. Again, you're saying that I should continue for all eternity to explain what is obvious to 10 year olds that a period can be used as a separator between various alphanumerics. Seriously?! Well then **** you, too. :no:
To be fair, BF never asks simple questions, he always has an anti Apple innuendo or insult in the comment. Hence some of us rise to the bait on a quiet day for sport.
This is the right decision. There's got to be a balancing act between supporting older OS versions and making vital features compulsory. ARC has too many benefits to be ignored.
Any developer on Mac who doesn't want to update their apps can always sell direct.
incorrect. this has been discussed ad nauseum but the points in software versioning represent separators, and not a decimal data type. for example popular component developer Telerik includes the date as part of their point versioning scheme: "2014.4.1209" for Decemeber 9; the next release could be "2015.1.0411" for April 11.
Besides, Apple really want to go all Spinal tap and go to 11 ;-), they relish the thought :-).
Read the article. They suggested 3 years ago.
Like I said, it's usually close enough to be a decent compromise. All those extra calls do have a cost, though. MRC code that didn't use autorelease pools would come out ahead pretty much every time, I'd expect.
As for missing retain/release calls, it's pretty simple to find those using Instruments, and the clang static analyzer catches a lot of that stuff as well. ARC is, in fact, based on the code to the static analyzer, so really, it's pretty comprehensive, and will catch most of the cases where ARC would have helped you. That's not true. The compiler inserts a call named objc_retainAutoreleasedReturnValue() at the method return, and adds a call named objc_autoreleaseReturnValue() at the call site. The former call looks up the call stack for the latter call at runtime, and if both of these calls are present, they both skip what they'd normally do. The check has to happen at runtime, though, because the same function or method may be called by either ARC or non-ARC code, and due to the dynamic nature of the Objective-C runtime, the compiler can't completely predict who may call the method (especially if it's framework code).
How much is it?
Of course, it's no longer sold, so nowadays you'll be buying it used, for whatever price people decide to sell it for on eBay. So, probably less than $499, but assuredly more than the $30 that the standard Snow Leopard discs went for.
If you have a developer account, which costs $100 per year, you can download an install image for SL server for free. This is, honestly, probably the most economical option.
There are, of course, hacks like modifying plists in the system to make Snow Leopard Client look like Snow Leopard Server, which is usually enough to get the VMs to run it. This completely breaks Software Update, though, so you either won't get any security updates, or possibly you'll get the wrong ones and break things.
For that cost and effort it seems like it would be better to buy an old Mac that supports a newer version of Mac OS X with the $20 Mac App Store version of OS X Server.
OT: Is there a way to get either AFP or SMB to connect to a network share faster? It takes 20-22 seconds, which is far too long?
I'm not 100% sure what Apple does differently from Microsoft in this regard, but, Microsoft has multiple runtimes, one for each C/C++ compiler version they've ever released. With Unix-like systems this also tends to be true, however the system compiler always compiles to the most recent runtime. So as long as developers still develop with an older version of Xcode on an older version of OS X (eg 10.8) they can still compile software without updating it.
That said, Apple probably doesn't want buggy software in the app store, so the minimum of not using depreciated functionality is the least that can be required. This has repercussions for older libraries that some software use.
At the end of the day, Apple can make any requirements they want of developers. This is less of an issue than say "NO FLASH" on iOS8, but may make software less portable... assuming anyone other than Apple has actually ported OBJC software outside of OSX.
11.0 will happen when there is a major shift in Mac OS X.
I believe Apple will create 11.0 as a A-series only version of the OS, and continue with 10.11, 10.12 etc for some time, for those who prefer Intel. And at some point, when the transition away from Intel is completed, the 10.v series will be stopped.
Oh lord, not this again. Look, I'm sure Apple has a prototype somewhere running on ARM, because they have to be prepared for contingencies. But for Apple to phase out Intel would require ARM to be at least equal to it in speed, and look, it's just not there yet, and probably won't be for a long time.
Where is your proof? Why can't a bespoke ARM SoC designed by Apple have the same or better performance as the currently shipping 1.4GHz Intel processor in the entry-level Mac mini?
1) I don't have to prove it, because I'm not making any absolute claims the way you are. You are the one saying it's not possible for any desktop system from Apple to run ARM and still be fast enough for the user's needs. Hence, the onus is on you.
2) You're clearly in over your head as you moved the argument from Mac OS X running on ARM to now exclusively referring to "same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro."