Apple says Mac app makers must transition to ARC memory management by May

13

Comments

  • Reply 41 of 76
    solipsismy wrote: »
    You don't have to be a SW development to be able to think rationally. Anyone that thinks version number 10.43.657(b), IP address 192.168.0.1, or the date 2015.02.21 is impossible is simply an idiot or a troll.

    Yes, indeed!
  • Reply 42 of 76
    asdasdasdasd Posts: 5,686member
    solipsismy wrote: »
    You don't have to be a SW development to be able to think rationally.

    BF asked a simple question. The answer is simple. The "dot" isn't a decimal place. You could make that point without recourse to nastiness.
  • Reply 43 of 76
    asdasdasdasd Posts: 5,686member
    No, but only software developers do software development.

    Which is an answer to a question I didn't ask.
  • Reply 44 of 76
    asdasd wrote: »
    Which is an answer to a question I didn't ask.

    Try another.
  • Reply 45 of 76
    asdasd wrote: »
    Your universe is full of trolls isn't it? Angry trolls under the bed.

    BF asked a simple question. The answer is simple. The "dot" isn't a decimal place. You could make that point without recourse to nastiness.

    1) No, he didn't ask a simple question. He asked an obvious question to which he's been been the explained the answer numerous times. He's been saying this crap for years now. Are you fucking saying I'm suppose to reply to a fucking troll sincerely until the end of time simply because their trolling is simplistic? :rolleyes:

    2) No, I can't answer him, because recently — and finally — became aware of his trolling and as a result blocked him. Note, I responded to [@]SpamSandwich[/@] about the lack of others to think rationally. Again, you're saying that I should continue for all eternity to explain what is obvious to 10 year olds that a period can be used as a separator between various alphanumerics. Seriously?! Well then **** you, too. :no:
  • Reply 46 of 76
    MacProMacPro Posts: 19,728member
    asdasd wrote: »
    Your universe is full of trolls isn't it? Angry trolls under the bed.

    BF asked a simple question. The answer is simple. The "dot" isn't a decimal place. You could make that point without recourse to nastiness.

    To be fair, BF never asks simple questions, he always has an anti Apple innuendo or insult in the comment. Hence some of us rise to the bait on a quiet day for sport. :D
  • Reply 47 of 76
    richlrichl Posts: 2,213member

    This is the right decision. There's got to be a balancing act between supporting older OS versions and making vital features compulsory. ARC has too many benefits to be ignored.

     

    Any developer on Mac who doesn't want to update their apps can always sell direct.

  • Reply 48 of 76
    foggyhillfoggyhill Posts: 4,767member
    Quote:

    Originally Posted by NolaMacGuy View Post





    incorrect. this has been discussed ad nauseum but the points in software versioning represent separators, and not a decimal data type. for example popular component developer Telerik includes the date as part of their point versioning scheme: "2014.4.1209" for Decemeber 9; the next release could be "2015.1.0411" for April 11.

     

    Besides, Apple really want to go all Spinal tap and go to 11 ;-), they relish the thought :-).

  • Reply 49 of 76



    Read the article. They suggested 3 years ago.

  • Reply 50 of 76
    (1) Some old technologies are specifically removed, it's true. But Apple maintains a list of widely used apps and tests OS changes against them. Still, there are always limits. Apps that are using Cocoa and reasonably modern API continue to work for a long time. If Apple were to *never* remove old stuff, progress on new stuff would be hamstrung due to the increased maintenance cost.

    (2) Okay, so *roughly* a decade, anything updated post-Intel. I still play Unreal Tournament 2004 with friends, even on the latest unreleased OS X. Although honestly, apps that old get so little use overall. I know they're near and dear to the people who still use them, but in the grand scheme of things, they are probably a tiny minority.
    Fortunately, these days you can run most PowerPC / Carbon stuff by running Snow Leopard Server in VMWare (it's pretty unfortunate if you don't have a developer account, though, since OS X Server is pretty expensive on its own). Still haven't found a good solution for Classic apps, though (SheepShaver is not very stable and fails on a lot of old games).
    (3) I know the limitations, I've been using ARC since before it was publicly announced. ;-) When I said "unused" I didn't just mean that they symbol wasn't directly referenced in code. Also, much of the retain/release overhead has been mitigated by greatly increased speed of calling those methods, due to runtime wizardry under the hood. In my experience, performance is generally pretty much the same between ARC and the *best* MRR code, but most MRR code is not the best, and contains bugs like extra/missing retain/release calls.
    Like I said, it's usually close enough to be a decent compromise. All those extra calls do have a cost, though. MRC code that didn't use autorelease pools would come out ahead pretty much every time, I'd expect.

    As for missing retain/release calls, it's pretty simple to find those using Instruments, and the clang static analyzer catches a lot of that stuff as well. ARC is, in fact, based on the code to the static analyzer, so really, it's pretty comprehensive, and will catch most of the cases where ARC would have helped you.
    asdasd wrote: »
    The searching up the stack is done at compile time (pre process time).
    That's not true. The compiler inserts a call named objc_retainAutoreleasedReturnValue() at the method return, and adds a call named objc_autoreleaseReturnValue() at the call site. The former call looks up the call stack for the latter call at runtime, and if both of these calls are present, they both skip what they'd normally do. The check has to happen at runtime, though, because the same function or method may be called by either ARC or non-ARC code, and due to the dynamic nature of the Objective-C runtime, the compiler can't completely predict who may call the method (especially if it's framework code).
  • Reply 51 of 76
    it's pretty unfortunately if you don't have a developer account, though, since OS X Server is pretty expensive on its own

    How much is it?
  • Reply 52 of 76
    solipsismy wrote: »
    How much is it?
    Retail price for the server version (which is the only version that works in a VM due to licensing issues; with 10.7 and up the license was modified to allow the standard version to run in a VM, but Rosetta was already removed in that version, so...) was $499.

    Of course, it's no longer sold, so nowadays you'll be buying it used, for whatever price people decide to sell it for on eBay. So, probably less than $499, but assuredly more than the $30 that the standard Snow Leopard discs went for.

    If you have a developer account, which costs $100 per year, you can download an install image for SL server for free. This is, honestly, probably the most economical option.

    There are, of course, hacks like modifying plists in the system to make Snow Leopard Client look like Snow Leopard Server, which is usually enough to get the VMs to run it. This completely breaks Software Update, though, so you either won't get any security updates, or possibly you'll get the wrong ones and break things.
  • Reply 53 of 76
    Retail price for the server version (which is the only version that works in a VM due to licensing issues; with 10.7 and up the license was modified to allow the standard version to run in a VM, but Rosetta was already removed in that version, so...) was $499.

    Of course, it's no longer sold, so nowadays you'll be buying it used, for whatever price people decide to sell it for on eBay. So, probably less than $499, but assuredly more than the $30 that the standard Snow Leopard discs went for.

    If you have a developer account, which costs $100 per year, you can download an install image for SL server for free. This is, honestly, probably the most economical option.

    There are, of course, hacks like modifying plists in the system to make Snow Leopard Client look like Snow Leopard Server, which is usually enough to get the VMs to run it. This completely breaks Software Update, though, so you either won't get any security updates, or possibly you'll get the wrong ones and break things.

    For that cost and effort it seems like it would be better to buy an old Mac that supports a newer version of Mac OS X with the $20 Mac App Store version of OS X Server.


    OT: Is there a way to get either AFP or SMB to connect to a network share faster? It takes 20-22 seconds, which is far too long?
  • Reply 54 of 76
    solipsismy wrote: »
    For that cost and effort it seems like it would be better to buy an old Mac that supports a newer version of Mac OS X with the $20 Mac App Store version of OS X Server.


    OT: Is there a way to get either AFP or SMB to connect to a network share faster? It takes 20-22 seconds, which is far too long?
    The point is to run old Mac software that requires Rosetta in a virtual machine, not to run an actual server. A newer version of OS X with the $20 App Store version of OS X Server can't do that.
  • Reply 55 of 76
    misamisa Posts: 827member
    ascii wrote: »

    It means apps will use less maximum memory (lower "high water mark"). Garbage collection lets unused memory build up for a while and then cleans it all up in one go, whereas ARC cleans up as it goes. 

    I wonder if this means the garbage collector will be removed (as opposed to deprecated but still there) in OS X 10.11?

    I'm not 100% sure what Apple does differently from Microsoft in this regard, but, Microsoft has multiple runtimes, one for each C/C++ compiler version they've ever released. With Unix-like systems this also tends to be true, however the system compiler always compiles to the most recent runtime. So as long as developers still develop with an older version of Xcode on an older version of OS X (eg 10.8) they can still compile software without updating it.

    That said, Apple probably doesn't want buggy software in the app store, so the minimum of not using depreciated functionality is the least that can be required. This has repercussions for older libraries that some software use.

    At the end of the day, Apple can make any requirements they want of developers. This is less of an issue than say "NO FLASH" on iOS8, but may make software less portable... assuming anyone other than Apple has actually ported OBJC software outside of OSX.
  • Reply 56 of 76

    11.0 will happen when there is a major shift in Mac OS X.

     

    I believe Apple will create 11.0 as a A-series only version of the OS, and continue with 10.11, 10.12 etc for some time, for those who prefer Intel. And at some point, when the transition away from Intel is completed, the 10.v series will be stopped.

  • Reply 57 of 76
    misa wrote: »
    I'm not 100% sure what Apple does differently from Microsoft in this regard, but, Microsoft has multiple runtimes, one for each C/C++ compiler version they've ever released. With Unix-like systems this also tends to be true, however the system compiler always compiles to the most recent runtime. So as long as developers still develop with an older version of Xcode on an older version of OS X (eg 10.8) they can still compile software without updating it.
    Almost everything in the OS X SDK is dynamically linked, so if either libauto or the GC-enabled versions of the frameworks go away, all GC apps will stop working, no matter what version of Xcode they were compiled with.
    misa wrote: »
    I believe Apple will create 11.0 as a A-series only version of the OS, and continue with 10.11, 10.12 etc for some time, for those who prefer Intel. And at some point, when the transition away from Intel is completed, the 10.v series will be stopped.
    Oh lord, not this again. Look, I'm sure Apple has a prototype somewhere running on ARM, because they have to be prepared for contingencies. But for Apple to phase out Intel would require ARM to be at least equal to it in speed, and look, it's just not there yet, and probably won't be for a long time.
  • Reply 58 of 76
    But for Apple to phase out Intel would require ARM to be at least equal to it in speed, and look, it's just not there yet, and probably won't be for a long time.

    Where is your proof? Why can't a bespoke ARM SoC designed by Apple have the same or better performance as the currently shipping 1.4GHz Intel processor in the entry-level Mac mini?
  • Reply 59 of 76
    solipsismy wrote: »
    Where is your proof? Why can't a bespoke ARM SoC designed by Apple have the same or better performance as the currently shipping 1.4GHz Intel processor in the entry-level Mac mini?
    Where's your proof that it can have the same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro? Because that's what would be necessary to phase out the Intel processor.
  • Reply 60 of 76
    Where's your proof that it can have the same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro? Because that's what would be necessary to phase out the Intel processor.

    1) I don't have to prove it, because I'm not making any absolute claims the way you are. You are the one saying it's not possible for any desktop system from Apple to run ARM and still be fast enough for the user's needs. Hence, the onus is on you.

    2) You're clearly in over your head as you moved the argument from Mac OS X running on ARM to now exclusively referring to "same or better performance as the 2.7GHz Xeon in the currently shipping top-end Mac Pro."
Sign In or Register to comment.