It's about the same as telling someone they have to stop jabbing themselves with a fork. Having coded an iOS app prior to ARC, I remember how painful it was and how careful you had to be.
It's only painful if you don't know how to do it. After I learned exactly what to do, it's just a pattern and one that I liked, and still like to have control over. I can choose when things are stored in memory and when to remove them. Maybe I'm just a little sick in the head, but I sort of like that level of control.
It's only painful if you don't know how to do it. After I learned exactly what to do, it's just a pattern and one that I liked, and still like to have control over. I can choose when things are stored in memory and when to remove them. Maybe I'm just a little sick in the head, but I sort of like that level of control.
Well, unless someone is programming an embedded system, a driver, or an OS, I think a programmer shouldbMt have this kind of control. The OS should take care of that. I come from a time were we really had to take care of everything.. Believe these days are better.
I have had back and forth with a major developer - who should know better and has deep pockets - that their rather mission-critical app has been crashing - with reports pointing to poorly implemented garbage collection. Hopefully this mandate will solve this problem.
Retain/Release is a well-founded feature of ObjC/Cocoa going back to NS 3.3. It isn't garbage collection. ARC just automates it and with blocks does a lot more.
It this related to the issue of self installed non Apple SSDs used on older Macs not supporting TRIM? OS X 10.2 removed the ability of third party TRIM utilities I seem to recall without a hack.
You'd have thought that 10.10 would have logically been 11.0, but no, Apple had to go and upset the applecart.
Perhaps we'll see a return to logic this year with OS 10.11.0.
Oh look, my Mac's clock went from 10:09 a.m. to 10:10 a.m., it didn't go to11:00 a.m.! There goes another applecart! Gosh, it seems not all things progress in base-10 increments, who'd have thunk eh?
Hopefully you are too young to have been around to use the pre decimal currency in the UK. Your applcart would have been screwed up big time. Try adding one penny to nine pence, or one shilling to nine shillings. If you were around perhaps you were happy exchanged nine shillings plus another shilling for a quid?
ARC inserts retain/release calls at compile time, so there is no searching the stack at runtime. ARC is generally more efficient than MRR code, because it can discard objects as soon as they are unused. Really tight MRR code can be as fast, but it's hard to get it optimal by hand, whereas with ARC the compiler can handle memory management and optimize the heck out of it.
It's ok :S
I'm a freak who really loved manually managing memory.
2) Baloney, Apple works hard to not break existing Mac apps;
That hasn't been true since Leopard came out. The new Apple is always removing things that older apps need to work, whether it's Classic (10.5), Rosetta (10.7), or 32-bit support and Carbon (almost assuredly 10.11 or 10.12). It irritates me, since it often kills old games that I enjoyed, but that's Apple's philosophy these days. Out with the old. Anyone who expects the GC, which was never even that widely used anyway and has had a clear migration path for ages, to stick around is deluding him/herself.
I have apps from over a decade ago that still work.
No you don't. A decade ago was February 2005. Apple didn't even announce the Intel switch until June 2005, and the first Intel-compatible Mac apps weren't released until well after that. Any Mac app from earlier than 2005 would have been PowerPC-only, which won't work in any OS X version later than 10.7, since that version removed the Rosetta environment. So unless you're referring to Java applets or apps written for some similar third-party VM and not linked directly against Apple's frameworks, or you're still running OS X 10.6.x, then no, you do not have Mac apps from over a decade ago that still work.
ARC inserts retain/release calls at compile time, so there is no searching the stack at runtime. ARC is generally more efficient than MRR code, because it can discard objects as soon as they are unused. Really tight MRR code can be as fast, but it's hard to get it optimal by hand, whereas with ARC the compiler can handle memory management and optimize the heck out of it.
In theory it can discard objects as soon as they are unused. In practice, you can't, because of things like this:
NSData *data = ...
const char *bytes = data.bytes;
NSUInteger len = data.length;
for (NSUInteger i = 0; i < len; i++) {
DoSomethingWith(bytes[i]);
}
The 'data' variable is never used after the third line, so in theory the object could be released immediately after that line. However, the fact that 'bytes' is referring to the internal structure of 'data' means that if this is done, the loop ahead of the variable may either process garbage data, or crash. For this reason, in practice ARC will usually just insert the release when the object goes out of scope, rather than when it is no longer being referenced.
The older GC mechanism did work the way you describe, which is part of why it was so unstable. In situations like the one above, it could release 'data' before the program was done with its bytes, and that led to all kinds of nasty stuff happening.
Where ARC can be more efficient than MRC is in the reduction of autoreleased objects. Many methods that would normally return an autoreleased object now check up the stack to see if the caller is also using ARC, and if it is, it just returns the object without doing any retain/release/autorelease dance around it. However, this efficiency is made up for the fact that ARC inserts a lot of extraneous retains and releases that would not be present in MRC code, and also the fact that despite being more efficient than an retain and autorelease, that check up the stack does have some performance costs of its own. In the final tally, ARC usually comes out as slightly less performant than MRC code, but the difference is slight enough that it's a decent compromise, especially since the option of using MRC (and especially, dropping down to straight C) is still there for when it's needed.
I'm a freak who really loved manually managing memory.
I know what you mean, I used to be kind of the same, and relished the challenge. Some people really loved (still love?) writing assembly by hand. It's not necessarily bad, but there are more productive ways to spend one's limited programming time and effort. ;-)
(1) That hasn't been true since Leopard came out. The new Apple is always removing things that older apps need to work, whether it's Classic (10.5), Rosetta (10.7), or 32-bit support and Carbon (almost assuredly 10.11 or 10.12). It irritates me, since it often kills old games that I enjoyed, but that's Apple's philosophy these days. Out with the old. Anyone who expects the GC, which was never even that widely used anyway and has had a clear migration path for ages, to stick around is deluding him/herself.
(2) No you don't. A decade ago was February 2005. Apple didn't even announce the Intel switch until June 2005, and the first Intel-compatible Mac apps weren't released until well after that. Any Mac app from earlier than 2005 would have been PowerPC-only, which won't work in any OS X version later than 10.7, since that version removed the Rosetta environment. So unless you're referring to Java applets or apps written for some similar third-party VM and not linked directly against Apple's frameworks, or you're still running OS X 10.6.x, then no, you do not have Mac apps from over a decade ago that still work.
(3) In theory it can discard objects as soon as they are unused. In practice, you can't, because of things like this:
for (NSUInteger i = 0; i < len; i++) { DoSomethingWith(bytes[i]); }
The 'data' variable is never used after the third line, so in theory the object could be released immediately after that line. However, the fact that 'bytes' is referring to the internal structure of 'data' means that if this is done, the loop ahead of the variable may either process garbage data, or crash. For this reason, in practice ARC will usually just insert the release when the object goes out of scope, rather than when it is no longer being referenced.
The older GC mechanism did work the way you describe, which is part of why it was so unstable. In situations like the one above, it could release 'data' before the program was done with its bytes, and that led to all kinds of nasty stuff happening.
Where ARC can be more efficient than MRC is in the reduction of autoreleased objects. Many methods that would normally return an autoreleased object now check up the stack to see if the caller is also using ARC, and if it is, it just returns the object without doing any retain/release/autorelease dance around it. However, this efficiency is made up for the fact that ARC inserts a lot of extraneous retains and releases that would not be present in MRC code, and also the fact that despite being more efficient than an retain and autorelease, that check up the stack does have some performance costs of its own. In the final tally, ARC usually comes out as slightly less performant than MRC code, but the difference is slight enough that it's a decent compromise, especially since the option of using MRC (and especially, dropping down to straight C) is still there for when it's needed.
(1) Some old technologies are specifically removed, it's true. But Apple maintains a list of widely used apps and tests OS changes against them. Still, there are always limits. Apps that are using Cocoa and reasonably modern API continue to work for a long time. If Apple were to *never* remove old stuff, progress on new stuff would be hamstrung due to the increased maintenance cost.
(2) Okay, so *roughly* a decade, anything updated post-Intel. I still play Unreal Tournament 2004 with friends, even on the latest unreleased OS X. Although honestly, apps that old get so little use overall. I know they're near and dear to the people who still use them, but in the grand scheme of things, they are probably a tiny minority.
(3) I know the limitations, I've been using ARC since before it was publicly announced. ;-) When I said "unused" I didn't just mean that they symbol wasn't directly referenced in code. Also, much of the retain/release overhead has been mitigated by greatly increased speed of calling those methods, due to runtime wizardry under the hood. In my experience, performance is generally pretty much the same between ARC and the *best* MRR code, but most MRR code is not the best, and contains bugs like extra/missing retain/release calls.
Yep, I also used to write ObjC apps using retain/release before GC or ARC were available. In fact manual retain/release is probably a little bit more efficient than ARC as it doesn't have to go searching up the stack to see if it should release a var at the end of a method or not.
The searching up the stack is done at compile time (pre process time).
The headline is also incorrect since ARC is only required for GC apps not manual apps.
I can't help but wonder why the next version won't be called OS X 11.0?
Generally, software versions go x.8. x.9, y.0, y.1
Generally x.0 of anything is to be avoided for production use. (x.0.2 and beyond are usually safe).
Oops! Ok, I get it, mind over marketing. Nevermind.
incorrect. this has been discussed ad nauseum but the points in software versioning represent separators, and not a decimal data type. for example popular component developer Telerik includes the date as part of their point versioning scheme: "2014.4.1209" for Decemeber 9; the next release could be "2015.1.0411" for April 11.
You'd have thought that 10.10 would have logically been 11.0, but no, Apple had to go and upset the applecart.
Perhaps we'll see a return to logic this year with OS 10.11.0.
somebody's not a software developer....do you even use computers? better leave this topic to the big boys. the versioning parts are numeric but do not indicate a strict decimal data type. you can parcel them however you wish, according to whatever schema is used by your org.
somebody's not a software developer....do you even use computers? better leave this topic to the big boys. the versioning parts are numeric but do not indicate a strict decimal data type. you can parcel them however you wish, according to whatever schema is used by your org.
No, but only software developers do software development.
You don't have to be a SW development to be able to think rationally. Nobody should think version numbers 10.43.657(b), IP address 192.168.0.1, or the date 2015.02.21 are impossible.
Comments
It's about the same as telling someone they have to stop jabbing themselves with a fork. Having coded an iOS app prior to ARC, I remember how painful it was and how careful you had to be.
It's only painful if you don't know how to do it. After I learned exactly what to do, it's just a pattern and one that I liked, and still like to have control over. I can choose when things are stored in memory and when to remove them. Maybe I'm just a little sick in the head, but I sort of like that level of control.
It's only painful if you don't know how to do it. After I learned exactly what to do, it's just a pattern and one that I liked, and still like to have control over. I can choose when things are stored in memory and when to remove them. Maybe I'm just a little sick in the head, but I sort of like that level of control.
Well, unless someone is programming an embedded system, a driver, or an OS, I think a programmer shouldbMt have this kind of control. The OS should take care of that. I come from a time were we really had to take care of everything.. Believe these days are better.
Halle-freaking-lujah!
I have had back and forth with a major developer - who should know better and has deep pockets - that their rather mission-critical app has been crashing - with reports pointing to poorly implemented garbage collection. Hopefully this mandate will solve this problem.
I can't help but wonder why the next version won't be called OS X 11.0?
Generally, software versions go x.8. x.9, y.0, y.1
Generally x.0 of anything is to be avoided for production use. (x.0.2 and beyond are usually safe).
Oops! Ok, I get it, mind over marketing. Nevermind.
No they don't. They go however the developer wants them to go.
Here are a few examples from my Mac:
Objective C with GC didn't work out well. It was buggy, and never quite intend to do what it was suppose to do.
Most App tried using it abandoned it, and those who had luck with it should have moved on to ARC long long time ago.
So this is merely a notice to the very very small percentage of App Dev. Its time to move on.
I can't help but wonder why the next version won't be called OS X 11.0?
Generally, software versions go x.8. x.9, y.0, y.1
Generally x.0 of anything is to be avoided for production use. (x.0.2 and beyond are usually safe).
Oops! Ok, I get it, mind over marketing. Nevermind.
Really? This again. I feel every year we go through this discussion.
Indeed.
You'd have thought that 10.10 would have logically been 11.0, but no, Apple had to go and upset the applecart.
Perhaps we'll see a return to logic this year with OS 10.11.0.
It this related to the issue of self installed non Apple SSDs used on older Macs not supporting TRIM? OS X 10.2 removed the ability of third party TRIM utilities I seem to recall without a hack.
Oh look, my Mac's clock went from 10:09 a.m. to 10:10 a.m., it didn't go to11:00 a.m.! There goes another applecart! Gosh, it seems not all things progress in base-10 increments, who'd have thunk eh?
Hopefully you are too young to have been around to use the pre decimal currency in the UK. Your applcart would have been screwed up big time. Try adding one penny to nine pence, or one shilling to nine shillings.
ARC inserts retain/release calls at compile time, so there is no searching the stack at runtime. ARC is generally more efficient than MRR code, because it can discard objects as soon as they are unused. Really tight MRR code can be as fast, but it's hard to get it optimal by hand, whereas with ARC the compiler can handle memory management and optimize the heck out of it.
It's ok :S
I'm a freak who really loved manually managing memory.
No you don't. A decade ago was February 2005. Apple didn't even announce the Intel switch until June 2005, and the first Intel-compatible Mac apps weren't released until well after that. Any Mac app from earlier than 2005 would have been PowerPC-only, which won't work in any OS X version later than 10.7, since that version removed the Rosetta environment. So unless you're referring to Java applets or apps written for some similar third-party VM and not linked directly against Apple's frameworks, or you're still running OS X 10.6.x, then no, you do not have Mac apps from over a decade ago that still work.
In theory it can discard objects as soon as they are unused. In practice, you can't, because of things like this:
The 'data' variable is never used after the third line, so in theory the object could be released immediately after that line. However, the fact that 'bytes' is referring to the internal structure of 'data' means that if this is done, the loop ahead of the variable may either process garbage data, or crash. For this reason, in practice ARC will usually just insert the release when the object goes out of scope, rather than when it is no longer being referenced.
The older GC mechanism did work the way you describe, which is part of why it was so unstable. In situations like the one above, it could release 'data' before the program was done with its bytes, and that led to all kinds of nasty stuff happening.
Where ARC can be more efficient than MRC is in the reduction of autoreleased objects. Many methods that would normally return an autoreleased object now check up the stack to see if the caller is also using ARC, and if it is, it just returns the object without doing any retain/release/autorelease dance around it. However, this efficiency is made up for the fact that ARC inserts a lot of extraneous retains and releases that would not be present in MRC code, and also the fact that despite being more efficient than an retain and autorelease, that check up the stack does have some performance costs of its own. In the final tally, ARC usually comes out as slightly less performant than MRC code, but the difference is slight enough that it's a decent compromise, especially since the option of using MRC (and especially, dropping down to straight C) is still there for when it's needed.
It's ok :S
I'm a freak who really loved manually managing memory.
I know what you mean, I used to be kind of the same, and relished the challenge. Some people really loved (still love?) writing assembly by hand. It's not necessarily bad, but there are more productive ways to spend one's limited programming time and effort. ;-)
Originally Posted by Durandal1707
(1) That hasn't been true since Leopard came out. The new Apple is always removing things that older apps need to work, whether it's Classic (10.5), Rosetta (10.7), or 32-bit support and Carbon (almost assuredly 10.11 or 10.12). It irritates me, since it often kills old games that I enjoyed, but that's Apple's philosophy these days. Out with the old. Anyone who expects the GC, which was never even that widely used anyway and has had a clear migration path for ages, to stick around is deluding him/herself.
(2) No you don't. A decade ago was February 2005. Apple didn't even announce the Intel switch until June 2005, and the first Intel-compatible Mac apps weren't released until well after that. Any Mac app from earlier than 2005 would have been PowerPC-only, which won't work in any OS X version later than 10.7, since that version removed the Rosetta environment. So unless you're referring to Java applets or apps written for some similar third-party VM and not linked directly against Apple's frameworks, or you're still running OS X 10.6.x, then no, you do not have Mac apps from over a decade ago that still work.
(3) In theory it can discard objects as soon as they are unused. In practice, you can't, because of things like this:
The 'data' variable is never used after the third line, so in theory the object could be released immediately after that line. However, the fact that 'bytes' is referring to the internal structure of 'data' means that if this is done, the loop ahead of the variable may either process garbage data, or crash. For this reason, in practice ARC will usually just insert the release when the object goes out of scope, rather than when it is no longer being referenced.
The older GC mechanism did work the way you describe, which is part of why it was so unstable. In situations like the one above, it could release 'data' before the program was done with its bytes, and that led to all kinds of nasty stuff happening.
Where ARC can be more efficient than MRC is in the reduction of autoreleased objects. Many methods that would normally return an autoreleased object now check up the stack to see if the caller is also using ARC, and if it is, it just returns the object without doing any retain/release/autorelease dance around it. However, this efficiency is made up for the fact that ARC inserts a lot of extraneous retains and releases that would not be present in MRC code, and also the fact that despite being more efficient than an retain and autorelease, that check up the stack does have some performance costs of its own. In the final tally, ARC usually comes out as slightly less performant than MRC code, but the difference is slight enough that it's a decent compromise, especially since the option of using MRC (and especially, dropping down to straight C) is still there for when it's needed.
(1) Some old technologies are specifically removed, it's true. But Apple maintains a list of widely used apps and tests OS changes against them. Still, there are always limits. Apps that are using Cocoa and reasonably modern API continue to work for a long time. If Apple were to *never* remove old stuff, progress on new stuff would be hamstrung due to the increased maintenance cost.
(2) Okay, so *roughly* a decade, anything updated post-Intel. I still play Unreal Tournament 2004 with friends, even on the latest unreleased OS X. Although honestly, apps that old get so little use overall. I know they're near and dear to the people who still use them, but in the grand scheme of things, they are probably a tiny minority.
(3) I know the limitations, I've been using ARC since before it was publicly announced. ;-) When I said "unused" I didn't just mean that they symbol wasn't directly referenced in code. Also, much of the retain/release overhead has been mitigated by greatly increased speed of calling those methods, due to runtime wizardry under the hood. In my experience, performance is generally pretty much the same between ARC and the *best* MRR code, but most MRR code is not the best, and contains bugs like extra/missing retain/release calls.
The searching up the stack is done at compile time (pre process time).
The headline is also incorrect since ARC is only required for GC apps not manual apps.
incorrect. this has been discussed ad nauseum but the points in software versioning represent separators, and not a decimal data type. for example popular component developer Telerik includes the date as part of their point versioning scheme: "2014.4.1209" for Decemeber 9; the next release could be "2015.1.0411" for April 11.
somebody's not a software developer....do you even use computers? better leave this topic to the big boys. the versioning parts are numeric but do not indicate a strict decimal data type. you can parcel them however you wish, according to whatever schema is used by your org.
Are computers just sold to developers now?
No, but only software developers do software development.
You don't have to be a SW development to be able to think rationally. Nobody should think version numbers 10.43.657(b), IP address 192.168.0.1, or the date 2015.02.21 are impossible.