I'm not sure it was your intention or not but you make it sound like open source is foriegn to Apple. Frankly I think many in the open source world are glad Apple is around. It is not "again" but just another facet we are seeing here of Apples continued development in this regards.
I not sure how you got that idea since "once again" refers to that they have done it before or are doing open source projects.
Apple picks what technologies they will use whether open source or home grown. Webkit, MySQL, and Apache are some existing projects that they've joined and contributed to with varying degrees of contributions. Bonjour, CUPS (Apple hired the developer), and now GCD are technologies Apple created and have sent to the open source community as new projects. I am just glad to see they have new projects going out.
GCD is a technology that is at the kernel level, so it is critical to the operation of the OS that it works extremely well. My hope is that the open source community (BSD and Linux) grab hold of GCD and incorporate it. While they are doing that they may find and fix bugs and/or improve the code and Apple will benefit from that. Heck it might even benefit Microsoft if they review the code.
My company has many developers around the country and world that contribute code to open source projects. We are the benefactors of opensource since our Oracle instances run on the Oracle Linux distro. I'm a DBA so I'm glad to see the community working for everyone.
Speaking of which, why are the AI comments all catty and obnoxious?
because this is AI.
check the all the various threads, or maybe you already have. catty, obnoxious, bitchy... that's pretty much the way it is. the s/n is not as high as other forums.
moderation of moderation makes for lots of squabbles and infighting. but if you can wade through the chafe there is some knowledge to be gained. mostly i just come here for the articles not the comments.
It'll take maybe 3-5 years before this is properly leveraged into mainstream apps. That's when your 16-core MacBook will really sing.
Can't wait!
Quote:
Originally Posted by MacGui
because this is AI.
check the all the various threads, or maybe you already have. catty, obnoxious, bitchy... that's pretty much the way it is. the s/n is not as high as other forums.
moderation of moderation makes for lots of squabbles and infighting. but if you can wade through the chafe there is some knowledge to be gained. mostly i just come here for the articles not the comments.
At least this thread is filled with more reflection and less aggression. I can barely program "Hello, world" but I love following (as best I can) what's really happening under the hood.
GCD is a technology that is at the kernel level, so it is critical to the operation of the OS that it works extremely well. My hope is that the open source community (BSD and Linux) grab hold of GCD and incorporate it. While they are doing that they may find and fix bugs and/or improve the code and Apple will benefit from that. Heck it might even benefit Microsoft if they review the code.
MS has had blocks for a while. The funny thing is that MS could use GCD while Linux could not.
I just have to say: that is a beautiful and informative diagram!
You might also consider Blue Box in Rhapsody - and Classic in OS X - to be at least spiritual descendants of Apple's (forgotten, but rather interesting) Macintosh Application Environment (MAE), which was virtual Mac OS for AIX/Solaris/HP-UX, and which was in turn a descendant of A/UX's virtual Mac environment:
A/UX startmac -> MAE -> Rhapsody Blue Box -> Mac OS X Classic
I don't think the standard kind of benchmark applies here, i.e the benchmark where we measure a computer's speed with and without the feature. If I understand GCD correctly, the main benefit will be measured in developer-time, not CPU-time. For example, a programmer who's already proficient in GCD may be able to create software that exploits multiple cores in a matter of days, whereas it might take months for a programmer without this tool to exploit multi-cores. But both solutions would run equally fast after they're debugged, it's just that GCD makes it much cheaper to debug.
Ah, so wait - What we should benchmark is an app without and with GCD complied into its code benched on the two versions of BSD, to see any speed advantages that incorporating the GCD code into the app, and running it on a GCD-enabled OS, may have? That's what I would like to see.
You really cherry-picked that one eh? I watched the show and the general consensus seemed to be 3.1.2 fixes issues, but diffidently not all of them. It specifically didn't fix the WiFi issue for most the vast majority of people with it. Besides, I don't need to read comments to know I own two iPhones with 3.1.2 and WiFi issues on both of them.
My wifi is flawless, love my iPhone. It isn't perfect, but beats the pants off my friends blackberry.
Apple could choose whatever license they liked/wanted since it's their code, so it left me thinking whether it could be Apple's intention not to have GCD on Linux. Apple could have chosen BSD license and nothing could prevent GCD being used in Linux kernel directly.
However, since BSDs and Linux don't have similar kernel internals (unlike Apple's Darwin and FreeBSD, which do), maybe this is irrelevant, since there would have to be a completely new implementation of the kernel components on Linux anyway and no code copying, which would mean the license of Apple's GCD does not matter for Linux port of kernel components.
When Apple first made the announcement to open-source GCD, I was interested in the impact its license would have on GPL'ed open-source operating systems, so I spent a little time reading up on it.
It seemed to me at the time that Apple had implemented GCD in two distinct components which worked with each other: the user-land library, and the underlying kernel required to support the library.
It is reasonable to predict that the implementation of kernel-level support for Linux would have to be substantially different from OS X's or FreeBSD due to the different kernel architectures. It seems to me that there could be little, if any, code duplication for technological reasons, rendering the license limitation somewhat moot.
But under the system Apple seemed to have deployed, application developers probably wouldn't be encouraged to work directly with the kernel's support layer anyway -- they'd be linking against the userland library, whose job would be to act as an intermediate, platform-agnostic wrapper between the kernel and the application.
As for the userland library, it is a distinct component from the kernel. The Linux kernel is released under a custom version of the GPL which makes it clear that the kernel's license does not extend itself to any userland programs or libraries which may run on top of the kernel. So duplicating code from the userland library to run on Linux probably wouldn't be too much of a problem.
However, just as is the case on every other platform for which GCD is deployed, it would be up to each of the open-source 3rd-party application developers to determine whether or not their own applications' license permits them to legally link their own applications against the userland GCD library.
I hope this technology catches up with all of Linux land. It will help us get more bang for the buck for our hardware investments.
The __block keyword is used for data storage that is exclusive to blocks that reference it. It is not exposed to local or global scope. That is the first extension. The second extension of ^ is for declaring a block variable. Both of these extensions are for different reasons.
Your response makes no sense whatsoever to me. Care to expand? Perhaps in coherent sentences???
Not that much. This is not a response at all.
Well... OK. Remarks, which contain only copies of descriptions of Apple's C language extensions and void qualifications like "made for different reasons", are useless bullshit.
People, who are interested in the problem, have already bothered to read articles and to look much more deeply in that. This is no more about the difference between defines and keywords. Neither is it about calling conventions, storage modifiers and namespaces.
Comments
I'm not sure it was your intention or not but you make it sound like open source is foriegn to Apple. Frankly I think many in the open source world are glad Apple is around. It is not "again" but just another facet we are seeing here of Apples continued development in this regards.
I not sure how you got that idea since "once again" refers to that they have done it before or are doing open source projects.
Apple picks what technologies they will use whether open source or home grown. Webkit, MySQL, and Apache are some existing projects that they've joined and contributed to with varying degrees of contributions. Bonjour, CUPS (Apple hired the developer), and now GCD are technologies Apple created and have sent to the open source community as new projects. I am just glad to see they have new projects going out.
GCD is a technology that is at the kernel level, so it is critical to the operation of the OS that it works extremely well. My hope is that the open source community (BSD and Linux) grab hold of GCD and incorporate it. While they are doing that they may find and fix bugs and/or improve the code and Apple will benefit from that. Heck it might even benefit Microsoft if they review the code.
My company has many developers around the country and world that contribute code to open source projects. We are the benefactors of opensource since our Oracle instances run on the Oracle Linux distro. I'm a DBA so I'm glad to see the community working for everyone.
Speaking of which, why are the AI comments all catty and obnoxious?
because this is AI.
check the all the various threads, or maybe you already have. catty, obnoxious, bitchy... that's pretty much the way it is. the s/n is not as high as other forums.
moderation of moderation makes for lots of squabbles and infighting. but if you can wade through the chafe there is some knowledge to be gained. mostly i just come here for the articles not the comments.
It'll take maybe 3-5 years before this is properly leveraged into mainstream apps. That's when your 16-core MacBook will really sing.
Can't wait!
because this is AI.
check the all the various threads, or maybe you already have. catty, obnoxious, bitchy... that's pretty much the way it is. the s/n is not as high as other forums.
moderation of moderation makes for lots of squabbles and infighting. but if you can wade through the chafe there is some knowledge to be gained. mostly i just come here for the articles not the comments.
At least this thread is filled with more reflection and less aggression. I can barely program "Hello, world" but I love following (as best I can) what's really happening under the hood.
GCD is a technology that is at the kernel level, so it is critical to the operation of the OS that it works extremely well. My hope is that the open source community (BSD and Linux) grab hold of GCD and incorporate it. While they are doing that they may find and fix bugs and/or improve the code and Apple will benefit from that. Heck it might even benefit Microsoft if they review the code.
MS has had blocks for a while. The funny thing is that MS could use GCD while Linux could not.
You might also consider Blue Box in Rhapsody - and Classic in OS X - to be at least spiritual descendants of Apple's (forgotten, but rather interesting) Macintosh Application Environment (MAE), which was virtual Mac OS for AIX/Solaris/HP-UX, and which was in turn a descendant of A/UX's virtual Mac environment:
A/UX startmac -> MAE -> Rhapsody Blue Box -> Mac OS X Classic
I don't think the standard kind of benchmark applies here, i.e the benchmark where we measure a computer's speed with and without the feature. If I understand GCD correctly, the main benefit will be measured in developer-time, not CPU-time. For example, a programmer who's already proficient in GCD may be able to create software that exploits multiple cores in a matter of days, whereas it might take months for a programmer without this tool to exploit multi-cores. But both solutions would run equally fast after they're debugged, it's just that GCD makes it much cheaper to debug.
Ah, so wait - What we should benchmark is an app without and with GCD complied into its code benched on the two versions of BSD, to see any speed advantages that incorporating the GCD code into the app, and running it on a GCD-enabled OS, may have? That's what I would like to see.
Dave
You really cherry-picked that one eh? I watched the show and the general consensus seemed to be 3.1.2 fixes issues, but diffidently not all of them. It specifically didn't fix the WiFi issue for most the vast majority of people with it. Besides, I don't need to read comments to know I own two iPhones with 3.1.2 and WiFi issues on both of them.
My wifi is flawless, love my iPhone. It isn't perfect, but beats the pants off my friends blackberry.
Apple could choose whatever license they liked/wanted since it's their code, so it left me thinking whether it could be Apple's intention not to have GCD on Linux. Apple could have chosen BSD license and nothing could prevent GCD being used in Linux kernel directly.
However, since BSDs and Linux don't have similar kernel internals (unlike Apple's Darwin and FreeBSD, which do), maybe this is irrelevant, since there would have to be a completely new implementation of the kernel components on Linux anyway and no code copying, which would mean the license of Apple's GCD does not matter for Linux port of kernel components.
When Apple first made the announcement to open-source GCD, I was interested in the impact its license would have on GPL'ed open-source operating systems, so I spent a little time reading up on it.
It seemed to me at the time that Apple had implemented GCD in two distinct components which worked with each other: the user-land library, and the underlying kernel required to support the library.
It is reasonable to predict that the implementation of kernel-level support for Linux would have to be substantially different from OS X's or FreeBSD due to the different kernel architectures. It seems to me that there could be little, if any, code duplication for technological reasons, rendering the license limitation somewhat moot.
But under the system Apple seemed to have deployed, application developers probably wouldn't be encouraged to work directly with the kernel's support layer anyway -- they'd be linking against the userland library, whose job would be to act as an intermediate, platform-agnostic wrapper between the kernel and the application.
As for the userland library, it is a distinct component from the kernel. The Linux kernel is released under a custom version of the GPL which makes it clear that the kernel's license does not extend itself to any userland programs or libraries which may run on top of the kernel. So duplicating code from the userland library to run on Linux probably wouldn't be too much of a problem.
However, just as is the case on every other platform for which GCD is deployed, it would be up to each of the open-source 3rd-party application developers to determine whether or not their own applications' license permits them to legally link their own applications against the userland GCD library.
Wikipedia
ADC
Yeah, that's great, Apple. No, no, really good job.
Two independent language elements are awesome: '__block' storage modifier and new home-brew unary operation '^'.
Why just not write
typedef void (__block *pblock)();
The __block keyword is used for data storage that is exclusive to blocks that reference it. It is not exposed to local or global scope. That is the first extension. The second extension of ^ is for declaring a block variable. Both of these extensions are for different reasons.
No "__block keyword", no "different reasons", lee-news-sea-noobies.
Your response makes no sense whatsoever to me. Care to expand? Perhaps in coherent sentences???
Your response makes no sense whatsoever to me. Care to expand? Perhaps in coherent sentences???
Not that much. This is not a response at all.
Well... OK. Remarks, which contain only copies of descriptions of Apple's C language extensions and void qualifications like "made for different reasons", are useless bullshit.
People, who are interested in the problem, have already bothered to read articles and to look much more deeply in that. This is no more about the difference between defines and keywords. Neither is it about calling conventions, storage modifiers and namespaces.