Apple's other open secret: the LLVM Complier

13»

Comments

  • Reply 41 of 51
    charlesscharless Posts: 301member
    Quote:
    Originally Posted by AppleInsider View Post


    There's a lot of unpronounceable words in all capital letters in that paragraph, LOLBBQ. Let's pull a few out and define them until the synopsis starts making sense.



    LOLBBQ? Seriously?
  • Reply 42 of 51
    Quote:
    Originally Posted by auxio View Post


    From my personal experience, it's usually good to have a small core of "architects" who are very knowledgeable and passionate developers. Often this is only one or two people. I believe this is what mdriftmeyer is talking about.



    Sure you'll have a lot of other folks involved: graphic designers, UI designers, and other developers. But it's this small, dedicated core group which dictates the main structure of the software and ensures that new features are well thought out and added properly. Not just thrown in haphazardly. If you don't have this core group and just have people working in their own directions, you quickly end up with an unmaintainable pile of steaming code.



    So while I agree that no one person can create a large piece of software, I also believe that you need a small group of dedicated people to ensure things are structured properly. These people are usually responsible for the majority of the code in the project (either directly or indirectly).



    I think that history proves you right. Observing the "computer" industry for 52 years (since 1956) it seems that the best and/or breakthrough systems (software and hardware) are developed by a handful of people-- here's 2 examples:



    -The Apple Computer (hardware, BASIC, Assembler): 3 People

    -The Mac Computer (Hardware, OS, GUI, MacWrite, MacDraw): 5-7 people



    I am sure that you and I can come up with many more examples.



    Small teams appear to be more effective pioneers than large teams.



    There are, likely, many reasons for this. One significant reason is the ability of a small team to communicate and interact with each other. Often the small team is in the same room or adjoining cubicles. A new problem, required/desired feature can often be discovered, prototyped and resolved in a mater of minutes or hours.



    A larger, more formal, organization is often spread out over several buildings (or even time zones). Each group has responsibility for a certain portion of the project. When interaction is required, you must set up a meeting, resolve scheduling conflicts, and often have a pre-meeting to establish the group's position on the new issue... often this requires additional background research and documentation (just to get our ducks in line)... yadda, yadda, yadda.



    There is some number of people, say 5, where interaction and communication is optimal for performing the tasks necessary to build the project. Too many people, means more time is spent interacting and communicating than doing actual work.



    The optimal group interacts/communicates something like this:



    Steve: "Bill, I think we heed to change the...".

    Bill: "I talked with Andy about that and he thinks..."

    Andy: "Yeah, I looked into it and it will make the program bigger and slower...".

    Bill: "What we really ought to do is..."

    Andy: "Now, That sounds good! I can throw something together..."

    Steve: "OK, great work guys".



    Direction, Shared Goals, Talent and Trust!



    Oversimplification? Maybe to some extent! But if you have ever been involved in a small team project, interaction like that is commonplace. Everyone is on the same wave-length and readily available. Issues are resolved with a minimum of communication. Time is spent getting the job done. Little time is spent on (unnecessary) emails, phone calls, establishing a position, research, pre-meetings, meetings, etc.



    This is not to say that there is no need for formal interaction & communication, but there is a time and place for that.
  • Reply 43 of 51
    Quote:
    Originally Posted by merdhead View Post


    You obviously haven't written any software professionally, or worked for a large development team.



    You can't come off as a genius by over generalizing. Like an engineer lets breakdown your arguments for easier problem solving.



    Quote:

    A large set of APIs make software easier to write, not harder. I can get the gist of thousands of API calls and lookup what I need as I need it. It radically reduces the amount of software I have to write.



    That would be ultimately true if you are working with APIs that are designed to be compatible with each other. Don't get it? Try developing cross-platform applications using Win32, Cocoa and GTK if you would like first hand experience for what I mean. Or better yet a game that uses both DirectX and OpenGL.



    Quote:

    Software is harder to write the more people you add, not easier. It's fine if you have a concrete spec which has long been frozen and doesn't change very much, and you know exactly what you are writing (eg Linux Kernel). But in every other case (and most interesting cases) it means more co-ordination, more communication and less freedom to act. Large software projects (in the billions) have known to collapse under their own weight (read "The Mythical Man Month"). Most good software was written by individuals. See Microsoft products for what happens with large teams (I might add that the Mac versions of their Office suite used to be developed by small teams).



    Really? All it only takes is a few people to write device drivers, filesystems drivers and protocol stacks for all the common devices, filesystems and protocols out there. Well I guess then all the shareware programmers are going to pull a complete modern OS and office productivity suite out of their ass pretty soon.



    Quote:

    Programs get more complex over time because of more developers and other people getting involved. Writing a program yourself, even a big one, means you have much more chance of it working and it being elegant and efficient.



    Show me a software/computer engineer that could design UIs that are both useable that are ergonomic as well then I would buy this BS. I've came across far too many uber software engineers that have failed usability surveys only to realize that a good software engineer does not equal a usability expert.





    Quote:

    Finally computers are not any more complex today than they were 10 or 20 years ago, they just have more capacity. In most senses they are simpler because as a programmer you can harness that capacity. You can't have to hand code in assembler to get things to run at a reasonable speed, for instance. Computer software hasn't changed appreciably since 1984. Most good computer science ideas invented in the 60 and 70's (and maybe early 80's), its been a process of refinement ever since, largely. Here's a hint for the morons: there is no new computer science in Facebook.



    GASP! You mean multi-core processors and fire-channel drives existed since 1984? This is a conspiracy I tells ya. What's the deal with the Sinclair ZX81 that I've used when I was 4?



    Here is a hint for the uber narcissist computer science engineers: Computer science did not invent ATMs. Dumb ass!



    -Ad
  • Reply 44 of 51
    Quote:
    Originally Posted by Dick Applebaum View Post


    I think that history proves you right. Observing the "computer" industry for 52 years (since 1956) it seems that the best and/or breakthrough systems (software and hardware) are developed by a handful of people-- here's 2 examples:



    -The Apple Computer (hardware, BASIC, Assembler): 3 People

    -The Mac Computer (Hardware, OS, GUI, MacWrite, MacDraw): 5-7 people



    I am sure that you and I can come up with many more examples.



    Small teams appear to be more effective pioneers than large teams.



    There are, likely, many reasons for this. One significant reason is the ability of a small team to communicate and interact with each other. Often the small team is in the same room or adjoining cubicles. A new problem, required/desired feature can often be discovered, prototyped and resolved in a mater of minutes or hours.



    A larger, more formal, organization is often spread out over several buildings (or even time zones). Each group has responsibility for a certain portion of the project. When interaction is required, you must set up a meeting, resolve scheduling conflicts, and often have a pre-meeting to establish the group's position on the new issue... often this requires additional background research and documentation (just to get our ducks in line)... yadda, yadda, yadda.



    There is some number of people, say 5, where interaction and communication is optimal for performing the tasks necessary to build the project. Too many people, means more time is spent interacting and communicating than doing actual work.



    The optimal group interacts/communicates something like this:



    Steve: "Bill, I think we heed to change the...".

    Bill: "I talked with Andy about that and he thinks..."

    Andy: "Yeah, I looked into it and it will make the program bigger and slower...".

    Bill: "What we really ought to do is..."

    Andy: "Now, That sounds good! I can throw something together..."

    Steve: "OK, great work guys".



    Direction, Shared Goals, Talent and Trust!



    Oversimplification? Maybe to some extent! But if you have ever been involved in a small team project, interaction like that is commonplace. Everyone is on the same wave-length and readily available. Issues are resolved with a minimum of communication. Time is spent getting the job done. Little time is spent on (unnecessary) emails, phone calls, establishing a position, research, pre-meetings, meetings, etc.



    This is not to say that there is no need for formal interaction & communication, but there is a time and place for that.



    I've used an early Apollo mainframe before and the developers who were behind it were only a team of 3. We had to use a serial terminal to connect to it and the only thing you would see is a cursor at the top and a command line at the bottom. It was a text based UI that had no internationalization, only ASCII. The only way to use it is to know a combination of commands that could be referenced from a 4lb manual.



    It was used for many things, from word processing to storing employee attendance records. I'm glad those days are over.



    The problems you are raising are related to software development and delivery processes. Many organizations tend to build many such processes out of need at a certain point in time or due to a fad, but do little to optimize them for efficiency. That is fundamentally the problem, not development team head counts.



    -Ad
  • Reply 45 of 51
    merdheadmerdhead Posts: 587member
    Quote:
    Originally Posted by adkiller View Post


    You can't come off as a genius by over generalizing. Like an engineer lets breakdown your arguments for easier problem solving.







    That would be ultimately true if you are working with APIs that are designed to be compatible with each other. Don't get it? Try developing cross-platform applications using Win32, Cocoa and GTK if you would like first hand experience for what I mean. Or better yet a game that uses both DirectX and OpenGL.







    Really? All it only takes is a few people to write device drivers, filesystems drivers and protocol stacks for all the common devices, filesystems and protocols out there. Well I guess then all the shareware programmers are going to pull a complete modern OS and office productivity suite out of their ass pretty soon.







    Show me a software/computer engineer that could design UIs that are both useable that are ergonomic as well then I would buy this BS. I've came across far too many uber software engineers that have failed usability surveys only to realize that a good software engineer does not equal a usability expert.









    GASP! You mean multi-core processors and fire-channel drives existed since 1984? This is a conspiracy I tells ya. What's the deal with the Sinclair ZX81 that I've used when I was 4?



    Here is a hint for the uber narcissist computer science engineers: Computer science did not invent ATMs. Dumb ass!



    -Ad



    Wow, you really don't know what you're talking about, do you? So you resort to abuse. Most of your post is rubbish which your final point illustrates. Without computer science ATMs wouldn't be possible, let along pocket calculators.



    So what are you telling us? You have no analytical skills so you just hack until it's "good enough"? Until it runs a couple of times? You should go work for Microsoft, I hear they're hiring.



    Or maybe you should just find you're self a good therapist and stop wasting our bandwidth. Maybe when you are better and well medicated your can participate in a rational discussion.
  • Reply 46 of 51
    kim kap solkim kap sol Posts: 2,987member
    It's kinda hard to take a guy that calls himself "shithead" seriously.
  • Reply 47 of 51
    mdriftmeyermdriftmeyer Posts: 7,503member
    FYI: The Openstep Specification was certified in 1994. It's the foundation for OS X and covers a large portion of the APIs. It's been extended, but it's the same damn spec.



    NeXTSTEP/Openstep being far less advanced than OS X is laughable. What's changed is the inevitable progress on top of foundations decades old and refined by small groups of bright minds.



    Linux has it's benevolent dictator and second in command where about 90% of the code is in the hands of no more than two dozen individuals.



    I could go on but as others have chimed in, the smaller more agile teams always create the revolutionary stages of computing, whereas the larger teams are lucky to create evolutionary stages of computing. It's often by copying portions of solutions from those small groups that get them to those stages.



    If you think there was a large team designing ZFS you're freakin' nuts.



    The same for ReiserFS/4, ext3/ext4, XFS, JFS, NFS, UFS, HFS/HFS+, et.al.
  • Reply 48 of 51
    Quote:
    Originally Posted by merdhead View Post


    Wow, you really don't know what you're talking about, do you? So you resort to abuse. Most of your post is rubbish which your final point illustrates. Without computer science ATMs wouldn't be possible, let along pocket calculators.



    Tsk, tsk, sarcasm != abuse. That should be simple for you to understand.



    Luther George Simjian was not a computer scientist. The closest he could be equated is to an electronics engineer/scientist. Yes, he did not know machine language and OS fundamentals either. Being the expert computer scientist you are I'm sure you could google it. Else let me know if you need help on that.



    By the way an ATM uses much more simple logic than a pocket calculator.



    Quote:

    So what are you telling us? You have no analytical skills so you just hack until it's "good enough"? Until it runs a couple of times? You should go work for Microsoft, I hear they're hiring.



    Like a professional engineer, try rebuking my comments. In case you have been living in box all this while, people who are not computer engineers have analytical skills too.



    If you have an OS that'll put an end to Windows, please publish it. Why hold back?



    Quote:

    Or maybe you should just find you're self a good therapist and stop wasting our bandwidth. Maybe when you are better and well medicated your can participate in a rational discussion.



    I'll do that only if you start taking your medication first, then both of us could have a rational discussion together.



    -Ad
  • Reply 49 of 51
    Quote:
    Originally Posted by mdriftmeyer View Post


    FYI: The Openstep Specification was certified in 1994. It's the foundation for OS X and covers a large portion of the APIs. It's been extended, but it's the same damn spec.



    NeXTSTEP/Openstep being far less advanced than OS X is laughable. What's changed is the inevitable progress on top of foundations decades old and refined by small groups of bright minds.



    Linux has it's benevolent dictator and second in command where about 90% of the code is in the hands of no more than two dozen individuals.



    I could go on but as others have chimed in, the smaller more agile teams always create the revolutionary stages of computing, whereas the larger teams are lucky to create evolutionary stages of computing. It's often by copying portions of solutions from those small groups that get them to those stages.



    If you think there was a large team designing ZFS you're freakin' nuts.



    The same for ReiserFS/4, ext3/ext4, XFS, JFS, NFS, UFS, HFS/HFS+, et.al.



    Agreed, however all of these are mere components of an OS. A component of an OS is nothing revolutionary if the supporting subsystems do not exist.



    I recall RMS being indignant about distributions referring to themselves as Linux and not as GNU\\Linux operating systems. This was because Linus only provided the kernel, the system libraries, compiler and user-land tools were contributed by many individuals from FSF.



    There are much more revolutionary designs than XNU and OpenSTEP. L4 and DROPS come to mind and these are pretty recent innovations. But unlike XNU and OpenSTEP, these implementations do not have an established user-base of users and developers.



    I believe if new technologies and designs were given the same financial support and installed user-base, we would be commonly using them today.



    -Ad
  • Reply 50 of 51
    gastroboygastroboy Posts: 530member
    Quote:
    Originally Posted by exscape View Post


    "The Code Generator phase then takes the optimized code and maps it to the output processor, resulting in assembly language code which is no longer human readable."



    Assembly isn't human readable? I guess that makes me a machine.



    Reminds me of Peter Cook's joke about C. P. Snow that he was a robot because he'd never seen anybody touch him with wet hands.
  • Reply 51 of 51
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by mdriftmeyer View Post


    You're overlooking people with actual Engineering field credentials. Everyone I worked with at NeXT Engineering knew from the basics of kernel programming to userspace APIs and how ObjC and the mach microkernel worked together. It was a small, deeply knowledge set of geniuses who actually loved to help and expand other engineers learning it along the way.



    Mmm...there's know the fundamentals and there's know in excruciating detail about every API and compiler tweak. Folks specialize and depend on each other for the minute details.



    Not everyone is going to be the equivalent of a Lippman or Meyers. Nor do you want them to because then you have no Tognazinis or Raskins.



    Quote:

    Sorry, but if companies would hire actual individuals holding Engineering degrees they would be better served.



    Arguably CS degrees aren't engineering degrees. I dunno if that was your point or not.



    Quote:

    Accredited universities in CS tend to have the following:



    Course sequences allowing students to specialize in specific areas such as:
    • computer graphics,

    • computer networking,

    • computer systems software,

    • software engineering, or

    • computer engineering.

    Of course, I'd recommend companies first hire individuals who've worked for Operating System companies or within Operating System projects [Linux, RiscOS, Haiku, et.al], before hiring the fresh grad who knows Flash, Javascript and Photoshop.



    Mmmm...perhaps. But working on an operating system does not equate to kernel hacker as you seem to imply. Folks work at different levels of abstraction. Being good at one does not imply being good at another.



    This is why UIs designed by programmer suck ass.



    Quote:

    What makes the Software Industry think it can actually get away with it and not feel the ripples along the way?



    Because we've been wildly successful at it? More so than other engineering endeavors?



    Quote:

    One problem is that we are driven by 6 month cycles to bring out the next best thing before figuring out if it's got a foundation worth building on.



    What you cite as a problem is one of our greatest strengths. It's also somewhat untrue. Software technology has roughly a 10 year adoption time as most industries from inception to mainstream use.



    Belief that we are agile is almost as good as being agile since resistance to change is mostly mental. But developers are still human.



    Quote:

    There is a reason C is still vital and FORTRAN is critical in many Engineering applications, especially Numerical Analysis--there is decades of refinement and design behind it.



    Yes and no. There's also decades of legacy code behind it. FORTRAN exists only as a niche language today and arguably MATLAB is used more than FORTRAN for numerical analysis.



    Quote:

    The dubious C Programming language is so deeply ingrained in Computer Graphics, Compilers, Kernels, to every aspect of programming where it shares duties with C++, ObjC, Java, Fortran, Ada, Pascal, Eiffel and more that it's clear that Python, Ruby, Javascript, the many ML functional languages and more are just one big family serving to solve our needs.



    The goal isn't to learn their languages and syntax but to learn the mathematical approach to design solutions [applications, services, classes, methods, delegates, etc] to solve programmatically large data sets where regular symbolic mathematics works if we only had a million years to crunch it out.



    Meh. The goal is to make solutions that work well enough to get the job done. A mathematical approach is nothing more than another tool in the toolbox, just like languages and syntax.



    Quote:

    However, the more deeply knowledgeable one is in the mathematical theories the smaller and more rapidly the solutions will surface and thus picking the right tool for the job is the road to completion.



    Bullshit.



    Quote:

    That takes years of learned skill development that many industries stopped doing over a decade prior. They want temporary solutions that mainly drive entertainment revenue streams.



    Industry jumped on the WEB bandwagon and ductaped it together.



    And amazingly it works without the requried mathematical rigor.



    Quote:

    An example of this is Apache 1 and 2 Web Servers. Great products within their original design scopes. Unfortunately, they've had to be bloated to adapt to each new web technology that comes along and gets "popular" with the business markets.



    The law of diminishing returns eventually rears it's ugly head and thats when people start to look around for those who realized the end game would require them to have knowledge to see the general scope to solve this dilemma.



    At which point we refactor OR some less crufty code base comes along to build on.



    Quote:

    One said example is the next generation of Apache 3.0.



    Go read up on what Apache 3.0 will do for the Web. It's a complete rewrite and guts out about 90% of the crap thrown on their from past ideas that made no sense.



    Yes. This is how it works.
Sign In or Register to comment.