Road to Snow Leopard: Twice the RAM, half the price, 64-bits

123468

Comments

  • Reply 101 of 147
    sxt1sxt1 Posts: 11member
    Quote:
    Originally Posted by ehych View Post


    I may be wrong but I definitely believe the minimum requirement is an Intel processor, it doesn't matter if it's Core Duo (32-bits) I pretty much think they are supported. Look at this page:

    http://www.appleinsider.com/articles...erpc_macs.html



    Using a Core Duo processor means you don't get all the good 64-bits stuff from Snow Leopard, even though there are lots of more good things besides 64-bits.



    so this means:



    Stripping down the PPC code base out of Mac OS X (Snow Leopard) but still 32 bit en 64 bit support.
  • Reply 102 of 147
    Quote:
    Originally Posted by melgross View Post


    I don't believe that there won't be features that others won't want.



    Even though jobs did say that this isn't going to be a feature upgrade version, I don't see anyone really believing that there won't be at least, the coming out of some features that we've been seeing included in the OS now, but not implemented for the general user, such as resolution independence.



    Plenty of people will really want that. That's a minimum.



    You've only got to look at what they're adding in 10.6 Server to realise there's lots of new features in OSX Client.



    http://www.apple.com/server/macosx/snowleopard/
  • Reply 103 of 147
    Quote:
    Originally Posted by melgross View Post


    You will have to explain this.



    That was probably in reference to the performance-hurting cache-flushing phenomenon described in the article we're all commenting on in this thread. Apparently in any version of OS X released for the Intel architecture to-date, a certain cache needs to be flushed at least twice every time any system call is made in any current version of OSX, while such flushes only happen during context-switches with a Windows NT kernel.



    As described in the article, it apparently has to do with the design decision Microsoft made to place a protected, shared 2GB kernel partition inside every process's virtual address space at the expense of leaving each process with less usable memory for its own code and data. With OS X, on the other hand, each process needs to switch to a totally different virtual address space every time any system call is made, and the addresses actually used by the kernel and the application within their respective address spaces directly overlap.



    The article claims that with Snow Leopard, OS X is going to start doing the same thing as Windows - mapping a copy of the kernel into each 64-bit process's virtual address space. The kernel and the application have many terabytes to play within each virtual address space, so avoiding overlap won't be a problem. The trade-off of addressable memory per process versus performance will not be as severe as it has grown to become for 32-bit Windows.
  • Reply 104 of 147
    kaiwaikaiwai Posts: 246member
    Quote:
    Originally Posted by qualar View Post


    Those sort of comments make me ashamed to be a Mac User. What incorrect and arrogant rubbish.



    I'd agree - because it isn't just PC users, it is users in general. The average user is an idiot - "I need a computer to get on this internet thing - I don't know what the hell it is, but apparently I should have it" is the mentality of many people out there.
  • Reply 105 of 147
    kaiwaikaiwai Posts: 246member
    Quote:
    Originally Posted by qualar View Post


    Those sort of comments make me ashamed to be a Mac User. What incorrect and arrogant rubbish.



    Quote:
    Originally Posted by wbrasington View Post


    Ok, I give up.



    What's incorrect?



    The fact that you stated PC user instead of computer users in general - some how claiming that Mac users were 'superior'.
  • Reply 106 of 147
    Quote:
    Originally Posted by melgross View Post


    While I'm not taking side on this argument (enough problems with my own), but didn't you say that you were in IT for 21+ years? You started when you were how old?



    I started working for them as a junior computer operator right out of high school at 17. So while I am in my late 30's im still only in my 30's, don't think I can apply for a Medicare card yet...



    As far as Vista x64 or x86 for that matter both work fairly well. I have had no real problems with either but I will say x64 appears more stable now that I have been using it a while.



    As to your point about features dropped, I am sure there have been several in Vista, nothing I can really say I miss. I my opinion both operating systems are getting closer and closer to having the same feel.



    For me Leopard has been more of a nagging pain because it's issues affect my work. The biggest that has bugged me are the network issues.



    I would say most of the problems with Vista in the begining were users attempting to install it or bought systems that simply could not handle the OS. If someone had a system with 1 gig of ram paging would go crazy and the system would take forever to start.
  • Reply 107 of 147
    Quote:
    Originally Posted by melgross View Post


    It's convincing when you understand it.



    So what you're saying is that you understand it, but no-one else does. It's just your little secret. Or maybe you just don't know what you're talking about.
  • Reply 108 of 147
    Quote:
    Originally Posted by merdhead View Post


    So what you're saying is that you understand it, but no-one else does. It's just your little secret. Or maybe you just don't know what you're talking about.



    I don't think it's a very well-kept secret at all.



    Consider a single thread of execution containing a tight loop performing iterative mathematical operations on a set of variables. In a traditional 32-bit x86, you basically have 4 general-purpose registers available (*AX, *BX, *CX, and *DX) in which to hold those variables and perform the math. Of course, it is often possible to re-purpose the *SI and *DI registers for such information, and with major caveats sometimes (but not often) the *BP and even *SP registers. But there's clearly still a fairly limited slate of registers available for a single thread of execution to work with.



    If you have more than variables to work with than will fit in the available registers, then some of those variables must necessarily be transferred back and forth with system RAM, in conjunction with the cache. Operations where one or more source and/or destination operands are located in RAM are are slower than what could be accomplished if all sources and destinations were limited to CPU registers.



    On an x64 CPU operating in long mode, you have 8 more general purpose registers available (R8, R9, ..., R15), it is possible to as much as triple the number of variables that a single thread of execution can keep track of before it needs to resort to RAM (and cache) transfers. Obviously, to be useful, a compiler must be made aware of the availability of the extra registers. But equally obviously, if an application is 64-bits, then it is automatically the case that it must have been produced by a compiler that is aware of, and thus capable of taking advantage of, the extra registers available in that mode.



    Quote:

    If registers were so critical Intel would have added them long ago,



    They did. The Itanium architecture had 128 registers. It failed where x64 succeeded, in part, because Intel didn't provide a backwards compatible path for existing software. Conversely, the x86 architecture continued to thrive with few registers for as long as it did, not because adding more registers wouldn't have been helpful, but rather because maintaining compatibility with older software was more advantageous.
  • Reply 109 of 147
    Quote:
    Originally Posted by JesseDegenerate View Post




    Welcome to 2008, old guy.




    It's always comforting to me to know that, in 20 to 30 years, you'll be scratching your head trying to understand why your children/neighbors kids/anyone younger than you, dismisses you out of hand with the "old guy" tag. Provided, of course, that you survive the "growing up" process.
  • Reply 110 of 147
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by aegisdesign View Post


    You've only got to look at what they're adding in 10.6 Server to realise there's lots of new features in OSX Client.



    http://www.apple.com/server/macosx/snowleopard/



    Of course. There is no way that 10.6 would come out with NO features.
  • Reply 111 of 147
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by lfmorrison View Post


    That was probably in reference to the performance-hurting cache-flushing phenomenon described in the article we're all commenting on in this thread. Apparently in any version of OS X released for the Intel architecture to-date, a certain cache needs to be flushed at least twice every time any system call is made in any current version of OSX, while such flushes only happen during context-switches with a Windows NT kernel.



    As described in the article, it apparently has to do with the design decision Microsoft made to place a protected, shared 2GB kernel partition inside every process's virtual address space at the expense of leaving each process with less usable memory for its own code and data. With OS X, on the other hand, each process needs to switch to a totally different virtual address space every time any system call is made, and the addresses actually used by the kernel and the application within their respective address spaces directly overlap.



    The article claims that with Snow Leopard, OS X is going to start doing the same thing as Windows - mapping a copy of the kernel into each 64-bit process's virtual address space. The kernel and the application have many terabytes to play within each virtual address space, so avoiding overlap won't be a problem. The trade-off of addressable memory per process versus performance will not be as severe as it has grown to become for 32-bit Windows.



    That doesn't look as though its a detriment right now, when compared to Windows. Performance wise, Vista has been a dog. It's only recently that most of that has been fixed. Comparable programs also seem to perform about the same on both platforms.
  • Reply 112 of 147
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by merdhead View Post


    So what you're saying is that you understand it, but no-one else does. It's just your little secret. Or maybe you just don't know what you're talking about.



    No, basically, you don't seem to understand it.



    The register "problem" has been held up as a major reason why 64 bit X86 chips have an advantage in program run speeds over their 32 bit counterparts, whils the same thing isn't true for G5s vs G4's and earlier.



    Obviously, there are other factors at work as well.
  • Reply 113 of 147
    Quote:
    Originally Posted by extremeskater View Post


    I started working for them as a junior computer operator right out of high school at 17. So while I am in my late 30's im still only in my 30's, don't think I can apply for a Medicare card yet...



    As far as Vista x64 or x86 for that matter both work fairly well. I have had no real problems with either but I will say x64 appears more stable now that I have been using it a while.



    As to your point about features dropped, I am sure there have been several in Vista, nothing I can really say I miss. I my opinion both operating systems are getting closer and closer to having the same feel.



    For me Leopard has been more of a nagging pain because it's issues affect my work. The biggest that has bugged me are the network issues.



    I would say most of the problems with Vista in the begining were users attempting to install it or bought systems that simply could not handle the OS. If someone had a system with 1 gig of ram paging would go crazy and the system would take forever to start.



    more non specific crap from the IT specialist. What networking issues? All that iphone syncing? It's the same setup as a Windows Mobile phone. username/pass/OWA addresss takes 20 seconds, just like you.



    So you were installing Mainframes for IBM when you were 17 out of Highschool huh?



    for the record, Schenanagans. I don't even think your an IT guy.
  • Reply 114 of 147
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by JesseDegenerate View Post


    more non specific crap from the IT specialist. What networking issues? All that iphone syncing? It's the same setup as a Windows Mobile phone. username/pass/OWA addresss takes 20 seconds, just like you.



    So you were installing Mainframes for IBM when you were 17 out of Highschool huh?



    for the record, Schenanagans. I don't even think your an IT guy.



    Please! Just a little bit more civility. I you're attacked, you can respond, but not just because you don't agree, or believe. If someone isn't responding, then you can get a bit snarly, but otherwise keep the level down.



    Thanks.
  • Reply 115 of 147
    MacProMacPro Posts: 19,728member
    Quote:
    Originally Posted by alandail View Post


    There is a HUGE difference between buying more computer than you need (which happens all the time, not just with computers) and paying extra for something you cannot use at all. I can't imagine too many people would be happy to learn that most of the extra 2 GIGs of ram they just paid $250 for at DELL's recommendation is almost completely unusable by the OS. Or to learn that the OS hides the fact that it's unusable.



    Totally agree. It is grounds for a class action law suit I would imagine. The misleading and untrue information is in print for heaven's sake!

    (Yes still in the mountains!)
  • Reply 116 of 147
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by melgross View Post


    Nikon came out with a 12.1 MP sensor because this is the first high quality sensor the did by themselves designwise. You must walk before you run. They were getting killed by Canon, and decided to concentrate on noise levels first, which is much easier to do on a low MP sensor such as the one they came up with (which they don't actually make themselves).



    "killed"? Boy, you're morbid.



    Anyway, to me, it appears they were doing pretty well in the business even before they introduced those two models.



    Quote:

    The MP race has not ended by a long shot. This has given Nikon some short term breathing room.



    Canon will come close in noise with its newer cameras, if not equal it, with higher pixel counts, which ARE important.



    I don't think the MP deal has ended, but it seems most people put way too much emphasis on it, I was happy to see a few examples of focusing on something else instead, those aren't the only ones, but it's rare enough. I understand that some people truly need it, a great deal more think they need it but really don't, and I'm not in either category. The files are already getting plenty big enough and for the most part, it really doesn't benefit me to have them get bigger when many of the current pixels are already getting thrown away en masse because of the output medium.
  • Reply 117 of 147
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by JeffDM View Post


    "killed"? Boy, you're morbid.



    Anyway, to me, it appears they were doing pretty well in the business even before they introduced those two models.



    Ah, come on, you know that's just an expression. Canon and Nikon have 90% of the D-SLR market together. Canon has had about two thirds of that 90%, so they were "killing" Nikon there. With the D3 and D700, Nikon has made up a bit of that.



    Canon won't let them keep it. They have a history of pushing back hard when required. They also have a good deal of research into sensor and processing chip technology going on, which they have related in a public way.



    Canon also sells about half of the point and shoot digital cameras out there.



    Quote:

    I don't think the MP deal has ended, but it seems most people put way too much emphasis on it, I was happy to see a few examples of focusing on something else instead, those aren't the only ones, but it's rare enough. I understand that some people truly need it, a great deal more think they need it but really don't, and I'm not in either category. The files are already getting plenty big enough and for the most part, it really doesn't benefit me to have them get bigger when many of the current pixels are already getting thrown away en masse because of the output medium.



    It depends on which "people" you're talking about. Most amateurs (I mean serious shooters) don't need anything above perhaps 16 MP. But pro's and advanced amateurs want, and need, as much as they can get.



    For example. The Canon 1Ds II was deliberately made to have a sensor of 16.7 MP. Why? Because a double page magazine spread of 11 x 17 at a 150 line screen is best with a 300 ppi file. That turns out to be 16.7 MP plus or minus a tenth MP or so.



    But pros often have to do a bit of a trim (or the editors do), so voilá, 21 MP for the 1Ds III.



    My printer is 17" by whatever. To make a print with a file that's 300 DPI on the print (my printer raises that to 600 DPI when printing with interpolation, if I don't do it first in PS, which I do) requires a file that's (for a 3/2 file format) 5100 x 7650, or 39MP, which is the size of a large medium format back. That gives me no option to do any cropping at all at that size and remain at highest quality. That file is increased to 156 MP for the print as I've just explained.



    Does that give you an idea of what the problem is?
  • Reply 118 of 147
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by melgross View Post


    It depends on which "people" you're talking about. Most amateurs (I mean serious shooters) don't need anything above perhaps 16 MP. But pro's and advanced amateurs want, and need, as much as they can get.



    I don't think most amateurs benefit from anything far beyond 10MP.



    Quote:

    For example. The Canon 1Ds II was deliberately made to have a sensor of 16.7 MP. Why? Because a double page magazine spread of 11 x 17 at a 150 line screen is best with a 300 ppi file. That turns out to be 16.7 MP plus or minus a tenth MP or so.



    But pros often have to do a bit of a trim (or the editors do), so voilá, 21 MP for the 1Ds III.



    My printer is 17" by whatever. To make a print with a file that's 300 DPI on the print (my printer raises that to 600 DPI when printing with interpolation, if I don't do it first in PS, which I do) requires a file that's (for a 3/2 file format) 5100 x 7650, or 39MP, which is the size of a large medium format back. That gives me no option to do any cropping at all at that size and remain at highest quality. That file is increased to 156 MP for the print as I've just explained.



    Does that give you an idea of what the problem is?



    There are pros that really do need it, but going much beyond 10MP is really serving mediums that appear to be fading in relevance. I mean, there's not much point in someone demanding 600ppi from a C sized print like your example. Most people aren't going to look that close. I try to point out detail in a letter sized photo prints, but pretty much no one cares. Knock photos down to VGA size for email and nobody seems to be any less impressed with the photo.
  • Reply 119 of 147
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by JeffDM View Post


    I don't think most amateurs benefit from anything far beyond 10MP.



    You don't think, don't you?



    And other than a "feeling", exactly what technical justification makes you say this?



    Quote:

    There are pros that really do need it, but going much beyond 10MP is really serving mediums that appear to be fading in relevance. I mean, there's not much point in someone demanding 600ppi from a C sized print like your example. Most people aren't going to look that close. I try to point out detail in a letter sized photo prints, but pretty much no one cares. Knock photos down to VGA size for email and nobody seems to be any less impressed with the photo.





    You use the expression of one who knows little of the matter.



    I can assure you that people do look closely at prints. You are just not interested in photography enough to do so, and so you think that others don't either.



    I was in this business since 1969, when I did my first fashion photography. I've made prints, and viewed many others over that time. There are plenty of people who would disagree with you.



    If you haven;t ever seen really fine prints, which you obviously haven't, you would see the difference between them, and soft unprofessionl ones.



    While I certainly agree that people wanting $200 point and shoots don't need it, they make smallish prints that can't use it, those making prints of larger sizes and of high quality, do.



    I always find it interesting that people are willing to exclaim that others don't need what they don't appreciate, or understand.
  • Reply 120 of 147
    jeffdmjeffdm Posts: 12,951member
    Quote:

    I don't think most amateurs benefit from anything far beyond 10MP.



    Quote:
    Originally Posted by melgross View Post


    You don't think, don't you?



    And other than a "feeling", exactly what technical justification makes you say this?



    My justification: 10MP exceeds what's necessary for a letter size photo print at 300ppi, with plenty of margin for cropping. And that's pretty good for amateurs. Larger prints often mean that observer is standing back a bit farther anyway. They're most likely not getting stuff printed as a two page magazine spread or hung as a similar or larger sized shot in a gallery. Even with nose-printing on the print, most people don't appear to notice or appreciate any extra detail at 600.



    Quote:
    Originally Posted by melgross View Post


    I can assure you that people do look closely at prints. You are just not interested in photography enough to do so, and so you think that others don't either.



    You misinterpreted what I said. I have no idea how you can say that in view of "I try to point out detail in a letter sized photo prints, but pretty much no one cares." I don't know how I can say I try to point out detail and you still contend that I don't care about said detail. I'm just being practical. Except for some pros, I see little use in obsessing over detail that are only of interest to the occasional nose-printers. I doubt most amateurs would stand much of a chance with that kind of crowd anyway.



    I am interested in photography despite your elitist assertions otherwise. It's like telling someone they aren't really interested in Macs if they don't own a Mac pro. While there are certain niches that demand such detail, the practical reality is that the web is fast becoming the primary photo display medium, if not already, and even more so for the future. Magazines are slowly declining, there's not a whole lot of room for shooters to serve that market. Being able to capture a 21MP photo doesn't mean shit if the photo is posted to the web, it's either annoying or significantly downsampled.
Sign In or Register to comment.