65nm Intel, probably not a good thing...

Posted:
in Future Apple Hardware edited January 2014
Was reading the newspaper at school today, and say a article on a 65nm processor chip that Intel was going to release sometime next year. If i'm not mistaken, you can push a processor past 4.5Ghz? I don't think this is very good,, big competition against apples 90nm (probably going to be out 2004).
«1

Comments

  • Reply 1 of 40
    nr9nr9 Posts: 182member
    its always a good thing for us cuz more competition = better products.



    anyways, intel's 65 won't be out until at least after 2005, they dont even have good 90 nm
  • Reply 2 of 40
    chagichagi Posts: 284member
    Quote:

    Originally posted by Nr9

    its always a good thing for us cuz more competition = better products.



    anyways, intel's 65 won't be out until at least after 2005, they dont even have good 90 nm




    I'll second this.



    Competition within the semiconductor industry is what provides us (the consumers) with computers that have generally been evolving at an amazing rate.



    I would only worry if for some reason IBM can't continue to provide competitive CPUs. Size isn't everything, something that's going to be come more important soon, as there are some theoretical limits to how small a circuit can possibly get...
  • Reply 3 of 40
    IBM and Intel are both claiming the same timeline actually. Both are expected to move to a 90mm process during '04 and then to a 65mm process in '05.



    I don't think it matters either way. Intel has gone beyond what I consider the wall of sanity, and IBM is quickly reaching that wall.



    Ten years ago I would have strangled someone in front of their own mother for another 25MHZ, but in the past year things have reached a boiling point--even for high end applications. Unless you're rendering, compiling, or playing games, there is no reason for a 2GHZ machine to not do everything you need all at the same time.



    I mean, when I was waiting a minute or two for a filter to render in Photoshop only to have to undo it, make changes, and give it another go I wanted a faster machine... but when the program reacts nearly as fast as I can use it, there's no point.



    I don't think I'm alone on this issue either. In the past couple of years there has been a strong push towards more efficient processing--the Pentium M/Centrino is a fantastic example of this. The market for energy and heat efficient processors for smaller form factors is booming now that it's becoming possible to have both the efficiency and the power.
  • Reply 4 of 40
    Apparently the 90nm G5 is supposed to top out somewhere around 5ghz. I don't see the problem with a 65nm Intel pushing 4.5ghz. You also seem to be forgetting that mhz isn't everything.
  • Reply 5 of 40
    Quote:

    Originally posted by jbarket

    The market for energy and heat efficient processors for smaller form factors is booming now that it's becoming possible to have both the efficiency and the power.



    I hope Apple (and IBM) enter this market soon! I'd love to see Apples portables (and maybe an upgraded iPod) gain even more battery life while improving system responsiveness.



    Heck, I just read that the Palm Tungsten T3's processor is almost as fast as my iBook (400 MHz / 600 MHz) - on paper.
  • Reply 6 of 40
    As everyone has said, Intel aren't even shipping 90nm yet, so 65nm is pure marketing BS... it smacks of desperation.



    I'd be more impressed if they slated a proper consumer 64 bit chip for release, instead they use the usual Intel tactic... try to ramp up the speed.



    Also what about the leakage issue? The 90 nm chips are leaking current like anything... and not even shipping yet! Surely this must be fixed on the new/current process before moving on...



    IBM has nothing to worry about.



    Peace,



    TPC
  • Reply 7 of 40
    I think that a more interesting development for the Electronics market is the one of the new diamond manufacturing processes that promises to deliver "blocks" of diamond at $5.00/ carat that could potentially replace silicon wafers in processor production. I'm not sure how long it would take for this to work its way down to a consumer level processor, but in one way it solves the heat issue because chips made of diamond would have a higher heat tolerance than those made out of silicon. It is intriguing to me because it could solve the issue of what to replace silicone in chip manufacturing that the industry has been working on for many years.
  • Reply 8 of 40
    yea the diamond thing is REALLY cool, when i read about it i thought that it could really change the computer industry! But yes it woudl take a long time in order for it to filter down to the consumer level, if you want to read the atricle, it is a long but good read.... http://www.wired.com/wired/archive/1...iamond_pr.html
  • Reply 9 of 40
    Quote:

    Originally posted by jbarket

    I mean, when I was waiting a minute or two for a filter to render in Photoshop only to have to undo it, make changes, and give it another go I wanted a faster machine... but when the program reacts nearly as fast as I can use it, there's no point.



    Well, I don't know what you're doing in Photoshop, but try to pass an intensive filter on a 22" x 30" document at 2400 dpi. A dual G5 will easily take a minute or two for that sort of task, and that's what's required for a quality poster. I make fine art prints at 60" x 60" and bigger, and am forced to work with the paultry 600dpi, a forth of the capacity of my large-format printer, which means RIP resampling and a much lower quality print. I long for a PowerMac that will make working with large documents real-time. While Photoshop on a G5 may be fast enough for yesterdays standard, 300 dpi, it chokes on 600dpi and higher, which is what many print shops are asking for these days. Within five years, they'll be asking for 3200 dpi documents and I imagine a dual G8 will still be choking. As processor technologies evolve, software will become more demanding so we're in a perpetual state of catching-up.
  • Reply 10 of 40
    That doesn't change my point at all. Like I said, there are people out there doing rendering/compiling/gaming that still push the limits.



    The point is that the average end user--even the average power user--isn't in the same boat as you are.
  • Reply 11 of 40
    gargar Posts: 1,201member
    Quote:

    Originally posted by jbarket

    That doesn't change my point at all. Like I said, there are people out there doing rendering/compiling/gaming that still push the limits.



    The point is that the average end user--even the average power user--isn't in the same boat as you are.




    nobody is



    i do my illustration work on 600dpi and allmost all of my customers are complaining about the resolution: it's to high.

    they can't print it for a presentation, the prepress charging them for rip-time etc.
  • Reply 12 of 40
    Quote:

    Originally posted by jbarket

    The market for energy and heat efficient processors for smaller form factors is booming now that it's becoming possible to have both the efficiency and the power.



    True, but the economic model upon which Intel operates is still based on continually pushing for more speed. Don't expect them to change that any time soon.





    Quote:

    Originally posted by The Placid Casual

    As everyone has said, Intel aren't even shipping 90nm yet, so 65nm is pure marketing BS... it smacks of desperation.



    Intel is by far the most dominant player in proc fab right now. They are just coming off a good quarter. They generate several tens of billions of dollars per annum in revenue.



    How, exactly, are they desperate?
  • Reply 13 of 40
    Quote:

    Originally posted by Michael Wilkie

    Well, I don't know what you're doing in Photoshop, but try to pass an intensive filter on a 22" x 30" document at 2400 dpi. A dual G5 will easily take a minute or two for that sort of task, and that's what's required for a quality poster. I make fine art prints at 60" x 60" and bigger, and am forced to work with the paultry 600dpi, a forth of the capacity of my large-format printer, which means RIP resampling and a much lower quality print. I long for a PowerMac that will make working with large documents real-time. While Photoshop on a G5 may be fast enough for yesterdays standard, 300 dpi, it chokes on 600dpi and higher, which is what many print shops are asking for these days. Within five years, they'll be asking for 3200 dpi documents and I imagine a dual G8 will still be choking. As processor technologies evolve, software will become more demanding so we're in a perpetual state of catching-up.



    If you are printing through post script then the RIP is converting your pixel info into a line screen for printing (this breaks down the Pixel image into the CMYK dots that the printer places on the page). The standard formula for line screen computation to find out the resolution (dpi/ppi) that is needed is 2*LPI=PPI. So for a 200 line screen printing process, you need a 400 PPI image. Even if the printer is not using a conventional line screen, it is bieng broken down to a CMYK dot pattern when the image is printed, this is even true for dye-sub printers which print a continuous tone image The dye bleeds together on the paper creating a blended color as opposed to 4-6 dots printed close together or overlapping to create an optical blending when viewed).



    As a graphics professional, I have never worked above 400 PPI, unless I was working at 50% of the final size due to the max document size limitations of Quark (The document was printed from Quark at 200% for final output to film). For most bit mapped (line art) we only use 800 dpi for output to film at 2400 dpi imagers using 133 lpi. I would suggest checking with your prepress house for the "optimum" PPI needed for the type type of printer or image setter that you are having the files output to, it is rarely the native resolution of the device. If you are having quality issues printing a 2400 ppi image then I would suggest finding another output method because that is probably where the issue resides, not in the file.



    Quote:

    Originally posted by gar

    nobody is



    i do my illustration work on 600dpi and allmost all of my customers are complaining about the resolution: it's to high.

    they can't print it for a presentation, the prepress charging them for rip-time etc.




    See above, your images are probably taking more than twice the needed time for the RIP to process them, not to mention wasted time importing the images into the DTP program, and processing time that it takes the local computer to retrieve the HR image, assemble the post script file, and send it to the printer (unless you are using an OPI system, then it is taking more time for your network to swap in the hi-res data before sending it to the print que). All this extra time spent by the users, and systems and this extra data is just thrown out by the RIP in the end.
  • Reply 14 of 40
    Quote:

    Originally posted by @homenow

    [BIf you are having quality issues printing a 2400 ppi image then I would suggest finding another output method because that is probably where the issue resides, not in the file.[/B]





    Nobody said anything about having quality issues at 2400 ppi. What I was saying was that while 2400 produces optimum quality, it chokes my box.
  • Reply 15 of 40
    Quote:

    Originally posted by Michael Wilkie

    Nobody said anything about having quality issues at 2400 ppi. What I was saying was that while 2400 produces optimum quality, it chokes my box.



    What kind of printer are you printing to? What is its maximum resolution, and what LPI are you printing at? Are you using postscript, or some other output language?



    Check with your printer manufacturer, I would be willing to bet that even at 1200 PPI your are over the resolution that is handeled by your RIP, and anything above that is just waisting time, yours and the equipments. Even an imageseter at 2540 dpi resolution can only handle ~158 lpi, which needs a 316 ppi image, and as far as I know there is not an inkjet printer marketed today that can match the print quality of high end imagesetters.
  • Reply 16 of 40
    amorphamorph Posts: 7,112member
    *cough*



    To answer the question about where IBM is: According to articles I've read, they're actively researching 65nm process tech, and they have been for a while now. It's not going to be any easier a jump than 130nm or 90nm were, so the sooner they get started, the better.
  • Reply 17 of 40
    gongon Posts: 2,437member
    Quote:

    Originally posted by The Placid Casual

    I'd be more impressed if they slated a proper consumer 64 bit chip for release, instead they use the usual Intel tactic... try to ramp up the speed.



    Frankly, I don't see why you'd be more impressed. If I can take a 32-bit processor that is twice as fast as an otherwise equivalent 64-bit processor, it will run 32-bit computation twice as fast and still do pretty well at 64-bit. Considering how big part of software is "naturally" 32-bit and does not benefit from 64-bit optimization, I'd want to err to the side of raw power and speed rather than have complicated tech that gets you an advantage in some specialized situation, and probably results in processor costing a hell of a lot more.
  • Reply 18 of 40
    Quote:

    Originally posted by Gon

    Considering how big part of software is "naturally" 32-bit and does not benefit from 64-bit optimization



    For now.......
  • Reply 19 of 40
    Quote:

    Originally posted by jouster

    For now.......



    Forever. Most numbers that software normally has to deal with are either considerably smaller in magnitude than a billion, or they are more appropriately represented by 32 or 64-bit floating point numbers (which all PPC's support natively). This fact will not change. 32-bit machines were much faster than 16-bit machines because the older machines were only fast with numbers into the tens of thousands. 16-bit machines were much faster than 8-bit because those ancient machines were only fast with numbers into the hundreds. Each increment in word size captures fewer and fewer commonly used numbers, however, so it provides diminishing returns for increasing cost (each step up involves moving around twice as much data). The step from 64-bit to 128-bit may never happen (for everyday computers, anyhow) because the magnitude of a 64-bit number is so huge.
  • Reply 20 of 40
    Quote:

    Originally posted by Michael Wilkie

    Nobody said anything about having quality issues at 2400 ppi. What I was saying was that while 2400 produces optimum quality, it chokes my box.



    It would choke any box right now. Nothing is built to handle this well enough. Try to run it on any Intel chip, or an AMD chip...your out of luck. You see a G5 choke at that, its Wintell counterparts would cry like a bunch of little girls. Some aspects of tech will always evolve faster than computers can keep up with. You need a cluster, or a network render for that kind of thing to happen efficiently.
Sign In or Register to comment.