Where the heck are the new displays?

13

Comments

  • Reply 41 of 67
    eds66eds66 Posts: 119member
    I am hoping that once the new displays come out, the current ones drop in price. I already own Apple's 20-inch cinema, which I think is the best LCD on the market right now, and would like to add another one. Two of those side by side is going to be awesome.
  • Reply 42 of 67
    matsumatsu Posts: 6,558member
    While we talk about the look of the new displays, we've missed one important aspect.



    The displays ought to come with DVI alongside the ADC. No cumbersome adaptor, just build both right into the display. Come to think of it, they could just supply the adaptor, since you're going to need one for power anyway, if you don't use ADC, but it's ludicrous to penalize PB owners for choosing an Apple display. And if they didn't nickle and dime people for an adaptor, they'd probably pick up more sales from the high end PC market too, just by marking the box "mac/PC" -- kinda like iPods.
  • Reply 43 of 67
    davegeedavegee Posts: 2,765member
    Quote:

    Originally posted by Matsu

    [BThe displays ought to come with DVI alongside the ADC.



    No cumbersome adaptor, just build both right into the display.



    Come to think of it, they could just supply the adaptor, since you're going to need one for power anyway, if you don't use ADC, but it's ludicrous to penalize PB owners for choosing an Apple display. And if they didn't nickle and dime people for an adaptor, they'd probably pick up more sales from the high end PC market too, just by marking the box "mac/PC" -- kinda like iPods. [/B]



    Disclaimer: Numbers given below are just guesses by me but I'm sure there not too far off the mark.



    Huh? You'd have Apple to FORCE the 97% / 98% of desktop owners who buy a desktop with an Apple display to pay for an additional DVI connector as well as a power brick that they don't need to be included in the Apple LCD just so the 3% or 2% (even less?) of the laptop buyers who want to also buy a desktop display don't have to pay for an DVI to ADC adapter?



    Heck, tons of people were yapping about having to pay for 'all of iLife' just to get _insert_one_app_name_ because "The'd never use the others" and iLife is the biggest bargain going and now you want those people to foot the bill for extra connectors and ac adapters that they'd never use?



    I just don't see it...



    Dave
  • Reply 44 of 67
    matsumatsu Posts: 6,558member
    Apple's 23" remains a relative bargain, but the connectors put people off. The 20" is also competitive, but ditto the connectors. The 17" has never been price competitive and probably never will be. Price reductions are in order. But at the high end of the market which the 20-23" panels describe, you will find buyers, probably enough double the numbers.
  • Reply 45 of 67
    machemmachem Posts: 319member
    Well, I thought it would be possible to save money within a new G5 purchase if I got an "other" LCD 17", and the $40 adapter to use it with the ADC Pmac.



    Nope. The skinny is that at $599 (academic), I might be able to save $100 with a quality (not no-name or junk) 17" LCD and adapter. Hardly seems worth it, especially as the Studio LCD seems to be much better quality.



    A Batesias 17" silver LCD is about $420, which is the best I could find for a quality DVI LCD (though I'm not sure this one isn't RGB/3-input, which would require another adapter still).
  • Reply 46 of 67
    tfworldtfworld Posts: 181member
    Mmm 23" displays are cool... 8)
  • Reply 47 of 67
    jasonfjjasonfj Posts: 567member
    I'd really like to see a slight resolution bump on the new 23" models. I use the current 23" screen and it's nice, but After Effects still hogs way too much space for it.



    What are the chances? 2400 x 1500 anyone?
  • Reply 48 of 67
    idaveidave Posts: 1,283member
    Quote:

    Originally posted by jasonfj



    What are the chances? 2400 x 1500 anyone?




    Oh, now you're stirring up a bag of worms...the old resolution debate. I haven't used a 23" HD display but aren't they already tighter resolution than most of Apple's other displays? Perhaps what you need is a 25 or 30 incher.
  • Reply 49 of 67
    Quote:

    Originally posted by iDave

    Oh, now you're stirring up a bag of worms...the old resolution debate. I haven't used a 23" HD display but aren't they already tighter resolution than most of Apple's other displays? Perhaps what you need is a 25 or 30 incher.



    The resolution debate ... yeah ... but, I'd be one of the people at the head of the line to buy a higher resolution 23"+ lcd panel.



    My eyesight is great, and my desk could use the extra space gained by moving from two monitors to one. A 30" panel wouldn't save me much desk space, though, so I'd prefer a higher res. 23".
  • Reply 50 of 67
    Quote:

    Originally posted by jasonfj





    What are the chances? 2400 x 1500 anyone?




    count me in on this one...
  • Reply 51 of 67
    matsumatsu Posts: 6,558member
    2500x1400 ???



    Not without more links. Current DVI tops out somewhere a little lower than that.



    ADC may cause Apple some problems at 30" panel sizes because it also has to carry power to drive the display. More resolution coupled with large displays (and stronger backlighting) might mean that ADC can't supply enough juice even though DVI can supply enough bandwidth (for something like 2240x1400.)



    But I don't know, maybe the difference in backlighting and current draw between a 30" and a 23" will not be so great.



    In any case, if it's DVI, I'm pretty sure that 2048x1536 (4:3) or 2240x1400 (16:10) is the bandwidth limit for the current (2 link, 6 channel) connector). Plenty, at least for next few years.
  • Reply 52 of 67
    davegeedavegee Posts: 2,765member
    Quote:

    Originally posted by Matsu

    In any case, if it's DVI, I'm pretty sure that 2048x1536 (4:3) or 2240x1400 (16:10) is the bandwidth limit for the current (2 link, 6 channel) connector). Plenty, at least for next few years.



    Interesting points tho... Even if we aren't quite yet touching the ceiling and a few years might sound like a lot when it comes to system design it really isn't... Does anyone know "what's next" after DVI? Has a DVI-2 spec been hashed out? Something else? It's gotta be on the roadmaps of the hardware engineers in the video card industry and display makers at the very least... Would be interesting to know..



    Dave
  • Reply 53 of 67
    shawkshawk Posts: 116member
    Display systems will eventually have to reproduce 35mm/70mm resolution for professional use.

    Current professional motion picture film scanners can go to 5,000 x 2,500 pixels at 48bit.

    6,000 x 3,125 pixels at 48bit are expected this year with 8,000 x 4000 pixels to follow fairly soon.



    I'm not sure if this has any meaning for large display monitor resolution and DVI/ADC bandwidth.
  • Reply 54 of 67
    amorphamorph Posts: 7,112member
    I think the problem, as shawk implies, is that we're actually heading toward an increase in resolution - real resolution (i.e. precision) rather than bigger and bigger monitors. After all, you can't get all that much bigger than the 23" before the monitor stops occupying the workspace and starts defining it, and most photographers would rather not stand across the room from an 8,000 x 4,000 display that takes up a wall and turns the electricity meter into a functional turbine when it's powered up. 8k by 4k in a window on a desktop monitor- now you're talking.



    IBM was working toward 200dpi displays before they spun off that division; work continues. The problem is that this will force a sea change. Not only will DVI become insufficient, most graphics cards will suddenly be thrown back into the position of having to struggle to push enough pixels, and software will suddenly be forced to map logical pixels to physical pixels, rather than taking advantage of a 1:1 mapping, which means a translation layer, which is a hit on performance. Worse, that will just be a stopgap until they become resolution independent, which will be an awkward and traumatic transition for a lot of software to make. Currently, even the Aqua HIG instructs programmers to place buttons no fewer or no more than x number of pixels from the edge of a window, or from another screen element; all that will have to change.



    This will be good. It will finally allow monitors to improve in the way that cameras, printers, and pretty much every other digital device has been improving for decades now. Publishers will doubtless love to have 300dpi displays, and better. But it's going to hurt to get there, because it will require breaking an assumption about display calculation that dates back to the earliest days of computing.
  • Reply 55 of 67
    mac+mac+ Posts: 580member
    ^ fascinating read Amorph and shawk - thanks.
  • Reply 56 of 67
    smirclesmircle Posts: 1,035member
    Quote:

    Originally posted by Amorph

    Not only will DVI become insufficient, most graphics cards will suddenly be thrown back into the position of having to struggle to push enough pixels, and software will suddenly be forced to map logical pixels to physical pixels, rather than taking advantage of a 1:1 mapping, which means a translation layer, which is a hit on performance.



    Actually, this translation layer is already in place - it is called OpenGL and by extension Quartz Extreme. As windows are stored as textures, they are mapped to the screen like textures in any 3D game - only with a 1:1 relationship between logical pixels and display pixels. This relationship, however is not hard wired into the graphics cards, Apple uses it because different resolutions look awkward on TFTs.







    Quote:

    Worse, that will just be a stopgap until they become resolution independent, which will be an awkward and traumatic transition for a lot of software to make. Currently, even the Aqua HIG instructs programmers to place buttons no fewer or no more than x number of pixels from the edge of a window, or from another screen element; all that will have to change.



    The CoreFoundation API (through which programmers access Quartz) already allows for floating-point coordinates. However, there is nothing wrong with sticking to logical pixels as a scale - as long as you do not assume that 72 pixels will measure an inch on screen you are fine. This assumption, incidentially, has been wrong since the days of multiscan displays - about 10 years now.



    What will be needed is a bypass API that allows software to draw pixels 1:1 without remapping for some programs, but the calls for drawing vectors (which are resolution independent) like lines, curves, circles have partially existed since the first MacOS - QuickDraw - and have already been completely redone with CoreFoundation. Fonts are drawn in high-res without most programs caring about their pixel representation anyways.
  • Reply 57 of 67
    bzbz Posts: 40member
    This is killing me....



    STEVE?!?! Are you listening?!?!?!



    Please release PowerMac RevB and LCD rev... soon!



    BZ
  • Reply 58 of 67
    mmmpiemmmpie Posts: 628member
    Quote:



    Huh? You'd have Apple to FORCE the 97% / 98% of desktop owners who buy a desktop with an Apple display to pay for an additional DVI connector as well as a power brick that they don't need to be included in the Apple LCD just so the 3% or 2% (even less?) of the laptop buyers who want to also buy a desktop display don't have to pay for an DVI to ADC adapter?



    Heck, tons of people were yapping about having to pay for 'all of iLife' just to get _insert_one_app_name_ because "The'd never use the others" and iLife is the biggest bargain going and now you want those people to foot the bill for extra connectors and ac adapters that they'd never use?



    I just don't see it...




    Huh, you'd prefer everyone who buys a Powermac to have to shell out for a powersupply that supports 140watts more than it needs too? Powersupplies that big are expensive, because they just dont have economies of scale.



    While applaud the ease of use of ADC, I dont it makes sense in todays market. 50% of Apples customers buy laptops, and dont have ADC. Neither eMacs nor iMacs support ADC. Anyone who doesnt buy a Powermac will look twice before buying an Apple monitor, because of the extra price, and because the crap that you have to jump through makes the user experience _WORSE_ than it is with a typical monitor.



    Not only that, but for 699, I expect to get more than one input, one of them analog. Philips are now offering 4 inputs on their monitors, in a similay price range. Id be quite happy to see Apple offering an ADC and a DVI, with active analog inputs. I would prefer a built in power supply, but at a stretch a power pack that connects to the ADC port would be acceptable ( but, ironically, not a good an experience ).



    Apple also have the opportunity to get ahead of the pack, and offer their quad input ( 2 analog, 2 digital ) with quad upstream usb ports, offering an integrated KVM.
  • Reply 59 of 67
    idaveidave Posts: 1,283member
    Agreed. Things have changed since ADC was introduced. I'd like the option, if I buy a 23" LCD, to hook it up to a HDTV tuner and watch TV on it when I'm not using it with my PowerBook. ADC should not be the only input.
  • Reply 60 of 67
    Quote:

    Originally posted by Smircle

    Actually, this translation layer is already in place - it is called OpenGL and by extension Quartz Extreme. As windows are stored as textures, they are mapped to the screen like textures in any 3D game - only with a 1:1 relationship between logical pixels and display pixels. This relationship, however is not hard wired into the graphics cards, Apple uses it because different resolutions look awkward on TFTs.





    This is true, although I think you might be overstating the point. There are several fundamental obstacles to overcome before resolution independence.



    Firstly, there are all the API layers, applications still currently assume a 1:1 correspondence, many applications will need to be updated. Fonts and windowing elements are already there for the most part.



    Secondly, display devices need to be affordable at high resolutions. One of the problems with LCD technology is that at higher resolutions, the size of the transistor controlling each pixel stays roughly the same size, since the transistor is effectively blocking some of the light from the backlight, the higher density the LCD screen, the stronger the backlight must be to have the same overall density. This is a significant problem in notebook displays. When Organic LEDs become available in large sizes, they will not have this problem.



    Thirdly, there is a gap to overcome. Right now everything looks pretty decent at current Apple display DPIs when the software assumes 1:1. At slightly higher resolutions but adding scaling (resolution independence) it will look significantly worse. At higher resolutions still, it will look good again. I'm not sure what the threshold for these DPI values is, but there is definitely a sweet spot where scaling really becomes practical. We aren't there yet except in the most high end professional displays (read: many thousands of dollars).



    In the next few years we will see this happen. It seems to me Apple is ahead of Microsoft because a lot of resolution scaling is already built in. It is just going to take some time for the hardware to really make it practical.



    -Spyky
Sign In or Register to comment.