Dell, Lenovo, Asus debut more affordable 4K monitors

2

Comments

  • Reply 21 of 46
    solipsismxsolipsismx Posts: 19,566member
    LOL. Fat. Chance. Who's going to make them?

    I appreciate his desire for more correctness and lack of confusion but it's simply not going to happen, and if he really wants to make sure that there isn't a misunderstanding he will just have to refer to the resolution.
  • Reply 22 of 46
    macxpressmacxpress Posts: 5,808member
    Quote:

    Originally Posted by herbapou View Post





    You are certainly NOT getting what you paid for if you get Apple current monitor. Instead of ripping off people Apple should either upgrade it or stop selling it.

     

    Explain?

  • Reply 23 of 46
    My point is:
    3840 x 2160 should be referred to as UHD.
    4096 x 2160 should be referred to as DCI 4K.
    4096 x 2560 should be referred to as 4K.
    7680 x 4320 should be referred to as QUHD.

    Dear Consumers,

    Memorize these numbers and acronyms. There will be a quiz at the end.

    Love,
    Your Consumer Electronics Industry

    solipsismx wrote: »
    I appreciate his desire for more correctness and lack of confusion but it's simply not going to happen, and if he really wants to make sure that there isn't a misunderstanding he will just have to refer to the resolution.

    "Lack of confusion"? More acronyms = more confusion.

    From a consumer perspective (and we are talking about consumers because this is CES not CinemaCon), the only resolution they will directly make a choice about is 3840 x 2160. So why not let them call it "4K"? Isn't it close enough?
  • Reply 24 of 46
    tallest skiltallest skil Posts: 43,388member

    So if 1080p is “True HD”, that implies 720p is “fake HD”. I personally buy that, but that’s beside the bigger point.

     

    What’s “4K”? “Truer HD”? “Truest HD”?

  • Reply 25 of 46
    solipsismxsolipsismx Posts: 19,566member
    From a consumer perspective (and we are talking about consumers because this is CES not CinemaCon), the only resolution they will directly make a choice about is 3840 x 2160. So why not let them call it "4K"? Isn't it close enough?

    It works for me if you just say 4K instead of the more proper 4K UHD to mean 3840x2160 but if he thinks you mean 4096x2304 there could be issues, which is why communication is important.


    PS: I'm not sure where he's getting 4096x2560 from. I know of 4K being defined as 4096 x 2160 which is 16:9 but his is 16:10 aspect ratio. The only 4K 16:10 I see is 4K WHXGA at 5120x3200.
  • Reply 26 of 46
    solipsismxsolipsismx Posts: 19,566member
    So if 1080p is “True HD”, that implies 720p is “fake HD”. I personally buy that, but that’s beside the bigger point.

    What’s “4K”? “Truer HD”? “Truest HD”?

    That's why Ultra is prefixed to HD. So what will 8K be?

    Again, I really hate all these marketing terms that seemingly were made up by the geeks inventing the technology a decade before they were even feasible to be used in shipping tech. They will all be outdated and yet in 20 years 1920x1080 TVs will be pretty bad yet they'll still could be known as "True High Definition" displays.
  • Reply 27 of 46
    tallest skiltallest skil Posts: 43,388member
    Originally Posted by SolipsismX View Post

    That's why Ultra is prefixed to HD. So what will 8K be?

     

    Do you swear to tell the ultra, the whole ultra, and nothing but the ultra? :p

  • Reply 28 of 46
    flaneurflaneur Posts: 4,526member
    herbapou wrote: »
    You are certainly NOT getting what you paid for if you get Apple current monitor. Instead of ripping off people Apple should either upgrade it or stop selling it.

    If what they're selling is priced too high, then people will stop buying. You and I both know that the "upgrade" schedule depends on IGZO production. Sharp is getting the first shot at selling to the early adopters, at a high price to help Sharp out. When there's enough to supply Apple, we'll start seeing 4K IGZO monitors from Apple.
  • Reply 29 of 46
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by SolipsismX View Post







    If we were to scale that so it's the same display height as the current 27" display it would be about a 33.5" display which i think would be absolutely brilliant for Xcode.

     

    Whether I would buy one would depend on many things including how good of a year, but I do like that concept. The 27" is still a little tall for me. I've worked with a reasonable variety of displays, and the 24" 16:10 height is one of my favorites. It's comparable to my old 21" 1600x1200, but wider. When it gets too tall, I find it has a subtle impact on my ability to efficiently navigate as quickly as possible. It's not that likely that it would affect me if Apple made one. I would probably wait for NEC to come out with one. Typically if you wait about a year after release, they drop prices to roughly Apple's price points. If an immensely wide 24" came out this year, I would probably replace my CG243W next year, assuming work picks up:grumble:.

  • Reply 30 of 46
    asciiascii Posts: 5,936member

    It's amazing how quickly the prices have come down.

  • Reply 31 of 46
    marvfoxmarvfox Posts: 2,275member
    That is competition for you in the computer field.
  • Reply 32 of 46
    Quote:
    Originally Posted by SolipsismX View Post


    PS: I'm not sure where he's getting 4096x2560 from. I know of 4K being defined as 4096 x 2160 which is 16:9 but his is 16:10 aspect ratio. The only 4K 16:10 I see is 4K WHXGA at 5120x3200.

     

    4096 x 2560 was invented by the film industry because they prefer 16:10, but the entire point is moot because it's never going to be seen by anyone unless you're watching the unadulterated feed from a full-frame 4K camera. See, for instance, this $40,000 Canon production monitor. 

     

    DCI cropped it to 16:9 for the sake of everyone's sanity. 

  • Reply 33 of 46
    zoetmbzoetmb Posts: 2,654member
    Quote:

    Originally Posted by David Byers View Post



    These "technically" aren't 4K monitors.



    "4K" is a video standard used by the movie industry that measures 4096 x 2160 (1.9:1 aspect ratio) and represents a standardized specification, including compression algorithm, bitrate, etc. These are "Ultra High Definition" monitors (UHD) that represent a display resolution of 3840 x 2160 (1.78:1 aspect ratio).



    The monitor industry needs to stop referring to these as 4K monitors when they don't meet the 4K video specification.

    That's true except that in the movie industry, the full 4096 x 2160 that the projectors are capable of isn't generally actually used.  In 4K projectors, 1.85:1 movies are projected at 3996x2160 and widescreen movies are projected at 4096x1716.   There is an option in the Sony 4K projector to blow up the 1716 to 2160 in the projector and then use a 1.25x anamorphic lens in projection, but almost no theatre does that because they don't want to change lenses and the Sony anamorphic lens is extremely expensive.

     

    The difference between the total number of pixels (including the ones that aren't used) in a "4K" monitor and in a UHD monitor is only 6.25%.   That's not really that big a deal.   Personally, I'm surprised the monitor industry doesn't market these as 8MP monitors instead of 4K monitors, but maybe it's because people will think, "My tiny camera is 24MP.  Why does my giant monitor only have 8MP?"  

  • Reply 34 of 46
    philboogiephilboogie Posts: 7,675member
    bobjohnson wrote: »
    See, for instance, this $40,000 Canon production monitor. 

    <span style="line-height:1.4em;">DCI cropped it to 16:9 for the sake of everyone's sanity. </span>

    Just when I'm ready to plunk down 40k (aka $40,000) for a replacement of my 30" ACD it turns out this Canon baby is 10-bit, which OSX doesn't support. Bummer¡

    Thanks for the info and link BobJohnson

    700
  • Reply 35 of 46
    hmmhmm Posts: 3,405member
    Quote:

    Originally Posted by PhilBoogie View Post





    Just when I'm ready to plunk down 40k (aka $40,000) for a replacement of my 30" ACD it turns out this Canon baby is 10-bit, which OSX doesn't support. Bummer¡





     

    When you get into the real specialty markets such as broadcast displays, costs go up considerably. 10 bit isn't that new. I've owned one that supported 10-bit since 2010. You need both an OS and a gpu with the appropriate driver features unlocked. For example AMD locks such features on Radeon cards, even with comparable chips. If you want them, you pay the firepro price, although you can get a workable card for that down to the $200-300 range. Newest ones might support 4K. I would have to check.

  • Reply 36 of 46
    philboogiephilboogie Posts: 7,675member
    hmm wrote: »
    philboogie wrote: »
    Just when I'm ready to plunk down 40k (aka $40,000) for a replacement of my 30" ACD it turns out this Canon baby is 10-bit, which OSX doesn't support. Bummer¡



     
    When you get into the real specialty markets such as broadcast displays, costs go up considerably. 10 bit isn't that new. I've owned one that supported 10-bit since 2010. You need both an OS and a gpu with the appropriate driver features unlocked. For example AMD locks such features on Radeon cards, even with comparable chips. If you want them, you pay the firepro price, although you can get a workable card for that down to the $200-300 range. Newest ones might support 4K. I would have to check.

    Didn't expect a response to my lame post, so big thanks.

    Interesting. I do know that 10-bit cards and screens are available as I have been looking at them because of my photography hobby. In all honesty I stopped looking further when I found out OSX doesn't support it. There might be a workaround; I haven't dug any further due to the high costs involved.

    I did see that Windows supports 10-bit. Would you know if Linux does as well? Just curious.
  • Reply 37 of 46
    Quote:

    Originally Posted by SolipsismX View Post

    Our eyes and muscles are designed for a wider aspect ratio than taller one as displays get larger.

     

    Ugh. That's twice in this thread you've used the "Intelligent Design" phrasing.  If indeed human eyes operate better with a particular aspect ratio, then it's because they evolved to do so.  But nothing about them was "designed".

  • Reply 38 of 46
    solipsismxsolipsismx Posts: 19,566member
    Ugh. That's twice in this thread you've used the "Intelligent Design" phrasing.  If indeed human eyes operate better with a particular aspect ratio, then it's because they evolved to do so.  But nothing about them was "designed".

    I never once used any phrasing of "Intelligent Design."
  • Reply 39 of 46
    philboogiephilboogie Posts: 7,675member
    solipsismx wrote: »
    Our eyes and muscles are designed for a wider aspect ratio than taller one as displays get larger.

    Through photography I came to learn if you use a camera with 35mm film (or CCD/CMOS) aspect ratio 3:2, and a 50mm lens will give you the most natural vision. Supposedly 45mm lens, to be more accurate.
  • Reply 40 of 46
    hmmhmm Posts: 3,405member
    Quote:

    Originally Posted by PhilBoogie View Post





    Didn't expect a response to my lame post, so big thanks.



    Interesting. I do know that 10-bit cards and screens are available as I have been looking at them because of my photography hobby. In all honesty I stopped looking further when I found out OSX doesn't support it. There might be a workaround; I haven't dug any further due to the high costs involved.



    I did see that Windows supports 10-bit. Would you know if Linux does as well? Just curious.

     

    I like responding to posts that reference cool hardware, even if they are sarcastic. I'm not aware of any stable drivers for it on Linux, although a lot of VFX work is done on Linux. Some of the animation and compositing packages are certified under Fedora. It's favored for the ability to make performance tweaks more than anything. I have read that a couple Nvidia cards supported 10 bit paths on OSX briefly and unofficially under Leopard, but I can't confirm it. Even then you wouldn't have a full path without the software developers on board. It couldn't go Nuke --> framebuffer ---> screen 10->10->10. Windows supports it with specific card features enabled, so it is a rather small matrix of working setups.

Sign In or Register to comment.