Briefly: "Lost" iPod game, Mac display dithering, Costco Apple TV pilot

2»

Comments

  • Reply 21 of 37
    Quote:
    Originally Posted by jasondwelsh View Post


    Some people think I am funny, but mostly they just .



    Mine was not so much based fractionally, but rather where flicker becomes irritating.



    BTW, I think you are .
  • Reply 22 of 37
    Quote:
    Originally Posted by mmmdoughnuts View Post


    Mine was not so much based fractionally, but rather where flicker becomes irritating.



    BTW, I think you are .



    Why, thank you. By the way...are you a police officer?
  • Reply 23 of 37
    To say it doesn't matter if you can't tell the difference really pisses me off. I spent close to 3 grand for my Macbook Pro, it should have the best display possible....it should be exactly how they told me it was. You people that say it doesn't matter are the reason they get away with that stuff. I want what I paid for, simple as that.
  • Reply 24 of 37
    louzerlouzer Posts: 1,054member
    Quote:
    Originally Posted by mynameisjesse View Post


    To say it doesn't matter if you can't tell the difference really pisses me off. I spent close to 3 grand for my Macbook Pro, it should have the best display possible....it should be exactly how they told me it was. You people that say it doesn't matter are the reason they get away with that stuff. I want what I paid for, simple as that.



    Hey, you got what you paid for. Its just not what you thought you'd be getting!
  • Reply 25 of 37
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by mynameisjesse View Post


    To say it doesn't matter if you can't tell the difference really pisses me off. I spent close to 3 grand for my Macbook Pro, it should have the best display possible....it should be exactly how they told me it was. You people that say it doesn't matter are the reason they get away with that stuff. I want what I paid for, simple as that.



    Are they talking about the MBP in this suit, or the MB. That's a big difference.
  • Reply 26 of 37
    yanchuyanchu Posts: 5member
    The Apple TV is available at AAFES for $289, no tax. Not bad.
  • Reply 27 of 37
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by ak1808 View Post


    If it tricks me so nicely, why should I care?

    Blue pill anyone?



    Nice.



    Quote:
    Originally Posted by Hattig View Post


    Quite clearly when the graphics card is performing temporal dithering regardless of the software it is a hardware function, not a software function.



    Indeed I was under the impression that the dithering actually takes place within the display rather than the output of the graphics card. This explains how you can connect an 8-bit display and get a perfect image via mirroring even if you are also showing that same screen on a 6-bit dithered display on the MacBook or iMac. I believe that the discovered phrase was a simplification by Apple and that it meant that the dithering occurred within the graphics subsystem, which includes the display which is performing the action.



    Software temporal dithering would actually alter the screen image in memory, thus affecting all attached displays. It would also be extremely wasteful and inefficient when it could be done in the display independently (it's not exactly rocket science either).



    Quote:
    Originally Posted by JeffDM View Post


    It doesn't work that way with everyone.



    In the case of MacBooks, I'm happy to swallow the blue pill, or apply it as a suppository, etc.



    Seriously though, the 6-bit dithering works well for 13" LCD glossy displays. I imagine it is done mostly at the LCD-hardware level. It might be something in the low-level display drivers though*. Serious techy stuff. I scared.



    *There have been reports that things on the exact same display looks worse when running BootCamp vs. Mac OS X.



    Quote:
    Originally Posted by EagerDragon View Post


    I can understand the ditherig in the case of consumer computers.



    Howeever in the case of PRO computers there is a significant expectation and a significant price differential.



    At this time I have no idea if they are using dithering or not on PRO systems, If Apple is doing dittering in any PRO system, I can see them in big problem.



    Several applications are called pro by Apple and 2 lines of machines are also called PRO. These systems and software have a large price differential from consumer software and hardware.



    I hope all is well.



    Quote:
    Originally Posted by ascii View Post


    I normally support Apple, but in this case I think it was sh*tty of them to pass off high color displays as true color, and I hope these lawyers rip their legal anus out and hand it to them.



    Quote:
    Originally Posted by melgross View Post


    One might as well sue every manufacturer that uses LCD's as they all do this.



    As long as you aren't doing graphics and photo color edits, it won't matter.



    I'm pretty sensitive to color, as that was my business going back to the early '70's, and my wife's new 6 bit monitor has very good color. I would have to be very engaged with it so see any problems.



    Don't forget that the color gamut for TV is not much more than one million colors, even for current hi def. Very few people are bothered by that.



    I think we are looking at two things here. The first is that the Macbook is fine, but MacBookPro being pushed strongly for high-level, high-definition video and photo, means 6-bit aint cutting it.



    The second thing is, forget this 6-bit 8-bit 24-triple-88-bit whatever nonsense. The solution is to show the MacBookPro *handling color workflow well*. For pro photo and video and calibrating monitors, LCD, etc. There are still a ton of creative hobby/ enthusiast/ prosumer/ pro that are quite clueless about this color workflow stuff. I used to seriously calibrate doing web design 6 years ago (I'm getting old like Melgross, always referring to the halcyon days...) .... but haven't in like 3-4 years. And goodness knows with print design and video those are whole different "color spaces".



    So yeah, blue pill for me when it comes to MacBook, but Apple needs to say in fine print (now that this "fiasco" is out there: *Millions of colors achieved using sophisticated 6-bit technology most of you wouldn't even understand or care about. Actual colors may vary depending on your eyes, temperature, Acts of God, and a bazillion other things.



    Seriously though this is a good opportunity for Apple to at least, by the end of the year, with Leopard and MacBookPro, MacBookPro displays, have solid, easy-to-use, secure, high-level color workflows in place for prosumer/pro. And displays of appropriate quality to achieve these goals.
  • Reply 28 of 37
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by melgross View Post


    Remember the controversy over screen size for crts?



    Hah! Old skool bro... It is the very birth-mother of the term "viewable" in specifications of display sizes. It was a good for everyone in the end though... I think.



    Quote:
    Originally Posted by kindwarrior View Post


    If indeed there is something inappropriate (ie *still* image 6 bit per channel color) I'll agree but I'm intrigued by the term "temporarally" in the sentence "the graphics card temporally dithers the 6 bits per component to show up to millions of colors".....

    K



    I think "temporarily" means that all processing is done at 8-bit at the general graphics hardware level, the 6-bit-ness only happens in the instant that each frame is sent to a particular 6-bit display. That's my reading of it at this stage.



    Quote:
    Originally Posted by Louzer View Post


    As I'm sure people will be doing (esp. if this case gets settled or money exchanges hands). But do all of them say "show millions of colors" as apple does? If Apple said "over 200,000 colors" with some "sort of millions, due to dithering" at the end, they'd be fine (like they do for Hard drives).



    But saying 'it doesn't matter' still ignores the big picture. You buy a $2500 laptop, you kind of expect to get some features worthy of the price, like a true 8-bit display or something. Or you're getting what you're being told you're getting. But, if you're fine with that, then I'm sure you'd be fine getting a computer with a 1.5GHz processor that's overclocked to 2GHz, or a 1600x1200 display that really had a resolution of 1200x800, but they used some software techniques to 'increase' the resolution.



    BTW, when I was about to buy my MBP, I checked the comments on Amazon, and someone specifically mentioned this (and this was months ago), with links to some forum discussions and screenshots of what they were talking about. It was almost enough for me not to get one (I ended up going to the apple store first to see if I could see the problem, which I couldn't, so I took a chance - but Apple may have addressed it by then).



    Valid points. There has been deep dissatisfaction with MBP displays for quite some while. It has improved over time, though clearly the 6-bit issue is still a major sticking point. With Adobe full CS3 (including AfterEffects), FinalCutStudio2, Core2Duo to SantaRosa stuff, here is a great opportunity for Apple to round out their Pro Notebook offering with some appropriate displays, plus good tech/software/training support with regard to color workflows. Pros don't give a frack about whether "OS X looks grainy", it's their videos and photos and how they translate to other media that's important.



    Quote:
    Originally Posted by melgross View Post


    ...Don't forget that the color gamut for TV is not much more than one million colors, even for current hi def. Very few people are bothered by that.



    Yeah, I am just dumbfounded by all the LCD HDTVs that say "Billions" of colors. WTF?



    Oh, and "1080p" -- I'll say it again, the LCD HDTV manufacturers (eg. Sony [yes I'll take some heat on this]) can use whatever specs and disclaimers and what not to prove it's "REAL 1080p", but it looks ... well, pretty bloody unimpressive. Yes, upscaling, sources, etc. etc. etc.



    But certainly HDTV LCDs is another big gray area. Hmmm... \
  • Reply 29 of 37
    Quote:
    Originally Posted by nvidia2008 View Post


    Oh, and "1080p" -- I'll say it again, the LCD HDTV manufacturers (eg. Sony [yes I'll take some heat on this]) can use whatever specs and disclaimers and what not to prove it's "REAL 1080p", but it looks ... well, pretty bloody unimpressive. Yes, upscaling, sources, etc. etc. etc. \



    Just for fun (actually, a little sad for myself): at least, when Sony declares a FULL HD 1920x1080p panel, it actually works as declared: up to 1080p.



    Do you know that Philips, with the "1920x1080p", means 1920 per 1080 P(IXELS) and not P(ROGRESSIVE)? You can find it on the web site, their catalogues etc. etc.



    As matter of fact, my great LCD TV 46" only accepts 1080i(nterlaced) signals.

    Nice, for 3.999 Euros, isn't it?

    They (customer service) are calling it a "marketing mistake".



    Sigh...
  • Reply 30 of 37
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by Luca Boccaccini View Post


    Just for fun (actually, a little sad for myself): at least, when Sony declares a FULL HD 1920x1080p panel, it actually works as declared: up to 1080p.



    Do you know that Philips, with the "1920x1080p", means 1920 per 1080 P(IXELS) and not P(ROGRESSIVE)? You can find it on the web site, their catalogues etc. etc.



    As matter of fact, my great LCD TV 46" only accepts 1080i(nterlaced) signals.

    Nice, for 3.999 Euros, isn't it?

    They (customer service) are calling it a "marketing mistake".



    Sigh...



    That's tragic. ...A pity, Philips as a brand with their Ambilight Plasmas and LCD HDTVs, I kinda like them.... \
  • Reply 31 of 37
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by nvidia2008 View Post


    Valid points. There has been deep dissatisfaction with MBP displays for quite some while. It has improved over time, though clearly the 6-bit issue is still a major sticking point. With Adobe full CS3 (including AfterEffects), FinalCutStudio2, Core2Duo to SantaRosa stuff, here is a great opportunity for Apple to round out their Pro Notebook offering with some appropriate displays, plus good tech/software/training support with regard to color workflows. Pros don't give a frack about whether "OS X looks grainy", it's their videos and photos and how they translate to other media that's important.



    The only time it matters is when one is doing bit level correction of colors, which is to be discouraged, under most circumstances. If one is doing overall corrections, which is the norm for most imaging and video, even 6 bit monitors are fine. I've seen comparisons between 6 bit and 8 bit models, and the colors (as adjusted, and calibrated) were the same.





    Quote:

    Yeah, I am just dumbfounded by all the LCD HDTVs that say "Billions" of colors. WTF?



    It's intersting, isn't it? If the circutry uses 10 bit processing, then it's billions of colors being sent to the screen. But, if the screen can't reproduce all of those billions of colors, then a smaller number of them are selected that would best represent the image, and those are projected.



    We're waiting for HDMI 1.3 to become standard on both the output and inputs. When that happens, all components will be capable of showing "deep color". Until then, at least for Tv, the answer is no.



    Quote:

    Oh, and "1080p" -- I'll say it again, the LCD HDTV manufacturers (eg. Sony [yes I'll take some heat on this]) can use whatever specs and disclaimers and what not to prove it's "REAL 1080p", but it looks ... well, pretty bloody unimpressive. Yes, upscaling, sources, etc. etc. etc.



    But certainly HDTV LCDs is another big gray area. Hmmm... \



    Well, depends. Some TV's, like the front projection models I earlier mentioned, can accept 1080p, but convert it to 1080i before projecting it. Others can accept 1080p, but downconveert it to 720p for display.



    We have also had plasma sets that can only disply 1024 x 720 (or 768) and call themselves hi-def. They have to downwards interpolate the vertical rez from 1270 to 1024. Almost no one can see the difference.
  • Reply 32 of 37
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by nvidia2008 View Post


    That's tragic. ...A pity, Philips as a brand with their Ambilight Plasmas and LCD HDTVs, I kinda like them.... \



    Phillips, as a rule, doesn't seem to get very good reviews for its TV's or its monitors.
  • Reply 33 of 37
    ouraganouragan Posts: 437member
    Quote:

    Apple has said that it will leverage its proven capability in the area of software development to gradually add new software features and applications to Apple TV over time.





    Don't hold your breath!



  • Reply 34 of 37
    adadbbyadadbby Posts: 1member
    --I can't believe I tried to read this...

    My brain just got hurt..

    ---------------------------------

    iPod Converter

    http://www.ipodconverter.com
  • Reply 35 of 37
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by ouragan View Post


    Don't hold your breath!







    You're right. I can only hold my breath for a bit over 80 seconds. Several months would be too much.
  • Reply 36 of 37
    kidredkidred Posts: 2,402member
    Quote:
    Originally Posted by Feste View Post


    How is this lawsuit any different from someone suing the "motion picture" industry for claiming that they show moving pictures? The pictures don't really move. There's a display of 24 still images per second. They trick the human eye into seeing motion, where there isn't "real" motion. Big deal.



    It would be ridiculous for Apple to claim that there's "really" great color in a display, and it's a defect in your eye that makes it see crappy color. Why is the reverse any less ridiculous?



    Each frame moves into view and out of view as you state, 24 times a second. That's a ton of moving still pictures as far as I'm concerned.
  • Reply 37 of 37
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by KidRed View Post


    Each frame moves into view and out of view as you state, 24 times a second. That's a ton of moving still pictures as far as I'm concerned.



    As we've tried to explain, it the methodology that's patented, not the idea of moving pictures.



    Each new design has new patents.
Sign In or Register to comment.