That's not correct. Why don't you go and read up on this stuff before you post? You, and a few others just post what you THINK is correct much of the time. If you had actual information, much of the arguments would never happen.
better still why not post only, say, once every ten times rather than 'squeaking' at every opportunity in a typically inane and inflammatory manner. perhaps reading up may make this happen... good idea Melgross (not having a go at you here by the way....)
Look, be realistic. All standards take time. What about FW 3200 that some people are talking about? What about USB 3, what about SATA 6 Gbs? What about Light Peak?
A year from now, you will be saying that they've been around for almost two years, because we've known about them for a year now. But they will have really been out for just a few months at best.
Two years from now, they still will be new.
That's the way it works.
Why is it that even now, some new devices are still using HDMI 1.2?
When will 1.4 be on almost all new devices? About the end of 2011.
This trend seems to be growing. People hear about a new technology and they immediately expect it in the new release despite it not even speced out. We knew about the mDP port interface being officially added to DP a year ago and it?s just gone through today.
This trend seems to be growing. People hear about a new technology and they immediately expect it in the new release despite it not even speced out. We knew about the mDP port interface being officially added to DP a year ago and it’s just gone through today.
It's kind of funny, but standards haven't been keeping up with consumer demand (Pre-N anyone?).
They used to joke that people couldn't keep up with technology. I think it's bureaucracy that could never keep up with Technology, or People for that matter
I can see pluses and minuses to that..too many changes to fast and you end up with a mess. Move too slow, and you become redundant.
By quite a few, you mean select lattitude laptops, optiplex desktops, precision workstations, and professional series displays. Not all of them and absolutely nothing on the consumer side.
Quote:
Almost any graphics card has it, or will be having it. So any computer that comes with a graphics card will have it by default.
Not anything remotely new. Most chipsets produced in the last two years have displayport as do many third party video cards. End of the day, hasn't enticed anyone to use ship it on a consumer computer or display.
Time is needed to properly spec the technology. Once its set in stone, that's what it is forever.
You have to admit, they waffled far more than was necessary on that technology. When mainstream manufacturer's are producing it before it's a ratified draft, you know you're dragging your feet a wee bit too much.
This trend seems to be growing. People hear about a new technology and they immediately expect it in the new release despite it not even speced out. We knew about the mDP port interface being officially added to DP a year ago and it?s just gone through today.
It's also the chicken and the egg. We won't see new technologies on many computers because there aren't any peripherals yet. We won't see the peripherals because they aren't on the computer.
A few companies take tentative steps, and we get stuff slowly rolling out. Then, at some point, almost everything new has it.
Look at how long it took for USB 1 to become popular.
By quite a few, you mean select lattitude laptops, optiplex desktops, precision workstations, and professional series displays. Not all of them and absolutely nothing on the consumer side.
Not anything remotely new. Most chipsets produced in the last two years have displayport as do many third party video cards. End of the day, hasn't enticed anyone to use ship it on a consumer computer or display.
It's also the chicken and the egg. We won't see new technologies on many computers because there aren't any peripherals yet. We won't see the peripherals because they aren't on the computer.
A few companies take tentative steps, and we get stuff slowly rolling out. Then, at some point, almost everything new has it.
Look at how long it took for USB 1 to become popular.
Gads. Good point. Do you remember the pre-usb days? Ugh... Really surprising how long it took for it to catch on considering how much it simplified external peripherals.
Why bother. You're not not going to listen to anything other than Apple is infallible and every computer and monitor on the market is currently shipping with displayport.
Gads. Good point. Do you remember the pre-usb days? Ugh... Really surprising how long it took for it to catch on considering how much it simplified external peripherals.
following that line of argument why didn't ADC catch on? was it a licensing issue or just a mine is better than yours attitude in the market? I suppose it must take a lot of effort for any technology to gain enough traction that it can be regarded as a standard.
You have to admit, they waffled far more than was necessary on that technology. When mainstream manufacturer's are producing it before it's a ratified draft, you know you're dragging your feet a wee bit too much.
The 802.11n spec is an unusual circumstance. They took way too long and we needed something faster for WiFi so devices got built with drafts. On the other end of that spectrum we have OpenCL which was ratified in record time and adopted across the board quickly.
Quote:
Originally Posted by melgross
Look at how long it took for USB 1 to become popular.
I?m predicting we?ll see a lot of parallels to USB1.0 when we look back at DP.
You mean like 27" and 30" displays? The resolution for DP is required to support PC displays. For Apple DP was the right choice over HDMI.
HDMI 1.3 apparently can drive 2560x1600@75Hz. The real difference being, if you want to go beyond that at 60Hz or above, there is a standard for that, but you'll need DP 1.2. The catch is, DP 1.2 isn't available yet, you'll need to replace your current Mac to get a higher resolution than that.
Quote:
Originally Posted by BenRoethig
Apple didn't connect audio to the port on the motherboard.
If that's true, why didn't Apple connect it? In effect, that's what was really asked.
Quote:
Originally Posted by DJRumpy
No, they aren't. They are capable of displaying a signal from a PC, but when they can do all do edge to edge output without overscan, they are just compatible.
Every HDTV I've used did proper edge-to-edge display with a computer. I suppose it's something to watch out for. Definitions vary, but to me, what separates an HDTV from a computer display was the dot pattern and dot pitch. Some HDTVs still use a honeycomb dot pattern, and most have a very low dot pitch.
Quote:
Originally Posted by teckstud
No- you're the one misinformed. Is it 7.1?
While 7.1 might be desirable by some for TV use, why does it matter for computers? I don't see going beyond 5.1 for my home theater, and I'm not going above 2.1 for my computer.
Why bother. You're not not going to listen to anything other than Apple is infallible and every computer and monitor on the market is currently shipping with displayport.
following that line of argument why didn't ADC catch on? was it a licensing issue or just a mine is better than yours attitude in the market? I suppose it must take a lot of effort for any technology to gain enough traction that it can be regarded as a standard.
That was just an Apple standard. It had its good points, and its bad points.
Every HDTV I've used did proper edge-to-edge display with a computer. I suppose it's something to watch out for. Definitions vary, but to me, what separates an HDTV from a computer display was the dot pattern and dot pitch. Some HDTVs still use a honeycomb dot pattern, and most have a very low dot pitch.
Then you've been lucky. I've bought 4 in the last 6 years, from rear projectors, through LCD and OLED and none of them offered the option to turn off overscan. On the Sony's, you could enter into the service menu to tweak if you could stumble around various 3 digit abbreviations for every function. Not for the meek.
I know this because I've been using HTPC's hooked up to TV's for years. You've been very lucky.
Then you've been lucky. I've bought 4 in the last 6 years, from rear projectors, through LCD and OLED and none of them offered the option to turn off overscan. On the Sony's, you could enter into the service menu to tweak if you could stumble around various 3 digit abbreviations for every function. Not for the meek.
I know this because I've been using HTPC's hooked up to TV's for years. You've been very lucky.
I haven't had the problem either. But you must have a DVI computer port on the Tv. This shows one of the major imitations of HDMI.
So basically you're saying resolution on TV images makes a bigger difference and is more important than on computer displays? Even though we sit 10+ feet away from a TV but only a few from a computer display?
You're arguing in circles, which means you don't really know what you're talking about.
The best part is how completely at odds it is from his 720p Apple TV vs BluRay argument.
Comments
I meant mini- so then who does?
That's not correct. Why don't you go and read up on this stuff before you post? You, and a few others just post what you THINK is correct much of the time. If you had actual information, much of the arguments would never happen.
better still why not post only, say, once every ten times rather than 'squeaking' at every opportunity in a typically inane and inflammatory manner. perhaps reading up may make this happen... good idea Melgross
??? Where did you get that quote from?
From stuff that you say all the time - it's just a precis!
Look, be realistic. All standards take time. What about FW 3200 that some people are talking about? What about USB 3, what about SATA 6 Gbs? What about Light Peak?
A year from now, you will be saying that they've been around for almost two years, because we've known about them for a year now. But they will have really been out for just a few months at best.
Two years from now, they still will be new.
That's the way it works.
Why is it that even now, some new devices are still using HDMI 1.2?
When will 1.4 be on almost all new devices? About the end of 2011.
This trend seems to be growing. People hear about a new technology and they immediately expect it in the new release despite it not even speced out. We knew about the mDP port interface being officially added to DP a year ago and it?s just gone through today.
This trend seems to be growing. People hear about a new technology and they immediately expect it in the new release despite it not even speced out. We knew about the mDP port interface being officially added to DP a year ago and it’s just gone through today.
It's kind of funny, but standards haven't been keeping up with consumer demand (Pre-N anyone?).
They used to joke that people couldn't keep up with technology. I think it's bureaucracy that could never keep up with Technology, or People for that matter
I can see pluses and minuses to that..too many changes to fast and you end up with a mess. Move too slow, and you become redundant.
It's quite a few. And it's not only Dell.
By quite a few, you mean select lattitude laptops, optiplex desktops, precision workstations, and professional series displays. Not all of them and absolutely nothing on the consumer side.
Almost any graphics card has it, or will be having it. So any computer that comes with a graphics card will have it by default.
Lot's of stuff has this:
http://www.google.com/search?client=...UTF-8&oe=UTF-8
Not anything remotely new. Most chipsets produced in the last two years have displayport as do many third party video cards. End of the day, hasn't enticed anyone to use ship it on a consumer computer or display.
It's kind of funny, but standards haven't been keeping up with consumer demand (Pre-N anyone?).
Time is needed to properly spec the technology. Once its set in stone, that's what it is forever.
You have to admit, they waffled far more than was necessary on that technology. When mainstream manufacturer's are producing it before it's a ratified draft, you know you're dragging your feet a wee bit too much.
This trend seems to be growing. People hear about a new technology and they immediately expect it in the new release despite it not even speced out. We knew about the mDP port interface being officially added to DP a year ago and it?s just gone through today.
It's also the chicken and the egg. We won't see new technologies on many computers because there aren't any peripherals yet. We won't see the peripherals because they aren't on the computer.
A few companies take tentative steps, and we get stuff slowly rolling out. Then, at some point, almost everything new has it.
Look at how long it took for USB 1 to become popular.
By quite a few, you mean select lattitude laptops, optiplex desktops, precision workstations, and professional series displays. Not all of them and absolutely nothing on the consumer side.
Not anything remotely new. Most chipsets produced in the last two years have displayport as do many third party video cards. End of the day, hasn't enticed anyone to use ship it on a consumer computer or display.
Thats not remotely true.
But you seem to be stuck with your idea.
It's also the chicken and the egg. We won't see new technologies on many computers because there aren't any peripherals yet. We won't see the peripherals because they aren't on the computer.
A few companies take tentative steps, and we get stuff slowly rolling out. Then, at some point, almost everything new has it.
Look at how long it took for USB 1 to become popular.
Gads. Good point. Do you remember the pre-usb days? Ugh... Really surprising how long it took for it to catch on considering how much it simplified external peripherals.
Thats not remotely true.
But you seem to be stuck with your idea.
Why bother. You're not not going to listen to anything other than Apple is infallible and every computer and monitor on the market is currently shipping with displayport.
Gads. Good point. Do you remember the pre-usb days? Ugh... Really surprising how long it took for it to catch on considering how much it simplified external peripherals.
following that line of argument why didn't ADC catch on? was it a licensing issue or just a mine is better than yours attitude in the market? I suppose it must take a lot of effort for any technology to gain enough traction that it can be regarded as a standard.
You have to admit, they waffled far more than was necessary on that technology. When mainstream manufacturer's are producing it before it's a ratified draft, you know you're dragging your feet a wee bit too much.
The 802.11n spec is an unusual circumstance. They took way too long and we needed something faster for WiFi so devices got built with drafts. On the other end of that spectrum we have OpenCL which was ratified in record time and adopted across the board quickly.
Look at how long it took for USB 1 to become popular.
I?m predicting we?ll see a lot of parallels to USB1.0 when we look back at DP.
its also lower resolution
You mean like 27" and 30" displays? The resolution for DP is required to support PC displays. For Apple DP was the right choice over HDMI.
HDMI 1.3 apparently can drive 2560x1600@75Hz. The real difference being, if you want to go beyond that at 60Hz or above, there is a standard for that, but you'll need DP 1.2. The catch is, DP 1.2 isn't available yet, you'll need to replace your current Mac to get a higher resolution than that.
Apple didn't connect audio to the port on the motherboard.
If that's true, why didn't Apple connect it? In effect, that's what was really asked.
No, they aren't. They are capable of displaying a signal from a PC, but when they can do all do edge to edge output without overscan, they are just compatible.
Every HDTV I've used did proper edge-to-edge display with a computer. I suppose it's something to watch out for. Definitions vary, but to me, what separates an HDTV from a computer display was the dot pattern and dot pitch. Some HDTVs still use a honeycomb dot pattern, and most have a very low dot pitch.
No- you're the one misinformed. Is it 7.1?
While 7.1 might be desirable by some for TV use, why does it matter for computers? I don't see going beyond 5.1 for my home theater, and I'm not going above 2.1 for my computer.
Why bother. You're not not going to listen to anything other than Apple is infallible and every computer and monitor on the market is currently shipping with displayport.
You just make things up.
following that line of argument why didn't ADC catch on? was it a licensing issue or just a mine is better than yours attitude in the market? I suppose it must take a lot of effort for any technology to gain enough traction that it can be regarded as a standard.
That was just an Apple standard. It had its good points, and its bad points.
Every HDTV I've used did proper edge-to-edge display with a computer. I suppose it's something to watch out for. Definitions vary, but to me, what separates an HDTV from a computer display was the dot pattern and dot pitch. Some HDTVs still use a honeycomb dot pattern, and most have a very low dot pitch.
Then you've been lucky. I've bought 4 in the last 6 years, from rear projectors, through LCD and OLED and none of them offered the option to turn off overscan. On the Sony's, you could enter into the service menu to tweak if you could stumble around various 3 digit abbreviations for every function. Not for the meek.
I know this because I've been using HTPC's hooked up to TV's for years. You've been very lucky.
Then you've been lucky. I've bought 4 in the last 6 years, from rear projectors, through LCD and OLED and none of them offered the option to turn off overscan. On the Sony's, you could enter into the service menu to tweak if you could stumble around various 3 digit abbreviations for every function. Not for the meek.
I know this because I've been using HTPC's hooked up to TV's for years. You've been very lucky.
I haven't had the problem either. But you must have a DVI computer port on the Tv. This shows one of the major imitations of HDMI.
So basically you're saying resolution on TV images makes a bigger difference and is more important than on computer displays? Even though we sit 10+ feet away from a TV but only a few from a computer display?
You're arguing in circles, which means you don't really know what you're talking about.
The best part is how completely at odds it is from his 720p Apple TV vs BluRay argument.