wired is being deprecated so it makes no sense to use a much larger port interface and add HW
They could eventually reduce the port to be just a power supply and any peripherals just have to use some wireless connection. That way 3rd parties can easily test peripheral devices without needing to have the latest model of device or having a strict certification but it would waste more power going wireless all the time when a simple wired connection would suffice and it doesn't seem like simpler certification is what Apple wants.
I don't think the mirroring functionality is something that would be needed often at 1080p and if video output still works at 1080p and doesn't show major artifacts, it's not much of a problem. It just seems silly to purposely design it in a way that requires more expensive adaptors, more engineering and delivers poorer quality when they could easily have used more pins if that's what the problem is and just required people to put the plug in the right way.
No doubt 3rd parties will be trying to make Lightning to HDMI adaptors of their own but there seems to be delays with certification:
"Monoprice.com, a popular Rancho Cucamonga, Calif., vendor of discount-priced cables, hopes to start selling its own line of Lightning adapters and connectors in November but needs to get Apple's approval first, spokesman George Pappas wrote."
Apple apparently requires suppliers to use Apple approved factories, which is good if it avoids abusive labour conditions but will add to delays:
In some ways it seems like a crackdown on unathorized peripherals, which could explain all the Lightning to 30-pin adaptors that are out and will allow them to avoid certifying or changing the current hardware. Imagine if they lose a large chunk of the 100 million iOS devices per year peripheral market to cheaply made passive cables. Their peripheral margins could easily be huge at $30-50 a cable. Even if they hit 1/4 of all owners and have a 60% net profit on them, it could easily be $0.5b.
It just seems silly to purposely design it in a way that requires more expensive adaptors, more engineering and delivers poorer quality when they could easily have used more pins if that's what the problem is and just required people to put the plug in the right way.
If their intended purpose in designing their iDevices was to make more expensive adapters then I'll wholeheartedly agree but that seems like a affect of better engineering in other areas.
"Monoprice.com, a popular Rancho Cucamonga, Calif., vendor of discount-priced cables, hopes to start selling its own line of Lightning adapters and connectors in November but needs to get Apple's approval first, spokesman George Pappas wrote."
Monoprice is great! Despite their incredibly low prices I've never had a problem with their quality. I've purchased most of my networking tools and supplies from them for years now. I also happen to live in a state that gets the Norco overnght delivery for $5 which is just icing on the cake.
It does support 1080p and 3D video. It's limited by the USB 2 bandwidth so they compress the input like Apple does but with micro-USB 3 (10 pins), they can support multiple 1080p HDMI outputs:
but it's not taking video input and sending it along, it's appears to the system as a video card. It requires drivers, etc. and then has different capabilities than the system's other video card...i.e. even though you might have some serious 3d acceleration on your main video card, this USB attached device probably won't.
This is probably not what Apple is doing, it looks the same, but functionally it's quite different.
Pin count doesn't really limit bandwidth. I have 2 pins coming into my house which provides 50M/sec network plus dozens of simultaneous (compressed) HD streams. However, pin count can limit latency and several other things. Lightening should have adequate performance in both categories to handle more than the requirements of a 1080p HDMI stream. That said, it still must be converted to the appropriate set of wires and signals the HDMI device on the other end of the cable needs, and that's clearly what this embedded ARM chip is doing. And it looks like either it, the software, or some component on the iDevice side is not quite capable enough (and/or has some software shortcomings), since it has some issues and some limitations (720p mirroring).
What will be interesting is if/when 1080p mirroring (or 2K output) becomes supported at some point in the future, if an updated adapter will be required. Likewise if there will even be a firmware update for the existing one...
Pin count doesn't really limit bandwidth. I have 2 pins coming into my house which provides 50M/sec network plus dozens of simultaneous (compressed) HD streams. However, pin count can limit latency and several other things. Lightening should have adequate performance in both categories to handle more than the requirements of a 1080p HDMI stream.
If they have the ability to send the data fast enough down each wire, the pin count won't be a problem. USB 3 will more than double bandwidth by upping the clock rate again with the same pin count:
If they were limited in the clock speeds, they'd need more wires but if it can be done with USB, they should be able to manage it ok. The hardware design they have chosen shouldn't be the limiting factor.
That said, it still must be converted to the appropriate set of wires and signals the HDMI device on the other end of the cable needs, and that's clearly what this embedded ARM chip is doing. And it looks like either it, the software, or some component on the iDevice side is not quite capable enough (and/or has some software shortcomings), since it has some issues and some limitations (720p mirroring).
It doesn't make sense that they'd say mirroring is limited to 720p though and video output is 1080p. If the limit is just to do with the video conversion, why would video playback be different?
I had thought it might be to do with the high resolution of the iPad display but the old setup can take a Retina framebuffer (2048x1536), convert it and send it down 1080p HDMI without lag and this one can't. Once they get the 1080p frame to send out, why would it differ from video playback?
Ok, so for those that want the actual FACTS and not more useless assumptions/guesses (As AppleInsider likes to do), the below is from an Apple employee.
I reiterate my previous statement, which is that is IS something in Apple's DNA, and looks like a brilliant, flexible, robust, and innovative solution, even with all the small-minded, short-sighted bitching by people who have no clue what they're talking about and no insight into whats required to move things forward. The people who have developed this stuff are infinitely smarter than the people mindlessly attacking it.
Quote:
Airplay is not involved in the operation of this adapter.
It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.
The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.
It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.
This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.
Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.
So $85 for a totally different solution instead of $49.
In what way didn't the cable work as advertised, since you didn't buy it in the first place?
Well, Mr. Smartypants, for my purposes I paid $25 (not $35) more for the Apple TV and got substantially more functionality than I would have gotten from the cable. In addition to not having to mess around with cables, I also got all the nice features of an Apple TV!
The cable I initially bought was perfectly good and functioned as advertised. It's the cable that is the subject of this article I was speaking of as not working as advertised. Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention. The cable may solve a problem and be very clever, but the kludge is symptomatic of less than optimal planning on the part of Apple. If this isn't clear to you, you have my pity.
Well, Mr. Smartypants, for my purposes I paid $25 (not $35) more for the Apple TV
I used current prices from the Apple Stores.
It's the cable that is the subject of this article I was speaking of as not working as advertised. Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention.
How can you know that when you don't own it? That's what's unclear to me.
If it's subsampled, it's subsampled. Do I need to own the cable to know that Apple doesn't mention that? Can't I just look at Apple's website? Does ownership somehow signify knowledge/facts to you?
PS: The Apple TV currently doesn't allow an ad-hoc connection from iDevice (or Mac/PC) to it. You must traverse via a router. Routers intrinsically add processing overhead to the path which also introduce latency. It would be great if you could connect directly (think of boardrooms and classrooms) so I hope they add that option soon.
Not true. You can in fact stream from an iPhone to the Apple TV via AirPlay, without the need of an additional router. Just turn on the iPhone's personal hotspot function, and it serves as the connection point.
Ok, so for those that want the actual FACTS and not more useless assumptions/guesses (As AppleInsider likes to do), the below is from an Apple employee.
I reiterate my previous statement, which is that is IS something in Apple's DNA, and looks like a brilliant, flexible, robust, and innovative solution, even with all the small-minded, short-sighted bitching by people who have no clue what they're talking about and no insight into whats required to move things forward. The people who have developed this stuff are infinitely smarter than the people mindlessly attacking it.
<p style="color:rgb(51,51,51);line-height:17px;margin-bottom:12px;font-family:Helvetica, Arial, sans-serif;">Airplay is not involved in the operation of this adapter.</p>
<p style="color:rgb(51,51,51);line-height:17px;margin-bottom:12px;font-family:Helvetica, Arial, sans-serif;">It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.</p>
<p style="color:rgb(51,51,51);line-height:17px;margin-bottom:12px;font-family:Helvetica, Arial, sans-serif;">The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.</p>
<p style="color:rgb(51,51,51);line-height:17px;margin-bottom:12px;font-family:Helvetica, Arial, sans-serif;">It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.</p>
<p style="color:rgb(51,51,51);line-height:17px;margin-bottom:12px;font-family:Helvetica, Arial, sans-serif;">This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.</p>
<p style="color:rgb(51,51,51);line-height:17px;font-family:Helvetica, Arial, sans-serif;">Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.</p>
That's all fine and dandy, but "irregardless" is not a word.
Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention. The cable may solve a problem and be very clever, but the kludge is symptomatic of less than optimal planning on the part of Apple.
The output is definitely downgraded at least for the time being:
It's good of them to go with a flexible solution that allows any type of connector but if the lightning port is fast enough, you'd think a lossless 1080p frame would have been possible. The adaptor design also means there will probably always be lag as there has to be a conversion step so gaming output won't be as good - Elgato claims their USB video device is lag-free so maybe they can work around it. There will be a bandwidth limit somewhere in the Lightning port so it's not going to be future-proof forever.
A whole load of peripherals should be possible but they will come with a higher price and complexity in making them. There's the software and hardware approval process too. Perhaps someone will even come up with an ethernet port adaptor.
There seems to have been a limit on flash drives and USB storage drives before so potentially what someone could do is make an adaptor that acts as a host to a storage drive and presents it to iOS as something more compatible like say a webdav server.
Maybe the problem with the HDMI adaptor output is just to do with how quickly the iPad can encode 1080p. It might not have been doing any conversion before. Video playback is already encoded at up to 1080p. Facetime is limited to 720p. The cameras can record 1080p but don't have to be real-time. If this is the case, faster iPad/iPhone GPUs will be able to handle the higher resolution frames. #plannedobsolescence
Wow. Everyone is basing all of this chatter on the conjecture of a reverse engineering errort? Mkay.
While everything may not be understood, engineering is applied science. And "reverse engineering" is applying science forensically. While the embedded ARM chip could do a variety of things, currently it is subsampling. There's no real doubt about that.
In some ways it's a great solution, but intrinsically it's rather unsatisfying in it's complex, kludgy, inelegance.
Comments
They could eventually reduce the port to be just a power supply and any peripherals just have to use some wireless connection. That way 3rd parties can easily test peripheral devices without needing to have the latest model of device or having a strict certification but it would waste more power going wireless all the time when a simple wired connection would suffice and it doesn't seem like simpler certification is what Apple wants.
I don't think the mirroring functionality is something that would be needed often at 1080p and if video output still works at 1080p and doesn't show major artifacts, it's not much of a problem. It just seems silly to purposely design it in a way that requires more expensive adaptors, more engineering and delivers poorer quality when they could easily have used more pins if that's what the problem is and just required people to put the plug in the right way.
No doubt 3rd parties will be trying to make Lightning to HDMI adaptors of their own but there seems to be delays with certification:
http://www.monoprice.com/home/news_detail.asp?news_id=184&s_keyword=
"Monoprice.com, a popular Rancho Cucamonga, Calif., vendor of discount-priced cables, hopes to start selling its own line of Lightning adapters and connectors in November but needs to get Apple's approval first, spokesman George Pappas wrote."
Apple apparently requires suppliers to use Apple approved factories, which is good if it avoids abusive labour conditions but will add to delays:
http://arstechnica.com/apple/2012/10/apple-revising-mfi-program-to-limit-third-party-lightning-accessories/
In some ways it seems like a crackdown on unathorized peripherals, which could explain all the Lightning to 30-pin adaptors that are out and will allow them to avoid certifying or changing the current hardware. Imagine if they lose a large chunk of the 100 million iOS devices per year peripheral market to cheaply made passive cables. Their peripheral margins could easily be huge at $30-50 a cable. Even if they hit 1/4 of all owners and have a 60% net profit on them, it could easily be $0.5b.
If their intended purpose in designing their iDevices was to make more expensive adapters then I'll wholeheartedly agree but that seems like a affect of better engineering in other areas.
Monoprice is great! Despite their incredibly low prices I've never had a problem with their quality. I've purchased most of my networking tools and supplies from them for years now. I also happen to live in a state that gets the Norco overnght delivery for $5 which is just icing on the cake.
Quote:
Originally Posted by Marvin
It does support 1080p and 3D video. It's limited by the USB 2 bandwidth so they compress the input like Apple does but with micro-USB 3 (10 pins), they can support multiple 1080p HDMI outputs:
but it's not taking video input and sending it along, it's appears to the system as a video card. It requires drivers, etc. and then has different capabilities than the system's other video card...i.e. even though you might have some serious 3d acceleration on your main video card, this USB attached device probably won't.
This is probably not what Apple is doing, it looks the same, but functionally it's quite different.
Pin count doesn't really limit bandwidth. I have 2 pins coming into my house which provides 50M/sec network plus dozens of simultaneous (compressed) HD streams. However, pin count can limit latency and several other things. Lightening should have adequate performance in both categories to handle more than the requirements of a 1080p HDMI stream. That said, it still must be converted to the appropriate set of wires and signals the HDMI device on the other end of the cable needs, and that's clearly what this embedded ARM chip is doing. And it looks like either it, the software, or some component on the iDevice side is not quite capable enough (and/or has some software shortcomings), since it has some issues and some limitations (720p mirroring).
What will be interesting is if/when 1080p mirroring (or 2K output) becomes supported at some point in the future, if an updated adapter will be required. Likewise if there will even be a firmware update for the existing one...
Armchair engineers slinging pin counts.
If you're all a bunch of purists for 1080p, why do you even tolerate any lossy compression artifacts in the source media?
When was the last time any of you people were "disappointed" over 4:2:0 compression in consumer HD?
If they have the ability to send the data fast enough down each wire, the pin count won't be a problem. USB 3 will more than double bandwidth by upping the clock rate again with the same pin count:
http://www.theregister.co.uk/2013/01/07/usb_three_point_oh_throughput_doubling/
If they were limited in the clock speeds, they'd need more wires but if it can be done with USB, they should be able to manage it ok. The hardware design they have chosen shouldn't be the limiting factor.
It doesn't make sense that they'd say mirroring is limited to 720p though and video output is 1080p. If the limit is just to do with the video conversion, why would video playback be different?
I had thought it might be to do with the high resolution of the iPad display but the old setup can take a Retina framebuffer (2048x1536), convert it and send it down 1080p HDMI without lag and this one can't. Once they get the 1080p frame to send out, why would it differ from video playback?
Ok, so for those that want the actual FACTS and not more useless assumptions/guesses (As AppleInsider likes to do), the below is from an Apple employee.
I reiterate my previous statement, which is that is IS something in Apple's DNA, and looks like a brilliant, flexible, robust, and innovative solution, even with all the small-minded, short-sighted bitching by people who have no clue what they're talking about and no insight into whats required to move things forward. The people who have developed this stuff are infinitely smarter than the people mindlessly attacking it.
Quote:
Airplay is not involved in the operation of this adapter.
It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.
The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.
It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.
This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.
Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.
Quote:
Originally Posted by Tallest Skil
So $85 for a totally different solution instead of $49.
In what way didn't the cable work as advertised, since you didn't buy it in the first place?
Well, Mr. Smartypants, for my purposes I paid $25 (not $35) more for the Apple TV and got substantially more functionality than I would have gotten from the cable. In addition to not having to mess around with cables, I also got all the nice features of an Apple TV!
The cable I initially bought was perfectly good and functioned as advertised. It's the cable that is the subject of this article I was speaking of as not working as advertised. Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention. The cable may solve a problem and be very clever, but the kludge is symptomatic of less than optimal planning on the part of Apple. If this isn't clear to you, you have my pity.
Originally Posted by DESuserIGN
Well, Mr. Smartypants, for my purposes I paid $25 (not $35) more for the Apple TV
I used current prices from the Apple Stores.
It's the cable that is the subject of this article I was speaking of as not working as advertised. Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention.
How can you know that when you don't own it? That's what's unclear to me.
Quote:
Originally Posted by Tallest Skil
How can you know that when you don't own it? That's what's unclear to me.
http://www.panic.com/blog/2013/03/the-lightning-digital-av-adapter-surprise/
If it's subsampled, it's subsampled. Do I need to own the cable to know that Apple doesn't mention that? Can't I just look at Apple's website? Does ownership somehow signify knowledge/facts to you?
I wonder what Charlie Brooker thinks of the new adapter. This is what he thought of the last one...
The output is definitely downgraded at least for the time being:
http://www.journaldulapin.com/2013/03/03/lightning-adapter-to-hdmi-a-real-quality-issue/
It's good of them to go with a flexible solution that allows any type of connector but if the lightning port is fast enough, you'd think a lossless 1080p frame would have been possible. The adaptor design also means there will probably always be lag as there has to be a conversion step so gaming output won't be as good - Elgato claims their USB video device is lag-free so maybe they can work around it. There will be a bandwidth limit somewhere in the Lightning port so it's not going to be future-proof forever.
A whole load of peripherals should be possible but they will come with a higher price and complexity in making them. There's the software and hardware approval process too. Perhaps someone will even come up with an ethernet port adaptor.
There seems to have been a limit on flash drives and USB storage drives before so potentially what someone could do is make an adaptor that acts as a host to a storage drive and presents it to iOS as something more compatible like say a webdav server.
Maybe the problem with the HDMI adaptor output is just to do with how quickly the iPad can encode 1080p. It might not have been doing any conversion before. Video playback is already encoded at up to 1080p. Facetime is limited to 720p. The cameras can record 1080p but don't have to be real-time. If this is the case, faster iPad/iPhone GPUs will be able to handle the higher resolution frames. #plannedobsolescence
Wow. Everyone is basing all of this chatter on the conjecture of a reverse engineering errort? Mkay.
LOL ... Pretty funny actually, love the 'unplugging the incubator to charge the iPad' joke best (real English humor at its best there).
.p.s. No doubt he is now a paid expert at Scamsung after that (or perhaps even before ...)
Plus, by the sound of it, they can sell a mk2 withl better picture quality at a later date.
Quote:
Originally Posted by gordy
Wow. Everyone is basing all of this chatter on the conjecture of a reverse engineering errort? Mkay.
While everything may not be understood, engineering is applied science. And "reverse engineering" is applying science forensically. While the embedded ARM chip could do a variety of things, currently it is subsampling. There's no real doubt about that.
In some ways it's a great solution, but intrinsically it's rather unsatisfying in it's complex, kludgy, inelegance.
He has a new series called Charlie Brooker's Weekly Wipe which aires on Thursdays on BBC Two where he talks about the week's events.
This is my favourite clip from Brooker…
[VIDEO]