or Connect
AppleInsider › Forums › General › General Discussion › ARM chip found in Apple's Lightning Digital AV Adapter could be AirPlay decoder
New Posts  All Forums:Forum Nav:

ARM chip found in Apple's Lightning Digital AV Adapter could be AirPlay decoder - Page 2

post #41 of 80
Quote:
Originally Posted by Marvin View Post

It's a very bizarre workaround - almost as if two separate teams had the tasks of designing the port and adding HDMI support. Maybe this is why they've decided to collaborate more on things.

Seems like there are some guys in the cable adapter department who want to be in the silicon logic board department hence all the proprietary chips showing up in the cables.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #42 of 80
Quote:
Originally Posted by Marvin View Post

Anyway, the fact that there's 16 pins on the current one means that there was enough room on the smaller plug that the form factor of the plug wasn't an issue. Some people seem to suggest that Apple compromised function in pursuit of the smaller plug. They must have purposely designed it knowing that 8 pins would be enough for them and given that USB 3 only has 9-10 pins, bandwidth shouldn't be a major problem.

It's a very bizarre workaround - almost as if two separate teams had the tasks of designing the port and adding HDMI support. Maybe this is why they've decided to collaborate more on things.

There are 16 pins on the male end of the plug but only 8 are ever used at a time. That means it's effectively an 8 wire setup. The female end of the plug as pins on only side.

Considering that it's dynamic 8 pins should be more than enough and as Gazoobee stated wired is being deprecated so it makes no sense to use a much larger port interface and add HW so this one adapter can be smaller, less complex and cheaper. Even now I have doubts about how much its used compared to AirPlay. Streaming to an Apple TV is a more expensive option and I bet that's used a lot more.

I'm quite happy with Lightning. It's one of my favourite aspects of the new iPhone. It's small, simple and future-forward.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #43 of 80
Quote:
Originally Posted by SolipsismX View Post

 [...] and as Gazoobee stated wired is being deprecated  [...]

The one place I can think of where this adapter is important is in corporate conference rooms or trade shows where there is no Apple TV available but there is always HDMI or DVI. I have all the adapters for Lightning and Thunderbolt for those types of situations. 

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #44 of 80

Another reason to support high bandwidth is to allow high quality streaming video.  There are a lot of applications in the future that could use the iPad as a monitor for a camera (e.g., in cars) where there is no need to save the data.  If Apple were smart, they would focus the lightning connector on providing high speed streaming input.  

post #45 of 80
Quote:
Originally Posted by mstone View Post

The one place I can think of where this adapter is important is in corporate conference rooms or trade shows where there is no Apple TV available but there is always HDMI or DVI. I have all the adapters for Lightning and Thunderbolt for those types of situations. 

That's what I was thinking too. HDMI is how projectors work, so it shouldn't be an afterthought in the design of the iPad if they want it to take off in business (you don't want digital compression artefacts on your slides).

 

I am wondering how much circuitry the Lightning to USB adaptor has. Since Lightning is USB2 at the protocol level, it should have the minimum circuitry of any Lightning device, helping to show how much of what we're seeing here is video specific.

 

The Lightning connector, which frankly gets more ridiculous the more you hear about it, and the iMac's unmanufacturability, seem like design missteps to me.


Edited by ascii - 3/2/13 at 5:43pm
post #46 of 80
Quote:
Originally Posted by ascii View Post

That's what I was thinking too. HDMI is how projectors work, so it shouldn't be an afterthought in the design of the iPad if they want it to take off in business (you don't want digital compression artefacts on your slides).

I am wondering how much circuitry the Lightning to USB adaptor has. Since Lightning is USB2 at the protocol level, it should have the minimum circuitry of any Lightning device, helping to show how much of what we're seeing here is video specific.

I was under the impression that Lightning is USB only when the device tells it to output USB per reading the chip in the Lightning to USB cable. I thought that if you plug in a different connector it can dynamically alter which pins and how they are utilized.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #47 of 80

Looks like a frick'n epic kludge to me.

Very un-Apple like.

post #48 of 80
Quote:
Originally Posted by SolipsismX View Post


I was under the impression that Lightning is USB only when the device tells it to output USB per reading the chip in the Lightning to USB cable. I thought that if you plug in a different connector it can dynamically alter which pins and how they are utilized.

I thought Thunderbolt was like that, but Lightning was just USB2 with a different connector.

post #49 of 80
Originally Posted by DESuserIGN View Post
Looks like a frick'n epic kludge to me.

Very un-Apple like.

 

Can you plug it in?
>Yes (one point)

>No (zero points)

 

Does it work as advertised?


>Yes (one point)

>No (zero points)

 

Total your points. If you have more than 1 point, it is Apple-like.

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #50 of 80

The fact that this was JUST discovered proves this isn't an issue for 99% of people, since millions of iPads have already been sold, and these have been out for a while. 

 

And yes, this is very Apple-like, which is streamlining the main device as much as possible, taking out components that most people won't take advantage of, and sacrificing some performance/functionality for a small percentage in order to simplify and streamline the product further. There have been many, many examples of Apple doing things like this previously with other products, both hardware and software. 

post #51 of 80
Quote:
Originally Posted by DESuserIGN View Post

Looks like a frick'n epic kludge to me.

Very un-Apple like.

 

 
Not only is it kludgey, it's blurry (see screenshot in Panic blog post). 
 
This is from the company that loves beautiful graphics so much. Starting with Steve Jobs' obsession with typography, to the OS X Quartz graphics engine where every frame is rendered as a PDF, to Color Calibration/Matching all through the OS, to the Retina display. 
 
And now every iPad from now on will have blurry HDMI output, the clarity of which is very important for it's use with projectors.
post #52 of 80
Quote:
Originally Posted by Slurpy View Post

And yes, this is very Apple-like, which is streamlining the main device as much as possible, taking out components that most people won't take advantage of, and sacrificing some performance/functionality for a small percentage in order to simplify and streamline the product further. There have been many, many examples of Apple doing things like this previously with other products, both hardware and software. 

Except HDMI is not some legacy and/or specialised-use technology. The most recent example of Apple's ruthless legacy-shedding was the Macbook Pro to Macbook Pro Retina revision. The Retina dropped Firewire, dropped Ethernet, dropped Optical drive and *added* an HDMI port.

post #53 of 80

Whether it works is in question. But you've set a very low bar for "Apple-like" in my opinion.

Your criteria seem closer to Microsoft and Dell (maybe without the "as advertised" part.)

 

"Is it elegant, well designed and reliable?" might be more to my point.

 

Quote:
Originally Posted by Tallest Skil View Post

 

Can you plug it in?
>Yes (one point)

>No (zero points)

 

Does it work as advertised?


>Yes (one point)

>No (zero points)

 

Total your points. If you have more than 1 point, it is Apple-like.

post #54 of 80
Quote:
Originally Posted by ascii View Post

I thought Thunderbolt was like that, but Lightning was just USB2 with a different connector.

Thunderbolt is protocol agnostic. Lightning is like the old 30-pin iPod Dock Connector except that instead of individual pins being set for features like USB, video out (composite and S-video), audio in/out, FireWire, serial, and then some pins for accessories use (perhaps ID detection and/or power, some unknown pins and additional ground pins. If I recall correctly I think 7 of the 30 pins are ground.

With Lightning you get a dynamic interface so if you are connected to Lightning cable with USB on it the chip will inform the device which will then adjust accordingly. With video out, audio devices, or various accessories, the same thing.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #55 of 80

By the way, I looked at and priced several of apple's cables and decided the best solution and best bang for the buck was to buy a refurb Apple TV. At least I knew what I was getting . . . and it was actually "working as advertised," unlike this kludge cable.


Edited by DESuserIGN - 3/2/13 at 7:54pm
post #56 of 80
Originally Posted by DESuserIGN View Post
By the way, I looked at and priced several of apple's cables and decided the best solution and best bang for the buck was to buy a refurb Apple TV. At least I knew what I was getting . . . and it was actually "working as advertised," unlike this kludge cable.

 

So $85 for a totally different solution instead of $49. 

 

In what way didn't the cable work as advertised, since you didn't buy it in the first place?

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #57 of 80
Quote:
Originally Posted by ascii View Post

Except HDMI is not some legacy and/or specialised-use technology. The most recent example of Apple's ruthless legacy-shedding was the Macbook Pro to Macbook Pro Retina revision. The Retina dropped Firewire, dropped Ethernet, dropped Optical drive and *added* an HDMI port.

1) Legacy or not the question is how often are people want this adapter into iDevices. I suspect it's not as often. If we instead include the HW into the iDevice would i make sense to increase the size, weight and cost of the iDevice (especially in the iPod Touch) to accommodate this atypical use? Then would it not be better to move the needed HW for connecting to HDMI for mirroring and video to an external adapter so that the iPod Touch and possibly the iPhone wouldn't have to increase in size, weight, and cost to support the needs of a few? I think so.

PS: I remember back when Apple used to ship multiple display adapters with their Mac notebooks in the PPC days. I personally hated getting all those extra components that I'd never use.

2) Regarding the RMBPs: FireWire was dropped because it's nearly completely obsolesced. FW800 is slower than USB and Lightning. it's only remaining benefit seems to be max power out. I doubt the Mac Pro will keep it.

Ethernet would have been nice but the 8P8C modular jack is quite large with no "mini' version like most modern connectors, and with WiFi so commonplace I have to think that not too many use it and if they do there is the Ethernet to USB adapter from Apple or others. With USB3.0 you also get the Gigabit Ethernet speeds which wasn't possible before the adoption of Ivy Bridge. Unlike the display adapters I mentioned previously adding the connector for Ethernet is extremely inexpensive but without the room or a high need I don't think Apple should hold off moving forward on PC design.

Same goes doubly for the ODD which I personally feel should have been removed years ago. I also doubts the Mac Pro will keep it despite people will say, "but there's room for it!"

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #58 of 80
Quote:
Originally Posted by SolipsismX View Post


1) Legacy or not the question is how often are people want this adapter into iDevices. I suspect it's not as often. If we instead include the HW into the iDevice would i make sense to increase the size, weight and cost of the iDevice (especially in the iPod Touch) to accommodate this atypical use? Then would it not be better to move the needed HW for connecting to HDMI for mirroring and video to an external adapter so that the iPod Touch and possibly the iPhone wouldn't have to increase in size, weight, and cost to support the needs of a few? I think so.

...

Ethernet would have been nice but the 8P8C modular jack is quite large with no "mini' version like most modern connectors, and with WiFi so commonplace I have to think that not too many use it and if they do there is the Ethernet to USB adapter from Apple or others. With USB3.0 you also get the Gigabit Ethernet speeds which wasn't possible before the adoption of Ivy Bridge. Unlike the display adapters I mentioned previously adding the connector for Ethernet is extremely inexpensive but without the room or a high need I don't think Apple should hold off moving forward on PC design.
 

 

 

I've got no problem with shifting potentially specialised circuitry to external adaptors, but if that's going to be your approach, you need to decide that up front, and make sure your one and only remaining port is up to the job. Something like Thunderbolt is, but (based on what we're seeing here) I'm not sure Lightning is. The fact that you have to set the res to 1600x900 (i.e. it can't do 1920x1080) is a dead giveaway that the port was under specified.
 
Regarding the MBP Retina ports, the GigE adaptor actually plugs directly in to a Thunderbolt port (I'm using one now). Thunderbolt is a work of art and it's a real shame the iDevices didn't get it, instead Apple went the cheap route.
 
It will be interesting if a Samsung/Microsoft/Google tablet comes out with a nice fat external bus and we see it being used for industrial applications that the iPad just can't do. For the sake of a few cents saving they could have sacrificed entire markets, and given competitors a way in.
post #59 of 80
Quote:
Originally Posted by ascii View Post


I've got no problem with shifting potentially specialised circuitry to external adaptors, but if that's going to be your approach, you need to decide that up front, and make sure your one and only remaining port is up to the job. Something like Thunderbolt is, but (based on what we're seeing here) I'm not sure Lightning is. The fact that you have to set the res to 1600x900 (i.e. it can't do 1920x1080) is a dead giveaway that the port was under specified.

How is a port under specified? Is USB Type-A under specified? If not, would you have said that back in the USB 1.0 days when FW400 was much faster and supported power? I really don't see how any of this has to do with the port at all when everything is pointing to other HW not being including on the iDevice logic board. I certainly wouldn't have wanted to keep the 30-pin connector for another decade and mini-USB would be a huge downgrade in every way.

Let's remember this hasn't been an issue for 6 months. But's an issue now only after the adapter was taken apart a half a year later?
Edited by SolipsismX - 3/3/13 at 3:04am

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #60 of 80
Quote:
Originally Posted by SolipsismX View Post


How is a port under specified? Is USB Type-A under specified? If not, would you have said that back in the USB 1.0 days when FW400 was much faster and supported power? I really don't see how any of this has to do with the port at all when everything is pointing to other HW not being including on the iDevice logic board. I certainly wouldn't have wanted to keep the 30-pin connector for another decade and mini-USB would be a huge downgrade in every way.

Let's remember this hasn't been an issue for 6 months. But's an issue now only after the adapter was taken apart a half a year later?

 

The quality problem appears to be due to the software on the iOS side compressing the video that it sends across the Lightning port to the mini ARM computer on the other side. Why is it compressing it? Why not send it uncompressed, or use lossless compression? It can only be because there is not enough bandwidth on the lightning bus. That is what I mean by Lightning being underspec'd.
 
I don't know why you think Lightning is not the problem, and the solution lies on the adaptor, when no amount of clever software on the adaptor can restore information that has already been lost due to compression for transmission. 
 
The only way they will solve this (without upgrading Lightning, which would be embarrassing so soon) is changing/tweaking the video codec in some way. H.265 is almost ready, it is still lossy, but it can achieve the same quality at half the bitrate of H.264 (probably what it's currently using). But then the ARM chip in the adaptor might not be grunty enough to decode it. Therefore I boldly predict we will see a v2.0 HDMI adaptor soon, with either a faster ARM chip or a dedicated H.265 decoder in there.
post #61 of 80
Quote:
Originally Posted by ascii View Post

I don't know why you think Lightning is not the problem, and the solution lies on the adaptor, when no amount of clever software on the adaptor can restore information that has already been lost due to compression for transmission.

Where is your evidence to support this claim that data is being removed because of the Lightning connector? You are free to assert that as a hypothesis but until it's proven you can't deny all other possibilities simply to fit your argument.

As previously stated, based on the HW in the adapter I think it's an offloading of HW that would have otherwise needed to be in each iDevice that would have increased size, weight and cost for every single iDevice despite this adapter not being being used frequently and the negative costs to the iPod Touch and iPhone. Based on the original article's statement that it's an AirPort streaming device (which is typically handled by an Apple TV over a network) it supports my hypothesis.

I think you might be getting hung up with the number of pins on HDMI v. Lightning but there is no one-to-one ratio that makes one better by simply having more pins in the connector, especially when it comes to a dynamic port.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #62 of 80
Quote:
Originally Posted by SolipsismX 
wired is being deprecated so it makes no sense to use a much larger port interface and add HW

They could eventually reduce the port to be just a power supply and any peripherals just have to use some wireless connection. That way 3rd parties can easily test peripheral devices without needing to have the latest model of device or having a strict certification but it would waste more power going wireless all the time when a simple wired connection would suffice and it doesn't seem like simpler certification is what Apple wants.

I don't think the mirroring functionality is something that would be needed often at 1080p and if video output still works at 1080p and doesn't show major artifacts, it's not much of a problem. It just seems silly to purposely design it in a way that requires more expensive adaptors, more engineering and delivers poorer quality when they could easily have used more pins if that's what the problem is and just required people to put the plug in the right way.

No doubt 3rd parties will be trying to make Lightning to HDMI adaptors of their own but there seems to be delays with certification:

http://www.monoprice.com/home/news_detail.asp?news_id=184&s_keyword=

"Monoprice.com, a popular Rancho Cucamonga, Calif., vendor of discount-priced cables, hopes to start selling its own line of Lightning adapters and connectors in November but needs to get Apple's approval first, spokesman George Pappas wrote."

Apple apparently requires suppliers to use Apple approved factories, which is good if it avoids abusive labour conditions but will add to delays:

http://arstechnica.com/apple/2012/10/apple-revising-mfi-program-to-limit-third-party-lightning-accessories/

In some ways it seems like a crackdown on unathorized peripherals, which could explain all the Lightning to 30-pin adaptors that are out and will allow them to avoid certifying or changing the current hardware. Imagine if they lose a large chunk of the 100 million iOS devices per year peripheral market to cheaply made passive cables. Their peripheral margins could easily be huge at $30-50 a cable. Even if they hit 1/4 of all owners and have a 60% net profit on them, it could easily be $0.5b.
post #63 of 80
Quote:
Originally Posted by Marvin View Post

It just seems silly to purposely design it in a way that requires more expensive adaptors, more engineering and delivers poorer quality when they could easily have used more pins if that's what the problem is and just required people to put the plug in the right way.

If their intended purpose in designing their iDevices was to make more expensive adapters then I'll wholeheartedly agree but that seems like a affect of better engineering in other areas.
Quote:
"Monoprice.com, a popular Rancho Cucamonga, Calif., vendor of discount-priced cables, hopes to start selling its own line of Lightning adapters and connectors in November but needs to get Apple's approval first, spokesman George Pappas wrote."

Monoprice is great! Despite their incredibly low prices I've never had a problem with their quality. I've purchased most of my networking tools and supplies from them for years now. I also happen to live in a state that gets the Norco overnght delivery for $5 which is just icing on the cake.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #64 of 80
Quote:
Originally Posted by Marvin View Post


It does support 1080p and 3D video. It's limited by the USB 2 bandwidth so they compress the input like Apple does but with micro-USB 3 (10 pins), they can support multiple 1080p HDMI outputs:
 

 

but it's not taking video input and sending it along, it's appears to the system as a video card. It requires drivers, etc. and then has different capabilities than the system's other video card...i.e. even though you might have some serious 3d acceleration on your main video card, this USB attached device probably won't. 

 

This is probably not what Apple is doing, it looks the same, but functionally it's quite different. 

 

Pin count doesn't really limit bandwidth. I have 2 pins coming into my house which provides 50M/sec network plus dozens of simultaneous (compressed) HD streams. However, pin count can limit latency and several other things. Lightening should have adequate performance in both categories to handle more than the requirements of a 1080p HDMI stream. That said, it still must be converted to the appropriate set of wires and signals the HDMI device on the other end of the cable needs, and that's clearly what this embedded ARM chip is doing. And it looks like either it, the software, or some component on the iDevice side is not quite capable enough (and/or has some software shortcomings), since it has some issues and some limitations (720p mirroring).

 

What will be interesting is if/when 1080p mirroring (or 2K output) becomes supported at some point in the future, if an updated adapter will be required.  Likewise if there will even be a firmware update for the existing one...

post #65 of 80

Armchair engineers slinging pin counts.

If you're all a bunch of purists for 1080p, why do you even tolerate any lossy compression artifacts in the source media?

When was the last time any of you people were "disappointed" over 4:2:0 compression in consumer HD?

"Apple should pull the plug on the iPhone."

John C. Dvorak, 2007
Reply

"Apple should pull the plug on the iPhone."

John C. Dvorak, 2007
Reply
post #66 of 80
Quote:
Originally Posted by soward 
Pin count doesn't really limit bandwidth. I have 2 pins coming into my house which provides 50M/sec network plus dozens of simultaneous (compressed) HD streams. However, pin count can limit latency and several other things. Lightening should have adequate performance in both categories to handle more than the requirements of a 1080p HDMI stream.

If they have the ability to send the data fast enough down each wire, the pin count won't be a problem. USB 3 will more than double bandwidth by upping the clock rate again with the same pin count:

http://www.theregister.co.uk/2013/01/07/usb_three_point_oh_throughput_doubling/

If they were limited in the clock speeds, they'd need more wires but if it can be done with USB, they should be able to manage it ok. The hardware design they have chosen shouldn't be the limiting factor.
Quote:
Originally Posted by soward 
That said, it still must be converted to the appropriate set of wires and signals the HDMI device on the other end of the cable needs, and that's clearly what this embedded ARM chip is doing. And it looks like either it, the software, or some component on the iDevice side is not quite capable enough (and/or has some software shortcomings), since it has some issues and some limitations (720p mirroring).

It doesn't make sense that they'd say mirroring is limited to 720p though and video output is 1080p. If the limit is just to do with the video conversion, why would video playback be different?

I had thought it might be to do with the high resolution of the iPad display but the old setup can take a Retina framebuffer (2048x1536), convert it and send it down 1080p HDMI without lag and this one can't. Once they get the 1080p frame to send out, why would it differ from video playback?
post #67 of 80

Ok, so for those that want the actual FACTS and not more useless assumptions/guesses (As AppleInsider likes to do), the below is from an Apple employee. 

 

I reiterate my previous statement, which is that is IS something in Apple's DNA, and looks like a brilliant,  flexible, robust, and innovative solution, even with all the small-minded, short-sighted bitching by people who have no clue what they're talking about and no insight into whats required to move things forward. The people who have developed this stuff are infinitely smarter than the people mindlessly attacking it. 

 

 

 

Quote:

Airplay is not involved in the operation of this adapter.

It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.

The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.

It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.

This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.

Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.

post #68 of 80
Quote:
Originally Posted by Tallest Skil View Post

 

So $85 for a totally different solution instead of $49. 

 

In what way didn't the cable work as advertised, since you didn't buy it in the first place?

Well, Mr. Smartypants, for my purposes I paid $25 (not $35) more for the Apple TV and got substantially more functionality than I would have gotten from the cable. In addition to not having to mess around with cables, I also got all the nice features of an Apple TV!

 

The cable I initially bought was perfectly good and functioned as advertised. It's the cable that is the subject of this article I was speaking of as not working as advertised. Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention. The cable may solve a problem and be very clever, but the kludge is symptomatic of less than optimal planning on the part of Apple. If this isn't clear to you, you have my pity.


Edited by DESuserIGN - 3/3/13 at 8:57pm
post #69 of 80
Originally Posted by DESuserIGN View Post
Well, Mr. Smartypants, for my purposes I paid $25 (not $35) more for the Apple TV

 

I used current prices from the Apple Stores.


It's the cable that is the subject of this article I was speaking of as not working as advertised. Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention.

 

How can you know that when you don't own it? That's what's unclear to me.

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply

Originally posted by Relic

...those little naked weirdos are going to get me investigated.
Reply
post #70 of 80
Quote:
Originally Posted by Tallest Skil View Post

How can you know that when you don't own it? That's what's unclear to me.

 

 

http://www.panic.com/blog/2013/03/the-lightning-digital-av-adapter-surprise/

 

If it's subsampled, it's subsampled. Do I need to own the cable to know that Apple doesn't mention that? Can't I just look at Apple's website? Does ownership somehow signify knowledge/facts to you?

post #71 of 80
Quote:
Originally Posted by SolipsismX View Post

PS: The Apple TV currently doesn't allow an ad-hoc connection from iDevice (or Mac/PC) to it. You must traverse via a router. Routers intrinsically add processing overhead to the path which also introduce latency. It would be great if you could connect directly (think of boardrooms and classrooms) so I hope they add that option soon.
Not true. You can in fact stream from an iPhone to the Apple TV via AirPlay, without the need of an additional router. Just turn on the iPhone's personal hotspot function, and it serves as the connection point.
post #72 of 80
Quote:
Originally Posted by Slurpy View Post

Ok, so for those that want the actual FACTS and not more useless assumptions/guesses (As AppleInsider likes to do), the below is from an Apple employee. 

I reiterate my previous statement, which is that is IS something in Apple's DNA, and looks like a brilliant,  flexible, robust, and innovative solution, even with all the small-minded, short-sighted bitching by people who have no clue what they're talking about and no insight into whats required to move things forward. The people who have developed this stuff are infinitely smarter than the people mindlessly attacking it. 


Quote:

Airplay is not involved in the operation of this adapter.



It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.



The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.



It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.



This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.



Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.



That's all fine and dandy, but "irregardless" is not a word.
post #73 of 80

I wonder what Charlie Brooker thinks of the new adapter. This is what he thought of the last one...

 

post #74 of 80
Quote:
Originally Posted by DESuserIGN 
Obviously it's feeding subsampled video to the display instead of straight digital video, which Apple doesn't mention. The cable may solve a problem and be very clever, but the kludge is symptomatic of less than optimal planning on the part of Apple.

The output is definitely downgraded at least for the time being:

http://www.journaldulapin.com/2013/03/03/lightning-adapter-to-hdmi-a-real-quality-issue/

It's good of them to go with a flexible solution that allows any type of connector but if the lightning port is fast enough, you'd think a lossless 1080p frame would have been possible. The adaptor design also means there will probably always be lag as there has to be a conversion step so gaming output won't be as good - Elgato claims their USB video device is lag-free so maybe they can work around it. There will be a bandwidth limit somewhere in the Lightning port so it's not going to be future-proof forever.

A whole load of peripherals should be possible but they will come with a higher price and complexity in making them. There's the software and hardware approval process too. Perhaps someone will even come up with an ethernet port adaptor.

There seems to have been a limit on flash drives and USB storage drives before so potentially what someone could do is make an adaptor that acts as a host to a storage drive and presents it to iOS as something more compatible like say a webdav server.

Maybe the problem with the HDMI adaptor output is just to do with how quickly the iPad can encode 1080p. It might not have been doing any conversion before. Video playback is already encoded at up to 1080p. Facetime is limited to 720p. The cameras can record 1080p but don't have to be real-time. If this is the case, faster iPad/iPhone GPUs will be able to handle the higher resolution frames. #plannedobsolescence
post #75 of 80

Wow.  Everyone is basing all of this chatter on the conjecture of a reverse engineering errort?  Mkay.  lol.gif

post #76 of 80
Quote:
Originally Posted by stike vomit View Post

I wonder what Charlie Brooker thinks of the new adapter. This is what he thought of the last one...



LOL ... Pretty funny actually, love the 'unplugging the incubator to charge the iPad' joke best (real English humor at its best there).

.p.s. No doubt he is now a paid expert at Scamsung after that (or perhaps even before ...) 1wink.gif
Enjoying the new Mac Pro ... it's smokin'
Been using Apple since Apple ][ - Long on AAPL so biased
nMac Pro 6 Core, MacBookPro i7, MacBookPro i5, iPhones 5 and 5s, iPad Air, 2013 Mac mini.
Reply
Enjoying the new Mac Pro ... it's smokin'
Been using Apple since Apple ][ - Long on AAPL so biased
nMac Pro 6 Core, MacBookPro i7, MacBookPro i5, iPhones 5 and 5s, iPad Air, 2013 Mac mini.
Reply
post #77 of 80
Could be a genius way to better control DRM, stop copies and justify a price. Plus, by the sound of it, they can sell a mk2 withl better picture quality at a later date.
post #78 of 80
Quote:
Originally Posted by gordy View Post

Wow.  Everyone is basing all of this chatter on the conjecture of a reverse engineering errort?  Mkay.  lol.gif

While everything may not be understood, engineering is applied science. And "reverse engineering" is applying science forensically. While the embedded ARM chip could do a variety of things, currently it is subsampling. There's no real doubt about that.

In some ways it's a great solution, but intrinsically it's rather unsatisfying in it's complex, kludgy, inelegance.

post #79 of 80
Well this helps the thought of there being no video out on the lighting
post #80 of 80
Quote:
Originally Posted by digitalclips View Post

LOL ... Pretty funny actually, love the 'unplugging the incubator to charge the iPad' joke best (real English humor at its best there).

.p.s. No doubt he is now a paid expert at Scamsung after that (or perhaps even before ...) 1wink.gif

He has a new series called Charlie Brooker's Weekly Wipe which aires on Thursdays on BBC Two where he talks about the week's events.

This is my favourite clip from Brooker…

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: General Discussion
AppleInsider › Forums › General › General Discussion › ARM chip found in Apple's Lightning Digital AV Adapter could be AirPlay decoder