Apple unlikely to utilize custom iPhone baseband chip until 2015 at earliest, JP Morgan says

Posted:
in iPhone edited April 2014
If Apple does bring baseband processor design and production in-house as recent moves have indicated, one analyst believes that the chips are unlikely to debut in new iPhone models before 2015, thanks to the "notoriously difficult" nature of their development.


iPhone 5s logic board with Qualcomm baseband chipset. | Source: iFixit


In a Thursday morning note to investors, obtained by AppleInsider, JP Morgan analyst Rod Hall pointed to silicon firm Broadcom's recent struggle to produce an LTE modem of its own as evidence of the uphill battle awaiting Apple. Broadcom is one of the companies from which Apple has hired away a number of baseband hardware and software engineers in recent months.

Apple's choice to produce its own wireless modems would likely be motivated in part by a desire for increased power efficiency, Hall believes. Apple's current logic board designs utilize a baseband chip that is separate from the company's A-series application processors, and the company may be looking for ways to integrate the two chips into a single package.

Qualcomm, Apple's current baseband vendor, has done just that with its Snapdragon processors, and Hall believes the company would likely be open to a licensing arrangement that would allow Apple to integrate Qualcomm baseband IP on A-series cores. Such an arrangement would be beneficial to Qualcomm, as Apple is believed to have accounted for approximately one quarter of Qualcomm's 2012 revenues and losing that business would represent a significant financial hardship.

Despite the challenges, Hall believes that Apple has the ability and internal know-how to attract the talent necessary to successfully develop its own modem technology, as evidenced by the success of the A-series processors. Apple is thought to be at least one year ahead of Qualcomm on that front, thanks to the "desktop class" A7 chip that powers the iPhone 5s, iPad Air, and iPad mini with Retina display.

Comments

  • Reply 1 of 15
    bighypebighype Posts: 148member

    99.9% of these analysts are clueless fools who just say nonsense to make themselves look important. If you look back, you can see that NOTHING THEY EVER SAY is true. It's amazing how terrible their record is.

     

    And it's sad that sites reprint this nonsense. 

  • Reply 2 of 15
    saareksaarek Posts: 1,133member
    Well, if anyone can do it Apple can. They suprised everyone with the 64bit A7
  • Reply 3 of 15
    slurpyslurpy Posts: 5,154member
    "...pointed to silicon firm Broadcom's recent struggle to produce an LTE modem of its own as evidence of the uphill battle awaiting Apple."

    This same fucking line is stated every single time Apple gets into something new. UPHILL BATTLE!! DOOMED!! No shit, sherlock.
  • Reply 4 of 15

    Well when you have geniuses at your baseband supplier, ahem Qualcomm, say boneheaded comments like 64-bit is all marketing fluff, I can see another reason Apple would want to roll their own!

     

    http://www.zdnet.com/iphone-5s-64-bit-a7-chip-is-a-marketing-gimmick-says-qualcomm-exec-7000021590/

  • Reply 5 of 15
    richlrichl Posts: 2,213member
    Quote:
    Originally Posted by libertyforall View Post

     

    Well when you have geniuses at your baseband supplier, ahem Qualcomm, say boneheaded comments like 64-bit is all marketing fluff, I can see another reason Apple would want to roll their own!

     

    http://www.zdnet.com/iphone-5s-64-bit-a7-chip-is-a-marketing-gimmick-says-qualcomm-exec-7000021590/


     

    The general consensus is that 64-bit has limited use at the moment. ARM64 is useful, 64-bit less so.

     

    Qualcomm employ a lot of smart people. They're the best in the industry and charge accordingly.

  • Reply 6 of 15
    Quote:

    Originally Posted by RichL View Post

     

     

    The general consensus is that 64-bit has limited use at the moment. ARM64 is useful, 64-bit less so.

     

    Qualcomm employ a lot of smart people. They're the best in the industry and charge accordingly.


     

    Anand Shimpi did an in-depth analysis of the Cyclone cores: 

    http://www.anandtech.com/show/7910/apples-cyclone-microarchitecture-detailed

     

    Apple seem to have done a lot more than simply moving to 64-bit architecture.


  • Reply 7 of 15
    richl wrote: »
     
    Well when you have geniuses at your baseband supplier, ahem Qualcomm, say boneheaded comments like 64-bit is all marketing fluff, I can see another reason Apple would want to roll their own!

    http://www.zdnet.com/iphone-5s-64-bit-a7-chip-is-a-marketing-gimmick-says-qualcomm-exec-7000021590/

    The general consensus is that 64-bit has limited use at the moment. ARM64 is useful, 64-bit less so.

    "64-bit has limited use at the moment. ARM64 is useful, 64-bit less so."

    Do you think this usefulness will change for the next iDevices?

    If not, why do you think Apple made this major step?

    Apparently, Apple has/had a 2-year lead over the ARM APU competition. Do you think that Apple will exploit this advantage? How?
  • Reply 8 of 15
    Quote:

    Originally Posted by bighype View Post

     

    99.9% of these analysts are clueless fools who just say nonsense to make themselves look important. If you look back, you can see that NOTHING THEY EVER SAY is true. It's amazing how terrible their record is.

     

    And it's sad that sites reprint this nonsense. 


    But like a broken clock, their right a couple times a day.

     

    Common sense would tell me that all products that are in the pipeline for 2014 are pretty much specced down to the chips at the moment, and any recent hires that have particular skills will not have any significant input this year.

  • Reply 9 of 15
    Quote:

    Originally Posted by libertyforall View Post

     

    Well when you have geniuses at your baseband supplier, ahem Qualcomm, say boneheaded comments like 64-bit is all marketing fluff, I can see another reason Apple would want to roll their own!

     

    http://www.zdnet.com/iphone-5s-64-bit-a7-chip-is-a-marketing-gimmick-says-qualcomm-exec-7000021590/


    that was a marketing exec.   when all you do is make marketing claims, then everything you hear from competitors is marketing claims.

  • Reply 10 of 15
    solipsismxsolipsismx Posts: 19,566member
    richl wrote: »
    The general consensus is that 64-bit has limited use at the moment. ARM64 is useful, 64-bit less so.

    Qualcomm employ a lot of smart people. They're the best in the industry and charge accordingly.

    I'm not sure why you're separating out 64-bit ARM and "ARM64" since you can't use the AArch64 (aka ARM64) ISA without moving to a 64-bit ARM architecture. The article you link to is 1) not a general consensus but the opinion of one very informed person, and 2) he states unequivocally that moving to AArch64 does help with performance and efficiency, especially with Obj-C.
    Mike Ash wrote:
    In short, the improvements to Apple's runtime make it so that object allocation in 64-bit mode costs only 40-50% of what it does in 32-bit mode. If your app creates and destroys a lot of objects, that's a big deal.

    Apple took advantage of the transition to make some changes of their own. The biggest change is an inline retain count, which eliminates the need to perform a costly hash table lookup for retain and release operations in the common case. Since those operations are so common in most Objective-C code, this is a big win. Per-object resource cleanup flags make object deallocation quite a bit faster in certain cases. All in all, the cost of creating and destroying an object is roughly cut in half. Tagged pointers also make for a nice performance win as well as reduced memory use.

    ARM64 is a welcome addition to Apple's hardware. We all knew it would happen eventually, but few expected it this soon. It's here now, and it's great.
  • Reply 11 of 15
    davemcm76davemcm76 Posts: 265member
    If Apple does bring baseband processor design and production in-house as recent moves have indicated, one analyst believes that the chips are unlikely to debut in new iPhone models before 2015, thanks to the "notoriously difficult" nature of their development.

    And here was me fully expecting a 5 month turnaround following the new hires and for them to be in the iPhone 6 later this year... /s

    More from the land of stating the bleeding obvious coming up at 10 - stay tuned folks.
  • Reply 12 of 15
    snovasnova Posts: 1,281member

    silly me. All this time I thought the power drain of the baseband was due to the analog amplifier used to drive the antenna and not the digital side circuitry.   boy these analysts are super smart.  somehow they have figured out how to reduce the amount of energy required to send and receive a signal over the air via antenna by integrating it into the SoC. wow.

  • Reply 13 of 15
    quinneyquinney Posts: 2,525member
    snova wrote: »
    silly me. All this time I thought the power drain of the baseband was due to the analog amplifier used to drive the antenna and not the digital side circuitry.   boy these analysts are super smart.  somehow they have figured out how to reduce the amount of energy required to send and receive a signal over the air via antenna by integrating it into the SoC. wow.

    That's why they get paid the big bucks. :p
  • Reply 14 of 15
    fracfrac Posts: 480member
    Ok...so what is the 'great' difficulty in building their own LTE baseband chip? Is it lack of IP in that area? What did Apple's purchase of Nortel's 4G patents give them?
  • Reply 15 of 15
    mde24mde24 Posts: 27member

    OK, so Apple have put a lot of effort into building a new ultra-low-power coprocessor (M7) for the latest round of devices so that the relatively high-power application processor (A7) can be shut down more of the time.

     

    So... the analysts reckon they're going to build the always-on baseband into the A(x) (which also powers WiFi-only iPads) and means that the hungry application processor is always running.  Genius.  Building two versions of the M(x) - one with cellular and one without - is plausible, but into the A-series: never.

     

    Where do they find these people?

Sign In or Register to comment.