bkkcanuck

About

Username
bkkcanuck
Joined
Visits
35
Last Active
Roles
member
Points
702
Badges
1
Posts
864
  • Mac mini: What we want to see in an update to Apple's low-cost desktop

    If you have Thunderbolt then you can upgrade graphics. You do not need to use internal one. Internal PCI? Why nobody goes this way anymore. Just buy eGPU wiith fast Thunderbolt transfers
    NO!  Hooking up an eGPU is great if you have no other possibility like you have a laptop.  

    Thunderbolt has overhead, Thunderbolt is limited to 4 PCIe lanes, Thunderbolt does not have the full bandwidth available on a top of the line graphics card requires. 

    There is a reason why the graphics cards are recommended to go in the slot closest to the CPU, and they use 16 PCIe lanes without the overhead of the Thunderbolt PCIe protocol.  

    The more powerful the graphics card, the more the impact.  

    When you get to the top end of the graphics card you basically throwing away a good 30% of the performance.  The price difference between that card and a lesser card of 30% less performance is hundreds of dollars.

    Thunderbolt is basically taking 4 PCIe lanes, running them along a wire, into another box that has PCIe slots, another power supply etc.  You are talking about hundreds of dollars wasted on an eGPU box, a couple hundred dollars of lost performance on the graphics card, the $100 for the Thunderbolt cable.  

    Talk about a waste!
    GeorgeBMac
  • Intel's first 10nm 'Cannon Lake' processor with 32GB LPDDR4 RAM support ships

    MacPro said:
    This explains why all the sudden massive discounts of Mac Book Pro  models reported on AI these last few days I suspect.  Well, I only want a MBP for casual use so I grabbed a MBP with touch bar from Adorama thanks to the AI advertorial and have no regrets, heck of a saving. 

    Will Apple use the Radeon RX Vega M GH Graphics GPUs on the next generation MBPs I wonder?
    It is highly unlikely that this chip is the reason.  At most, we might see a small bump with a revision as another stop-gap measure until Intel gets it's act together.  

    Even if Intel is able to churn out chips in limited quantity, the ones that Apple needs are typically the ones with the largest die size (usually the high-end graphics variant which takes up considerable die).  If there is a high reject ratio then the larger the chip, the larger (exponentially) the number of rejects - basically the cost of the product is more than what they can sell them at that point.  It still seems like Intel is at the point where CannonLake for Apple is far away - possibly even into 2020.  

     Soli said:
    1) I wonder what the price bump is for an extra 16GiB of RAM.

    2) I also wonder if Apple's MBP line will offer bth 16GiB and 32GiB options, or just jump to 32GiB for all models the way they now only offer 16GiB. My guess is they'll offer both capacities.
    It is irrelevant to speculate on the price of the extra 16GB of RPM since it is so far off in the future (unfortunately) that with RPM prices -- right now -- we have no clue on the cost to Apple for that same memory.  It could be the same price as now, it could be half, it could even be more.  

    Apple could put 32GB in right now if they were to use DDR4 instead of LPDDR3 -- but they won't.  The impact of changing to DDR4 for the first 16GB and then adding another 16GB of DDR4 drain on the battery - would be significant. As such we have to wait for Intel.  It is not that DDR4 consumes more power when active, or it consumes more power when inactive - the truth is that it most likely does not.  The difference between the two is that if you use DDR4 memory and it is active, it will not switch down into low power mode unless unused for a significant amount of time.  LPDDR3/4 switches from active to inactive almost instantaneously in comparison.  With the OS (even with the optimization that Apple did several versions ago), is constantly starting up and scheduling stuff to run in the background - with LPDDR3 it would go active, inactive, active etc.  With DDR4 - it would more likely stay active and not have enough time to switch down to inactive... hence it draws significantly more power just to have installed in the computer.  For the life of me, I don't know why Intel did not have an architecture update to support LPDDR4 since they obviously have been unable to master the die shrink... maybe they keep on thinking ... just a little more - not worth it... yet it has been years now.
     


    baconstangbestkeptsecretchasmwatto_cobra
  • Mac mini: What we want to see in an update to Apple's low-cost desktop

    nht said:
    mike54 said:

    I think Apple has lost many macOS sales. Most just want a reasonably priced headless mac with decent specs in which the drive and ram can be easily user replaceable. I don't think that's too difficult to do.

    Apple doesn't sell macOS so zero sales lost.  It provides macOS to sell highly profitable machines.
    They have lost sales of macOS ... and the machine itself... to HP workstation in many cases for Video production because of the mess.  The reason for pre-announcing so far in advance is because the exodus was becoming a greater issue (and Apple finally decided they want to hang onto the market -- now the Microsoft seems interested in catering to).   When you work all day in one or two applications... the OS is not as big a deal...  (except on if you are dead set on using Final Cut).   If there was not the coming exodus - Apple would have not felt the need to pre-announce so far in advance.
    argonaut
  • Apple modular Mac Pro launch coming in 2019, new engineering group formed to guarantee fut...

    cgWerks said:

    I meant that in regards to the mentioned "128GBps external pci expansion chassis"... that isn't possible for the Mac, unless you could bond multiple TB ports. You're just meaning it's being done in other industries, right?

    PCI Express 3.0's 8 GT/s bit rate effectively delivers 1 GB/s per lane.

    PCI Express 4.0's 16 GT/s bit rate effectively delivers 2 GB/s per lane.

    PCI Express 5.0's 32 GT/s bit rate effectively delivers 4 GB/s per lane.

    The maximum that your average consumer grade CPU's expansion capabilities are 24 PCIe lanes (I think 4 goes to the chipset; so 20 net).  Your average Xeon workstation chip gives you 44 lanes PCIe lanes - for dual CPU Xeon it effectively doubles as it is per CPU.  The benefits of Xeon over consumer CPUs is more bandwidth throughput and ECC memory (among others).  Thunderbolt only expansion effectively eliminates the first benefit.

    Thunderbolt uses either 2 or 4 lanes depending on the implementation.  

    So the entire bandwidth for a consumer grade CPU can handle currently is theoretically 20GB/s.  A Xeon workstation chip would be 40GB/s.

    The maximum that 4 lanes to the CPU would have would be 4GB/s or 40Gbps.  

    ---

    PCI Express 4.0 will bring with it OCuLink version 2 (an alternative to Thunderbolt) will have up to 16 GT/s (8 GB/s total for × 4 lanes).

    Graphics card (if you have one) typically slots into an x16 slot -- if you put them in x4 slot you would be bottlenecking the graphics card (assuming it ran).

    ---

    For high bandwidth expansion -- PCIe for the foreseeable future provides the best bandwidth and cheaper cost of expansion.

    cgWerks
  • The 2019 Mac Pro will be what Apple wants it to be, and it won't, and shouldn't, make ever...

    k2kw said:
    danvm said:
    k2kw said:
    Soli said:

    Bottom line: Apple owes you nothing over what they purchase agreement included, just as you own them nothing but the cost of the machine that you choose to by.

    Apple's success was built on the long term commitment if its Macintosh user base and before that the Apple II base. Without this commitment, it most likely would have gone extinct sometime in the 1990-ties. 

    The expectation from the committed user base is that Apple keep investing in them and their needs, and not just deliver a point in time box like most PC manufacturers do.

    Apple used to deliver products that in sum comprised an ecosystem that their committed user base could live happily in, and Steve Jobs in particular understood how important this is for the longevity and continued success of the company.  

    Under Tim Cook, Apple has started peeling away components of the ecosystem removing items such as screens, not upgrading network components, TM capabilities and capacity, lobotomizing the server software, remove server configurations, not refreshing existing systems for literally years, remove the ability for the users to add and replace components to their system such as disk, memory, GPU, battery and other techcnologies gets cheaper and more capable over the lifetime of the system. Now most configurations are frozen in time, while before they could evolve and serve the user better over its lifetime.

    Their obsession with anorexic thinness produce as a result systems cannot be fully utilized or expanded because there is no more room in the thermal envelope for faster components (we see this both in the trashcan and the new iMac Pro), or connecting the system for use in a real world situation leaves the user with a dongle and docking station hell, or a bunch of additional box clutter on their desktops (MacBook Pro, Mac Pro, iMac Pro.) Just to illustrate the madness of the situation, do a quick search for "rack mount for Mac Pro" and have a good laugh!  

     
    Some would say this

    "Apples survival was built on the long term commitment of his Macintosh user base during the 90's before Jobs returned.
    Apples long term success began by extending its "It Just Works" ethos  into consumer products beginning with the iPod.
    As its success grew with multiple smart mobile products iPhone and iPad Apple dropped the 'Computer' from its name reflect its mainstream consumer emphasis.
    With new priorities Apple dropped some existing products like the AirPort Extreme & Express and (temporarily) the MacPro."

    Despite the iOS business being much larger Hopefully they will put significant resources into their Pro Computers with this new group.
    This also why the new hire of the the Google AI guy is so good.

    Lots of Windows programmers like the "It's just works" (and works better) philosophy of Macs when we come home.     
    (MS is re-inventing itself as a cloud company) because they know they will loose the Wintel monopoly they've had on desktops eventually.   Hopefully it is to iOS and not Chrome.



    MS is moving it's priorities to the cloud, not because they are losing the desktop market, but because their cloud businesses are growing quicky.  The lead they have in the desktop market still huge, and now they are expanding to the ARM processors to keep / expand their lead.  And looks like it will a long time until we see something making a difference in the desktop market.

    On the article, I noticed that the issue Apple has is to give options to users.  For example, HP has four models of desktop workstations, starting as small as the Z2 Mini (it's as small as a Mac Mini), one with a single CPU and two with dual CPU.  They made sure entry-level and high end users were covered.  That's something Apple is missing.  Personally I think the current MacPro is an excellent option for many users, but not all of them.  They should had add the MacPro to their workstation line, and not replaced the "grater cheese" model, as they did.
    I've had the Cherry Hill Surface 3 with 4GB RAM and 128 GB SSD  and LTE.   It worked but was extremely slow.   Only nice thing was the Kickstand which was nice to have Laying in bed. Because of that S3 I'm doubtful about the QualComm 845 giving decent performance.    I have a new Surface Pro i5 for work. Its nice as a light laptop but windows is still subpar as a tablet.   I expect the 845 to be slower than the i5.
    I was watching Leo Laporte play around with one of the new ARM/x86 emulator devices - and the performance for anything that was basically being emulated (not ARM code) was very very bad.  I use a Macbook which for a lot of stuff is actually quite reasonable -- so not everything I do needs the big horsepower (some dev work on it, web, video, spreadsheets, documents, editing compiling, etc.  - for all that it works fine.  I watched and although he was saying it was "sluggish but workable" - I would have thrown it against the wall in 5 minutes at most... it was not.  The x86 emulation just is not doable on that machine in any reasonable useable way.
    cornchip