KITA

About

Username
KITA
Joined
Visits
127
Last Active
Roles
member
Points
1,479
Badges
0
Posts
410
  • Missouri school touts success with iPad Pro curriculum, saves nearly $600K annually on har...

    genovelle said:
    KITA said:
    foggyhill said:
    But, but, but, but... I love crap, I love chromebooks,
    I love saving $50 bucks up front and paying $200 dollars off the back and getting less,
    were will we be without the craptastic in our lives.

    (yes, this is sarcasm).
    Why the hate for Chromebooks? They seem to be very successful for K12.

    I think the implementation this school has done with their iPads is interesting, but in university, an actual macOS or Windows laptop would still be required.

    I can't imagine an engineering student at this school using an iPad Pro as their only device. This is one area a 2-in-1 computer, such as Microsoft's Surface, would be very well suited for (take notes in OneNote, type on a real keyboard in Word, open up a model in SolidWorks, etc.).
    Because they are crap 💩 just like the netbooks they replaced.

    There are hundreds of better accessory keyboards available for and IPad than  a Surface.

    There is a reason these devices with all the ads and heavy placement do not sell on any real scale after 6 years of trying. For instance Microsoft is estimated to have sold a whopping 1.5 million it’s first year, topping out at 6 million 2 years ago. For a little perspective, the original IPad sold its first million in 28 days, bettering the original IPhone’s 74 days. So, basically Microsoft may finally pass the 20 million mark in 6 years of estimated cumulative sales that the IPad achieved in its first year on the market. While Apple was already at 360 million reported sakes in March of last year. It is just not selling because it is a niche product. 

    You can call it that, but what makes a Chromebook crap?

    The best keyboard accessory for the iPad (Brydge) is also made for the Surface Pro, none of which come even remotely close to the keyboard on a Surface Book.

    Since when was the Surface the only Windows 2-in-1?

    Rayz2016 said:
    KITA said:
    foggyhill said:
    But, but, but, but... I love crap, I love chromebooks,
    I love saving $50 bucks up front and paying $200 dollars off the back and getting less,
    were will we be without the craptastic in our lives.

    (yes, this is sarcasm).
    Why the hate for Chromebooks? They seem to be very successful for K12.

    I think the implementation this school has done with their iPads is interesting, but in university, an actual macOS or Windows laptop would still be required.

    I can't imagine an engineering student at this school using an iPad Pro as their only device. This is one area a 2-in-1 computer, such as Microsoft's Surface, would be very well suited for (take notes in OneNote, type on a real keyboard in Word, open up a model in SolidWorks, etc.).

    Even if this were the case (and I’m not sure it is), the real problem is that you seem to think that universities only run engineering courses. 
    I only pointed out that there are limitations for engineering students, at no point in time did I state universities "only run engineering courses".
    muthuk_vanalingam
  • The 2019 Mac Pro will be what Apple wants it to be, and it won't, and shouldn't, make ever...

    cpelham said:
    I don’t see many people commenting about what is going to be needed hardware/software wise to create 4k/8k 360VR, 3D and AR content in the coming years. Cook has commented many times in recent years that this is where he thinks tech/content is going and that Apple wants to play a big part in creating it and delivering it. From what I understand, we are going to need much more powerful machines and I think they must be taking this extra year to try to design for this future, and this future may well require something other than Intel inside and possibly faster busses and cables and whatnot to drive, say, two 8k monitors.
    We have workstations with immense amounts of power now, but at a high cost.

    For example, NVIDIA's DGX Station:



    It brings to market a considerable boost in performance via the 2560 Tensor cores which put out 500 TFLOPS of mixed precision performance, this is on top of the 60 TFLOPS of single precision from the CUDA cores. It even utilizes NVLink 2.0 for speeds much greater than PCIe. The same system was used to run the Unreal Engine / Star Wars real time ray tracing demo (video). 



    All of this comes at a smooth price of $50,000.  

    Bringing the cost down is the next step.
    jdw
  • Apple modular Mac Pro launch coming in 2019, new engineering group formed to guarantee fut...

    KITA said:
    KITA said:
    KITA said:

    [...] The main components of your PC are tied together with a bus called PCIe. Your graphics card, your SSD communicate with the CPU over PCIe. Now consider extending that PCIe bus over a cable outside the case of your PC: this is Thunderbolt.
    Great in theory, except that Thunderbolt only manages a fraction of the speed an internal PCIe bus provides. It's not actually equivalent.

    According to your non-equivalence theory, all of those 4K and 5K Thunderbolt monitors are hoaxes then? Since they wouldn't be able to display 4K video as it is pumped out by the CPU/GPU.


    He's literally saying, the PCIe 3.0 x16 offers 32 GB/s, while Thunderbolt 3 offers 5 GB/s. So an external GPU, for example, is now limited to 5 GB/s (2.5 GB/s each way).
    How did you get those numbers?
    Thunderbolt 3 has a total bandwidth of 40 Gb/s, convert to GB/s and we get 5 GB/s.

    B = byte
    b = bit 
    8 bits = 1 byte
    Yes that's it, then what is the significance of those numbers in terms of GPU performance, what is the difference between using a graphics card internally on a x16 PCIe slot and using it in an eGPU TB 3 enclosure? Is that difference so big to justify using an internal slot? eGPU testing articles will certainly reveal that, I'm just asking without digging further.
    That's a bit of a loaded question as a lot depends on the setup.

    Here's a 3D benchmark with an older dGPU.

    PCIe x16 3.0 (external monitor):

    [removed]

    Thunderbolt 3 (external monitor):

    [snip]

    Of course, there are more scenarios than the above. For example, when using an eGPU with a laptop, you either connect an external monitor to the eGPU directly, or, you use the laptop's internal display. When using the internal display, the bandwidth for the GPU over TB3 is cut even further to 2.5 GB/s and the performance will be even lower.

    The situation becomes even more complex for multi GPU setups, as even PCIe 3.0 x16 can be a bottleneck. As I pointed out above, NVIDIA, for their V100 GPUs, uses NVLink 2.0. In a four GPU configuration, each GPU has 200 GB/s of bandwidth over the NVLink.

    TL;DR - Thunderbolt 3 is not going to cut it.
    Of course there is a reason Apple has soldered two GPUs onto the main board in Mac Pro. It might put a decent single GPU and suggest eGPU for the rest, but this is not the case. My point is, pushing the performance loss of a eGPU as a weakness of Thunderbolt doesn't do justice. Despite all those performance losses, eGPUs are still proposed as long as they offer better performance than the internal GPU:
    https://9to5mac.com/2017/11/27/egpu-amd-rx-vega-64-macos-high-sierra-beta-gpu-video/

    The FPS in your screen shots goes from 90.5 to 76.1, is that such a big deal? Can the gamer move at 90.5 fps? No, then this is irrelevant unless you're a FPS freak.
    What are you talking about? That's just a benchmark, the FPS is meaningless, the performance lost is what the focus should be on. On a newer card, like the GTX 1080 Ti, you're looking at a 30% decrease.

    However beyond gaming, we're talking about use for a workstation. If you used a modern workstation card, such as a Quadro GV100, you'd cripple its performance. You'd also rule out the potential for using multiple GPUs.

    cgWerksbkkcanuck
  • Apple modular Mac Pro launch coming in 2019, new engineering group formed to guarantee fut...

    KITA said:
    KITA said:

    [...] The main components of your PC are tied together with a bus called PCIe. Your graphics card, your SSD communicate with the CPU over PCIe. Now consider extending that PCIe bus over a cable outside the case of your PC: this is Thunderbolt.
    Great in theory, except that Thunderbolt only manages a fraction of the speed an internal PCIe bus provides. It's not actually equivalent.

    According to your non-equivalence theory, all of those 4K and 5K Thunderbolt monitors are hoaxes then? Since they wouldn't be able to display 4K video as it is pumped out by the CPU/GPU.


    He's literally saying, the PCIe 3.0 x16 offers 32 GB/s, while Thunderbolt 3 offers 5 GB/s. So an external GPU, for example, is now limited to 5 GB/s (2.5 GB/s each way).
    How did you get those numbers?
    Thunderbolt 3 has a total bandwidth of 40 Gb/s, convert to GB/s and we get 5 GB/s.

    B = byte
    b = bit 
    8 bits = 1 byte
    Yes that's it, then what is the significance of those numbers in terms of GPU performance, what is the difference between using a graphics card internally on a x16 PCIe slot and using it in an eGPU TB 3 enclosure? Is that difference so big to justify using an internal slot? eGPU testing articles will certainly reveal that, I'm just asking without digging further.
    That's a bit of a loaded question as a lot depends on the setup.

    Here's a 3D benchmark with an older dGPU.

    PCIe x16 3.0 (external monitor):



    Thunderbolt 3 (external monitor):



    Of course, there are more scenarios than the above. For example, when using an eGPU with a laptop, you either connect an external monitor to the eGPU directly, or, you use the laptop's internal display. When using the internal display, the bandwidth for the GPU over TB3 is cut even further to 2.5 GB/s and the performance will be even lower.

    The situation becomes even more complex for multi GPU setups, as even PCIe 3.0 x16 can be a bottleneck. As I pointed out above, NVIDIA, for their V100 GPUs, uses NVLink 2.0. In a four GPU configuration, each GPU has 200 GB/s of bandwidth over the NVLink.

    TL;DR - Thunderbolt 3 is not going to cut it.
    cgWerks
  • Apple modular Mac Pro launch coming in 2019, new engineering group formed to guarantee fut...

    KITA said:

    [...] The main components of your PC are tied together with a bus called PCIe. Your graphics card, your SSD communicate with the CPU over PCIe. Now consider extending that PCIe bus over a cable outside the case of your PC: this is Thunderbolt.
    Great in theory, except that Thunderbolt only manages a fraction of the speed an internal PCIe bus provides. It's not actually equivalent.

    According to your non-equivalence theory, all of those 4K and 5K Thunderbolt monitors are hoaxes then? Since they wouldn't be able to display 4K video as it is pumped out by the CPU/GPU.


    He's literally saying, the PCIe 3.0 x16 offers 32 GB/s, while Thunderbolt 3 offers 5 GB/s. So an external GPU, for example, is now limited to 5 GB/s (2.5 GB/s each way).
    How did you get those numbers?
    Thunderbolt 3 has a total bandwidth of 40 Gb/s, convert to GB/s and we get 5 GB/s.

    B = byte
    b = bit 
    8 bits = 1 byte
    Solimuthuk_vanalingamcgWerks