Marvin

About

Username
Marvin
Joined
Visits
86
Last Active
Roles
moderator
Points
3,237
Badges
2
Posts
14,668
  • Apple has already partially implemented fix in macOS for 'KPTI' Intel CPU security flaw

    Why everyone is so panicked?

    In order to exploit the flaw the "attacker gains physical access by manually updating the platform with a malicious firmware image through flash programmer physically connected to the platform’s flash memory. Flash Descriptor write-protection is a platform setting usually set at the end of manufacturing. Flash Descriptor write-protection protects settings on the Flash from being maliciously or unintentionally changed after manufacturing is completed.
    If the equipment manufacturer doesn't enable Intel-recommended Flash Descriptor write protections, an attacker needs Operating kernel access (logical access, Operating System Ring 0). The attacker needs this access to exploit the identified vulnerabilities by applying a malicious firmware image to the platform through a malicious platform driver."

    as explained by Intel:
    https://www.intel.com/content/www/us/en/support/articles/000025619/software.html

    In everyday's language, the attacker needs physical access to your computer's interior. And all the efforts for what? For accessing kernel VM pages into which macOS never puts critical information. Holding critical information in wired memory is the ABC of kernel programming in Apple programing culture. That wired memory is the one that cannot be paged to VM. When the computer is turned off no critical information resides anywhere in your storage media.
    That's a different issue. Intel's statement on this issue is here:

    https://newsroom.intel.com/news/intel-responds-to-security-research-findings/

    This issue is about user-level software being able to access kernel-level data. As Intel says, there are other attacks that do similar things:

    https://www.vusec.net/projects/anc/
    http://www.tomshardware.com/news/aslr-apocalypse-anc-attack-cpus,33665.html
    http://www.cs.ucr.edu/~nael/pubs/micro16.pdf

    These try to bypass a security feature (ASLR), rather than being a direct security flaw:

    https://security.stackexchange.com/questions/18556/how-do-aslr-and-dep-work

    One way they do it according to the Kaiser paper is to measure the timings between a memory access and error handler callback. When it hits the cache the timings change so they can figure out where the legitimate memory addresses are:

    https://gruss.cc/files/kaiser.pdf

    That paper suggests ARM isn't affected as it uses separate mapping method for user and kernel tables:

    "All three attacks have in common that they exploit that the kernel address space is mapped in user space as well, and that accesses are only prevented through the permission bits in the address translation tables. Thus, they use the same entries in the paging structure caches. On ARM architectures, the user and kernel addresses are already distinguished based on registers, and thus no cache access and no timing difference occurs. Gruss et al. and Jang et al. proposed to unmap the entire kernel space to emulate the same behavior as on the ARM architecture."

    According to Intel, it doesn't allow write access. Maybe it's possible to get read access to sensitive data, this seems to be the case given the system updates. They are trying to come up with an industry-wide solution. Maybe there's a way to obfuscate memory data using a random key like a bit-shift operation and hide the key. It might be too slow for some things but worthwhile for sensitive data like passwords and encryption keys, which don't need to be accessed frequently. OS developers can do this themselves.
    loquiturGG1georgie01brian greenpscooter63bakedbananasneilmjony0
  • Tesla stops accepting BitCoin, nearly entire cryptocurrency market hammered

    Beats said:
    Can’t believe Dogecoin is an actual thing...
    All the coins make as much sense as each other. They are all useful as an IOU system.

    Person A wants to buy something for $50 online in a similar way to using cash. Although the transaction has a ledger, the goods/services don't have to be recorded.
    Person A gives $50 fiat to coin seller (bitcoin, dogecoin, whatever coin) and gets coins ($50 IOU).
    Person A gives the coin IOU to Person B in return for the $50 worth of goods/services.
    Person B sells coins for $50. If the person who got $50 from A is the same person who gives $50 back to B, it's just a way of making transactions like using cash online.

    The mining aspect is silly and is a flawed way to prove transactions. The idea behind it is that people who invest more into computing would be more trusted but that's been proven to not be the case where people have managed to get over 50% of the compute power and been able to double spend their coins.

    Proof of Stake crypto will take over from Proof of Work eventually. People are just investing more into Proof of Work systems because they can make more money - they are effectively generating money but it's been purposely slowed down over time to make it more stable and it increases energy usage to crazy levels as well as causes supply shortages of GPU. Proof of Stake doesn't need miners.



    The main problem with Proof of Stake is that it shows how little general use crypto would have. An IOU system is just an extra transaction to have to make and few people would bother. The interest in crypto has all been driven by people buying early coins and making 1000x returns. Someone recently sold millions in Dogecoin from a $145m wallet and there are others:

    https://www.msn.com/en-us/money/topstocks/a-2425-billion-dogecoin-whale-lurks-but-robinhood-ceo-says-e2-80-98we-don-e2-80-99t-have-significant-positions-in-any-of-the-coins-we-keep-e2-80-99/ar-BB1gqWXv

    The Winkelvoss twins are sitting on a few billion in coins. Crypto is often lauded as a currency that allows people to be free from the corrupt system of government-controlled finance but a lot of the people who made huge gains were people who were already rich and had the capital to invest. If someone who had $10m invested $100k into each of the top 10 coins, they could easily have made $100m and those people are selling the coins for fiat now that everyone has access to buying coins. It'll end up as another system of transferring money from poor people to rich people.

    Elon Musk is potentially doing this to promote Dogecoin more, it uses less energy:

    https://www.msn.com/en-us/news/technology/spacex-will-launch-a-moon-mission-funded-by-dogecoin-in-2022/ar-BB1gFKOa
    https://www.deseret.com/2021/5/10/22423052/dogecoin-environment-energy-use

    Tesla could even make their own coins if they wanted. If they made their own currency and got the value up, they could potentially make more from that than they do from their business.
    chiacg27roundaboutnowOfer[Deleted User]avon b7dewmeforegoneconclusionstourqueFileMakerFeller
  • Intel's Alder Lake chips are very powerful, and that's good for the entire industry

    DuhSesame said:
    The only advantage for Apple Silicon on paper seems to just be the power efficiency.
    The biggest advantages are the GPUs and special silicon for hardware encoding.

    Intel's GPUs are still terrible, this is their latest and greatest IGP and is less than 1/5th the performance of a 3060, which is roughly equivalent to an M1 Max:

    https://www.notebookcheck.net/GeForce-RTX-3060-Laptop-GPU-vs-Iris-Xe-G7-96EUs_10478_10364.247598.0.html

    This requires pairing Intel's chip with a 3rd party GPU, which itself will draw 40W+. Intel's chart compares the entire SoC power.

    For efficiency, the node advantage Apple has contributes the most and Intel will eventually catch up. Intel plans to match them by 2025 but they're still more than a full node behind as far as density.

    Also, the only reason they are improving now is because Apple dropped them, their roadmap before was half the pace.

    The other downside to Intel was price on higher-end chips. People will be surprised when the Max Duo and possibly higher chips come out and compare performance per dollar vs Intel.
    williamlondonrob53patchythepiratetmaydewmecat52fastasleeppscooter63bestkeptsecretAlex_V
  • Apple attorneys threaten UK market exit if court orders 'unacceptable' patent fees

    seanj said:
    mcdave said:
    The may have to leave the UK market to set an example and cool the current wave of attacks. What would be the cost of not leaving?
    Presumably you think they should leave the Texas market too? Absolutely stupid comment. As if legal courts pay attention to childish threats like this.

    The UK is way too profitable for Apple to even consider doing this.
    There are estimates of iPhone sales in the UK here:

    https://www.finder.com/uk/iphone-sales-statistics

    Total UK population is around 68m and that estimates around 7m sales per year. Active userbase is split 60/40 with Android, with 19m UK iPhone users:

    https://www.emarketer.com/newsroom/index.php/iphones-gaining-market-share-in-us-but-losing-in-uk/

    Total iPhone sales are estimated around 52m units between 2011-2019. At an ASP of $800, that makes $41.6b and a 25% net profit margin is around $10b. Issuing a $7b charge for a couple of patents is insane and would of course lead Apple to consider leaving the UK to avoid paying it. It seems to be applied to every iPhone sold worldwide but that's equivalent to $134 per iPhone they've ever sold in the UK. The judicial system around patents needs to be reformed everywhere, it's crazy that this threat to Apple's business is allowed to happen. Apple doesn't even make the cellular chips.

    https://www.mintz.com/insights-center/viewpoints/2231/2019-03-panoptis-recent-victory-against-huawei-demonstrates-why
    https://wccftech.com/apple-to-pay-506-million-for-infringing-on-panoptis-4g-lte-patents/

    There are amounts listed here:

    "PanOptis argues that it should be awarded $4.22 per iPhone, $3.62 per iPad that infringes its patents, and $2.25 per Apple Watch."

    https://appleworld.today/2021/03/31/patent-trollin-panoptis-wants-even-more-moolah-from-apple/

    If the courts allowed every patent owner to do the same, the costs would eventually amount to far more than the cost of the entire product, which makes no sense.
    aderutterDogpersonmuthuk_vanalingamFileMakerFellerapplguytenthousandthingsspock1234watto_cobra
  • Redesigned Mac Pro with up to 40 Apple Silicon cores coming in 2022

    Ok, so how much will this $40k MacPro cost?
    The good thing is it's entirely up to Apple, whereas before they were charging a markup on top of Intel's and AMD's prices.

    Intel charges a few thousand dollars for high-end Xeons, the following is in the Mac Pro:
    https://www.amazon.com/Intel-CD8069504248702-Octacosa-core-Processor-Overclocking/dp/B086M6P8D6

    An AMD Radeon Pro VII based on Vega architecture retails for $2740:
    https://www.newegg.com/amd-100-506163/p/N82E16814105105

    Apple charges $10k for 4 Radeon Pro GPUs. Cutting Intel and AMD out means it instantly cuts out around $15k of costs from a $24k Mac Pro. If Apple continued charging that, the extra $15k would all be profit.

    For the easiest manufacturing, they'd use multiple units of the chips that go in their laptops and iMac models. An entire MBP/iMac would cost under $2.5k with a 30% margin so $1750 manufacturing. The CPU/GPU part would be well under $500. It will have bundled memory though up to 64GB per chip. Using 4 of them shouldn't cost more than $2k with 64GB RAM total. The whole machine will likely start around $5k but the lowest quad-chip model will perform like the $24k Mac Pro.

    If they price it too high, people will just buy multiple MBPs/iMacs. I would guess the price range to be starting around $5k and going up to $10k with maximum memory (256GB) and bit higher for 8TB+ SSD. Hopefully the XDR will come down in price a bit too to be able to get a decent Pro with XDR under $10k.

    It may still have PCIe slots to support things like the Afterburner card and other IO cards but I don't see any reason to include slots for GPUs and they can always build an external box for IO cards with a single internal slot.
    zimmie said:
    If it's using the same GPU cores as the M1, clocked at the same speed, a 64-core GPU could do 20.8 TFLOPS, while a 128-core GPU could do 41.6 TFLOPS. For comparison, a GeForce RTX 3090 (the top consumer card from Nvidia) does up to 35.6 TFLOPS, and a Radeon RX 6900 XT (the top consumer card from AMD) does up to 23 TFLOPS.

    Considering the RTX 3090 and RX 6900 XT still universally sell for more than double their MSRP, I wonder if Apple will have scalping problems. Their system of allowing backorders mitigates scalping, but doesn't eliminate it. With the added demand from blockbros, it may be difficult to get one for a year or so.
    It would be a terrible shame if a significant use of these machines is for block chain mining.  (But if it boosts my AAPL investment, I'll be crying all the way to the bank.)
    I could see them being used in render farms and servers. They would be extremely efficient and cost-effective machines.

    Crypto mining would depend on the software being optimized for Apple Silicon and the overall price of the compute units:

    https://otcpm24.com/2021/03/01/apple-m1-vs-nvidia-ethereum-hash-rate-comparison-which-is-more-capable-for-crypto-mining/

    M1 is 2MH/s on the GPU at around 10-15W, NVidia 90HX goes up to 86MH/s at 320W. A 3090 can do over 100MH/s under 300W.
    A Mac Pro would be expected to get 16x this so 32MH/s at around 200W.
    An Nvidia 3090 is around $3k so I don't see people buying $5k+ Mac Pros specifically for mining but if they already planned to have a server array of Macs, they might use them for mining. Mining will probably become obsolete in the next year anyway.
    tenthousandthingsJWSCfastasleepviclauyycpatchythepirateFileMakerFellerwatto_cobra
  • Epic calls Apple's 'Fortnite' & developer tool block 'overbroad retaliation'

    I hope Apple loses, as it is stifling competition. Why is it that apps that offer physical goods are treated differently than those that offer virtual goods? Also, how is Apple entitled to 30 percent revenue of billions of dollars of virtual good revenue, when it doesn’t cost nearly that much to run the App Store and offer the services that it does ? Also, why is it that some apps are restricted from being able to even a mention an alternative payment system? Seems draconian. 

    I’m not advocating For an end result that leads to alternative app stores in iOS. But at minimum I think there should be competition in regards to in App payments. Apple shouldn’t be the sole supplier. I’m sure if competition opened up, costs would come down for consumers. 
    The people who profit from the store are helping subsidize the provision of the App Store for everyone. Successful companies try to ensure that no part of their companies make a loss. Apple pays a lot for cloud services for over 1 billion users and over 1 million apps:

    https://www.theverge.com/2019/4/22/18511148/apple-icloud-cloud-services-amazon-aws-30-million-per-month
    https://www.apple.com/newsroom/2018/12/apple-to-build-new-campus-in-austin-and-add-jobs-across-the-us/

    "Apple plans to invest $10 billion in US data centers over the next five years, including $4.5 billion this year and next. Apple’s data centers in North Carolina, Arizona and Nevada are currently being expanded."

    There are more services than payments, there's real-time push notifications (billions per hour), iCloud data storage and access, location services and more. Apple offers 150,000 APIs.

    Offering other payment services could easily open Apple up to fraud like if someone decides to offer bitcoin payments and buyers are tricked into losing a lot of money, Apple can't recover the funds for the App Store user.

    Apple's main objective is providing a safe, secure, trusted App Store for their hardware customers. There are too many bad actors that want to abuse this.

    Some people consider 30% to be too high but digital is different from the provision of physical goods as physical goods come with a tangible cost. Offering a taxi ride has the cost of the vehicle, the driver's income, the fuel. It's hard to know what the profit margin is on any given physical product or service, mostly it's small. Taking 30% of a physical product or service would easily make most of them unsustainable. While there's a cost of production with digital products too, it's mostly time the developer spent making the product and it's much easier to adjust pricing. It's a unique product with little competition. There's no competition for Epic's in-game currency, it only works in Epic's games so they can charge whatever they want.

    For apps that compete with Apple like music reseller services, the fee is more of a problem because Apple can sell the same music cheaper than Spotify but for long term subscription, it's 15% and most of Spotify's users are subscribed outside the store. Apple only monetizes 0.5% of Spotify's premium users according to Apple:

    https://www.idropnews.com/news/spotify-pays-apple-tax-on-less-than-1-of-subscribers-while-demanding-money-back-from-songwriters/109615/

    That's very far from stifling them. For all the stifling people talk about, there are a lot of billion-dollar companies being built using the App Store.

    It seems as though companies like Epic want hardware platforms to be like cable/internet companies where the cable companies provide the cable and anyone can compete offering services on the cable. But software platforms have inherent security risks and while desktop platforms have managed this effectively, mobile has more risks with biometrics, location data, mobile banking, more personal data and it's harder to notice and deal with security issues. Just the fact people didn't notice apps were copying clipboard data shows this.

    On some level it would be beneficial for consumers to be able to install software from anywhere like game emulators on the Apple TV but the same is true for games consoles. It would be nice for the developers not having to pay fees but most would continue doing it anyway because it's easier to setup (payment processing services are fairly easy to setup, merchant services are not) and just the biggest companies would avoid it. This means the little people end up covering the costs of the App Store while the biggest companies pay nothing and that's not a fair system.

    Just saying it's not fair to charge a percentage doesn't make sense when Epic (and every other company) do this on their own store. They charge 12%, who's to say their server costs justify charging any given games company 12% of their revenue? They even did the same thing forcing developers to use their payment system:

    https://www.gamasutra.com/view/news/355283/Epic_Games_Store_policy_change_expands_ingame_payment_options_for_devs.php

    "Until now, it was only possible to use an Epic-provided payment service, but the company said it wanted to give developers more freedom."

    They had a change of heart after a year and now they think they have a right to dictate that to other companies who have different business models. Every business has a right to negotiate terms but the kind of thing Epic is doing is juvenile, literally because they are trying to get children who play their game to back them when the only difference to them is $2 for a topup their parents pay for anyway.
    tenthousandthingscornchippulseimagespscooter63Rayz2016bestkeptsecretFileMakerFellertobianBeatstechconc
  • Video: Putting the iMac Pro thermals to the test

    arthurba said:
    if Apple are throttling and HP are not then Apple's real world performance is shot.
    It's pretty rare that any task that's being run on a highly muti-core machine will use exactly 100% of the cores at all times, dropping potentially 10% is meaningless. For a half hour processing task, that's a 3 minute difference. 10% drop was also while running both CPU and GPU at max, which nothing practical ever does. Workstations aren't meant to run benchmarks 24/7, that's just helpful to get an understanding of their performance class relative to other machines. Workstations are used in workflow tasks. Someone is using one here in an 8K RED workflow for Da Vinci Resolve:

    https://forum.blackmagicdesign.com/viewtopic.php?f=21&t=68351

    "As a RED user I can say it's been really great so far. I have 10 core/64GB RAM/Vega 64 GPU/4 TB SSD. I also have a Red Rocket-X connected via TB3 in a Sonnet box. It cuts through 8K files like a knife through hot butter. Today is literally the first time in years that I felt like my computer was helping me work faster rather than holding be back.

    There are a few times I can remember such a large step-up in performance. Moving from floppy disks to a 10MB hard drive in 1976 (Cromemco Z2-D); moving from Sun 2 to Sun 3 workstation (1986); moving from SPARC to MIPS R10000 (1996); moving from Pentium to dual-Xeon (Mac Pro 2008); and now. It's a big deal--and not a moment too soon. I've been using a 2012 Mac Pro and a 2013 new Mac Pro, and both have long fallen behind what my cameras have been demanding. Now I feel like I'm back to being able to do creative things without constantly bumping into performance limitations of one sort or another. Which feels great!"

    The performance and price of future models will just get better from this point on too. CPUs/GPUs can still improve another 2-4x. GPU performance is at a point now where the traditional workstation form factor is no longer essential. Fitting 11TFLOPs of compute power into an iMac and sustaining around 70 degrees at full load is a good achievement.
    chiaStrangeDaysarthurbacornchipdysamoriawelshdog
  • Lunar details a 'very exciting time' for high end Mac Pro, Apple Silicon

    Lunar Animation has a nice looking studio. They mostly have iMac Pros and a couple of Mac Pros:

    https://www.lunaranimation.com/about-us



    They have an array of 30 Mac minis in the back:



    Here they detail the improvements the Mac Pro gives them over the iMac Pro:

    https://www.lunaranimation.com/lunar-blog/2021/03/02/mac-pro-a-year-in-the-studio
    https://www.lunaranimation.com/lunar-blog/2019/12/23/using-the-new-mac-pro

    The iMac Pros with 16GB video memory would sometimes run out and the Mac Pro GPUs with 32GB would be enough. Having the unified memory in the Apple Silicon should do well with this as long as the total is enough. 64GB unified memory should be able to handle it, DDR5 might allow them up to 128GB but the bandwidth isn't clear. The consoles use GDDR6 unified memory but only up to 16GB and the XBox splits into different types, which they said they couldn't do on PC:

    https://www.psu.com/news/ps5-gddr6-ram-vs-xbox-series-x-gddr6-ram-which-is-better/

    I guess they could have a 64GB unified DDR5 cache plus 16GB GDDR6 just for GPU tasks and the GPU would swap in out of the main memory as needed. If the MBPs and iMacs had 16/8 base up to 64/16, that would be fine.

    The M1 is around 1/3 the CPU performance of the high-end Mac Pro. Upgrading their array of minis would be like having 10x high-end Mac Pros for $21k (a single high-end CPU Mac Pro is $13k and $24k with the GPUs). The Intel minis probably perform ok too, they will be around 70% of that but much more noisy with their fans running at full speed as they are 65W chips, 3x power usage, 3x heat:

    https://support.apple.com/en-us/HT201897

    But they gain much more performance having GPU-supported software, they get 10-20x speedup vs the CPU and having high performance GPUs in every machine with access to lots of video memory in Apple Silicon will help things.
    Fidonet127seanjtenthousandthingsrezwitsFileMakerFeller
  • Intel-based MacBook Pro is Intel's latest anti-Apple campaign target

    Intel then went on to offer data on the matter, including claims over half of "today's most popular games" are not supported on macOS itself. This attack is a little misguided for Intel, as Intel-based Macs running macOS still exist, and they cannot run games like Grand Theft Auto V or Cyberpunk 2077 natively either.

    Emulation was also mentioned but shown to be ineffective with a demonstration of Valheim poorly running on Parallels on a Mac.
    Parallels gaming on M1 isn't even official yet and bugs are being fixed, the Valheim glitches are fixed using a config value (1:20):



    Between all the compatibility options, there's a good list of working games:

    https://www.applegamingwiki.com/wiki/M1_Parallels_Windows_compatible_games_list
    https://www.applegamingwiki.com/wiki/M1_CrossOver_Windows_compatible_games_list
    https://www.applegamingwiki.com/wiki/M1_native_compatible_games_list
    https://www.applegamingwiki.com/wiki/M1_Rosetta_2_compatible_games_list

    People also don't have to choose Mac or PC. Decent gaming PCs are cheap now. 1660ti for $900, runs most games at 1080p Ultra 60FPS:

    https://www.amazon.com/HP-Pavilion-1920x1080-Core-i5-10300H-GeForce/dp/B08CVTP6T6
    https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1660-Ti-Max-Q-GPU-Benchmarks-and-Specs.418762.0.html

    A PC like that can sit in the corner somewhere and stream via remote play to either a TV or Mac and that way it runs on a better quality display and the fan noise is less noticeable. That's a lot easier to setup than messing with software compatibility programs.

    Obviously for people who only want a single computer for everything and need decent gaming, a PC will be the better option and that's been true for many years. Bootcamp improved things and for the last 8 years or so offered the possibility for decent gaming but it still needed rebooting all the time.
    blastdoor said:
    The fundamental problem with Mac gaming is software and consistency/commitment from apple. These are issues well within apple’s power to fix and I’m glad intel is pushing apple to fix them. But intel might regret providing that push if apple actually acts on it. 
    The Mac gaming audience is just much smaller, at least 1/10th-1/20th the size of the Windows platform. Making 10% extra revenue for years of support isn't worthwhile. The OpenGL/Metal/Vulkan issue doesn't help matters but loads of developers support iOS so the main issue is the audience.

    Intel's comparison is just bizarre, they are effectively promoting Nvidia who is one of their biggest competitors. Nvidia is trying to buy ARM and even without a buyout, there's nothing stopping them from making their own ARM PCs:

    https://www.pcgamer.com/nvidia-arm-gaming-laptops/

    Then they'll be cheaper than Intel PCs and will easily take significant marketshare away from Intel. Game devs will start supporting ARM better, which helps Apple.

    Intel is flailing in panic just now because everybody can see that nobody needs them any more.
    pscooter63cg27ioniclejdb8167AlexMorellobaconstangravnorodomchiaBeatsjony0
  • Apple AR headset could cost consumers over $2,000

    At that price point, it better be mining BitCoins when I’m not wearing it.
    It has the processing power.
    If it was really this spec, it would be able to replace a laptop entirely for some but it sounds like a lot of guesswork. If Apple had the ability to fit an M1 Pro into a compact headset, they could have made the MBPs thinner and lighter too. The 14" M1 Pro components on the right, including battery would all have to get inside the headset:



    It also wouldn't have as much space as in the mockup as there has to be space for the nose at the bottom:



    The Oculus Quest 2 on-board chip is around 1TFLOPs:

    https://uploadvr.com/oculus-quest-2-benchmarks/

    The test there puts it at around 1/6th of an Nvidia 1060. This is around Nintendo Switch level on-board. The Switch aims for 30FPS but VR is ideally 90FPS.

    Oculus Quest 2 has a link system (both wired and wireless) to display from a console or PC. That is much more usable for VR performance-wise.

    The fanless M1 or A15 makes way more sense in a headset. Strapping an iPhone to a head is much easier than a Macbook Pro and would keep it's price below $1k. It can also have a link system to allow more powerful Macs to stream content but iPhone hardware would be enough to handle video streaming, mobile games and social media.

    These wildly varying reports suggest that leakers have pretty much zero credible information on this product and are playing guessing games like everybody else.

    I think internally, it will have similar hardware to an iPhone. Here is an example phone AR system that teaches the piano in a way similar to games like Guitar Hero:



    Phone hardware is easily capable of rendering this and also physically based materials for realistic 3D objects. For anything heavier they can have a wireless connection to a Mac or a special adaptor that plugs into any video output for streaming using direct wifi.
    williamlondonihatescreennamesroundaboutnowtommikeleviclauyycravnorodompatchythepirate