Marvin

About

Username
Marvin
Joined
Visits
116
Last Active
Roles
moderator
Points
6,086
Badges
2
Posts
15,322
  • Apple's iOS 18 AI will be on-device preserving privacy, and not server-side

    What IQ level will be AI on device? 40?
    The larger models need a lot of memory but even the smaller ones are very capable of more meaningful replies than chat apps like Siri. Meta released some models similar to Chat GPT and they work very well:

    https://ai.meta.com/blog/code-llama-large-language-model-coding/

    The smallest Llama-7B model only needs 6GB of memory and around 10GB storage:

    https://www.hardware-corner.net/guides/computer-to-run-llama-ai-model/

    The larger 70B model needs 40GB RAM and is considered close to GPT4:

    https://www.anyscale.com/blog/llama-2-is-about-as-factually-accurate-as-gpt-4-for-summaries-and-is-30x-cheaper

    Apple published a paper about running larger models on SSD instead of RAM:

    https://appleinsider.com/articles/23/12/21/apple-isnt-behind-on-ai-its-looking-ahead-to-the-future-of-smartphones
    https://arxiv.org/pdf/2312.11514.pdf

    Perhaps they can bump the storage sizes up 32-64GB and preinstall a GPT4-level AI model locally. The responses are nearly instant running locally. They may add a censorship/filter model to avoid getting into trouble for certain types of response.
    40domi said:
    AI is well over hyped and unintelligent consumers are swallowing it hook line & sinker 🤣
    They are powerful tools. You can try them here, first is chat, second is an image generator:

    https://gpt4free.io/chat/
    https://huggingface.co/spaces/stabilityai/stable-diffusion

    For example, you can ask the chat 'What is an eigenvector?' or 'What is a good vegetarian recipe using carrots, onions, rice, and peppers?' and it will give a concise answer better than searching through search engines. The image generators can give some poor quality output but they are good for concepts with a specific description. An image prompt can be 'concept of a futuristic driverless van' to get ideas of what an Apple car could have looked like. Usually image prompts need tuned a lot to give good quality output.

    The AI chats are context-aware so they remember what was asked previously. If you ask 'What is the biggest country?', it will answer and if you ask 'What is the capital of that country?', it knows what country is being referred to.

    They are much better than search engines for certain types of information. Web engines are good for news, shopping, social media but when you just need an answer to a very specific question, it's usually very difficult to click through each of the links to find the answer.
    williamlondonwonkothesaneAlex1Nwatto_cobra
  • Apple now allows classic game emulators on the App Store

    What about old Mac games like titles from Ambrosia Software. Mars Rising, etc?
    There was a developer that released a whole bunch of pinball games.
    Or Hellcats. 

    That would be great!
    OS 9 can be run via UTM. Apple's change here may have been influenced by the alt store, which will be one of the first 3rd party app stores:

    https://www.theverge.com/24100979/altstore-europe-app-marketplace-price-games
    https://altstore.io/

    The featured apps on that store are emulators like Delta, UTM, Dolphin:

    https://getutm.app/

    Here is UTM running OS 9:



    Emulation speed on M-series chips is very good, some people think M-chips are overpowered for mobile devices but the benefit shows with CPU-intensive apps:



    Nintendo recently sued an emulator developer:

    https://www.polygon.com/24090351/nintendo-2-4-million-yuzu-switch-emulator-settlement-lawsuit

    Some emulators can run current-gen games, which can result in the company losing money, e.g Nintendo Switch games like Mario Wonder and Zelda:





    Nintendo could make a lot of money with an official emulator for mobile for the old systems, especially if they legally supply ROMs.
    roundaboutnowargonautAlex1N
  • iOS 18 is coming soon with AI, a new interface, and accessibility: what to expect at WWDC ...

    nubus said:
     I will be shocked if Apple has a massive overhaul of Siri and iOS with AI features this summer since they seem to be late to the AI party and these features do not get into released software and hardware in a short period of time.  I suspect the 2025 series of processors (A and M series) will be the first ones to have significantly improved AI hardware and it seems unrealistic to believe this year’s A18 processor (which is already designed and in pre production) will have anything more than just core and clock increases. Of course Apple will hype any AI related enhancements to the moon but I suspect 2025 will be the year we see significant implementation of AI in Apples devices. 
    Neural Engine in M3 is far behind similar processors from Intel and even A16. Could Apple skip M3 Ultra to focus on M4 AI?
    AI doesn't only use the Neural Engine though, it can use the CPU and GPU. Total AI performance is the sum of all the compute parts.

    M3 was designed to allow for different core counts so M3 Ultra could be a monolithic chip instead of two separate chips and would allow the possibility of an M3 Extreme chip with 3x the GPU power of M3 Max.

    One of Apple's biggest strengths for AI is unified memory. This shows up in tests with big models. Here there's a test of a 70 billion parameter model locally:



    In the smaller models, the 4090 is around 2x the speed of M3 Max but at 5:50, the large model can't fit into the GPU memory. It needs 40GB VRAM and the 4090 has 24GB so the performance is much slower. GPUs with that amount of RAM are the expensive enterprise GPUs or they have to use dual GPUs.

    M3 Ultra will be 2x the M3 Max and could have up to 256GB of RAM.

    A17 has a 35TOPs Neural Engine. Microsoft Copilot running locally is reported to need 40TOPs:

    https://www.tomshardware.com/pc-components/cpus/intel-confirms-microsoft-copilot-will-soon-run-locally-on-pcs-next-gen-ai-pcs-require-40-tops-of-npu-performance

    I expect A17 and A18 will be fast enough for local AI models. They might prefer to still run them on a server to save battery life and RAM is very limited on mobile devices. It would need 16GB of RAM for even the smaller models.

    Apple's vertically integrated hardware model shows its strength again with AI and is something most manufacturers will struggle to compete with. Intel, AMD, GPU hardware doesn't have the memory setup to handle it, nor are they efficient enough to run the models without heating up too much and the NPUs will be so varied across retail models that software can't scale across the whole platform.
    jas99tmaywilliamlondon
  • Canva's Affinity deal will shake the Adobe status quo

    abriden said:

    Firstly, Cinema 4D was nowhere near the $3600 price-point when I first bought it and I was able to use the perpetual licences for several years at a time between upgrades, so my point stands. It was affordable to run Cinema 4D and other 3D software alongside the Adobe apps I needed, and now it is not affordable for peripheral use. Secondly, you seem to have limited life experience. If you had actually had the misfortune of an accident or chronic health condition that impacted your work life in the longer-term, you might realise how condescending your comment sounds. In the absence of any humility, perhaps you might consider just how fast your own circumstances could be up-ended.
    That's to do with the price, not the subscription model and the same would apply if they increased the standalone price and someone didn't have a license. A lot of companies started to see that their existing business model wasn't sustainable long-term. Specialist software like Maxon isn't used by millions of people and when people stop paying, they run the risk of going bankrupt or letting employees go. This nearly happened a few times with Avid. They were bought out last year:

    https://sonicatlas.co/stg-buys-avid/

    People having tough circumstances in life is irrelevant to how a business with 30,000 employees runs its business, Adobe has ongoing payroll of around $1.5 billion / year. You're suggesting that businesses should structure their business model around scenarios where someone isn't able to pay $20-50/month. If someone's income stream depends on this, it's their responsibility to have measures in place, like insurance or pay for it on credit or ask their customers to pay something upfront. It's not Adobe's responsibility to deal with this, just like it's not the responsibility of an internet service provider, mortgage lender and so on.

    Recurring revenue keeps businesses sustainable and every business aims to have long-term sustainable business models. Perpetual licenses aren't sustainable for big businesses or ones with cloud services. Adobe's integration of cloud-based AI across their product line is only possible with recurring revenue. Similarly, Canva could only exist with recurring revenue and it's why they were able to buy out Serif.
    9secondkox2neoncat
  • Canva's Affinity deal will shake the Adobe status quo

    abriden said:
    Adobe Creative Cloud is a relative bargain if one is working professionally, on a full-time basis. However, what happens if one's employment is paused or reduced, either by market forces, family commitments or chronic health issues. In these situations, CC is immediately expensive and yet without it one loses access to much of their work as well as the ability to maintain and update their skill-set should circumstances improve.

    Furthermore, it used to be possible to work across different disciplines but with each now subject to subscription-pricing it is almost impossible to ...for example, I spent years invested in Cinema 4D but it was not my core speciality and now it's impossible to sustain that subscription alongside Adobe CC and others.

    It is fine incentivising students to use certain tools, but with an uncertain job market, globally, how many of them can afford to maintain the software between graduating and finding the roles for which they studied.

    I believe that these developers need to rethink their strategies to serve real-world employment circumstances as they evolve throughout one's lifecycle. 
    The combined monthly price of the full Adobe Suite ($60/m) and Cinema 4D ($80/m) is $140 per month.

    The standalone purchase price of Cinema 4D used to be around $3600 and Adobe suite was $2600.

    Adobe initially kept the standalone option but so many people jumped on the monthly option that it didn't make sense to keep it. Customers drove this trend, not the big companies.

    For anyone complaining that they'd struggle to come up with $140/month, they definitely couldn't come up with a $6200 lump sum. This is equivalent to nearly 4 years of subscription revenue.

    The subscription model fixes the need for piracy because pretty much everyone has some amount of monthly income. Students can easily come up with the student price $15-20/month. The average student loan in the US is $30k. Subscription software would be 3% of this.

    Monthly payments are why the iPhone is so popular. Hardly anyone would pay $700+ outright for a phone but $20/month is negligible.

    The arguments about health issues or whatever causing people to not be able to pay their subscription don't make any sense. If people reached that level of poverty, they'd lose access to their accommodation, food, internet, transport too. Software subscriptions would be the least of their worries.

    Apple could do the same with Final Cut and Logic, they can sell those for $3-5/month, ideally with yearly options ($36-59/year) and add in some cloud storage for exports so families can edit movies and share them with people.
    9secondkox2