auxio

About

Username
auxio
Joined
Visits
125
Last Active
Roles
member
Points
4,741
Badges
2
Posts
2,728
  • iPhone isn't secureable enough for the South Korea military - but Android is

    Is bog standard Android open source?
    The trick with Android is that, if you want your Android device to have access to Google services (search, maps, mail, etc) then you need to be GMS certified. Those APIs & apps aren't open source.

    AFAIK, Samsung devices are GMS certified and so there's no way they can block use of the microphone from Google's apps. So it appears to be a symbolic ban based on favouring local. Though perhaps they have a special arrangement with Google on this.
    dewmewilliamlondonwatto_cobra
  • Gaming and AI are in Mac's future, even with low memory capacities


    jdw said:
    Sorry, but all the defenses of Apple on this are total poppycock.  I'm a huge defender of Apple, and have been since 1984.  People who go around doing nothing but trashing Apple are trash themselves.  But let's face it, folks, RAM is one area Apple has ALWAYS failed the consumer on.  Apple Silicon and RAM efficiency talk is garbage.  What matters is PRACTICAL USABILITY.  Does somebody online talking about efficiency magically make 8GB good enough for YOU?  No.  No, it does not.
    I just got into the technical details, but hey, stick with the "magical" thinking because you can't comprehend the actual engineering details.

    The big problem is that every application you use is web, web, web these days. Which adds a bunch of extra overhead (JavaScript runtime) and doesn't allow for proper hardware optimization like the RAM/VRAM optimization I described above. The Google heads love you to be on the web using Chrome for everything because that's where they make their money (tracking everything you do). The second rate software development companies love it too because it saves them on development costs not needing to hire people who truly understand how operating systems and hardware work (so that things can be properly optimized for different systems). And unfortunately it leads to all the bloat and overhead which requires more memory and CPU power.

    I guarantee that if you can stick to native applications outside of a web browser, you'll find you'll rarely (if ever) hit the limits of the average computer unless you're doing very specialized things like software engineering, scientific data analysis, CAD, video editing, etc. And if you're doing those, you should be looking at the pro models which have higher base specifications.
    williamlondon
  • Gaming and AI are in Mac's future, even with low memory capacities

    elijahg said:
    auxio said:
    Unified memory IS more efficient than DDR. There’s no 1x1 comparison between the two. 
    DDR (double data rate) is just the data throughput of RAM, which doesn't say anything about how much memory is required by applications.

    However, the point about unified memory being more efficient (apps require less overall RAM) is correct. Without it apps which need to, for example, display an image on the screen need to store a copy of that image in both CPU memory (RAM) and GPU memory (VRAM). With unified memory they only need one copy because both the CPU and GPU can access the same memory. The same holds true for machine learning and the NPU (neural processing unit).

    All that said, people without critical thinking skills (i.e. the majority of the population) simply follow the "bigger is better" logic. And so if Apple hopes to sell to such people, they'll have to bump the specs, even if those people will never need that extra memory.
    The second half is not true. The same image data does not need to be in RAM and VRAM.
    Please enlighten me as to how the GPU can load an image from disk directly into VRAM when it has no ability to access the hard drive (GPU is just specialized graphics compute power with no device I/O capabilities).

    Sure, once the image has been loaded from disk by the CPU into RAM and then copied to a texture/buffer in VRAM, it can be purged from RAM. But that doesn't negate the fact that, at image load time, you'll have 2 copies of the image in memory (and thus require double the memory) unless the GPU and CPU can share the same memory (i.e. unified memory).

    I could go on about how, if you use a lot of images, you'll need to cache some in RAM since you typically have more RAM than VRAM (or you need VRAM for other things like vertex buffers, frame buffers, etc) but I think the original example is a good, simplified explanation. It's a similar situation with AI/ML and the NPU.

    williamlondonnubustimpetus
  • Gaming and AI are in Mac's future, even with low memory capacities

    Unified memory IS more efficient than DDR. There’s no 1x1 comparison between the two. 
    DDR (double data rate) is just the data throughput of RAM, which doesn't say anything about how much memory is required by applications.

    However, the point about unified memory being more efficient (apps require less overall RAM) is correct. Without it apps which need to, for example, display an image on the screen need to store a copy of that image in both CPU memory (RAM) and GPU memory (VRAM). With unified memory they only need one copy because both the CPU and GPU can access the same memory. The same holds true for machine learning and the NPU (neural processing unit).

    All that said, people without critical thinking skills (i.e. the majority of the population) simply follow the "bigger is better" logic. And so if Apple hopes to sell to such people, they'll have to bump the specs, even if those people will never need that extra memory.
    tmayzeus423watto_cobratimpetusStrangeDays
  • Rumor: M4 MacBook Pro with AI enhancements expected at the end of 2024

    ajmas said:
    tyler82 said:
    How many years until Apple actually has decent AI though? They're not really a leader right now. I'd wait until then to buy a current computer that could actually take advantage of AI advancements.
    What do you mean by decent AI and how do you define AI? Also AI for what purpose? 
    To mindlessly drive up the value of shares silly. AI is the new tech buzzword. If you don't use it in your quarterly report, your company valuation drops significantly.
    9secondkox2danoxdewmeAlex1N