nadriel
About
- Username
- nadriel
- Joined
- Visits
- 34
- Last Active
- Roles
- member
- Points
- 200
- Badges
- 0
- Posts
- 92
Reactions
-
Kanye West won't release 'Donda 2' on 'oppressive' Apple Music or Spotify
Oppressive yeah, like having fans waiting 2 hours for nothing and getting no consequences for not failing to keep up deadlines. I guess the 200$ device is actually like a 20$ device. This all feels like his T-shirts.Not a fan, don't listen to his music nor really follow him, but he gets on news way too often due to his ridiculousness. -
Apple shares 'Foundation' recap ahead of season finale
entropys said:Is it heresy to confess I think what Goyer has done to the story has improved it?
After the first few episodes I decided to take this as something inspired by the books and enjoy them as is. So far it seems like a space opera rather than a superior intelligence measuring competition between the characters like the books really are. Oh and I like how this feels like a totally different storyline.
Perfect? No, I still have my complaints. Enjoyable? Sure, but not for fundamentalists who feel offended about changes. -
Apple rejecting apps is unfair competition, declare rejected app developers
lkrupp said:elijahg said:pwrmac said:But an Android! You are not forced by Apple to buy their services or devices.
Loud and widespread complaining is pretty effective way to attract Apple's attention and to bring out change, same with other big companies really. I'd really like them allowing side loading apps, because it's a pain in the butt to renew the certificate every so often for Apps that I'm never putting on AppStore. Please leave Android out of this, I'm really not interested in it. I'd rather have something I want on hardware I already own and like.
Back to the topic, with everything going to the shitty subscription models I'm really on with Apple on this. Exhausting business milking people, more power to users and developers, but really against greedy practices skimming on pennies on a platform they're indebted to for their success. -
Apple investigating RISC-V instruction set architecture, job listing shows
RISC-V processors are already used in multitude of specialised cases: Seagate is testing them for creating "enable massive parallel computational" storage solutions; Nvidia uses RISC-Vs in their GPUs, likely handling some IO, ("and beyond"); Western Digital is going for RISC-V controllers to their hard drives.Don't take my words as gospel, but one of the advantages of RISC-V (as an ISA) over ARM (and x86) is that it doesn't carry baggage from ages ago. Not that information theory and computation theory has really advanced so far that they're from a different universe, but some things can be done more efficiently with modern approaches. Benchmarks made by RISC-V developers highlight some specifics where their platform has an advantage over the others, but in general the difference is really really small or goes to something like +/- 5% difference in a specific compilation on a specific compiler (like GCC). Note the minus.
The other advantage is that it's really "modular (core is really small) and easy to customise" and create custom instructions for ASIC purposes (for example for handling high amount of IO in a GPU). Even if it's free and open source, the same problem comes around as with other ISAs, if you want something done, you need to hire people or buy the service if you can't do it yourself.
I'd see the RISC-V becoming a huge thing in accelerators and in some specific controllers (Power, USB, Modem etc.) within the next 5 years. Even if there are already some machines running Linux on RISC-V, they are low performance and have a really long uphill way to become even remotely mainstream. -
Sketchy rumor claims Samsung courting former Apple engineers for custom chip project
There seems to be a huge misunderstanding around about optimization… When using high level languages most of the optimization is done by the compiler (assuming code isn’t total garbage), not some magical level of understanding the hardware you’re coding on (in rare occasions those do exist too).
Lower level (for example assembly languages) are used in consort with high level languages to optimize heavy tasks, by making them do specific tasks fast. This could be a ready function in the language or API you call on for example to calculate the approximation of square root, imaging and creating the most efficient way to do a calculation is above most people (even at Apple). And in general all the lower level language uses are processor specific, and can’t be ported without some modifications, i.e. a specific call to processor does not exist on other platform, or there’s some other limitation or difference.
First level of optimization is removing unnecessary loops (especially loops within loops) and if statements, there are practices that are more efficient with memory calls (arrays vs matrices vs dictionaries etc..), and lastly parallelism, which is very hard to implement in some cases. I’m not saying that people at Apple are average or stupid, just that I don’t expect the average coder at Samsung (or any other big tech company coder) being different than the one at Apple really.
I don’t pretend to know much about hardware design, and I think Apple has pretty much the best (compromise between choices for) silicon on their hardware currently, but I don’t think they’re that much better as the marketing and hype tries to make us believe. Like looking at TSMC vs Intel chips under electron microscope reveals little actual difference. Video from YouTube der8auer, who has exactly done exactly that: https://youtu.be/1kQUXpZpLXI or google with title “14nm and 7nm are not what you think it is” (not the full title) if you don’t trust the link.
edit: YouTuber name typo