nadriel
About
- Username
- nadriel
- Joined
- Visits
- 34
- Last Active
- Roles
- member
- Points
- 200
- Badges
- 0
- Posts
- 92
Reactions
-
Apple investigating RISC-V instruction set architecture, job listing shows
RISC-V processors are already used in multitude of specialised cases: Seagate is testing them for creating "enable massive parallel computational" storage solutions; Nvidia uses RISC-Vs in their GPUs, likely handling some IO, ("and beyond"); Western Digital is going for RISC-V controllers to their hard drives.Don't take my words as gospel, but one of the advantages of RISC-V (as an ISA) over ARM (and x86) is that it doesn't carry baggage from ages ago. Not that information theory and computation theory has really advanced so far that they're from a different universe, but some things can be done more efficiently with modern approaches. Benchmarks made by RISC-V developers highlight some specifics where their platform has an advantage over the others, but in general the difference is really really small or goes to something like +/- 5% difference in a specific compilation on a specific compiler (like GCC). Note the minus.
The other advantage is that it's really "modular (core is really small) and easy to customise" and create custom instructions for ASIC purposes (for example for handling high amount of IO in a GPU). Even if it's free and open source, the same problem comes around as with other ISAs, if you want something done, you need to hire people or buy the service if you can't do it yourself.
I'd see the RISC-V becoming a huge thing in accelerators and in some specific controllers (Power, USB, Modem etc.) within the next 5 years. Even if there are already some machines running Linux on RISC-V, they are low performance and have a really long uphill way to become even remotely mainstream. -
Kanye West won't release 'Donda 2' on 'oppressive' Apple Music or Spotify
Oppressive yeah, like having fans waiting 2 hours for nothing and getting no consequences for not failing to keep up deadlines. I guess the 200$ device is actually like a 20$ device. This all feels like his T-shirts.Not a fan, don't listen to his music nor really follow him, but he gets on news way too often due to his ridiculousness. -
Apple readying 96W USB-C power adapter for 16-inch MacBook Pro
Reading philboogie said:I never gave it much thought, but this piece on a breakdown of an Apple power adapter, switching power supply , is a must-read:
"Macbook charger teardown: The surprising complexity inside Apple's power adapter"
http:// www.righto .com/2015/11/macbook-charger-teardown-surprising.html
[quote}:Apple's involvement with switching power supplies goes back to 1977 when Apple's chief engineer Rod Holt designed a switching power supply for the Apple II. According to Steve Jobs:
"That switching power supply was as revolutionary as the Apple II logic board was. Rod doesn't get a lot of credit for this in the history books but he should. Every computer now uses switching power supplies, and they all rip off Rod Holt's design."
” This is a fantastic quote, but unfortunately it is entirely false. The switching power supply revolution happened before Apple came along, Apple's design was similar to earlier power supplies[4] and other computers don't use Rod Holt's design. Nevertheless, Apple has extensively used switching power supplies and pushes the limits of charger design with their compact, stylish and advanced chargers.”
But again, this reminded me about that it’s not only greed that costs with Apple products. Thanks for the link, interesting read! -
Apple shares 'Foundation' recap ahead of season finale
entropys said:Is it heresy to confess I think what Goyer has done to the story has improved it?
After the first few episodes I decided to take this as something inspired by the books and enjoy them as is. So far it seems like a space opera rather than a superior intelligence measuring competition between the characters like the books really are. Oh and I like how this feels like a totally different storyline.
Perfect? No, I still have my complaints. Enjoyable? Sure, but not for fundamentalists who feel offended about changes. -
USB-C group hopes new logos will solve customer confusion
-
Apple rejecting apps is unfair competition, declare rejected app developers
lkrupp said:elijahg said:pwrmac said:But an Android! You are not forced by Apple to buy their services or devices.
Loud and widespread complaining is pretty effective way to attract Apple's attention and to bring out change, same with other big companies really. I'd really like them allowing side loading apps, because it's a pain in the butt to renew the certificate every so often for Apps that I'm never putting on AppStore. Please leave Android out of this, I'm really not interested in it. I'd rather have something I want on hardware I already own and like.
Back to the topic, with everything going to the shitty subscription models I'm really on with Apple on this. Exhausting business milking people, more power to users and developers, but really against greedy practices skimming on pennies on a platform they're indebted to for their success. -
Sketchy rumor claims Samsung courting former Apple engineers for custom chip project
There seems to be a huge misunderstanding around about optimization… When using high level languages most of the optimization is done by the compiler (assuming code isn’t total garbage), not some magical level of understanding the hardware you’re coding on (in rare occasions those do exist too).
Lower level (for example assembly languages) are used in consort with high level languages to optimize heavy tasks, by making them do specific tasks fast. This could be a ready function in the language or API you call on for example to calculate the approximation of square root, imaging and creating the most efficient way to do a calculation is above most people (even at Apple). And in general all the lower level language uses are processor specific, and can’t be ported without some modifications, i.e. a specific call to processor does not exist on other platform, or there’s some other limitation or difference.
First level of optimization is removing unnecessary loops (especially loops within loops) and if statements, there are practices that are more efficient with memory calls (arrays vs matrices vs dictionaries etc..), and lastly parallelism, which is very hard to implement in some cases. I’m not saying that people at Apple are average or stupid, just that I don’t expect the average coder at Samsung (or any other big tech company coder) being different than the one at Apple really.
I don’t pretend to know much about hardware design, and I think Apple has pretty much the best (compromise between choices for) silicon on their hardware currently, but I don’t think they’re that much better as the marketing and hype tries to make us believe. Like looking at TSMC vs Intel chips under electron microscope reveals little actual difference. Video from YouTube der8auer, who has exactly done exactly that: https://youtu.be/1kQUXpZpLXI or google with title “14nm and 7nm are not what you think it is” (not the full title) if you don’t trust the link.
edit: YouTuber name typo -
Greg Joswiak confirms iPhone's future move to USB-C
lkrupp said:nadriel said:This should’ve happened years ago! Since Apple obviously haven’t cared about improving lighting, except some small incremental changes they did.
The current devices or protocols aren’t even close to max usbc has to offer.
Granted the switch will be painful for some. Or not really, my lightning gadgets won’t become obsolete or stop working when their usbc counterparts come out. I’ll either sell them or use them until they become inoperable.
People really need to step back and take a chill pill here. This is - will be an upgrade, win win.
ANSI, ISO, JIS, FCC, ECC and so on. What radio frequencies cellular devices use, how they manufacture batteries, what plastics and materials can be used in foodstuffs. Lead paint, environmental standards, safety standards on arsenic usage etc.Now, this is only a port standardization for similar devices. This is a good thing. There is a provision on this law that takes into account future ports and the bureaucrats don’t make the decision in a vacuum. There will be lobbying from the industry.
This is in no means the death of innovation. Not for ports, like the standards for radio equipment was not for Wi-Fi, cellular and so on.And heck yea I’m going to gloat over this. This should’ve happened across all Apple devices with the MacBooks 7 years ago. -
iPhone 13 Pro Max supports faster 27W charging, but only temporarily
So, it works pretty much as intended? Doesn't all fast charging gizmos start to limit the wattage after 50% or so? Due to how li-ion batteries reach peak voltage around that point and after that there's diminishing returns, also batteries heat up etc. I'm not an expert, but I think this is just the nature of battery chemistry and not just a Apple thing. Other than they seem to be more aggressive in saving battery lifetime with smart software limitations. -
Greg Joswiak confirms iPhone's future move to USB-C
This should’ve happened years ago! Since Apple obviously haven’t cared about improving lighting, except some small incremental changes they did.
The current devices or protocols aren’t even close to max usbc has to offer.
Granted the switch will be painful for some. Or not really, my lightning gadgets won’t become obsolete or stop working when their usbc counterparts come out. I’ll either sell them or use them until they become inoperable.
People really need to step back and take a chill pill here. This is - will be an upgrade, win win.