melgross

About

Username
melgross
Joined
Visits
111
Last Active
Roles
member
Points
10,525
Badges
2
Posts
33,510
  • Google is downplaying Android to focus its future on Chrome OS

    The main reason why Google is walking away from Android, at least for tablets and notebooks, is because of AOSP. If people don’t know what that is, it’s the actual open software version of Android, which is NOT open software, despite what a lot of people think.

    aosp was intended to be the poor man’s, or poor manufacturer’s version of Android. The idea was that companies selling cheap phones in poor countries would want an OS unencumbeted by the licensing requirements of “real” Android. As people had more money, these companies, and customers, would easily transition to licensed Android devices. Those devices require a much stricter licensing requirement, and can use the official services of Google, such as search, maps, Google Play Store, etc.

    companies using AOSP aren’t allowed to call it Android, and companies making Android devices aren’t allowed to use AOSP.

    but that plan went awry. Instead, Google pulled out of China, leaving the field to Chinese companies backed by the Chinese government. Chinese manufacturers, also backed by the government, began to rise. They had no incentive to move to Android.

    so companies using AOSP began to develop their own services. There were browsers, maps, search engines etc. real Android became less necessary. Google continued to remove more services from AOSP in the hope that it would convince companies to move to Android, but it didn’t work. Now, about 65% of all “Android” phones out there are really AOSP phones, which Google has little control over, and gets nothing from. Instead of being a gateway to Android and Google’s services, it’s become a burden.

    that’s one of the reasons they converted the Chrome browser, to ChromeOS. How successful they will be moving phones to that is hard to say, as they’ve moved Chrome OS to x86 from ARM.
    dewmejbdragonmuthuk_vanalingambaconstangiqatedoRayz2016StrangeDaysdoozydozencommand_fgilly017
  • Why the 'iPhone 8' may see Apple finally adopt OLED

    I'm reading the same thing here I read elsewhere. Things that simply aren't true.

    one is that OLEDs have better color. That's not true. Color is the result of a number of decisions. LCD panels, backlights and filters can give color gamuts that are just as wide as OLED, and when lifetimes are taken into account, even wider.

    the second involves contrast. There is an amazing lack of understanding of what contrast means. Yes, OLED can have dead blacks (but doesn't always, for technical reasons). But a small decrease in black levels when compared to the best LCD doesn't translate out to mean better contrast, or more contrast in the practical sense. A major reason is that LCDs can be much brighter than OLED. That gives a wide amount of variation in light output, an amount that OLED can't match.

    Another reason is that our eyes have just a certain range of contrast that they can accommodate. Under normal situations, our iris opens and closed automatically as we look around our environment. It's why we can see dark shadows, and yet see detail where our camera picks up a black hole. But when we're viewing graphics, or photos, or video, we can't do that. We see the entire thing at once, and our eyes can't accommodate the brightness shifts. So very dark grey looks perfectly black. But, brighter areas look bright, brighter than OLED can get. So contrast may easily look greater on the LCD screen.

    and that old bugaboo about efficiency. Sure, in theory, OLEDs  can be more efficient. But while OLED efficiency has been rising, so has that of LCD, mainly because backlights, which are made from non organic LEDs, which are far more efficient than OLEDs, have also gained in efficiency. The relative efficiency between OLED and LCD therefor, has remained about the same, which is to say, neither has an advantage at this point.

    Apple uses an OLED screen for the Watch, not because of efficiency, but because it's simpler to make, and is significantly thinner, with less unlit edge width.

    we also see that Samsung's display only reaches a max brightness level of under 400 nits, though auto mode will bring that to slightly over 600 in bright, direct sunlight. Allowing it otherwise shortens the life of the display. Apple's phones go to 550-600 for manual adjustment, reaching about 700 auto for outdoor direct sun. There is also some burn-in in OLEDs, though it's gotten better over the years.

    an interesting exception to the max output is the Apple Watch Series 2. That reaches 1,000 nits in direct sunlight. I've been trying to figure out how Apple does that. Either, they use a higher quality display, a slightly different design, which perhaps is some of their own IP, or simply because they figure we don't look at our Watch displays that much, so the decrease in life doesn't matter. I've looked at the awatch display under one of my microscopes, and it does have an unusual makeup. One long vertical blue sub pixel, running along a small horizontal red sub pixel, above which is an almost square green sub pixel. Most of the space between the sub pixels is black. I've never seen that arrangement before. Since the blue is the first to go, possibly that long blue sub pixel is there to allow greater brightness.

    will all of this change some day? Sure, but it's not the case now.

    why would Apple be interested? For one thing, there is an efficiency case to be made if Apple wants to follow what some OLED phone manufacturers have been doing with an on all the time time display, or some other small display area function. In that case, where just 1%, or so, of the display elements need to be on, OLED is the choice, because LCDs need the entire backlight on all the time, though I've seen some experimental displays with segmented backlights. In addition, when you need a display with punched out holes, say, for speakers, microphones, cameras, etc. it's much easier to do that with an OLED than an LCD. So for an entire Phone face with a display, OLED would be the choice.
    charlesgreswonkothesaneksecjcs2305bestkeptsecretwatto_cobraloquiturmonstrosityjustadcomics
  • Apple Car research focusing on use of Tesla-style induction motor

    I use three phase induction motors in some of my equipment. To control them, because they work either over single phase 120, or single phase 240, depending on the motor, I use what are called VFDs. These convert either single phase 120 or 240 to three phase 240 (at the outlet in most places we see 220 measured). VFD stands for variable frequency drive. By varying the frequency the speed of the motor can be varied by a wide range, often from zero to possibly 3600, as with my motors. These are complex devices. They have control over every parameter of the motor, and monitor motor characteristics and health. They have reverse, jog, degree of turn, and most importantly, constant torque.

    there are special “inverter duty” motors for this purpose, because motors don’t like to run off their designed frequency of either 50 or 60 Hz. If they do, they get too hot. The VFD will then shut them down. So while a drill press can use a regular (cheaper) 3 phase motor with this, its very disconcerting to have the mill or lathe shut off in the middle of a long cut.

    this technology the article is taking about is almost the exact same technology we use. But our equipment isn’t meant to be bumping along the road while in use.
    GeorgeBMacdysamoriarazorpitAndy.Hardwakejony0qwerty52Carnagewatto_cobra
  • Apple's 'iPhone 8' to boast larger Plus-sized battery in form factor similar to 4.7" iPhon...

    We continue to read about how much more efficient OLED screens are, particularly because black pixels aren't illuminated from behind. We rarely read, in those same articles, just how much more power is consumed by OLED pixels when they are at a bright level.

    apple's watch is using an OLED screen, according to Apple, because it's thinner than an LCD and LED backlight, not because it's significantly more efficient. The entire reason why OLED devices often have a black background is because of the inefficiency at brighter levels. The interesting thing here is that while, over the years, OLEDs have become more efficient, so have LCD LED backlights. The two screen types are at about the same overall efficiency, and apparently will continue to be for some time.

    OLEDs do have some other advantages, mainly the mythical edge to edge screen. And, as I've mentioned, they are thinner. The disadvantages include the still problematic burn in. While that has been improved over time, it still exists, and is part of the continued problem over shorter overall screen life. The shorter screen life, and burn in are related to the problem of why OLEDs aren't nearly as bright as LCDs. When more power is poured in, they, like every other illumination device, get hotter. But OLEDs can't get as hot as an inorganic led, so they can only have so much power. That means their brightness is restricted.

    the latest OLED screens are stuck below 400 nits in normal mode. They can jump to over 600 for a short time in direct daylight, but there is no manual control over that high brightness. LCDs can get to over 600 nits in normal mode, and up to 700 nits in bright daylight. It's true that most time that isn't needed, but when it is, it makes a big difference.

    i don't know what Apple is doing with the Apple Watch Series 2 OLED screens, as Apple states that they can reach 1,000 nits in direct daylight, and indeed, it's a lot brighter than my friends first gen Apple Watch under these conditions. Either Apple has made a breakthrough that their manufacturer uses exclusively for them, or Apple isn't worried about shortening the screen lifetime, as watches aren't used as much as a smartphone.

    but, my take on these stories is to be just a bit skeptical about the virtues of OLEDs. While they're better than they used to be, as is everything electronic, they're not yet a paragon of virtue.
    Rayz2016doozydozenirelandpatchythepiratewatto_cobra[Deleted User]zroger73
  • iPhone 11 Pro Max screen secures 'highest ever A+ grade' in lab testing

    Apple claims (and they have a fair number of OLED patents), that while their screens are made by Samsung, they include Apple technology. I don’t know exactly what that means, but it’s somewhat different than Samsung’s screens they use for themselves, and presumably from those they sell to others, rather than to Apple.
    jahbladeStrangeDayskevin keebb-151983jbdragonFileMakerFellerwatto_cobrajony0
  • Text of FCC 'Proposal to Restore Internet Freedom' released, eradicates net neutrality rul...

    Well, here we go, just another move by the Trump administration to take more rights away from us. Now, removing these rules, which were hard fought for, will allow ISPs to decide which sites they will carry. One day, if someone at Comcast, Spectrum, AT&T, Verizon and others is a Windows person, we may not be able to get AppleInsider from them. Isn’t that just great?
    lostkiwidysamoriajahbladefrankiemagman1979radarthekatmanfred zornaylkteejay2012jony0
  • Why Apple uses integrated memory in Apple Silicon -- and why it's both good and bad

    Ok, so the writer gets it wrong, as so many others have when it comes to the M series RAM packaging. One would think that’s this simple thing would be well understood by now. So let me make it very clear - the RAM is NOT on the chip. It is NOT “in the CPU itself”. As we should all know by now, it’s in two packages soldered to the substrate, which is the small board the the SoC is itself soldered to. The lines from Apple’s fabric, which everything on the chip is connected with, extend to that substrate, to the RAM chips. Therefore, the RAM chips are separate from the SoC, and certainly not in the CPU itself. As we also know, Apple offers several different levels of RAM for each M series they sell. That means that there is no limit to their ability to decide how much RAM they can offer, up to the number of memory lines that can be brought out. This is no different from any traditional computer. Every CPU and memory controller has a limit as to how much RAM can be used. So, it seems to me that Apple could, if it wanted to, have sockets for those RAM packages, which add no latency, and would allow exchangeable RAM packages. Apple would just have to extend the maximum number of memory lines out to the socket. How many would get used would depend on the amount of RAM in the package. That’s nothing new. That’s how it’s done. Yes, under that scheme you would have to remove a smaller RAM package when getting a larger one, but that's also normal. The iMac had limited RAM slots and we used to do that all the time. Apple could also add an extra two sockets, in addition to the RAM that comes with the machine. So possibly there would be two packages soldered to the substrate, and two more sockets for RAM expansion. Remember that Apple sometimes does something a specific way, not because that’s the way it has to be done, but because they decided that this was the way they were going to do it. We don’t know where Apple is going with this in the future. It’s possible that the M2, which is really just a bump from the M1, is something to fill in the time while we’re waiting for the M3, which with the 3nm process it’s being built on, is expected to be more than just another bump in performance. Perhaps an extended RAM capability is part of that.
    baconstangjony0Alex1NFileMakerFellerthtkillroymuthuk_vanalingamwatto_cobra
  • New energy regulations prompt Dell to stop sales of high-performance PCs in six states

    lkrupp said:
    Because Dell is a privately owned company Michael Dell can do whatever he wants, so he pulled high-end products from those states. i wonder how the big-time gamers in California will react to this. 

    If Dell were still a publicly owned corporation there would have been a crap-storm of major proportions and Dell stock would plummet. I sometimes wish that Apple could go private so it could give the finger to stupidity. Buy me out, Apple! I’m ready.
    I guess you don’t know this, but Dell hasn’t been a private company for years. Its stock is up today to $96.
    GeorgeBMacmuthuk_vanalingamgatorguyjony0crowleykillroywatto_cobra
  • Apple announces M1 as first Mac Apple Silicon chip

    Something/s not right. They were talking about software running up to 3.8 times faster, so how could performance be equal to a two core Air chip. I just watched it, and I didn’t get that it was equivalent. We’re all missing something.

    it just occurred to me what they said. I also said this on Arstechnica. It’s the four efficiency cores that are equal to the MacBook Air x86 chip, not the entire M1
    hcrefugeemagman1979Alex1Nwatto_cobracornchipbig_fan
  • ARM Mac coming in first half of 2021, says Ming-Chi Kuo

    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    This is the problem I’ve been wondering about for some time. While some people dismiss this as an issue, or in most cases, don’t even think about it (aren’t aware it is an issue), it’s the biggest issue apple will need to deal with. In previous changeovers, even Apple was very lax in getting their own big apps out. It took a year for them. It took a long time for Adobe and Microsoft, with their massive software, to come over too.

    ARM is not close to x86. It’s optimized for battery life over performance. Apple and ARM have made significant advances on that front, but the instruction sets are different enough. We know from previous attempts at emulation, that a processor family needs to be 5 times as powerful in order to be able to run software at the same speed as the family they’re emulating. This hasn’t changed. Microsoft supposedly does it now, with their “universal” sdk. But they don’t, really. They require software to be rewritten, and recompiled for ARM. And there have still been issues with performance, specific features and bugs.

    im not saying it can’t be done, because obviously it can. But if Apple is really going to release a device next year, there will either be significant limitations, or they’ve figured out a way around them. My suggestion, which no one here has ever commented on, from my memory, is to add a dozen x86 instructions to the chip. It’s been found that 80% of the slowdown between chip families is from about a dozen instructions. The chip, or OS, could hand that over to those when native x86 software needs them. Individual instructions aren’t patented, or copyrighted, as far as I know. If true, that would give Apple a way around the problem.
    jdb8167rundhvidFileMakerFellerargonautmuthuk_vanalingam