nht

About

Username
nht
Joined
Visits
115
Last Active
Roles
member
Points
2,007
Badges
1
Posts
4,522
  • Boeing 737 Max pilots didn't have flight simulators, and trained on iPads instead

    kingofsomewherehot said:
    Software is not required to ensure it remains in the air. (That statement would be valid for an Airbus.... you should avoid flying on them.) 
    Would you care to provide citation? Stating that all Airbus aircraft will crash without computer control is quite a sweeping statement. If you’re referring to fly by wire, my understanding is that the 320 has mechanical backup, all others have electrical (not electronic) backups in the case of multiple computer failures. If you’re suggesting that no Airbus aircraft would have the potential to survive a “Gimli glider”-like incident, I’d be interested to see your source.

    [edit] Guess what I just found: an Air Transat A330 landed safely after travelling 75 miles without any power  https://www.telegraph.co.uk/travel/travel-truths/can-a-plane-fly-with-no-one-engines/

    And of course the “Miracle on the Hudson” was an A320, so I’m really not sure what you’re on about. 
    That’s without engine power, not without computers.  You can maybe land an airbus but the mechanical backups only gives you rudder and stabilizer.  At least on the older ones.  The newer airbuses have a BCM (Backup Control Module) which counts as a computer but really isn’t likely to fail at the same time the normal control systems will...or at least no more likely than total hydraulic failure...which leaves most airliners screwed regardless.

    An airbus is safe to fly. 
    cgWerks
  • Google up to $9.4 billion in total fines to EU, with latest $1.7 billion AdSense penalty

    I don't care for Trump but we should be fining EU companies $94B in response since the EU has elected to target US tech companies since EU tech companies have been singularly uncompetitive vs US and China...and the EU is too chicken to target China.
    williamlondonwatto_cobra
  • 5G iPhone unlikely until 2020, given Intel modem announcement


    Anyway, regarding the subject material, the wait until 2020 for 5G isn't a bad thing. There isn't going to be a network to speak of, in much the same way that Apple waited until LTE was built out better.
    You are not recognizing that this puts Apple in a real bind. On the one hand, yes, they will incur the additional cost of making the next iPhone compatible with what will surely be a partially implemented network. But on the other hand, many people (like me) will wait until 2020 to upgrade, since there is zero reason to buy something that will be technologically obsolete a year later.

    How should Apple navigate this trade-off right after experiencing one of the toughest years they've had financially thanks to decelerating iPhone sales?
    Lol...2017 Revenues: $229B.  2018 Revenues $265B.  

    You're a terrible concern troll.  Toughest year=Highest revenues ever.

    Horrors of horrors...Apple slipped to $84B in Q1 2019...5% off 2018 and higher than 2015, 2016 or 2017.
    Soli
  • Intel officials believe that ARM Macs could come as soon as 2020

    wizard69 said:
    tipoo said:
    BS. If it were coming to MacOS then AMD Threadripper and Ryzen would already be here.
    How does this statement make any sense? What does AMD have to do with Apple planning to switch to their own ARM chips, AMD using x86/AMD64? 
    It means Intel is deflecting. Apple needs Thunderbolt, period. It's the only reason they've stuck with Intel after Zen came out. Intel has ZERO threat of ARM supplanting them on the desktop and laptop, never mind the Data Center. They have every concern of AMD and future generations using their superior products for LESS COST.

    Apple was ecstatic when Intel announced Thunderbolt would be open sourced. Intel has dragged its feet for nearly 2 years since the announcement and it is still not royalty free and released.

    So there is no rational basis for Apple to invest heavily into augmenting their ARM designs for a workstation [Mac Pro], never mind the desktop/laptop [And no iOS is fast because it is very limited in multi-user/multithreaded, multi-core based processing that will be a must on macOS. There are literally hundreds to thousands of processess/threads that are and can be running inside OS X that ARM won't ever supplant what is coming down the pike.

    Basic threads and processes on my Macbook Pro 13: 1391 threads, 346 processes. The ARM would get slammed with that and that is nothing when pushing an iMac Pro or Mac Pro.

    If you think Apple is going to screw over developers with ARM with the Mac Pro you're effing nuts.

    Intel bound Apple when Apple [and as a former NeXT/Apple Engineer I was there] needed a fusion of legitimacy, especially when IBM crapped the bed. At NeXT we made a Quad FAT architecture for the OS because Motorola fucked us over more times than you can imagine on their designs. HP did the same thing. The HP PA-RISC ran circles around x86 at the time. HP did nothing to follow through.

    Sun was just a clusterfuck of stupid with regards to the OpenStep initiative. Sun wanted all revenues on the Hardware and to force us to cut the cost of OpenStep licensing. So people were ``shocked'' that didn't take off? Please.

    ARM dictates designs. Apple modifies but within those design specs.

    You keep believing those pissant benchmarks the mobile world shows as performance figures. Throw 500 processes and 2000 threads at an iPhone and it crashes. There is a reason Apple has very limited subsets of functionality tuned around the tightly coupled hardware constraints.
    The Mac Pro is effectively dead.   Given that developers really don’t care about architecture as much as the do about performance.   It is pretty clear now that ARM has real advantages here.  Mainly because they can have cores running at 1-2 watts at hight clock rates than Intel or AMD.  This leads to the prospects of a Mac Pro running 50 to 100 cores at far higher cLock rates than can be achieved with x86.  
    Um no.  First, ramp the clock and you ramp power usage.  A 3.3Ghz Armv8 draws 125W or around 3.9W/core (32 cores).   Intel isn't suffering from low performance to watt...they're just expensive to get the good stuff.  If you have to build a server farm would you choose an $800 ARM or an $899 Threadripper or EPYC?  That's a no brainer...go AMD and run everything like before.  And Intel being $2000 for Xeon really only means that if a price war comes Intel has room to maneuver. 

    Second, developers do care about architecture as many tools don't run well on Arm.  Like almost all of them.  That means an Arm desktop is a 2nd tier platform for virtualization, docker, dev tools, infrastructure, driver support, etc.  Not to mention major apps will lag just like last time and you lose the ability to run windows apps.

    muthuk_vanalingamwatto_cobra
  • Untangling monitor resolution and size -- how to pick the best display for home and office...

    nht said:

    Even when the listing says it's for a 4K monitor and you know 4K is good, that's little to no help. It's because 4K, like most monitor standards, is utterly useless on its own. You need to know that 4K on a 21-inch monitor will look great and that 4K on a 49-inch one will be bad.

    And if only it were that simple. It's easy to appreciate that 4K at 49-inches is going to be fuzzier than the sharpness of that 21-inch 4K monitor. But, that latter one is likely to make everything so small that it's unusable too.
    This article is bogus.  
    • A 49" 4K display will not look "bad". I'm sitting in front of one at 30+ inches away.  It allows you to reduce the scaling and more desktop space while maintaining a very high level of readability.
    • DPI is also another "meaningless" spec because it gives you exact same data as 4K and size gives you.  The missing element is viewing distance which gives you Pixels per Degree.
    • At a normal 20-40 inch seating distance a 49" 4K display is between 30 PPD and 62 PPD.  60 PPD is the rule of thumb for a "retina display".
    While you are welcome to your opinion on the applicability of the article, given the fact that we're talking about a desktop monitor, and viewing ranges are effectively the same from desk to desk, PPI is fine in this context.

    PPD as a metric makes more sense when comparing living room viewing distances, which can vary by feet, whereas viewing distances from desk to desk will vary by inches.

    While I appreciate your reading the article, given our previous conversations, I'm nearly positive that your tech acumen is a bit higher than who this article is aimed at. While it works for you, I don't think that you'd recommend a 49-inch 4K display to any given monitor purchaser.
    20"-40" is the normal seating range for computer use.  At 20" a 27" 4K display is just shy of retina (57 PPD).  At 40" a 27" 4K display is overkill at 114 PPD from a resolution perspective.  That is less than 2 feet worth of variance that results in a large differences in resolution.  

    Distance IS the key factor in both required resolution and desktop size.  If you can reach out and touch the monitor screen you are closer to the 20" number.  If you can't touch your monitor without moving you are closer to that 40" number.  This is something that anyone can determine even without a ruler.  It's the difference between slouching back in your chair and standing (close) at a standing desk.

    27" is good for those that can easily touch their monitor.  
    43-49" is good for those that are more than a couple inches from touching their monitor.

    If you are short desktop real estate and are forced to drop to the middle scaled desktop (2560x1440) due to everything being too small on the 27" BenQ then a 43-49" UHDTV for $500 is a better option if can run at native resolution and still read everything.

    And yes, I do recommend using the Sony 43" UHDTV for $500 as a monitor for many folks...especially after they see mine.  For the college student, it doubles as a TV.  For a software dev it replaces the standard 2 24" monitors setup with more space for stuff. At home I use a 43" Samsung.  However, it depends on whether they like sitting close or far.

    Folks that like to sit at 20" not so much because it occupies too much of your field of view.  When I lean on my standing desk I can't see the whole screen at once.  That's too close or too big.
    raulcristian