misa

About

Username
misa
Joined
Visits
34
Last Active
Roles
member
Points
270
Badges
1
Posts
827
  • Some US airlines prohibiting use of any Samsung phones in wake of Note 7 recall



    Reports have circulated that the public relations problem for Samsung is extending past just the Galaxy Note 7. AppleInsider can confirm that some American Airlines and Delta flights are instructing all Samsung phone users to power down, and not use their devices in flight.

    "Its hard to tell at a glance if a phone is a Note 7 or not," we were told when we asked about the in-flight safety briefing. "But we can tell if something's a Samsung from the aisle."

    This is one of the consequences of all smartphones chasing the "thin" rabbit, nobody can tell one years model from another once a protective case or battery-case is attached to it.

    Samsung's best option here, as sad as it is, is to recall 100% of the phones, Issue new ones as "Samsung Galaxy Note 7.1" and any that aren't defective get sent to "refurbishment" to engrave "R" or something on the model number so that people don't get lemon models off auction sites and craigslist at some later point in time.

    Samsung could also just afford to write the entire line off, but I somehow doubt that will pass. Sure it's fine and dandy when you force users to upgrade every 2.5 years, but to prematurely make the devices fail due to a bad part source is a very expensive lesson that you don't want to have to learn twice.

    As for airlines, they're right, there is no way to tell which model is which short of programming the WiFi access point in the plane to alert the flight crew that a IMEI belonging to Samsung is active and to confiscate the devices.
    calinolamacguywatto_cobra
  • Tim Cook email response tells sender to 'stay tuned' for Mac refresh

    emoeller said:
    It's all about the processors.   The sooner Apple moves to its own chips the better....
    We have their M chips for the iPhones, A chips for the mobile devices, S chips for the watches, and now the W chip for the wireless Air Pods.  We're seeing Apple design their own chips rather than taking things off the shelf more and more.  It would seem to me that creating an entirely new chip (non x386 architecture) would be something they are contemplating considering the slowdown with Kaby Lake, and all.  It's been a good long while since we've had a shakeup in the processor world, and x386 has been great for us all, but the time might be right for Apple to create their own chips and have Intel manufacture them.
    It's not going to happen, People have been saying Apple will do this for several years (just like removing the headphone jack, switching to OLED's, adopting the "edge screen", bigger phablet screens, and so forth) there is no evidence that Apple is going to do this.

    The A10 is not equal to any 2016-era CPU in performance. The A9 last year wasn't either. Apple can NOT pull a Microsoft here and create a parallel ARM laptop platform and then admit failure. Microsoft failed on this front because Windows is not designed to be used with a single input. 

    Apple would rather improve the performance of the A-series chip until the iPad Pro beats whatever Intel puts out in the sub-15watt sub-stupidly-thin-laptop market, and then quietly discontinue all low-end Macbook Pro's. Because at that point the iPad Pro essentially is better than any possible laptop that is possible. Apple will still put out x86 laptops, but the best performance will be in the iPad Pro and whatever high-end laptop with a dedicated GPU that is still available. The iMac/MacMini and Mac Pro will always remain x86 because those devices are simply not used that way. You're not going to get someone to connect a cintiq to an iPad. You're not going to plug in your video editing equipment into an iPad. It's just not possible to do any "pro" stuff with an iPad that involves any other hardware.

    As it is, for Apple to scale the A10 up to the performance of an i7-4790 it needs to become 55% faster without using any more power. Even Intel's highest-end Xeon's don't do this well, overclock a 3.6Ghz processor to 4.0Ghz and it goes from 150 watts to 280 watts and risks melting the socket. To scale the A10 up to the performance of a 22-core Xeon would require only a 20% increase in clock speed, but 22 cores, which means you'd have to imagine a PCB 11 times larger that what is in the iPhone. Currently that is not going to happen either.

    People quickly forget the circumstances of why Apple switched from 68K to PPC and PPC to Intel in the first place. The 68K to PPC was fine because the PPC could run the 68K software, which included the OS 8.5 itself. NextStep/OSX was built to run on anything. So can Windows NT. However Microsoft's mistake in trying to get an ARM version of Windows was that NONE of the software works on ARM, as there has never been a requirement for "fat binaries", and Micrsoft's legacy software all requires two-button mouse input or meta-key keyboard shortcuts. Neither are available on a touch-screen, hence an entirely new UI is needed. For OS X however Apple has never required more than a 1-mouse button (this might actually be a long-con game with the eventual goal of touch-screens, who knows) and OS X has the Launchpad which functions identical to iOS's way of launching software. Yet it remains a full OS and doesn't try to shoe-horn you into something unfamiliar.

    Apple could, but it won't. It leaves "Switching to it's own CPU's" as leverage over Intel. If Apple ever wanted to switch to it's own chips, it would still end up coming back to Intel for chip-fabrication, and likely wouldn't get the latest fabrication that Intel makes it's own chips on. This is always the risk for Apple, that it wants to use it's own chips but can't find enough capacity to produce enough of them. 

    And that is why Apple will continue to use Intel's chips, they have no reason to switch unless the Intel-Apple relationship sours.That is why they switched away from IBM/Motorola in the first place. Apple will not be able to produce better chips for the laptops and desktops than Intel, and if the laptops run different chips than the desktops, then people simply won't buy the laptops at all.

    fastasleepkamiltonbaconstangargonaut
  • Apple defends decision to ditch 3.5mm jack, says AirPods development began years ago

    tmay said:
    Looks like iPhone 7 is the impetus for a rapidly expanding market for Bluetooth headphones and EarPods.

    Most people will move on from wired and never look back.
    Nope. A2DP profile only pushes lossy audio (AAC and MP3) over it, not lossless. So bluetooth is a downgrade.

    For all the whining about the removal of the 3.5mm jack, I think people defending this decision are losing sight of what this means. It's an antithesis to the "Smaller, lighter" race Apple has been chasing, by requiring a dongle of any sort that has actually made the footprint  3" larger. Sorry, that is not what I want.

    But I'll point out that the Nokia headphones that I had been using forever had a "dongle" too, it extended the headphones a foot so that the "playback controls" could be attached to your coat, and also worked as a "Break away" cable in case you took your coat off and didn't remove the headphones from your head first. It was still a 3.5mm cable, but it added a function to ANY headphones.

    This is why removing the 3.5mm jack is stupid. Apple could have solved the water-proofing and kept the jack, or added a USB-C or another lightning port. But NO, instead they remove it entirely and replace it with nothing, allowing only lossy bluetooth as the only "standard". What idiot at Apple proposed this without replacing it?

    So this destroyed functionality has consequences for the vast majority of people:
    - People who use their expensive headphones for long periods of time, can no longer charge their phone
    - People who use their phone with the car on a road trip, can no longer charge their phone
    - People who listen to music for 8-12 hours in the office can no longer charge their phone
    - People who use wireless headphones end up needing two or three pairs to last an entire day.

    Nintendo, Nokia and Motorola have all done this, and all quickly abandoned the effort when people complained that they couldn't charge their device. Now had Apple introduced a "wireless charging" dock that charges the headphones and phone, maybe this might have gone over better. But instead we're expected to only use our phones for 5 hours with a wireless headset, or not use Apple Music at all while using lightning headphones since the charge will run out and the batteries do not last long enough for commute+office, or road-trip. 

    Effectively the iPhone 7 is the most poorly thought-out phone Apple has ever done. Why would anyone want to downgrade to this? This is not even remotely comparable to the removal of 3.5" floppy drives, serial ports and optical drives, which were already on the way out and had better alternatives. What alternative is introduced here? NONE.

    cnocbuitoddzrx
  • Apple officially ditches headphone jack for Lightning, will include adapter in iPhone 7 box

    fallenjt said:

    zmas said:
    Remember when you could charge your phone and listen to music at the same time?
    use an adapter (sure to come), or better yet wireless, and quit your bitching. or just live with analog tech for the rest of your life. but GTFO either way. 
    The BT Buds don't last very long between charges. That's reserved for the Beats ones at $159 (or more).
    I put my phone in my car and charge it from a 12v adapter. The Car really does not handle any phone apart from Windows ones very well w,r,t, accepting calls and using the hands free. So I use a headphone with mic and stop to take calls. The phone is still being charged.

    So now it is one thing or the other. In my eyes, this is a huge loss of functionality and ease of use.
    Until a dual TB adapter comes out there is no way that I will upgrade my phone.
    Sorry Apple you screwed up, big time.

    This is a dumbest reason for justifying the use of iPhone: analog headphone port! And Apple don't screw up...maybe only for few of you, not the rest of the world. Remember floppy disk, optical drives in Mac and 30-pin connector in idevices? Whinyass people complained no matter what and Apple still sold millions of them more...
    The reason Apple dumped the 3.5" disk was because NOBODY WAS USING IT ANYMORE by year 2000 (it wasn't available on the iMac in 1998.) It was a standard feature on desktop computers from 1984 until 2008. CD-ROM had replaced it since 1992, and CD-R's were standard features by 2002. Memory cards/USB sticks didn't gain popularity until USB 2.0, up until that point firewire was the trend. Optical drives were dumped prematurely because at that point people were using USB sticks with larger capacity (around 2006) than cd's. Now you can get 128GB USB sticks for 25$, which are faster than burning a write-once blue-ray disc.

    Yet, at no point did anyone go "hey lets's remove the 3.5mm jack and force everyone to use USB or Bluetooth" because bluetooth is not standard fare on PC's, and Apple actually offers an digital optical S/PDIF in the 3.5mm jack, which doesn't exist in the iPhone/iPod. We've had S/PDIF available for just as long as a 24-bit TOSLINK since 1998. But here's the thing, that optical connection only supports 2 48Khz channel audio or 5.1/7.1 compressed audio. The lightning connector doesn't provide anything better than this, so what was the point?

    The tradeoffs are worse in this case, as the 3.5mm jack is universal on ALL consumer hardware. One might make the argument that it's obsolete... but "wireless" is not the correct solution whatsoever. The correct solution would have been to put TWO lightning connectors on the iPhone, or to put a usb-c connector on the iPhone for charging. Since neither happened, someone needs to be fired for this.

    All that's going to happen now is that people who want to listen to their music for 12 hours are going to have to carry around multiple bluetooth earbuds, or "battery-pack"'s will start coming with headphone jacks.
    indiekiduk
  • Apple officially ditches headphone jack for Lightning, will include adapter in iPhone 7 box

    GrizRuss said:
    The adapter should have two tails, one to go to your old headphones and one to connect the charging cable so you can charge and listen through headphones at the same time. Forget the adapter, how do you charge your phone while listening with the Lightning Ear Buds? How do the geniuses at Apple not think of that? Steve Jobs is truly dead.
    AirPods + charging is easy -- just plug in while using wireless. if both your phone and your headphones are dead, you failed, try again next time. 
    You clearly never used an iPod or iPhone.

    Wander around outside for a few hours and notice how many people have headphones/earbuds on. When I worked for a call center, that was a 10 hour shift without an opportunity to charge the phone (eventually I bought a counterfeit charger off eBay just to leave at work) and I was able to listen to the music on the phone for a complete shift and be able to listen to it on the way to and from work. That was not going to be possible with an iPhone then, or even now, so removing the 3.5mm jack now made the iPhone no longer an option for people who listen to music for long periods of time.

    5 hour earpods are not going to cut it, and neither will a phone with less than a full day's charge. This is why I keep repeating that Apple did not learn the lesson from Nintendo's clamshell GBA, nor did it learn it from the times Nokia did it either. People DO NOT WANT a single interface jack, nor do they want dongles hanging off their device. You may as well make the device twice as thick and double the battery life, because that's what you're doing by introducing a dongle of any kind.

    ewtheckman