Apple debuts $999 iPhone X with OLED Super Retina Display & Face ID authentication

1121314151618»

Comments

  • Reply 341 of 352
    cornchipcornchip Posts: 1,048member

    Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    I've pointed this out in another thread, but you really have to look past the emoji. Apple is just using emojis as a tool to show everyone what can be done with the new tech without having to invent and explain some new app. Emoji are something people can instantly understand and get on board with, meanwhile it's the developers Apple is really doing this for (besides the whole face recognition unlock thing). There will be some interesting uses for this tech, we'll just have to wait & see what ingenious developers come up with.
  • Reply 342 of 352
    melgross said:

    melgross said:

    airnerd said:
    sog35 said:
    avon b7 said:
    I like the notch. It makes a break with styling. Android phones have been there too with similiar negative opinions from some but I like this design over say, the Sharp screen with corners cut off.

    You'll have to ask me in a couple of years if I still like it, though.

    From the photos I don't like the bezel. Seems like it's wearing a bumper case but built in but as that's likely a photo problem, I'll reserve final judgment until I hold one in my hand.


    I’d love to know why Apple decided to embrace the notch. Was there no seamless way to hide it without it looking like a hack? Now that they’ve gone OLED I’m surprised we didn’t get dark mode which might have helped hide it.
    They don't need "help" to hide it, they could've just made that part of the screen black, easily, and "hid" it. That's it. They didn't, meaning they tried it and didn't like it, and you know they tried it. Go ask Alan Dye why not.
    I don’t know that they tried it. But if they prefer the notch to making that part of the screen black then I question who’s making the decisions here. Ben Bajarin who is about as pro Apple as anyone said he hopes developers don’t embrace the notch. John Gruber said the notch (especially in landscape mode) looks “goofy”. I hope Apple does some PR around this and explains their thinking behind it. I’d like to know. Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    People always have problems with new design language when it first comes out. Remember all the hand wringing about Airpod design?

    Its the same with the notch.  After a while you won't notice it at all.

    I think you are wrong about the Animoji.  Everyone who I spoke to who are NOT kids/teenagers absolutely love the feature.  
    I work with a broad mix of ages, from 23 to 60.  No one I spoke with gives one crap about animoji's.  It's a bar trick that won't be used after the first day.  
    I doubt they’re right. Animoji are already popular on a number of gaming networks that have them.
    Don't think in terms of silly animated characters.  

    Here's another cool application.  Set those silly animated emojis aside and think in terms of Facetune and other apps that allow a person to whiten their teeth and cover over blemishes.  Now imagine being able to run through that process and instead of resulting in a cleaned up static image, you get a set of rules to apply to your face in realtime.  So now when you FaceTime or go on a video chat in FB Messenger or any other app, your face is tuned up to look its best.  

    Imagine having at your disposal not just the Snapchat or FB Messenger masks, but professional clothing, suit and tie for a man for example, automatically layered onto your image.  Now you can dress appropriately for business video conference calls you take from home, while actually wearing casual clothing.  


    A bit of imagination can surface unlimited potential. 
    Or register emotion, intent, so forth. How long before Siri can interpret facial gestures? 
    I would imagine that it can now. These Animojis are duplicating facial expressions. I’ve worked on studies of this many years ago. It’s a major research area. There’s no difficulty in mapping your expressions to what we already know of them. In fact, it’s pretty simple. Apple has to want to do it. They could even decide to finally give faces to Siri, and give them expression. Apple’s Animojis are better than others I’m seeing on gaming networks, but not very much better. Apple is animating more muscles. They got all the major ones, which is why they work so well. Others use less, but I can now see them upgrading to meet Apple’s new standard. But I can also see Apple using another twenty, or so, which would give them more subtly to expressions, assuming the Animojis themselves can be seen using them. Right now, they’re not complex enough for that.
    I meant as far as actual functionality, i.e. Siri recognizes when you shake your head no or scowl back at her, and reacts accordingly. That sort of thing. I imagine that they're already working on all this though.

    Apple seeks Siri engineer with psychology background to build out deep human-computer interactions

  • Reply 343 of 352
    melgrossmelgross Posts: 30,368member
    melgross said:

    melgross said:

    airnerd said:
    sog35 said:
    avon b7 said:
    I like the notch. It makes a break with styling. Android phones have been there too with similiar negative opinions from some but I like this design over say, the Sharp screen with corners cut off.

    You'll have to ask me in a couple of years if I still like it, though.

    From the photos I don't like the bezel. Seems like it's wearing a bumper case but built in but as that's likely a photo problem, I'll reserve final judgment until I hold one in my hand.


    I’d love to know why Apple decided to embrace the notch. Was there no seamless way to hide it without it looking like a hack? Now that they’ve gone OLED I’m surprised we didn’t get dark mode which might have helped hide it.
    They don't need "help" to hide it, they could've just made that part of the screen black, easily, and "hid" it. That's it. They didn't, meaning they tried it and didn't like it, and you know they tried it. Go ask Alan Dye why not.
    I don’t know that they tried it. But if they prefer the notch to making that part of the screen black then I question who’s making the decisions here. Ben Bajarin who is about as pro Apple as anyone said he hopes developers don’t embrace the notch. John Gruber said the notch (especially in landscape mode) looks “goofy”. I hope Apple does some PR around this and explains their thinking behind it. I’d like to know. Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    People always have problems with new design language when it first comes out. Remember all the hand wringing about Airpod design?

    Its the same with the notch.  After a while you won't notice it at all.

    I think you are wrong about the Animoji.  Everyone who I spoke to who are NOT kids/teenagers absolutely love the feature.  
    I work with a broad mix of ages, from 23 to 60.  No one I spoke with gives one crap about animoji's.  It's a bar trick that won't be used after the first day.  
    I doubt they’re right. Animoji are already popular on a number of gaming networks that have them.
    Don't think in terms of silly animated characters.  

    Here's another cool application.  Set those silly animated emojis aside and think in terms of Facetune and other apps that allow a person to whiten their teeth and cover over blemishes.  Now imagine being able to run through that process and instead of resulting in a cleaned up static image, you get a set of rules to apply to your face in realtime.  So now when you FaceTime or go on a video chat in FB Messenger or any other app, your face is tuned up to look its best.  

    Imagine having at your disposal not just the Snapchat or FB Messenger masks, but professional clothing, suit and tie for a man for example, automatically layered onto your image.  Now you can dress appropriately for business video conference calls you take from home, while actually wearing casual clothing.  


    A bit of imagination can surface unlimited potential. 
    Or register emotion, intent, so forth. How long before Siri can interpret facial gestures? 
    I would imagine that it can now. These Animojis are duplicating facial expressions. I’ve worked on studies of this many years ago. It’s a major research area. There’s no difficulty in mapping your expressions to what we already know of them. In fact, it’s pretty simple. Apple has to want to do it. They could even decide to finally give faces to Siri, and give them expression. Apple’s Animojis are better than others I’m seeing on gaming networks, but not very much better. Apple is animating more muscles. They got all the major ones, which is why they work so well. Others use less, but I can now see them upgrading to meet Apple’s new standard. But I can also see Apple using another twenty, or so, which would give them more subtly to expressions, assuming the Animojis themselves can be seen using them. Right now, they’re not complex enough for that.
    I meant as far as actual functionality, i.e. Siri recognizes when you shake your head no or scowl back at her, and reacts accordingly. That sort of thing. I imagine that they're already working on all this though.

    Apple seeks Siri engineer with psychology background to build out deep human-computer interactions

    The Japanese have been working on this for years, with robots with very realistic heads and expressions. The Japanese are really big on this because of their cultural limitations. Not so much here. But I’ve always thought it would be interesting to see a face on the screen that we would be talking to rather than just the disembodied voice, as now.

    it would be interesting if the conversation would have an emotional content on both ends.
    fastasleep
  • Reply 344 of 352
    gatorguygatorguy Posts: 18,589member
    melgross said:
    melgross said:

    melgross said:

    airnerd said:
    sog35 said:
    avon b7 said:
    I like the notch. It makes a break with styling. Android phones have been there too with similiar negative opinions from some but I like this design over say, the Sharp screen with corners cut off.

    You'll have to ask me in a couple of years if I still like it, though.

    From the photos I don't like the bezel. Seems like it's wearing a bumper case but built in but as that's likely a photo problem, I'll reserve final judgment until I hold one in my hand.


    I’d love to know why Apple decided to embrace the notch. Was there no seamless way to hide it without it looking like a hack? Now that they’ve gone OLED I’m surprised we didn’t get dark mode which might have helped hide it.
    They don't need "help" to hide it, they could've just made that part of the screen black, easily, and "hid" it. That's it. They didn't, meaning they tried it and didn't like it, and you know they tried it. Go ask Alan Dye why not.
    I don’t know that they tried it. But if they prefer the notch to making that part of the screen black then I question who’s making the decisions here. Ben Bajarin who is about as pro Apple as anyone said he hopes developers don’t embrace the notch. John Gruber said the notch (especially in landscape mode) looks “goofy”. I hope Apple does some PR around this and explains their thinking behind it. I’d like to know. Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    People always have problems with new design language when it first comes out. Remember all the hand wringing about Airpod design?

    Its the same with the notch.  After a while you won't notice it at all.

    I think you are wrong about the Animoji.  Everyone who I spoke to who are NOT kids/teenagers absolutely love the feature.  
    I work with a broad mix of ages, from 23 to 60.  No one I spoke with gives one crap about animoji's.  It's a bar trick that won't be used after the first day.  
    I doubt they’re right. Animoji are already popular on a number of gaming networks that have them.
    Don't think in terms of silly animated characters.  

    Here's another cool application.  Set those silly animated emojis aside and think in terms of Facetune and other apps that allow a person to whiten their teeth and cover over blemishes.  Now imagine being able to run through that process and instead of resulting in a cleaned up static image, you get a set of rules to apply to your face in realtime.  So now when you FaceTime or go on a video chat in FB Messenger or any other app, your face is tuned up to look its best.  

    Imagine having at your disposal not just the Snapchat or FB Messenger masks, but professional clothing, suit and tie for a man for example, automatically layered onto your image.  Now you can dress appropriately for business video conference calls you take from home, while actually wearing casual clothing.  


    A bit of imagination can surface unlimited potential. 
    Or register emotion, intent, so forth. How long before Siri can interpret facial gestures? 
    I would imagine that it can now. These Animojis are duplicating facial expressions. I’ve worked on studies of this many years ago. It’s a major research area. There’s no difficulty in mapping your expressions to what we already know of them. In fact, it’s pretty simple. Apple has to want to do it. They could even decide to finally give faces to Siri, and give them expression. Apple’s Animojis are better than others I’m seeing on gaming networks, but not very much better. Apple is animating more muscles. They got all the major ones, which is why they work so well. Others use less, but I can now see them upgrading to meet Apple’s new standard. But I can also see Apple using another twenty, or so, which would give them more subtly to expressions, assuming the Animojis themselves can be seen using them. Right now, they’re not complex enough for that.
    I meant as far as actual functionality, i.e. Siri recognizes when you shake your head no or scowl back at her, and reacts accordingly. That sort of thing. I imagine that they're already working on all this though.

    Apple seeks Siri engineer with psychology background to build out deep human-computer interactions

    The Japanese have been working on this for years, with robots with very realistic heads and expressions. The Japanese are really big on this because of their cultural limitations. Not so much here. But I’ve always thought it would be interesting to see a face on the screen that we would be talking to rather than just the disembodied voice, as now.

    edited September 2017 fastasleep
  • Reply 345 of 352
    avon b7avon b7 Posts: 2,202member
    gatorguy said:
    melgross said:
    melgross said:

    melgross said:

    airnerd said:
    sog35 said:
    avon b7 said:
    I like the notch. It makes a break with styling. Android phones have been there too with similiar negative opinions from some but I like this design over say, the Sharp screen with corners cut off.

    You'll have to ask me in a couple of years if I still like it, though.

    From the photos I don't like the bezel. Seems like it's wearing a bumper case but built in but as that's likely a photo problem, I'll reserve final judgment until I hold one in my hand.


    I’d love to know why Apple decided to embrace the notch. Was there no seamless way to hide it without it looking like a hack? Now that they’ve gone OLED I’m surprised we didn’t get dark mode which might have helped hide it.
    They don't need "help" to hide it, they could've just made that part of the screen black, easily, and "hid" it. That's it. They didn't, meaning they tried it and didn't like it, and you know they tried it. Go ask Alan Dye why not.
    I don’t know that they tried it. But if they prefer the notch to making that part of the screen black then I question who’s making the decisions here. Ben Bajarin who is about as pro Apple as anyone said he hopes developers don’t embrace the notch. John Gruber said the notch (especially in landscape mode) looks “goofy”. I hope Apple does some PR around this and explains their thinking behind it. I’d like to know. Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    People always have problems with new design language when it first comes out. Remember all the hand wringing about Airpod design?

    Its the same with the notch.  After a while you won't notice it at all.

    I think you are wrong about the Animoji.  Everyone who I spoke to who are NOT kids/teenagers absolutely love the feature.  
    I work with a broad mix of ages, from 23 to 60.  No one I spoke with gives one crap about animoji's.  It's a bar trick that won't be used after the first day.  
    I doubt they’re right. Animoji are already popular on a number of gaming networks that have them.
    Don't think in terms of silly animated characters.  

    Here's another cool application.  Set those silly animated emojis aside and think in terms of Facetune and other apps that allow a person to whiten their teeth and cover over blemishes.  Now imagine being able to run through that process and instead of resulting in a cleaned up static image, you get a set of rules to apply to your face in realtime.  So now when you FaceTime or go on a video chat in FB Messenger or any other app, your face is tuned up to look its best.  

    Imagine having at your disposal not just the Snapchat or FB Messenger masks, but professional clothing, suit and tie for a man for example, automatically layered onto your image.  Now you can dress appropriately for business video conference calls you take from home, while actually wearing casual clothing.  


    A bit of imagination can surface unlimited potential. 
    Or register emotion, intent, so forth. How long before Siri can interpret facial gestures? 
    I would imagine that it can now. These Animojis are duplicating facial expressions. I’ve worked on studies of this many years ago. It’s a major research area. There’s no difficulty in mapping your expressions to what we already know of them. In fact, it’s pretty simple. Apple has to want to do it. They could even decide to finally give faces to Siri, and give them expression. Apple’s Animojis are better than others I’m seeing on gaming networks, but not very much better. Apple is animating more muscles. They got all the major ones, which is why they work so well. Others use less, but I can now see them upgrading to meet Apple’s new standard. But I can also see Apple using another twenty, or so, which would give them more subtly to expressions, assuming the Animojis themselves can be seen using them. Right now, they’re not complex enough for that.
    I meant as far as actual functionality, i.e. Siri recognizes when you shake your head no or scowl back at her, and reacts accordingly. That sort of thing. I imagine that they're already working on all this though.

    Apple seeks Siri engineer with psychology background to build out deep human-computer interactions

    The Japanese have been working on this for years, with robots with very realistic heads and expressions. The Japanese are really big on this because of their cultural limitations. Not so much here. But I’ve always thought it would be interesting to see a face on the screen that we would be talking to rather than just the disembodied voice, as now.

    Ha! And there I was thinking of Holly from Red Dwarf!
    fastasleep
  • Reply 346 of 352
    melgrossmelgross Posts: 30,368member
    sog35 said:


    Wow, this from CNN.  Seems some are beginning to shift to the correct terminology.  I'm impressed.  Maybe the tech sites will review the terms too as it becomes apparent the differing contexts the two terms - face recognition versus facial recognition - apply to.



    could you explain the difference?

    I don't know what's the big deal between face and facial

    Face recognition is the term used to describe the process of identifying a specific person, such as from a database of known persons (no fly list, for example).  This is face recognition.

    There's also face detection, which is the process of detecting the elements of a human face within a scene.  This is typically a precursor to application of face recognition algorithms, used to identify the owner of a face in a scene.

    Then there's facial recognition, which is the process of detecting specific facial expressions (smiling, frowning, sadness, etc).  This term is often used in the medical world to characterize specific inabilities of patients to recognize meaning in human faces.  Or, I suppose, one could use the term facial recognition to mean the detection of someone who has recently come from a spa treatment appointment.  (Kidding.)


    This whole misunderstanding of the terms is what's causing the debate over biometric identification/authentication.  FACIAL recognition is not FACE recognition.  It can be used as a step in a face recognition process, but facial recognition is simply the process of identifying each feature of a face; eyes, nose, mouth, smiling, frowning, etc, but not used to determine whose face it is.

    Facial recognition returns the result, here is the mouth and it is smilin,' which is useful to map that feature onto the face of an avatar or game character, or useful in mapping an overlay onto the person's actual face, ala Snapchat.  But facial recognition does NOT return the result, 'this is Phil Schiller's face.'  That biometric identification step is done by a method called face recognition.  I know, I know, they sound the same.  But facial recognition and face recognition, and face detection, are all three different things.  Apple's acquisition of Faceshift and others suggests Apple will be employing facial recognition for AR apps, NOT for biometric identification.  Face recognition, as stated by Phil Schiller at 1:29:40... in the event, is the correct term to describe what's happening no behind FaceID.

    I noticed that, in a current article about Apple’s solution vs other’s solutions, where the writer does say that Apple’s is better, the terms face recognition, and facial recognition are used interchangeably throughout the article. He starts with face, uses it a few times, then changes to facial, and then back to face.

    it seems to me that even someone writing an article in a major publication forgets the difference between those two terms. It’s odd that he does it within the same article, and that the editor, because, yes, in those major publications, editors do actually read articles before publication, and fix errors, didn’t catch it, likely because the editor doesn’t understand the differences either.
  • Reply 347 of 352
    melgrossmelgross Posts: 30,368member

    gatorguy said:
    melgross said:
    melgross said:

    melgross said:

    airnerd said:
    sog35 said:
    avon b7 said:
    I like the notch. It makes a break with styling. Android phones have been there too with similiar negative opinions from some but I like this design over say, the Sharp screen with corners cut off.

    You'll have to ask me in a couple of years if I still like it, though.

    From the photos I don't like the bezel. Seems like it's wearing a bumper case but built in but as that's likely a photo problem, I'll reserve final judgment until I hold one in my hand.


    I’d love to know why Apple decided to embrace the notch. Was there no seamless way to hide it without it looking like a hack? Now that they’ve gone OLED I’m surprised we didn’t get dark mode which might have helped hide it.
    They don't need "help" to hide it, they could've just made that part of the screen black, easily, and "hid" it. That's it. They didn't, meaning they tried it and didn't like it, and you know they tried it. Go ask Alan Dye why not.
    I don’t know that they tried it. But if they prefer the notch to making that part of the screen black then I question who’s making the decisions here. Ben Bajarin who is about as pro Apple as anyone said he hopes developers don’t embrace the notch. John Gruber said the notch (especially in landscape mode) looks “goofy”. I hope Apple does some PR around this and explains their thinking behind it. I’d like to know. Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    People always have problems with new design language when it first comes out. Remember all the hand wringing about Airpod design?

    Its the same with the notch.  After a while you won't notice it at all.

    I think you are wrong about the Animoji.  Everyone who I spoke to who are NOT kids/teenagers absolutely love the feature.  
    I work with a broad mix of ages, from 23 to 60.  No one I spoke with gives one crap about animoji's.  It's a bar trick that won't be used after the first day.  
    I doubt they’re right. Animoji are already popular on a number of gaming networks that have them.
    Don't think in terms of silly animated characters.  

    Here's another cool application.  Set those silly animated emojis aside and think in terms of Facetune and other apps that allow a person to whiten their teeth and cover over blemishes.  Now imagine being able to run through that process and instead of resulting in a cleaned up static image, you get a set of rules to apply to your face in realtime.  So now when you FaceTime or go on a video chat in FB Messenger or any other app, your face is tuned up to look its best.  

    Imagine having at your disposal not just the Snapchat or FB Messenger masks, but professional clothing, suit and tie for a man for example, automatically layered onto your image.  Now you can dress appropriately for business video conference calls you take from home, while actually wearing casual clothing.  


    A bit of imagination can surface unlimited potential. 
    Or register emotion, intent, so forth. How long before Siri can interpret facial gestures? 
    I would imagine that it can now. These Animojis are duplicating facial expressions. I’ve worked on studies of this many years ago. It’s a major research area. There’s no difficulty in mapping your expressions to what we already know of them. In fact, it’s pretty simple. Apple has to want to do it. They could even decide to finally give faces to Siri, and give them expression. Apple’s Animojis are better than others I’m seeing on gaming networks, but not very much better. Apple is animating more muscles. They got all the major ones, which is why they work so well. Others use less, but I can now see them upgrading to meet Apple’s new standard. But I can also see Apple using another twenty, or so, which would give them more subtly to expressions, assuming the Animojis themselves can be seen using them. Right now, they’re not complex enough for that.
    I meant as far as actual functionality, i.e. Siri recognizes when you shake your head no or scowl back at her, and reacts accordingly. That sort of thing. I imagine that they're already working on all this though.

    Apple seeks Siri engineer with psychology background to build out deep human-computer interactions

    The Japanese have been working on this for years, with robots with very realistic heads and expressions. The Japanese are really big on this because of their cultural limitations. Not so much here. But I’ve always thought it would be interesting to see a face on the screen that we would be talking to rather than just the disembodied voice, as now.

    That was a great show, but it was just for fun, and an attempt to show some of what the future might be like.

    but the Japanese are very serious about this.
  • Reply 348 of 352
    asdasd said:
    the presentation seemed a bit off today, lacking in something. 

    The X is good but maybe not as good as it could have been if they had gotten touchID to work. I actually think that the ARKit and the emotes will be big on both new devices. 
    Tim Cook appeared very nervous, especially when compared to other presenters, and he loosened up only slightly as he exited and returned. I kept wishing that the supremely confident Jonathan Ive were hosting the event.


  • Reply 349 of 352
    Caocao888 said:
    asdasd said:
    the presentation seemed a bit off today, lacking in something. 

    The X is good but maybe not as good as it could have been if they had gotten touchID to work. I actually think that the ARKit and the emotes will be big on both new devices. 
    Tim Cook appeared very nervous, especially when compared to other presenters, and he loosened up only slightly as he exited and returned. I kept wishing that the supremely confident Jonathan Ive were hosting the event.
    Ha ha?
  • Reply 350 of 352
    If memory serves me right didn't Apple say they were going to be releasing a iPhone SE 2nd generation in sometime March of next year 2018.
    Why would they have said that? And why would they do it? The SE exists for the people who desire a smaller phone, but Apple obviously doesn’t care about that or they’d’ve made a new 4” phone with a case that looked like the iPhone 7’s when it was first launched.
    Sorry for the late reply but this has been a busy year and hardly no time to post.

    This was what I was talking about. I knew I was right on the money, just had taken a few minutes to look it up.

    http://appleinsider.com/articles/17/08/04/apple-said-to-be-targeting-march-quarter-2018-launch-for-second-gen-iphone-se



    edited November 2017 tallest skil
  • Reply 351 of 352
    This was what I was talking about. I knew I was right on the money, just had taken a few minutes to look it up. http://appleinsider.com/articles/17/08/04/apple-said-to-be-targeting-march-quarter-2018-launch-for-second-gen-iphone-se
    Huh! Thanks. That’s a weird article, particularly since they’ve been updating the iPhone SE’s internals alongside new iPhone releases already. It wouldn’t necessarily be a 2nd gen since we’ve already had that. It does get confusing to keep the same name and swap out internals, though.
  • Reply 352 of 352
    melgross said:

    melgross said:

    airnerd said:
    sog35 said:
    avon b7 said:
    I like the notch. It makes a break with styling. Android phones have been there too with similiar negative opinions from some but I like this design over say, the Sharp screen with corners cut off.

    You'll have to ask me in a couple of years if I still like it, though.

    From the photos I don't like the bezel. Seems like it's wearing a bumper case but built in but as that's likely a photo problem, I'll reserve final judgment until I hold one in my hand.


    I’d love to know why Apple decided to embrace the notch. Was there no seamless way to hide it without it looking like a hack? Now that they’ve gone OLED I’m surprised we didn’t get dark mode which might have helped hide it.
    They don't need "help" to hide it, they could've just made that part of the screen black, easily, and "hid" it. That's it. They didn't, meaning they tried it and didn't like it, and you know they tried it. Go ask Alan Dye why not.
    I don’t know that they tried it. But if they prefer the notch to making that part of the screen black then I question who’s making the decisions here. Ben Bajarin who is about as pro Apple as anyone said he hopes developers don’t embrace the notch. John Gruber said the notch (especially in landscape mode) looks “goofy”. I hope Apple does some PR around this and explains their thinking behind it. I’d like to know. Also while I think the animated emoji look cool and will be a hit with certain demographics it is odd that Apple chose it as one of the main selling features of a $1000 phone. Does Apple really think parents are going to be shelling out $1000 to buy their teenage kids the X? I can’t see adults using this feature much.
    People always have problems with new design language when it first comes out. Remember all the hand wringing about Airpod design?

    Its the same with the notch.  After a while you won't notice it at all.

    I think you are wrong about the Animoji.  Everyone who I spoke to who are NOT kids/teenagers absolutely love the feature.  
    I work with a broad mix of ages, from 23 to 60.  No one I spoke with gives one crap about animoji's.  It's a bar trick that won't be used after the first day.  
    I doubt they’re right. Animoji are already popular on a number of gaming networks that have them.
    Don't think in terms of silly animated characters.  

    Here's another cool application.  Set those silly animated emojis aside and think in terms of Facetune and other apps that allow a person to whiten their teeth and cover over blemishes.  Now imagine being able to run through that process and instead of resulting in a cleaned up static image, you get a set of rules to apply to your face in realtime.  So now when you FaceTime or go on a video chat in FB Messenger or any other app, your face is tuned up to look its best.  

    Imagine having at your disposal not just the Snapchat or FB Messenger masks, but professional clothing, suit and tie for a man for example, automatically layered onto your image.  Now you can dress appropriately for business video conference calls you take from home, while actually wearing casual clothing.  


    A bit of imagination can surface unlimited potential. 
    Or register emotion, intent, so forth. How long before Siri can interpret facial gestures? 
    I would imagine that it can now. These Animojis are duplicating facial expressions. I’ve worked on studies of this many years ago. It’s a major research area. There’s no difficulty in mapping your expressions to what we already know of them. In fact, it’s pretty simple. Apple has to want to do it. They could even decide to finally give faces to Siri, and give them expression. Apple’s Animojis are better than others I’m seeing on gaming networks, but not very much better. Apple is animating more muscles. They got all the major ones, which is why they work so well. Others use less, but I can now see them upgrading to meet Apple’s new standard. But I can also see Apple using another twenty, or so, which would give them more subtly to expressions, assuming the Animojis themselves can be seen using them. Right now, they’re not complex enough for that.
    I meant as far as actual functionality, i.e. Siri recognizes when you shake your head no or scowl back at her, and reacts accordingly. That sort of thing. I imagine that they're already working on all this though.

    Apple seeks Siri engineer with psychology background to build out deep human-computer interactions

    I remember something about a computer psychologist in an Isaac Asimov novel.
Sign In or Register to comment.