roake
About
- Username
- roake
- Joined
- Visits
- 151
- Last Active
- Roles
- member
- Points
- 1,477
- Badges
- 1
- Posts
- 821
Reactions
-
Green texts in iMessages nudge teens to use iPhones
dmorgan said:lkrupp said:SMS is SMS is SMS. Apple’s iMessages is their proprietary system and I think they are under no obligation to produce cross-platform versions of it unless they choose to, like Apple Music. Calling it anti-competitive or anti-trust is just another red herring hoisted up the flagpole to see if anyone salutes.
Critics want Apple to be just another tech company. We all should know that by now. -
Green texts in iMessages nudge teens to use iPhones
-
Everything we know about the 'iPhone 14 Pro' so far
georgie01 said:Fred257 said:I’ve switched from iPhone 12 mini to Google pixel and I couldn’t be happier. Battery life is amazing. Voice dictation is perfect and faster then ever. Tasks are incredibly fast. Always on display is amazing. Weather in Lock Screen is perfect and I strongly needed it. Touch ID during the pandemic is a complete necessity for me and much faster then Face ID. I’m afraid that I’m now a convert to android. I’ve been an apple phone fan since the 3g. Apple is using an old compartmentalized notion that you need each separate thing for a complete Apple experience. It’s an old outdated model that needs to be reconstructed. I’m not going to buy an Apple Watch to unlock my iPhone with a mask. I’m not going to have my iPad side by side with my computer. Apple is falling behind with everything accept the Mac and Apokecwatch. The iPhone and iPad are total crap these days. I updated from first version of SE to the 12 mini. What a depressing upgrade. Goodbye Apple iPhone. Your years ahead thinking and sollidifying of your products a year or two out needs to be redone. Apples compartmentalization is destroying Apples products strength. The new MacBook pros are fantastic and I have one. Cheers
I’m also not following your argument. You don’t need an Apple Watch to use an iPhone, nor an iPad to use a MacBook, etc. They aren’t incomplete devices in need of each other. You could say there isn’t much overlap between things like the iPad and MacBook, but many people see that ‘compartmentalisation’ as a good thing.
Not sure how having the weather on the Lock Screen is something anyone strongly needs—does that extra few seconds to unlock your device really make or break you? Or even just (gasp) stepping outside? Seems like a trite ‘need’.
As far as mask wearing, you could always move somewhere that doesn’t subscribe to the false idea that cloth masks are statistically anything more than virtue signalling and a security blanket…?Seriously though, during the time my area required masks adapting to FaceID was a non-issue. Pulling the mask down for 2 seconds to unlock your phone is surely safer than removing it entirely to eat…
All of that to say that I too love the Apple Watch unlock feature for people wearing masks. I wear a mask everywhere, but not a “fashionable” cloth one. Never be ashamed for doing the smart thing.
Source: I’m an ICU Physician and reluctant COVID-19 expert that watches people linger on life support and witness firsthand *many* COVID-19 deaths every week despite everything we can do.
TD;DR: Cloth masks don’t work, but wear a mask. Vaccines don’t prevent you from getting COVID, but do prevent it from killing you. Apple Watch iPhone unlock with masks is awesome. -
Apple AR headset, new Mac Pro and more expected in 2022
OutdoorAppDeveloper said:Hopefully the main focus of Apple's AR glasses will not not be games. The real value of AR glasses is that they can give the user augmented intelligence. Using AI and computational vision, AR glasses could give you a better understanding of the world around you. A simple example would be the ability to zoom in to see distant objects. Computer vision has exceeded the ability of human eyesight for quite a while now. Glasses that let you see in the dark, find objects you are looking for, look around corners or behind solid objects would quickly become essential items. Tagging of objects and locations becomes possible as well. You could select Yelp reviews and see what people are saying about a restaurant you are looking at. You could select a nature channel and see information about each plant or animal you can see. Like all new technology, AR will have its down sides as well. It is not clear how you build AR glasses without some kind of camera built in and walking around with an always on camera is still not socially acceptable.
Your overall points about the technology are valid, and I don’t disagree with them, but let us pay proper respects to human eyesight.
Human eyesight is a fantastically incredible mechanism with so much “processing” and “features” that us physicians can only being to fathom it the most basic parts of it. The eyeball is only a small part of human eyesight. It lets the light in, and does some simple analog tricks to separate out frequency and different fields of your vision. It does this by refracting parts of the image onto parts of the retina, with different receptors (many lay people hear about rods and cones, but that’s the tip of the iceberg for “processing” in the eyeball). The information then gets essential converted to digital. There are several layers that the image overlays, and different nerve fibers carry this off to different sectors, criss-crossing being the eyes like some complex freeway exchange, and racing through so many ancillary processing systems on their “optic” processing that boggles the mind. This images can activate thousands of different brain responses before the “seeing” processing in the back part of the brain even takes place, depending on what is coming in through the eye. While I refer to this as an image, the processing utilizes far more information taken in through the eye than what you may think of as an “image.” As this information is being processed, it is analyzed for a fantastic amount of information before an image is produced, best we can tell. These centers can identify basic threats to safety, process your position in 3D space, feeds realtime movement information to your balance centers and separately to your multiple threat systems, depending on what “pre-processing” has so far taking from the input. All this happens in the more basic centers far before you “see” the images. There are far too functions here to even brush the surface. There are many, many thick tomes on the topic that cause sleepless nights to medical students, specialized neurologists, scientists/researchers, and the like. Those are just basic processing features. Then, when your brain sends a process image to your higher brain, the spectrum of possibilities of further processing and responses bloom almost into the infinite. All this happens in the tiniest fraction of time before you even get the chance to consciously register the image. Eyesight processes longitudinally over time as well, not just “snapshots.” Eyesight is processing of fluid data streaming in continuously, not just “frames.” These things are really remarkable, and we barely understand enough to even know that they are there to research. Human eyesight can process something you have NEVER seen before and stratify a threat level and cross categorize it using so many variables that you can react reasonably to it within a fraction of a second without having a damned clue what it is.
Computerized “vision” is mostly made up of really cool, but really simple tricks that do things like shift wavelengths or simple magnification that allow us to see things that we would not normal register. This shifted light information is then dumped into the eye for the real processing. Identification of objects using “machine learning” is another really amazing trick, but isn’t one-millionth as incredible as human eyesight and processing. The computer’s “sight” is extremely fundamental, and an algorithm essentially pattern-matches something in an image to a known database. These mechanisms are getting very sophisticated, and they seem like magic to some people.
While I LOVE the things that AR are bringing our way, I would argue that computer vision has not exceeded the ability of human eyesight, and probably never will. What it will do is augment our vision by dropping some extra data into the eyeball that human eyesight will process. I think that we will see all the things you discussed coming into play in the very near future, and I’ll be one of the first to get on board, especially if it’s an Apple product that I know will be very well supported. -
Apple will alert users exposed to state-sponsored spyware attacks
Anilu_777 said:“Install apps from the App Store”. Either this is a duh don’t jailbreak your phone if you’re a journalist in a dangerous country or a prelude to possible alternate App Stores ¯\_(ツ)_/¯