- Last Active
My two major issues are:
1. Did Apple just cripple the “world phone” capabilities? There are very good reasons to change sims when you travel to certain countries, such as a) local plans are FAR cheaper in most cases, b) better access to cell networks - in some countries you theoretically have access on the mainstream networks, but in some areas only the small networks have good coverage (and AT&T, T-mobile, etc. isn’t necessarily supported), and c) having a local number which reduces confusion and increases convenience for locals (such as family or clients) to call you. I thought the 1 physical + e-sim was the most flexible option. I wonder if they will have another version for some countries that offer the physical sim slot.
2. Is the max storage 512gb when it was previously 1TB?!? They announced storage “up to a terabyte.” But in both the order screens and the tech specs for the iPhone 14 Pro Max, there are maximum storage options of 512gb. My trusty old iPhone 13 Pro Max has 1TB of storage. Maybe these are errors on the site that they will correct.
Other than those issues, I feel like this is a decent upgrade over the 13. I especially like the new cameras.
i would still love to see under-the-display Touch ID (with a “poison finger” option).
I would also love to have the ability to utilize the Apple Pencil’s features with the iPhone. I know there are things they have said to suggest it’s not coming, but I still want it.
Off-topic, argh! Why does Appleinsider’s text box for entering posts always scroll to the top when we select a word by double-tapping, then try to move a selection dot to include more text (ex. select an entire sentence)? This makes editing painful and, at least for me, discourages posting. This only occurs for me on Appleinsider.com. Is there a solution to this issue on the iPhone?
UPDATE #1: Storage issue no longer an issue
Now I see the 1TB option. Either I did something wonky the first time around, or the page was corrected. Could be either!
UPDATE #2: WTF, Apple!?!
Taken directly from the FAQ at the bottom of the official Apple iPhone 14 Pro page: "iPhone 14 and iPhone 14 Pro models will arrive ready to activate with eSIM.* Note, these models cannot activate with a physical SIM." and "*Not all carriers support eSIM. Use of eSIM in iPhone may be disabled when purchased from some carriers."
I did see someone stating the "international version" would have a physical SIM tray and not be limited to eSIM only. Is that true? If so, how do I get THAT version of the phone? I travel!
I made a new post with just that latter info and question just to highlight it.
As I write this, I’m uncertain what the new sensors are (the article does not yet say, and I cannot current watch the presentation), but I’m excited to learn. Probably temperature and …what else?As a physician, I have been very impressed with the Apple Watch. I love the idea of Apple adding to the sensor platform to monitor health parameters.
Physicians call vital signs “vital” for a reason. Simple parameters such as temperature, blood pressure, heart rate, breathing rate, and oxygen levels convey critical information.
Apple is gradually adding the ability to nearly continuously monitor vital signs. They also monitor additional information, such as heart rhythm. What makes Apple stand out is the high quality of the measurements and Apple’s development of extremely clever algorithms to extrapolate additional information from the data they measure, far more than the sum of the parts. When they are able to add blood pressure monitoring, glucose monitoring, etc., we will see something pretty amazing as the result. I can imagine Siri being able to tell you that you are “diagnose” dehydration, respiratory/lung problems, severe infection (sepsis), sleep apnea, seizures, orthostatic hypotension (blood pressure dropping when you stand up), early Parkinson‘a disease, and that’s just the tip of the iceberg. Each sensor they add could increase the potential issues that the watch/iPhone could detect in an exponential fashion.
TL;DR - Doctors love this crap!
HYPER HyperJuice Battery
User Manual with more info
- 27,000 mAh Battery Capacity
- Up to 245W of Power Output
- 2 x 100W USB Type-C Ports
- 2 x 65W USB Type-C Ports
- 100W Passthrough USB Charging
- OLED Indicator
- PD, Qualcomm 4.0+ & PPS Compatible
- 3' USB Type-C Cable Included
- Dimensions (W x H x D): 7.5 x 3.2 x 1.1" / 191 x 81 x 28 mm
- Weight: 765g / 1.69lbs.
georgie01 said:Fred257 said:I’ve switched from iPhone 12 mini to Google pixel and I couldn’t be happier. Battery life is amazing. Voice dictation is perfect and faster then ever. Tasks are incredibly fast. Always on display is amazing. Weather in Lock Screen is perfect and I strongly needed it. Touch ID during the pandemic is a complete necessity for me and much faster then Face ID. I’m afraid that I’m now a convert to android. I’ve been an apple phone fan since the 3g. Apple is using an old compartmentalized notion that you need each separate thing for a complete Apple experience. It’s an old outdated model that needs to be reconstructed. I’m not going to buy an Apple Watch to unlock my iPhone with a mask. I’m not going to have my iPad side by side with my computer. Apple is falling behind with everything accept the Mac and Apokecwatch. The iPhone and iPad are total crap these days. I updated from first version of SE to the 12 mini. What a depressing upgrade. Goodbye Apple iPhone. Your years ahead thinking and sollidifying of your products a year or two out needs to be redone. Apples compartmentalization is destroying Apples products strength. The new MacBook pros are fantastic and I have one. Cheers
I’m also not following your argument. You don’t need an Apple Watch to use an iPhone, nor an iPad to use a MacBook, etc. They aren’t incomplete devices in need of each other. You could say there isn’t much overlap between things like the iPad and MacBook, but many people see that ‘compartmentalisation’ as a good thing.
Not sure how having the weather on the Lock Screen is something anyone strongly needs—does that extra few seconds to unlock your device really make or break you? Or even just (gasp) stepping outside? Seems like a trite ‘need’.
As far as mask wearing, you could always move somewhere that doesn’t subscribe to the false idea that cloth masks are statistically anything more than virtue signalling and a security blanket…? Seriously though, during the time my area required masks adapting to FaceID was a non-issue. Pulling the mask down for 2 seconds to unlock your phone is surely safer than removing it entirely to eat…
All of that to say that I too love the Apple Watch unlock feature for people wearing masks. I wear a mask everywhere, but not a “fashionable” cloth one. Never be ashamed for doing the smart thing.
Source: I’m an ICU Physician and reluctant COVID-19 expert that watches people linger on life support and witness firsthand *many* COVID-19 deaths every week despite everything we can do.
TD;DR: Cloth masks don’t work, but wear a mask. Vaccines don’t prevent you from getting COVID, but do prevent it from killing you. Apple Watch iPhone unlock with masks is awesome.
OutdoorAppDeveloper said:Hopefully the main focus of Apple's AR glasses will not not be games. The real value of AR glasses is that they can give the user augmented intelligence. Using AI and computational vision, AR glasses could give you a better understanding of the world around you. A simple example would be the ability to zoom in to see distant objects. Computer vision has exceeded the ability of human eyesight for quite a while now. Glasses that let you see in the dark, find objects you are looking for, look around corners or behind solid objects would quickly become essential items. Tagging of objects and locations becomes possible as well. You could select Yelp reviews and see what people are saying about a restaurant you are looking at. You could select a nature channel and see information about each plant or animal you can see. Like all new technology, AR will have its down sides as well. It is not clear how you build AR glasses without some kind of camera built in and walking around with an always on camera is still not socially acceptable.
Your overall points about the technology are valid, and I don’t disagree with them, but let us pay proper respects to human eyesight.
Human eyesight is a fantastically incredible mechanism with so much “processing” and “features” that us physicians can only being to fathom it the most basic parts of it. The eyeball is only a small part of human eyesight. It lets the light in, and does some simple analog tricks to separate out frequency and different fields of your vision. It does this by refracting parts of the image onto parts of the retina, with different receptors (many lay people hear about rods and cones, but that’s the tip of the iceberg for “processing” in the eyeball). The information then gets essential converted to digital. There are several layers that the image overlays, and different nerve fibers carry this off to different sectors, criss-crossing being the eyes like some complex freeway exchange, and racing through so many ancillary processing systems on their “optic” processing that boggles the mind. This images can activate thousands of different brain responses before the “seeing” processing in the back part of the brain even takes place, depending on what is coming in through the eye. While I refer to this as an image, the processing utilizes far more information taken in through the eye than what you may think of as an “image.” As this information is being processed, it is analyzed for a fantastic amount of information before an image is produced, best we can tell. These centers can identify basic threats to safety, process your position in 3D space, feeds realtime movement information to your balance centers and separately to your multiple threat systems, depending on what “pre-processing” has so far taking from the input. All this happens in the more basic centers far before you “see” the images. There are far too functions here to even brush the surface. There are many, many thick tomes on the topic that cause sleepless nights to medical students, specialized neurologists, scientists/researchers, and the like. Those are just basic processing features. Then, when your brain sends a process image to your higher brain, the spectrum of possibilities of further processing and responses bloom almost into the infinite. All this happens in the tiniest fraction of time before you even get the chance to consciously register the image. Eyesight processes longitudinally over time as well, not just “snapshots.” Eyesight is processing of fluid data streaming in continuously, not just “frames.” These things are really remarkable, and we barely understand enough to even know that they are there to research. Human eyesight can process something you have NEVER seen before and stratify a threat level and cross categorize it using so many variables that you can react reasonably to it within a fraction of a second without having a damned clue what it is.
Computerized “vision” is mostly made up of really cool, but really simple tricks that do things like shift wavelengths or simple magnification that allow us to see things that we would not normal register. This shifted light information is then dumped into the eye for the real processing. Identification of objects using “machine learning” is another really amazing trick, but isn’t one-millionth as incredible as human eyesight and processing. The computer’s “sight” is extremely fundamental, and an algorithm essentially pattern-matches something in an image to a known database. These mechanisms are getting very sophisticated, and they seem like magic to some people.
While I LOVE the things that AR are bringing our way, I would argue that computer vision has not exceeded the ability of human eyesight, and probably never will. What it will do is augment our vision by dropping some extra data into the eyeball that human eyesight will process. I think that we will see all the things you discussed coming into play in the very near future, and I’ll be one of the first to get on board, especially if it’s an Apple product that I know will be very well supported.