A look at Apple's secretive strategies set to unfold at WWDC 2018

Posted:
in iPhone edited May 2018
Apple's annual Worldwide Developer Conference is often mistaken for a hardware product launch event. However, the main purpose of WWDC is to introduce Apple's latest software enhancements and tools to the third party developers who use them to create apps for Apple's platforms. However, both hardware and software play a role in advancing Apple's future plans--and WWDC serves as the company's unfurling roadmap of strategic direction. Here's a look at what we've already seen and what we can expect next week.


Apple's evolving development strategies at WWDC

WWDC has always sought to offer Apple's third party developers deeper insight into the company's overall strategic directions for its software platforms, largely with the purpose of convincing them to invest their time and effort into building apps on top.

At the same time, WWDC has also debuted new hardware introductions and refreshes, such as last year's iMac Pro and the new HomePod. The latter had very little apparent relevance for third party development. However, there are previous examples of Apple outlining the beginnings of a strategy at WWDC that didn't fully emerge for several years.

Twenty years ago at WWDC 1998, Steve Jobs detailed Apple's plans for what would eventually ship as macOS X. The new software didn't actually ship in beta form until 2000, and it took another two years before Apple began making its advanced new OS its default platform for its Mac hardware. During that eternity of delay, Apple debuted iBook, PowerBook G3 and introduced iPod.

Ten years later at WWDC 2008, Jobs focused on a new mobile platform--which at the time Apple was calling "OS X iPhone." The company was also newly introducing the concept of the iOS App Store, offering developers the ability to work with the same development APIs it used internally to create the iPhone apps bundled on the device. The real timeline of how things were developing inside of Apple was not the same as the public releases that the rest of us were observing from the outside

Jobs also laid out a future strategy for iCloud, then known as MobileMe. The company's new take on cloud-based app storage also took years to work through its first awkward stages, but within a fews years it too became a solid part of Apple's overall strategy, and is now deeply woven into macOS X, iOS and Apple's other platforms, and drives new hardware including HomePod.

The next year Apple began calling its mobile platform "iOS" and after another year it showed off iPad--its future vision of tablet-based iOS computing that had actually predated Apple's plans for a smartphone. The real timeline of how things were developing inside of Apple was not the same as the public releases that the rest of us were observing from the outside.

Machine Learning, AI vision and AR

Last year, Apple further exposed internal work as public development interface for third parties to build on top of. These included Core ML, the machine learning framework for computer vision and natural language processing that Apple had already been using inside Siri, its Camera app and the QuickType keyboard. Now, developers could begin building their own model-based ML features.

IBM has since released "Watson Services for Core ML" support for MobileFirst apps to analyze images, classify visual content and train models using its Watson Services, bringing Core ML into enterprise mobile apps.


IBM was quick to put Core ML to work in the enterprise


Apple also detailed at WWDC how the "magic" behind its dual-lens Portrait photos works, and exposed its Depth API for developers to use in their own apps.

It did the same for iPhone X's TrueDepth structure sensor camera, enabling third parties to do facial tracking, and it unveiled an entirely new set of ARKit tools for building Augmented Reality experiences that worked on any A9 or better iOS device.

Motion, wearables, VR, Metal and GPU

ARKit is a major extension of the work Apple has done in Core Motion, which Jobs first introduced in 2010 on the gyroscope-equipped iPhone 4. Apple has since built custom motion coprocessors into its mobile devices to track ongoing physical activity in health and fitness in HealthKit. Paired with camera vision ML, iOS devices can now handle Visual Inertial Odometry.



We can expect to see big new leaps over last year's surprising introductions, with not only ARKit but also applied advancements to health-related fitness tracking in Apple Watch. The applications of motion were important enough for Apple to make it a major focus of its annual presentation to shareholders this February, depicting a video of people whose lives who were changed or even saved by its wearable technology that serves as a fitness coach, bionic bodyguard and a mobile link to the outside world.

Apple is particularly well positioned in wearables, already offering a tactile, glanceable computer that looks like a fashion watch and an audio AR experience in the form of affordable earbuds. Rumored to be next are glasses presenting visual AR and Virtual Reality experiences, something Apple advanced last year in the form of new VR content development on Macs, tied to external GPU support.

Apple has also progressively rolled out advancements to Metal, its hardware optimized framework for GPU coding first introduced in 2014.

Over the last few years, it brought Metal to the Mac, advanced Metal to do more and last year it moved its own A11 Bionic chips to support Metal 2 using a custom Apple GPU design--with surprisingly little fanfare for such a massive undertaking.

iOS UI, Intelligence and messaging apps

Specific to iPad, last year Apple introduced iOS 11 Drag and Drop, a multitasking navigation experience that required rewriting the entire upper-level interface. This year, with another year of polish and development, we can expect iOS 12 to focus on stability after a growing period that introduced more UI glitches than iOS users have perhaps ever had to endure.

There's also talk of bringing iOS and macOS UI development closer together, potentially facilitating the porting of iOS code to more easily adapt to run on Macs. In previous years, Apple did something similar with APFS, radically updating and harmonizing the core file system with minimal interruption for users on both iOS and Macs.

We'd also like to see advancements made in areas such as Data Detectors and text services, where iOS and Macs often come close to greatness before falling over sideways. Recent features such as Continuity-based copy and paste across devices, Handoff of documents, Maps from addresses, inside mapping and calculated travel times for calendar events suggests that there is a lot more in the pipeline waiting to increase the intelligence that our modern devices can suggest.

With Apple's emphasis on fitness and Apple Watch mobility, perhaps we'll see Maps catch up in offering urban biking directions and supplying better off road and offline maps for hikers, mountain bikers, skiers and other outdoor sports enthusiasts.

Another thread of advances that has progressively emerged at WWDC: new iMessage features, and particularly iMessage apps, which last year supported the creation of Apple Business Chat, a new platform for companies to use in direct communications with their clients using rich, in-chat software elements to handle orders, payments, schedule appointments and make custom selections.


With custom iMessage Apps, enterprise developers can add interactive software elements to chat


When iMessage apps first appeared, some people only saw frivolity and an unwelcomed complication to their simple chat app. Apple was actually rolling out the beginnings of a new platform.

Other new developments in the WWDC pipeline

There are similar advances Apple appears to be incrementally preparing. As we've previously noted, Apple has made recent acquisitions including Texture the "Netflix of magazines," with the apparent intent of deploying a periodical subscription service.

Apple has already been working to embellish News and add more video content and is also working to develop custom content for Apple Music and what may eventually be its own parallel subscription video service.

Apple also recently acquired Lattice Data a programming and execution framework for statistical inference for analyzing piles of unstructured data; and Workflow, the automation tool that builds actions into events you can trigger with a touch.


Hey Siri: Workflow makes complex tasks into a triggerable event


By integrating Workflow with Siri, Apple could greatly expand the richness of what both developers and even casual users can do, by making it easy to set up regular tasks that could be triggered by a voice request, or a tap on Apple Watch.

Apple also reportedly acquired the employees of Init.ai, a natural language startup, and recently hired John Giannandrea, Google's chief of artificial intelligence, to head up Apple's machine learning and AI strategy.

Regarding that hire, Apple's chief executive Tim Cook noted in a message to employees that "John shares our commitment to privacy and our thoughtful approach as we make computers even smarter and more personal."

More Siri at WWDC

Apple has been investing in making Siri's voice more natural with human intonation. It has also replicated the "always listening" features first introduced by Amazon and Google, allowing iOS, Apple Watch and now HomePod to listen for "Hey Siri."

Another way Siri could get better is for more commands to work locally. Apple has already made some initial stabs in this direction, but expect to see more at this year's WWDC.

As with any voice assistant, when the cloud is down or when connectivity is lost, Siri's not very useful at all. Recently, when my Internet went down HomePod couldn't even tell me what time it was, despite all of its onboard computing power. Playback controls and basic tasks like time and local data details (such as calendar and contacts) are all things that Siri could drastically improve upon by doing more processing locally.

For devices like AirPods that lack the ability to do much local computation, the mesh of Continuity could handle Siri tasks on your phone or potentially even Apple Watch, accelerating how quickly a voice command can be put into action, even when Internet cloud connectivity is limited.


Hello? Sorry, Alexa isn't here.


While it's easy to complain about Siri and its deficiencies in comparison to rival services with their own unique strategies and intents, it's also rarely noted that Apple's strategy involves making Siri work across far more languages and regions. That not only pertains to the languages it seeks to understand but the interests of users in other countries. Siri supports information on over 100 sports leagues globally, for example, making it relevant to users seeking updates on games being played in their home country.

There are still lots of areas Siri needs to radically improve in to become extremely useful. But a key differentiation in how it works compared to other voice services is that--just as with iMessages--its communications are encrypted both ways, and requests you make are not added to a marketer's profile.

When you ask Siri about the weather, for example, your request is handled without linking your query to your Apple ID and forever remembering who asked about weather where, extrapolating your location and deciding what you might be enticed to buy in the future based on your question.

Siri's efforts to respect users' privacy and keep their data secured rather than mining it for marketing opportunities is not lost upon buyers. It appears that Apple will have an easier time improving how Siri works than rivals will have in earning trust back from users. Currently, the number of people enraptured with the entertainment-level novelty of voice is small and likely short-lived. Once we start commonly using voice commands for more important things, the nature of their security and trustworthiness will be far more important than whether a complex conversation query could be handled back in 2018.

Amazon and Google were once hailed as having a tremendous advantage over Apple due to their openly expressed policy of consuming user data and freely using it to improve their own services. But Apple then detailed its own differential privacy efforts to anonymize samples of user content without risking the leakage of personally identifying data.


Apple was once supposedly at a disadvantage for caring about users' privacy


As a result, Apple now employs similar deep machine learning on vast amounts of user data without the same privacy concerns, but Amazon and Google can't ever be fully trusted with the data they have taken, nor do consumers generally trust that anyone outside of Apple really cares about their privacy. The "potential for improvement" tables have turned.

Apple's reputation moat

Recall that several years ago, Android phones essentially had a monopoly on 4G LTE service, a truly compelling and vast jump in data speed over what iPhone 5 could do at the time. That advantage lasted for years, but today is irrelevant.

Qualcomm is now trying to resurrect this in advertising the potential for 1.2Gbit mobile data on its Android chipsets--something that isn't even available in practice from typical mobile networks. But that marketing hasn't stopped Apple's iPhones from being the most popular devices around the world--even with a substantial price premium.

If novel features like voice search and AI were really compelling features that drove significant numbers of buyers to new hardware, Google's Pixel 2 and Andy Rubin's Essential phone would not have been total duds. The reality is that mainstream buyers consider factors like longevity, reliability and brand experience, and that gives Apple a reprieve from chasing down every short-term tech fad and brief feature advantage its competitors can offer.

That doesn't mean, however, that Apple can just sit around basking on its past accomplishments. This year we expect to see some major advances for not just the (dare I say it, "beleaguered") Siri, but other initiatives including HomeKit, ARKit, Core ML, Metal and GPUs, and of course wearables, with Apple Watch seeing new fitness-related capabilities and expanded ways to make easy purchases and activate devices via NFC.

There's another device in particular that we are likely to see more about at WWDC, which will be considered tomorrow. What do you expect to see more of at WWDC? Let us know in the comments below.
lolliverMacPro
«1

Comments

  • Reply 1 of 28
    DemonkidDemonkid Posts: 3unconfirmed, member
    I'd like to see more VR news. I was surprised last year by the on stage star wars demo. Since then though it's been hard to get any support from SteamVR and Vive and while i've managed to create some VR experiences using Unity it would be nice to hear some extra VR support from apple.
    watto_cobra
  • Reply 2 of 28
    asciiascii Posts: 5,936member
    Only 1 week to go! 

    I think we will see a big update on VR for the Mac. Remember last year, they:
    - started selling eGPU kits with HTC Vive headsets
    - released beta Steam VR libraries for Mac
    - had a live demo of a woman using a VR headset to edit a movie scene? 
    Well the eGPU support is part of macOS now, so that part's done. I predict VR versions of Final Cut Pro and Xcode. That is what I think the conference logo (first picture in this article) represents, it represents using Xcode in VR.

    In terms of Mac hardware, I think we will see the return of the 17" Macbook Pro. Why? Because 17" notebooks have made a comeback in general over the last year, just look around, even Dell has them. Plus, how did they placate the pros who are waiting for the new Mac Pro? By releasing a very high end iMac model. I think they will also release a very high end Macbook Pro model for the same reasons. The extra size will allow a bigger battery which will support DDR4 and therefore allow 32GB RAM.
    BigDannuniscapewatto_cobra
  • Reply 3 of 28
    FCP X already supports VR headsets, since last year's update. Here's a 360° video I shot at the official reveal:
    watto_cobra
  • Reply 4 of 28

    One should not forget about HomeKit - there are a lot of Smart Home Gateway Users eagerly waiting for being able to a) expose their connected devices to HomeKit and b) integrate HomeKit-Devices into their Gateway. A lot of work has been put into this area by 3rd party gateway developers lately and all of them are waiting for the Go of Apple (my vendor has it running in beta and it works as designed - one-directionally integrating Z-Wave/ZigBee and Enocean-Devices into HomeKit only so far).

    HomeKit itself will without any doubt be more integrated into Voice services and hopefully into macOS (a dedicated Home app is still missing), while a lot of supported functionality is generally missing from the iOS Home App UI at the moment. I doubt that HomeKit will be exposed to 3rd party Voice systems, though.

    BigDanncornchipwatto_cobra
  • Reply 5 of 28
    asciiascii Posts: 5,936member
    FCP X already supports VR headsets, since last year's update. Here's a 360° video I shot at the official reveal:
    Yep it has the ability to create content for VR headsets, but does it let you make normal 2D videos, but instead of editing them with mouse and keyboard, enter a VR world and edit them with hand gestures? Because the later is what I think might be coming.
    watto_cobra
  • Reply 6 of 28


    I've love to see more of this; specifically for cars. A modern take on the Haynes manual perhaps, AR guides for servicing and repairs :smiley: 

  • Reply 7 of 28
    dick applebaumdick applebaum Posts: 12,527member
    Swift 5:


    jony0watto_cobra
  • Reply 8 of 28
    BigDannBigDann Posts: 66member
    hblaschka said:

    One should not forget about HomeKit - there are a lot of Smart Home Gateway Users eagerly waiting for being able to a) expose their connected devices to HomeKit and b) integrate HomeKit-Devices into their Gateway. A lot of work has been put into this area by 3rd party gateway developers lately and all of them are waiting for the Go of Apple (my vendor has it running in beta and it works as designed - one-directionally integrating Z-Wave/ZigBee and Enocean-Devices into HomeKit only so far).

    HomeKit itself will without any doubt be more integrated into Voice services and hopefully into macOS (a dedicated Home app is still missing), while a lot of supported functionality is generally missing from the iOS Home App UI at the moment. I doubt that HomeKit will be exposed to 3rd party Voice systems, though.

    A Home server is really needed! One that has deep security for the users protected content and a second location for the shared content like music, TV shows & movies. Then add to it the AI smarts of Siri so it can run more locally, add HomeKit server, and lastly HomePod type of devices server so the content it will serve up is more central and the ability of the device to discriminate the different users within the household.
    williamlondonlollivercornchipwatto_cobra
  • Reply 9 of 28
    FolioFolio Posts: 698member
    Upgraded Siri and a smaller cheaper HomePod for bathrooms, kitchens etc. Such a one-two punch could swing momentum.
    watto_cobra
  • Reply 10 of 28
    wizard69wizard69 Posts: 13,377member
    Sadly the author seems to have no idea what will happen at WWDC.   Also WWdC is as much about hardware as it is software, the point of WWDC is to get developers to use new features that are often tied to new hardwarec.  Of course i have my own guesses:

    1. swift 5 will be huge this year with the ABI solid.  Hopefully we wikl see indications that the rest of the suite will flesh out. 

    2.  New Macs for the desktop.   The desktop and even the laptops are pretty pathetic at this point.  Im not sure if the transition to ARM will happen if not more halfway machines. 

    3,   More hardware to accelerate machine learning and AI type apps.   Expect a coprocessor for Macs mainly because Intel is way behind the curve here.     This may be pprocessed in the ARM chip already in some Macs but the reality is a full fledged coprocessor coikd offer an incredible increase in ML performance.  

    4.   I actually agree that the more that Siri can do locally the better.    This however requires performance which is still lacking in some platforms.  However Apple needs to start sooner ratger than later so expect a lot of hardware to go obsolete.  

    5.   Speaking of hardware i wouldnt be surprised to see custom i86 hardware in partnership with Intel or AMD.   The goal would be to integrste Apple specific hardware.  This woukd include camera hardware, the ML/AI hardware and other features Apple has invested a lot of time and money into.  Custom i86 chips are about the only way for Apple to go outside of its own ARM chios.  

    6.   Expect Apple to start to buikd more AI/ML like features into its operating systems.  I fully expect Knowledge Navigator like functionality.  Frankly this why Siri needs to become locally centered.    This may be what the rumored Star project is all about.   That is hardware and software optimized for a different sort of interaction with the user.    The OS basically becomes an agent with natural language processing that shifts user interaction into the future.  


    In any event a quick off the top of my head list.   Oh buy the way dont forget all the small things that make a huge difference.  
    ascii
  • Reply 11 of 28
    Rayz2016Rayz2016 Posts: 6,957member
    BigDann said:
    hblaschka said:

    One should not forget about HomeKit - there are a lot of Smart Home Gateway Users eagerly waiting for being able to a) expose their connected devices to HomeKit and b) integrate HomeKit-Devices into their Gateway. A lot of work has been put into this area by 3rd party gateway developers lately and all of them are waiting for the Go of Apple (my vendor has it running in beta and it works as designed - one-directionally integrating Z-Wave/ZigBee and Enocean-Devices into HomeKit only so far).

    HomeKit itself will without any doubt be more integrated into Voice services and hopefully into macOS (a dedicated Home app is still missing), while a lot of supported functionality is generally missing from the iOS Home App UI at the moment. I doubt that HomeKit will be exposed to 3rd party Voice systems, though.

    A Home server is really needed! One that has deep security for the users protected content and a second location for the shared content like music, TV shows & movies. Then add to it the AI smarts of Siri so it can run more locally, add HomeKit server, and lastly HomePod type of devices server so the content it will serve up is more central and the ability of the device to discriminate the different users within the household.
    I’ll be very surprised if something like this shows up. How about loading the stuff onto a shared spot on iCloud that can be seen by the family?
  • Reply 12 of 28
    seanismorrisseanismorris Posts: 1,624member
    I was expecting a new 12.9” iPad refresh by now...

    Now?
  • Reply 13 of 28
    dewmedewme Posts: 5,362member
    I really (really!) hope that Apple doesn't fall into creating a SDK/API/Platform/Toolkit galaxy of time sucking black holes like the ones that Microsoft created for itself throughout the 90s and 00s. An endless parade of unfinished, ever changing, and unfulfilled promises perched precariously atop fragile SDKs, APIs, Platforms, Toolkits, and fantastical reimagined architectures that kept tens of thousands of engineers very busy for years building technical tidbits that would too often never even see the light of day and constantly keeping customers sitting on their hands year after year waiting for the next big thing that would transform their business and pump up their bottom line. It ends up being an endless chain of pass-along promises and everyone ends up losing - except the company selling the toolkits and technical book publishers.

    Apple's success is based on its ability to deliver highly compelling, easy to use, valuable, and reliable products (and to a lesser degree - services). These big geek laden mashups like WWDC are fabulous opportunities (and boondoggles) for those who will be building the pieces and parts too get with the program so they can help bring the next round of products and services to market. But Apple has to be very careful to always be selling this technology to the technologists who "touch the code" and don't try to sell technology to the managers, directors, VPs, SVPs, C-suite residents, and especially end-customers. There's nothing quite as horrifying as seeing your CEO get up at a big customers facing conference and start spewing the virtues of service oriented architecture (SOA) like they're selling Swiss Steak as the daily special at Bob Evans. "You, our most valued customers, you need some SOA (pronounced 'so ahh') and by golly we're the ones who are gonna bring you the best SOA on the planet, with some help from our friends in Redmond no less, and it's going to be totally awesome. You're gonna love it. Maybe with some green beans on the side." 

    It's all about the products. Don't forget that.
    pscooter63radarthekatcornchipwatto_cobra
  • Reply 14 of 28
    SoliSoli Posts: 10,035member
    Rayz2016 said:
    BigDann said:
    hblaschka said:

    One should not forget about HomeKit - there are a lot of Smart Home Gateway Users eagerly waiting for being able to a) expose their connected devices to HomeKit and b) integrate HomeKit-Devices into their Gateway. A lot of work has been put into this area by 3rd party gateway developers lately and all of them are waiting for the Go of Apple (my vendor has it running in beta and it works as designed - one-directionally integrating Z-Wave/ZigBee and Enocean-Devices into HomeKit only so far).

    HomeKit itself will without any doubt be more integrated into Voice services and hopefully into macOS (a dedicated Home app is still missing), while a lot of supported functionality is generally missing from the iOS Home App UI at the moment. I doubt that HomeKit will be exposed to 3rd party Voice systems, though.

    A Home server is really needed! One that has deep security for the users protected content and a second location for the shared content like music, TV shows & movies. Then add to it the AI smarts of Siri so it can run more locally, add HomeKit server, and lastly HomePod type of devices server so the content it will serve up is more central and the ability of the device to discriminate the different users within the household.
    I’ll be very surprised if something like this shows up. How about loading the stuff onto a shared spot on iCloud that can be seen by the family?
    Yeah, if that very long request hasn't happened by now it seems improbable it would happen now as Apple has shifted so much focus to iCloud.

    And then there's the number of server functions that used to be part of OS X Server at a high price that have been folded into the standard macOS build, if you want to create a home server you're probably better off just doing so with a Mac mini or even some old Mac you may have laying around.
    edited May 2018 watto_cobra
  • Reply 15 of 28
    DanielEranDanielEran Posts: 290editor
    dewme said:
    I really (really!) hope that Apple doesn't fall into creating a SDK/API/Platform/Toolkit galaxy of time sucking black holes like the ones that Microsoft created for itself throughout the 90s and 00s. An endless parade of unfinished, ever changing, and unfulfilled promises perched precariously atop fragile SDKs, APIs, Platforms, Toolkits, and fantastical reimagined architectures that kept tens of thousands of engineers very busy for years building technical tidbits that would too often never even see the light of day and constantly keeping customers sitting on their hands year after year waiting for the next big thing that would transform their business and pump up their bottom line. It ends up being an endless chain of pass-along promises and everyone ends up losing - except the company selling the toolkits and technical book publishers.

    Apple's success is based on its ability to deliver highly compelling, easy to use, valuable, and reliable products (and to a lesser degree - services). These big geek laden mashups like WWDC are fabulous opportunities (and boondoggles) for those who will be building the pieces and parts too get with the program so they can help bring the next round of products and services to market. But Apple has to be very careful to always be selling this technology to the technologists who "touch the code" and don't try to sell technology to the managers, directors, VPs, SVPs, C-suite residents, and especially end-customers. There's nothing quite as horrifying as seeing your CEO get up at a big customers facing conference and start spewing the virtues of service oriented architecture (SOA) like they're selling Swiss Steak as the daily special at Bob Evans. "You, our most valued customers, you need some SOA (pronounced 'so ahh') and by golly we're the ones who are gonna bring you the best SOA on the planet, with some help from our friends in Redmond no less, and it's going to be totally awesome. You're gonna love it. Maybe with some green beans on the side." 

    It's all about the products. Don't forget that.
    Part of MSFT's problem was that it was creating entirely software-based solutions that had to be paid for via licensing.

    Much of the new APIs that Apple is opening up are (as the article notes) actually internal work that it is making usable to third parties. Core ML isn't chasing after everyone else's ML API, but rather the work Apple did internally to develop ML-based features in Camera, Siri and the word suggesting quicktype keyboard. That's the same formula behind building iPhone apps, then opening up the SDK to third parties to build more.


    Soliradarthekatrandominternetpersonjony0watto_cobra
  • Reply 16 of 28
    dick applebaumdick applebaum Posts: 12,527member
    I think that the Home Server/HomeKit/Siri issues can be resolved!

    Consider, If you had a Home Server, likely, you would:

    • connect HomeKit devices via WiFi with a bridge if needed
    • percolate up/trickle down cloud data and buffer locally LRU -- as iTunes interfaces iCloud files on the Mac over WiFi
    • access local data from various iDevices and Macs via WiFi -- same as above, except local files only
    • control the whole thing with Siri over WiFi - with Siri capabilities processed locally and tailored to your use off Siri

    So, where do these local data reside, and what does this Home Server Box look like (tech specs).   Simply stated, it's [what's a] computer with some storage, WiFi, and voice recognition.

    The Home Server can be a logical implementation rather than a physical one. It can run concurrently, on multiple iPhones, iPads, Macs -- all connected by Wifi and controlled by local Smart Siri.

    Smart Siri will know about all your data, where they reside, and if they are currently accessible -- and be at your service!

    Think about it!

    patchythepiratejony0watto_cobraMacPro
  • Reply 17 of 28
    PutzyPutzy Posts: 8member
    Local Siri is a no brainer and actually Apple's devices used to operate this way before Siri was introduced. On my iPhone 3G and 3GS, I could say things to my phone like 'play this playlist' or 'shuffle all songs', or 'call (insert contact here). I did not need an internet connection and it was instant. Asking Siri to set a timer for ten minutes and waiting for it to send this to the cloud and then wait for a response is kinda ridiculous. Especially when in the car if I'm in a remote area. Siri is all of a sudden unusable, and all I want to do is play a song or set a reminder. I should not need an internet connection for this. Huge step backwards.

    Drives me crazy because I live in a city that doesn't have connection on underground transport and connectivity is also bad outside of the city when on the road.
    patchythepiratecornchipwatto_cobra
  • Reply 18 of 28
    rogifan_newrogifan_new Posts: 4,297member
    Apple please make improvements to Siri so it’s consistent across devices in terms of capabilities and responses. Please fix notifications so they’re grouped by app on the lock screen and when you dismiss something on one device it’s dismissed on all devices. And for the love of god please up the free iCloud storage to at least 10GB. The company’s services strategy shouldn’t be about nickel and diming customers over cloud storage.
    williamlondonpatchythepiratecornchip
  • Reply 19 of 28
    Eric_WVGGEric_WVGG Posts: 968member
    I say this every year, but god I wish we could get "complications" on the Home Lock Screen. I want Weather to be as accessible and "just there" as the time; swiping and digging through widgets doesn't cut it.
    edited May 2018 watto_cobra
  • Reply 20 of 28
    Eric_WVGG said:
    I say this every year, but god I wish we could get "complications" on the Home Screen. I want Weather to be as accessible and "just there" as the time; swiping and digging through widgets doesn't cut it.
    At some point Apple might abandon the grid of rounded rectangles approach, but I wouldn't hold my breath. Apple has a long, long history of always allowing much less user customization than (some) users ask for.  They have been incredibly successful with that approach, so why would they change?
    cornchipwatto_cobra
Sign In or Register to comment.