MacBook Pro Touch Bar could be revived as a strip that supports Apple Pencil

Posted:
in Future Apple Hardware edited May 9

Apple keeps researching how to have the iPad-centric Apple Pencil do the work of the old Touch Bar on the surface of a future MacBook Pro.

Line drawing of a hand holding a stylus over a laptop with labeled components indicating an instructional or patent illustration.
The patent doesn't appear to rule out a touch screen Mac, but it's focused on the Touch Bar-like strip



It's just a patent application and not only may one of those not be granted, it may not ever lead to an actual product if it is. Except this patent application made public on May 9, 2024 is an expansion of a series of previous ones -- including some that have actually been granted.

So this is not a skunkwork project by a few Apple people, it is a project that at least was under continuous development. Mildly hidden under the dull title of "Mountable tool computer input," it's really about how an Apple Pencil could be used with a MacBook Pro.

That sounds as if it obviously means how it could be used to draw and write on a MacBook Pro screen. But perhaps with a mind to Steve Jobs's famous criticism of how "you have to get 'em, put 'em away, you lose 'em, yuck," it is more about storing the Apple Pencil.

There's a chance that Apple is only thinking of a MacBook Pro for very tidy people, as just like the previous versions of the patent application, this new one talks of people having a separate device for drawing on.




"[Some] computing devices, such as laptop computers, can have a touch screen positioned in or adjacent to a keyboard of the device that can be configured to provide many more functions than a set of traditional keys," begins the patent.

"However, an ancillary touch screen can be difficult to use in some cases," it continues. "Touch typists may dislike using the touch screen because it lacks tactile feedback as compared to a set of mechanical, moving keys."

Specifically, that they use the Pencil in at least roughly the area where Apple used to include a Touch Bar.

"The touch screen is also generally positioned near the user's hands and therefore may be prone to being obscured from the user's vision by their own hands," says the patent. "Also, even when the user looks at the touch screen, it is positioned at a different focal distance from the user as compared to the main display, so the user must readjust their head or eyes to effectively read and interact with the touch screen..."

It's a wonder Apple ever bothered with a Touch Bar. Yet the company wants to do something with that space, and it persists. The Bar may be replaced by a touch panel.

"The touch panel may include a touch-sensitive surface that, in response to detecting a touch event, generates a signal that can be processed and used by other components of the electronic device," continues Apple. "A display component of the electronic device may display textual and/or graphical display elements representing selectable virtual buttons or icons, and the touch sensitive surface may allow a user to navigate and change the content displayed on the display screen."

None of this would immediately seem to fix Apple's criticisms of the Touch Bar. A user would have to break off typing, look for the Pencil to pick it up out of the holder, then write or tap with it on the touch sensitive strip.

A stylus may be more natural than a Touch Bar



Yet that is more natural than the Touch Bar. While it stops the user typing, it feels more natural to look away from the screen to find the Pencil.

And rather than trying to remember a control that is a small spot on the Touch Bar -- which also moves -- then picking up a Pencil is a lot easier.

Detail from the patent showing a housed Apple Pencil being swiped across
Detail from the patent showing a housed Apple Pencil being swiped across



There is one more thing, though. In a few patent drawings, a user is shown tapping on the Pencil, or touching it, or swiping across it.

The Apple Pencil could then show Touch Bar-like controls while it's in the holder.

This patent application, like its previous granted versions, is credited to Paul X. Wang, Dinesh C. Mathew, and John S. Camp. Wang is listed on many previous patents for Apple, including more on user input devices.



Read on AppleInsider

Comments

  • Reply 1 of 18
    command_fcommand_f Posts: 422member
    I never liked the Touch Bar but this seems to throw away its better bit (you could see what it was supposed to do as it changed). Now, we swap the extra tap to wake it up before it will do anything for extracting an Apple Pencil from that awkward corner between keyboard and display. Then we tap/draw in an unlabelled space?

    I'm not convinced. Part of the Touch Bar's downfall was that it didn't do anything that standard macOS features ('cos not everyone has a Touch Bar) didn't do at least as well: I can't see this being any different.
    watto_cobrawilliamlondon
  • Reply 2 of 18
    If Apple wanted to include the touchbar AND the physical row of keys on the new m1 macbook pro models, there was room to do it.  There is certainly ample space on the 16" model.  I don't think many would object to having the touchbar in addition to physical F keys - it was the substitution that was divisive.  

    Alternatively...and most obviously...Apple could have given the MacBook Pros a touch capable display.  Before MacOS users lose their $h1t about it not being a touch OS and should never be (until Apple does it and you love it) - I'm not talking about making the OS touch capable.  I'm talking about giving developers touch capabilities within their Mac apps.  

    For example...

    One of the best features of the touch bar is that it becomes a play scrub bar in Final Cut, iMovie or Quicktime.  If the screen has touch capability, the timeline in FCP could simply be scrubbable via touch.  Before some lose their $h1t about finger prints on your laptop screen...well - you touch your iphone & ipad right?  Also, whoever wants to happily ignore it could do so.
    watto_cobrararedewme
  • Reply 3 of 18
    DAalsethDAalseth Posts: 2,813member
    OR this is for the next generation Magic Keyboard for the iPad. 
  • Reply 4 of 18
    omasouomasou Posts: 594member
    I really liked the touch bar, especially the sliders for adjusting volume and brightness.

    For me function keys are completely useless and never used.

    Like the control key; function keys are only necessary for Windows compatibility.
    edited June 2022 jib40domiwilliamlondon
  • Reply 5 of 18
    dewmedewme Posts: 5,437member
    I think the main issue with the Touch Bar was its location. Apple’s insistence that Macs never have touch screens ties their hands behind their back when it comes to finding UI real estate to incorporate new interactive controls. Using the Touch Bar required taking your eyes off the screen and looking down, which breaks some people’s workflow. Since Apple wasn’t opposed to this disruption they could have simply incorporated smart controls into a glass trackpad in some manner. 

    Apple’s trackpads are unusually large. It’s conceivable that certain regions of their massive trackpad could be used on-demand and in an app-centric context to place additional controls directly on the trackpad, e.g., a vertical or horizontal slider/fader along an edge of the trackpad that would pop up when the user selects an on-screen control that requires a variable value. Using the trackpad as both a control and display surface may make sense, or it might be a complete disaster, i.e., Touch Bar Fail v2.0. 

    I’m actually not trying to suggest any kind of UI concept, all I’m saying is that Apple has painted themselves in a corner by insisting “fingers off” when it comes to the primary display surface. Adding a secondary display surface with touch by eliminating the function keys didn’t work out so well in the long run. Doing something similar with the trackpad may be a repeat of that previous failure.

    Popping up touch controls, fully in-context with an app’s workflow, directly on the screen and within the un-averted sight lines of the user, is an obvious solution to the problems the Touch Bar was trying to solve. I hate fingerprint smudges as much as anyone does, and I’ll never like typing on glass, but how many more crazy ways of dancing around the “no fingers on the screen” edict will Apple have to come up with before they finally admit that the obvious solution is probably the right solution? 
    williamlondonfirelock
  • Reply 6 of 18
    ravnorodomravnorodom Posts: 702member
    I can't live without Function Keys because many of my Photoshop shortcuts assigned to Function Keys. Touch Bar is useless for me because when I am on desktop, I use "fn" key combo all the time to control volume and others. So it's natural thing for me to do the same on MacBook Pro. For Apple to have Touch Bar on MacBook Pro only doesn't help me who use desktop and laptop frequently.
    VictorMortimerwilliamlondon
  • Reply 7 of 18
    jibjib Posts: 56member
    I really liked the Touch Bar and although it was never implemented to its full potential, I would be very happy to see it come back on both MacBook and desktop Macs.  One of the options of a new Touch Bar might be an option to temporarily blank (a portion of it) an allow the Apple Pencil to be used. 
    To avoid some of the complaints to the previous Touch Bar, it need not replace the top row of control keys, and it could be located elsewhere, for example by (or on) the touchpad on a MacBook, on on a bezel below the actual screen.
    firelockwilliamlondon
  • Reply 8 of 18
    fastasleepfastasleep Posts: 6,432member
    What’s with all the recycled old articles getting pushed to the home page again?
    williamlondonmattinoz40domi
  • Reply 9 of 18
    crowleycrowley Posts: 10,453member
    If you need to look away from the screen then it’s a dead end.  Touch Bar was a dumb idea from the start.
    muthuk_vanalingamjcallowseriamjhVictorMortimer
  • Reply 10 of 18
    jcallowsjcallows Posts: 150member
    I just love the Touch Bar!  I hardly use it but it's just so cool to look at!  I especially love how it flickers, flashes lights and changes colors whenever you switch apps.  It's not distracting-- it's just exciting!  Why let all those cheap laptop makers like Dell, HP and Samsung have all the fun and be the only ones that have these colorful flashing lights? Nothing says refined and class more than a lit-up, candied-color strip staring at your face from the keyboard.  Moreover, the added level of complexity and increased cost for something that's rarely, if ever, used, just adds more spice to life by making the MacBooks more challenging to fix and to afford.  Kudos, Apple, on your new direction!  But if you do decide to remove the beloved Touch Bar, please replace it with something equally as annoying, like say, oh, a big ugly notch!

    Signed, 
    An Anonymous Member of Apple's Post-Jony Ive Design Team
    williamlondonVictorMortimer
  • Reply 11 of 18
    40domi40domi Posts: 117member
    omasou said:
    I really liked the touch bar, especially the sliders for adjusting volume and brightness.

    For me function keys are completely useless and never used.

    Like the control key; function keys are only necessary for Windows compatibility.
    Totally agree with you, if Apple ever make it an option on MBP again, I will upgrade my M3
  • Reply 12 of 18
    eriamjheriamjh Posts: 1,668member
    crowley said:
    If you need to look away from the screen then it’s a dead end.  Touch Bar was a dumb idea from the start.
    I agree with this 100%.   When the results are on screen, no one looks down except for notes or references, not controls.
    williamlondon
  • Reply 13 of 18
    mikethemartianmikethemartian Posts: 1,380member
    They could just put a touch bar on the far top end of their wireless trackpad and see if developers find good use for it. That way it doesn’t take away function keys on the keyboard and it can be used with any Mac.
  • Reply 14 of 18
    AppleZuluAppleZulu Posts: 2,050member
    dewme said:
    I think the main issue with the Touch Bar was its location. Apple’s insistence that Macs never have touch screens ties their hands behind their back when it comes to finding UI real estate to incorporate new interactive controls. Using the Touch Bar required taking your eyes off the screen and looking down, which breaks some people’s workflow. Since Apple wasn’t opposed to this disruption they could have simply incorporated smart controls into a glass trackpad in some manner. 

    Apple’s trackpads are unusually large. It’s conceivable that certain regions of their massive trackpad could be used on-demand and in an app-centric context to place additional controls directly on the trackpad, e.g., a vertical or horizontal slider/fader along an edge of the trackpad that would pop up when the user selects an on-screen control that requires a variable value. Using the trackpad as both a control and display surface may make sense, or it might be a complete disaster, i.e., Touch Bar Fail v2.0. 

    I’m actually not trying to suggest any kind of UI concept, all I’m saying is that Apple has painted themselves in a corner by insisting “fingers off” when it comes to the primary display surface. Adding a secondary display surface with touch by eliminating the function keys didn’t work out so well in the long run. Doing something similar with the trackpad may be a repeat of that previous failure.

    Popping up touch controls, fully in-context with an app’s workflow, directly on the screen and within the un-averted sight lines of the user, is an obvious solution to the problems the Touch Bar was trying to solve. I hate fingerprint smudges as much as anyone does, and I’ll never like typing on glass, but how many more crazy ways of dancing around the “no fingers on the screen” edict will Apple have to come up with before they finally admit that the obvious solution is probably the right solution? 
    The suggestion that looking just below your screen to see the touch bar is too hard or too distracting just seems entirely nonsensical to me. It's no more destructive of the user workflow than is looking at a file menu on the top of the screen or the dock at the bottom of it. Your eyes already move around to look at things. 

    Apple doesn't want to implement touch on the MacBook Pro screen because

    a) macOS is not designed for touch. Tapping at menu-driven options with your finger doesn't work well because, for instance, menus are small, your fingers aren't and your hand would always be in front of things you need to see. Now that's something that will interrupt your workflow.

    and

    b) macOS is designed to run not just MacBooks, but desktop Macs all the way up to Mac Pro. Doing something like changing macOS to look more like iPadOS to improve a touch experience on a MacBook Pro would degrade the UI experience on desktop Macs, which would never be ideal for a touchscreen experience. There are already ergonomic and engineering issues with tapping at a MBP screen, but desktop Macs use larger screens at varying distances and elevations and in varying screen multiples that would make trying to tap away at them with your finger a complete ergonomic mess. That problem then leads to the idea of having both a touch-optimized UI as well as a cursor-optimized UI, depending on the user's current interaction - in essence a hybridized macOS and iPadOS. That of course describes Windows, the all-things-for-all-people-on-all-occasions operating system that has for decades served as a great example to remind Apple what not to do.
    edited May 9
  • Reply 15 of 18
    dewmedewme Posts: 5,437member
    AppleZulu said:
    dewme said:
    I think the main issue with the Touch Bar was its location. Apple’s insistence that Macs never have touch screens ties their hands behind their back when it comes to finding UI real estate to incorporate new interactive controls. Using the Touch Bar required taking your eyes off the screen and looking down, which breaks some people’s workflow. Since Apple wasn’t opposed to this disruption they could have simply incorporated smart controls into a glass trackpad in some manner. 

    Apple’s trackpads are unusually large. It’s conceivable that certain regions of their massive trackpad could be used on-demand and in an app-centric context to place additional controls directly on the trackpad, e.g., a vertical or horizontal slider/fader along an edge of the trackpad that would pop up when the user selects an on-screen control that requires a variable value. Using the trackpad as both a control and display surface may make sense, or it might be a complete disaster, i.e., Touch Bar Fail v2.0. 

    I’m actually not trying to suggest any kind of UI concept, all I’m saying is that Apple has painted themselves in a corner by insisting “fingers off” when it comes to the primary display surface. Adding a secondary display surface with touch by eliminating the function keys didn’t work out so well in the long run. Doing something similar with the trackpad may be a repeat of that previous failure.

    Popping up touch controls, fully in-context with an app’s workflow, directly on the screen and within the un-averted sight lines of the user, is an obvious solution to the problems the Touch Bar was trying to solve. I hate fingerprint smudges as much as anyone does, and I’ll never like typing on glass, but how many more crazy ways of dancing around the “no fingers on the screen” edict will Apple have to come up with before they finally admit that the obvious solution is probably the right solution? 
    The suggestion that looking just below your screen to see the touch bar is too hard or too distracting just seems entirely nonsensical to me. It's no more destructive of the user workflow than is looking at a file menu on the top of the screen or the dock at the bottom of it. Your eyes already move around to look at things. 

    Apple doesn't want to implement touch on the MacBook Pro screen because

    a) macOS is not designed for touch. Tapping at menu-driven options with your finger doesn't work well because, for instance, menus are small, your fingers aren't and your hand would always be in front of things you need to see. Now that's something that will interrupt your workflow.

    and

    b) macOS is designed to run not just MacBooks, but desktop Macs all the way up to Mac Pro. Doing something like changing macOS to look more like iPadOS to improve a touch experience on a MacBook Pro would degrade the UI experience on desktop Macs, which would never be ideal for a touchscreen experience. There are already ergonomic and engineering issues with tapping at a MBP screen, but desktop Macs use larger screens at varying distances and elevations and in varying screen multiples that would make trying to tap away at them with your finger a complete ergonomic mess. That problem then leads to the idea of having both a touch-optimized UI as well as a cursor-optimized UI, depending on the user's current interaction - in essence a hybridized macOS and iPadOS. That of course describes Windows, the all-things-for-all-people-on-all-occasions operating system that has for decades served as a great example to remind Apple what not to do.
    A blast from the past!

    Anything that Apple does that takes your attention and focus away from the work being done on-screen is only going to reduce productivity. But I think that Apple has actually articulated a possible solution to the problem that addresses many of our concerns, including losing focus by having to look down at the keyboard area, clumsiness and fatigue of using touch screens, fingers/hands blocking the view and sightline to onscreen elements, and a lingering need to visually navigate around on-screen menus using pointing devices that move your focus. We need to look no further than the Apple Vision Pro to consider some potential ways of dealing with user focus, UI control, text selection, etc.

    If Apple could bring some of the eye tracking, finger tracking, and voice control innovation from Vision Pro and other products to their intelligent displays they could potentially provide a way to coordinate a user's focus with the control of specific on-screen actions. For example, if the user focused on the start of a block of text or selectable content, they could invoke a voice command to initiate an action, like "Begin Selection" (or "Control KB" for WordStar fans), move their focus to the end of the block to select and say "End Selection" (or "Control KK") followed by the action word "Copy", "Cut", "Delete", "Move", "Make Bold", "Make Italic" "Underline", etc. The user would then move their focus to the insertion point and give the action command, like "Paste", or say "Menu", to bring up a choice of options including things "Mail", "Print", "Send to", etc., with a keyboard key tied to the action, the user's focus would never leave the screen. Same thing with accessing menus or form controls, i.e., use eye tracking and either audible commands, finger clicking (would require hand tracking like on Vision Pro), or quick keyboard shortcuts overlaid on an on-screen popup list, to initiate the action, or to revert a previous action with a audible "Undo" or by visually clicking a main toolbar or a context-aware floating toolbar containing action buttons, much like what is used on Vision Pro and Apple Watch. If the user wanted to empty the trash they would focus on the trash bucket and say "Empty the trash". Likewise things like "Open Folder", "Rename Folder", "Create Folder", and similar file management functions could be done visually with voice, focus based context menus, or using keyboard shortcuts. In a large office you probably don't want everyone blabbing away, so all commends would have a non-audible equivalent.

    The goal here is to provide a "virtual touchscreen" or "visual touchscreen" that does not require physically touching the screen. Action controls that were previously delegated to the Touch Bar could be overlaid on the monitor's display surface (perhaps with translucency) and controlled via voice,, virtual clicking, or keyboard shortcuts. Controls could be provided for scrolling left/right, VCR style controls, up/down numerical controls, etc., all of which use the voice recognition, visual control, and eye tracking and hand/gesture tracking built into the display itself. This would definitely require a lot more intelligence to Apple's displays, both built-in ones, but also Studio Display and Pro Display XDR. Finally Apple would have a way to encourage monitor buyers to go with Apple's "Intelligent Displays" rather than the dumb/generic displays from third party vendors that only provide a display surface. Much of the required functionality could be put into the intelligent display itself. Hopefully the functionality would be orders of magnitude better than Apple's current Studio Display. I thought the Studio Display had at least some of the guts of an iPad, but somehow all of its processing potential has not been utilized to any great extent. 
  • Reply 16 of 18
    fastasleepfastasleep Posts: 6,432member
    Another recycled old patent post... Great.
    VictorMortimermichelb76williamlondon
  • Reply 17 of 18
    michelb76michelb76 Posts: 639member
    Why would they ruin a perfectly good laptop again?
    williamlondon
  • Reply 18 of 18
    wood1208wood1208 Posts: 2,917member
    Good to learn from mistake and not to make the similar mistake again.
    williamlondon
Sign In or Register to comment.