lorin schultz

About

Username
lorin schultz
Joined
Visits
150
Last Active
Roles
member
Points
2,660
Badges
1
Posts
2,771
  • Here are the five biggest iPad Pro problems, because no device is perfect

    I disagree for the simple reason that the pen is mightier than the sword/mouse. Ever tried drawing with a mouse? Ever tried to be precise with a trackpad? All they are good for is moving stuff around but you can do that with a touchscreen and get immense precision using the Pencil.
    I used to work in a setting that provided mouse, keyboard, touch screen, and physical control surface as methods of interacting with various components of the system. While some input methods only worked for certain tasks, there was a lot of overlap so operators often had a choice. It didn't long to establish that certain tasks were best accomplished with the touch screen, others were better handled by the mouse, etc. I suspect that's what we're discovering with the iPad, too.

    I think the Pencil is an excellent idea and appreciate its value. I just think there's benefit to be realized from a more traditional pointing device as well (not "instead"), particularly when one is typing, using an external monitor, or the iPad is positioned vertically. 

    Just because you’ve done something all the time doesn’t make it the best option. In fact if you’re serious about video editing and precision on a Mac you’d use a jog control not a mouse
    Agreed, and I do. I have a keyboard in the middle, mouse to one side, and jog/shuttle on the other. Each is best at some aspect of interacting with the software, and each really sucks for certain operations. None is ideal on its own. It's the combination of various methods that makes it most efficient.

    The Pencil adds a form of control that's better than the others in some respects but worse in others. It's not the best choice for every kind of input on its own, but making it a nutritious part of this complete breakfast improves productivity.

    While my complaint is largely semantic I completely disagree that it’s the lack of the mouse that means that the iPad Pro can’t be used as a desktop/laptop replacement. If that’s so then why the hell is Adobe bringing over full Photoshop? Why is AutoDesk bringing over the full AutoCAD engine? The issue is not lack of mouse support but lack of software support and we are starting to see this changing now thanks to the original iPad Pro.
    The whole "Is it or sin't it?" argument depends as much on preferred working methods as device capability. I continue to use a laptop for many tasks that could very easily be handled by an iPad. I dislike typing on the virtual keyboard and like to have the display propped up in front of me instead of flat on the desk. Neither is better or worse than the other, it's simply a matter of what I like.

    There's also the question of what one is doing with it. Some tasks, like the site assessment examples cited elsewhere, are obviously much better handled by an iPad than a laptop. Others, like post-production, may impose requirements that make a more traditional computer a better choice. Different tools for different jobs. There's overlap in their capabilities, but each is going to excel at certain things while sucking at others.

    The iPad Pro combined with the Apple Pencil is a very precise device and those decrying its abilities just don’t understand how computing is going to evolve because they’re stuck in the past.
    I agree with you about the Pencil but not your conclusion about motives. It's not so much being stuck in the past as preferring not to automatically discard useful tools just because something new comes along unless the new thing covers what the old thing did, or at least provides a viable alternative. So far the Pencil augments other input methods very, very well, but doesn't yet replace them.

    For me, the Pencil isn't a good alternative to a mouse, even though it offers greater precision and more intuitive selection capability. I realize that doesn't seem to make sense, but it comes back to workflow.

    In my line of work, speed is essential. Keyboard shortcuts are much faster than menu operations with a pointer, so that's where my hands are most of the time. Of course, I need to navigate the project and make selections, but it's faster and easier to slide my hand a few inches to slide-drag-click than it would be to pick up a pencil and move my hand to the screen. Either method yields the same result, and neither is much better or worse than the other by itself, but one fits into the workflow better than the other. I certainly could use the Pencil instead of a mouse, but it would be a little slower and quite a bit less comfortable.
    williamlondonGeorgeBMac
  • Apple should keep Lightning for now, but USB-A has to die

    ascii said:
    ascii said:
    Because with digital transmission you don't just have a raw signal being sent but can have an entire protocol defined.
    I don't disagree with most of what you wrote, and applaud both you taking the time to explain and your visionary view of how things could be.

    It's not really a visionary view, its just how other digital ports already work, but it probably came across a bit too advocate-like.

    lorin schultz said:
    That's true, but can you think of any examples of how auto-configured multichannel playback would benefit a user holding an iPad? Are you going to watch a Dolby Atmos movie on an iPad? Does the potential for inter-device operability outweigh the benefits and convenience of a quick and easy analog audio connection?
    It's a false alternative though, because a digital connection can be both powerful and easy to use. Powerful because auto-negotition allows the iPad to discover the best output available to it, and easy to use because auto-negotiaton means the user doesn't have to configure connection parameters at each end.

    Of course in the real world its not always like that because different companies define different standards and then refuse to interoperate with each other, and that's where I think you analog argument is strongest. But that situation is not the fault of digital technology per se, and therefore it doesn't automatically follow that analog is the answer. It does follow that *some* kind of fallback mode is needed, however that could be a digitial fallback mode, such as an open source coded (MP3?) that all devices must support by legislation, meaning you know when it plug it in it will always work no matter what, even if only at the fallback level (and hopefully better than that).

    And regarding analog connectors, are they really so simple and issue free? Don't you sometimes have a problem with low volume/signal level, or electrical interference from other devices? A couple of times I have bought laptops and when the drive spins up there's noise over the headphones and I've thought, "Oh no, I'm stuck with this for years now."

    1. I'm not convinced that digital offers enough additional resistance to transmission errors over analog to make it universally preferred. If it were as bulletproof as pundits claim, I wouldn't have skips, pops, or bursts of digital noise in some of the songs I've ripped from CD. It doesn't take much to break cross-checking.

    2. Real-time error correction requires buffering, and therefore throughput delay (or what is commonly mislabelled "latency"). Unless the audio output somehow communicates that delay time to the video system, and the video system is capable of compensating for that delay, audio and video will go out of sync. There may also be ramifications for real-time audio production, like in the cases of iPhones or iPads being used as instruments or personal monitor mixers for performers.

    1. CD was invented in 1982 though and has very basic error correction, a lot of rippers even ignore the error correction bits by default. In the modern age with all the experience we have from streaming we can do much better.

    2. Buffering is one way to do it but not the only way, redundancy is another, i.e. if your cable has more bandwidth than you need (as you said below), why not just transmit 5 copies of the data at all times, in parallel. When you have a proper protocol defined you have that flexibility.


    What level of detail can not be accurately transmitted over two or three feet of wire? The sound of atoms banging into each other? Ultrasonics? Is it worth trading the benefits of the headphone jack for the ability to transmit sounds no one can hear? Even the crappiest wire on the planet exceeds the capability of the best transducers.

    Ok, I guess compression is one thing you might not need, but it was just one example of flexibility really.

    No argument that what you describe is theoretically superior. The question for me is whether the ACTUAL BENEFITS outweigh the inconvenience, cost, and complication imposed by that approach. There are many, many, many more cases of a headphone jack being not only better-than-adequate but also much more cost-effective and convenient than there are examples of how the user will genuinely benefit from moving digital conversion and amplification out of the iDevice. And, more specifically, the Apple-supplied dongle doesn't achieve those objectives anyway. And it adds another layer of power consumption.
    I'm not someone who likes complicated designs, like you (I suspect) I think simple/just-works is better. I just think that "digital everywhere" makes for a simpler world overall even if there's some short term pain. And I'm not saying existing digital standards are perfect and could not do with some intelligent fallback modes or other changes.
    That post should be a "sticky" at the top of the forum for anyone who thinks the headphone jack needs to stay. We may not all ever completely agree on which deficiencies are worth suffering and which aren't, because that's just an issue of personal preference, but yours has been the most well thought out, lucidly explained, and non-confrontational discussion on the matter I've ever encountered. Bravo!
    muthuk_vanalingam
  • Here are the five biggest iPad Pro problems, because no device is perfect

    crowley said:
    mac_128 said:
    mac_128 said:
    crosslad said:
    Here’s how to solve your problems:

    1 External Drive support - use a WiFi Drive
    2 Lack if mouse - use the Apple Pencil
    3 Headphone jack - use a dongle or a device with a usb c jack. 3.5 headphone jacks have gone from mobile devices
    4 Overpowered - come on, rendering a video in less than half the time is a problem. It will also future proof the iPad. 
    5 Storage - see 1
    1. I agree. Or the cloud.
    2. Pencil is the worst possible mouse substitute I can imagine, as not only must one move their hands from the keyboard to touch the screen, but then they have to pick up and put down a pencil, with no support to stabilize it in mid-air.
    But the mouse causes Carpal Tunnel Syndrome. I haven’t heard of anyone getting carpal tunnel syndrome caused by the pencil usage.

    Pencil is the worst possible mouse substitute when playing Call of Duty. But on that, the mouse is not the best input device neither, there are game controllers for that. Yet I don't see people discussing "mouse or game controller" in gaming forums, why are we into such a pointless discussion here?

    Pencil can do everything that a mouse can do and even more. The mouse is a mechanical pointing device of the 1960s. Pencil is a 21st century technology and there is state of the art engineering in it, encompassing both the display and the device. Pencil is not a stylus, a stylus is a stick compared to Pencil. What the mouse interface provided and the touch interface couldn't provide was the precision data selection. With Pencil, precision data selection is possible even better than the best mouse or trackpad can provide.
    You’re taking my response out of context of the article to which we’ve all been replying. The Pencil is great for fine tuned selection and editing directly on an iPad laying flat on a table. It’s terrible for an iPad propped up on a keyboard stand. It’s terrible when using an external monitor. Heck, even if the iPad is laying flat and the user is merely typing on it, having to take their hands of the keyboard to pick up and put down a Pencil is still worse than using a mouse. And forget about using it while sitting on a couch as one might with a MacBook. That’s where a trackpad rules the day. Apple has added a virtual trackpad, but that’s far from ideal, even on native Apple apps, but still better than the Pencil which you’d have to pull off its magnetic docking perch, holding the iPad with one hand, and then reattaching it, before continuing with the typing. So in general I’d say a mouse or trackpad would still be better for most everything in daily computer use, aside from drawing, editing, and taking notes.

    lowededwookie said:
    [...] I can edit video on an iPhone just as easily as using iMovie on the Mac
    "Easily," yes. Accurately, no. Fine adjustments are difficult using a finger on a small screen.
    There is Pencil for that.
    Besides, even putting all that aside, the iPad Pro's marketing includes using the keyboard stand and an external monitor. Both make touch a less effective control method than using a mouse.
    If you'd watched the Keynote you'd know or you already know that the reason to attach a 4K monitor to iPad Pro is to follow iMovie edits in real time 4K, since the iPad's own display is not 4K. The actual iPad page on Apple's site mentions only "USB-C for ... external display" and says nothing about that external display. There is no point to present it as the main "computer display" of iPad.
    So the user has to continually shift his eyes back and forth from the iPad to the screen, as well as take their hands off the keyboard, to lift a Pencil to the vertical screen, and hover it unsupported in midair to make detailed selections, then put the pencil down to continue using the keyboard; all the while shifting their eyes from the iPad to the 4K monitor and back as they make the fine adjustments and make sure the Pencil is where they think it’s supposed to be on the iPad? Doesn’t really sound more efficient than a mouse ... in fact it sounds a lot worse, even if Apple only intended the external monitor to be used solely as a 4K reference display and not a workspace.
    All your long anecdotal narration is irrelevant. The iPad is not for desktop usage. If you don't want to leave your comfy desk get a laptop. Neither the folding keyboard nor the 4K display are main components of iPad Pro. The monitor is there only to watch 4K iMovie edits. Besides that there is absolutely no point in buying a 4K monitor for the iPad Pro.

    OK if your point is to get a trackpad on that foldable keyboard, then this is not possible: 1) How will you power it? 2) What if people with disabilities or long fingernails want to attach a mouse to that keyboard? How will you power both? 3) There is no pointer in iOS. Your request requires the whole UI to be re-written for the mouse interface. That won't happen, buy a Surface it has both touch and mouse. I am off that mouse discussion.
    1) It's not anecdotal.   Its reality.  But it is true that the iPad is obviously not capable of desktop usage -- thus this discussion triggered by Apple's contention that it is a laptop/desktop replacement.  Most agree that it COULD be, but isn't.

    2) Your second paragraph is just throwing out objections that have no basis in reality.
    - How will you power it?   The same way bluetooth keyboards and mice are powered now.  Or, via the smart connector.   That problem was solved years ago.
    - What about people with long fingernails?   -- They will do the same as they do today
    - The entire UI needs to be rewritten:   That just isn't true.  Not by a long shot.  iOS already has a multiple pointers -- the finger is one, the two finger touch on the iPad works as another.
    On the UI "issue", the appleTV has a trackpad interface and no pointer.
    The difference being the Apple TV has only specific targets. There's no such thing as a random placement of the tool. For something like editing a photos, you have to be able to see what the pen is going to touch. That means either looking at the iPad, which is fine but makes the external monitor superfluous, or Apple adding support for a mouse with a cursor to iOS. Even that won't work very well with the pencil, though. It's one of the reasons I got rid of my Wacom tablet -- you have to "hover" the tip of the stylus over the work area to move the cursor. It's not a deal-breaker (obviously, look at how many people use Wacom) but I found it unnatural and restrictive.
    elijahg
  • Apple should keep Lightning for now, but USB-A has to die

    polymnia said:
    ascii said:
    I think we should focus on getting rid of analog ports (the headphone jack being the only remaining one) and going all digital.
    I'm curious why you want the headphone jack removed? What advantage do you perceive from that?

    Headphones are analog. They have transducers in them. At some point before the speaker, the signal MUST be converted to analog and amplified.

    The phone or tablet already has a digital-to-analog converter and an amplifier. Removing the headphone jack doesn't mean they can be removed too, because they're required for the speaker(s) on the device itself. By removing the headphone jack, those parts of the chain have to be duplicated in the form of a dongle hanging inelegantly on the outside of the device, instead of just using the parts that already exist, tucked neatly inside the device.

    On devices with only one "digital" port like a phone or tablet, removing the headphone jack means that any wired audio connection ties up the port so it can't be used for anything else. That complicates some really common uses cases, like using the device in the car. With only a Lightning port on the phone I can either charge or listen to it, not both, unless I add a dongle that does nothing more than duplicate parts that are already inside the phone!

    None of this is insurmountable. Adapters and wireless alternatives exist. I just don't see how they offer any ADVANTAGE. They add cost, require charging additional devices, and are less convenient. How is this BETTER than just leaving the headphone jack where it is/was?
    Duplication is what bugs you? What about duplicate holes in my iPhone? One (lightning) that does pretty much everything and the other (headphone jack) that does only one thing.

    In your explanation of the supposed requirement for dongles to replace the functionality of the missing headphone jack you conveniently omit the fact that many of us have made the jump to airPods or other Bluetooth headphones. I have. Why should I have to have an extra hole cut into the bottom of my phone because you are stuck in the past with a dwindling band of other complainers? I say, bring on the future. I’m fact, let’s also lose the Lightning jack as soon as it’s feasible. I’d love the iPhone to be stripped down to the simplest, most waterproof & structurally uncompromised form possible. 

    Leave the extra ports for ipads & macs!
    I have answers to some of your questions, but it seems you're not really interested in having a respectful discussion so much as wanting to demonstrate your advanced state of tech evolution by hurling shade.

    Since the only one who gets anything out of that is you, I'm going to politely excuse myself. You carry on. Just remember to put the Kleenex box back where you found it when you're done.
    baconstangwilliamlondonroundaboutnowmuthuk_vanalingam
  • Apple should keep Lightning for now, but USB-A has to die

    ascii said:
    Because with digital transmission you don't just have a raw signal being sent but can have an entire protocol defined.
    I don't disagree with most of what you wrote, and applaud both you taking the time to explain and your visionary view of how things could be.

    That said, some potential benefits are outweighed by practical realities. I'd like to address some of those, partly just to play Devil's Advocate, but also to illustrate how trying to make something better can actually make it worse.

    ascii said:
    The computers in the phone and speaker can talk to each other and describe each others capabilities, so that the phone knows how many speakers there and their configuration and what the best quality signal they can handle is.
    That's true, but can you think of any examples of how auto-configured multichannel playback would benefit a user holding an iPad? Are you going to watch a Dolby Atmos movie on an iPad? Does the potential for inter-device operability outweigh the benefits and convenience of a quick and easy analog audio connection?

    ascii said:
    [...] once the transmission starts there can be error correction and retransmission, errors are very easy to detect in digital data.
    1. I'm not convinced that digital offers enough additional resistance to transmission errors over analog to make it universally preferred. If it were as bulletproof as pundits claim, I wouldn't have skips, pops, or bursts of digital noise in some of the songs I've ripped from CD. It doesn't take much to break cross-checking.

    2. Real-time error correction requires buffering, and therefore throughput delay (or what is commonly mislabelled "latency"). Unless the audio output somehow communicates that delay time to the video system, and the video system is capable of compensating for that delay, audio and video will go out of sync. There may also be ramifications for real-time audio production, like in the cases of iPhones or iPads being used as instruments or personal monitor mixers for performers.

    ascii said:
    There could also be digital compression to a higher quality signal than could otherwise we sent over a thin wire.
    What level of detail can not be accurately transmitted over two or three feet of wire? The sound of atoms banging into each other? Ultrasonics? Is it worth trading the benefits of the headphone jack for the ability to transmit sounds no one can hear? Even the crappiest wire on the planet exceeds the capability of the best transducers.

    ascii said:
    Yes, a signal must ultimately be converted to analog to be heard, but digital transmission is so much more powerful/flexible than a raw analog signal that this conversion should be pushed as far downstream as possible. Ideally right at the speaker, but at least after any kind of transmission through wires or air.
    No argument that what you describe is theoretically superior. The question for me is whether the ACTUAL BENEFITS outweigh the inconvenience, cost, and complication imposed by that approach. There are many, many, many more cases of a headphone jack being not only better-than-adequate but also much more cost-effective and convenient than there are examples of how the user will genuinely benefit from moving digital conversion and amplification out of the iDevice. And, more specifically, the Apple-supplied dongle doesn't achieve those objectives anyway. And it adds another layer of power consumption.
    baconstangwilliamlondon