UI fun

24

Comments

  • Reply 21 of 70
    ipodandimacipodandimac Posts: 3,273member
    Damn... I was hoping for the minority report stuff that is used when tom cruise is looking at the merry go round. i'm talking massive screen, gloves, and lifting/throwing things all over the place. if i had the money i would put one together myself, cause it would be easy as hell, but oh well. this is pretty cool though.
  • Reply 22 of 70
    kickahakickaha Posts: 8,760member
    You oughtta try it with a projector. We got yer massive screen right here, buddy...



    And... no gloves.



    And... gestures support with something like Cocoa Gestures lets you do all sorts of neat things. If we can eek a bit more out of it, I plan on using it during my Keynote presentation of my defense. Want to point at something on the screen? Screw a little laser pointer - raise your hand into the camera view, and voila, they see you in 14' glory, and you point at what you want. Done? Drop your hand, content is clear again. Want the next slide? Lift your hand into camera view and slide it left. Previous slide? Slide right. Etc.



    This is just the tip o' the iceberg.
  • Reply 23 of 70
    torifiletorifile Posts: 4,024member
    Cool stuff, kick. It's good to see that someone's phd work actually DOES something.... Mine on the other hand.... an exercise in futility At least I don't need significant results to graduate...
  • Reply 24 of 70
    defiantdefiant Posts: 4,876member
    Quote:

    Originally posted by Kickaha

    Besides, do you know the *HELL* I've gone through the past few months keeping my blasted mouth shut?!?



    I understand. But you can always PM me for a 'private beta'.



    I think I understand what it means to be silent about something really cool. You want to tell the whole world, but you just can't. Sucks.
  • Reply 25 of 70
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by torifile

    Cool stuff, kick. It's good to see that someone's phd work actually DOES something.... Mine on the other hand.... an exercise in futility At least I don't need significant results to graduate...



    That's the suckiest part. This isn't my PhD work! :



    Four years of solid work on the dissertation, and *THIS* stupid thing gets all the attention. Go figure. \



    Ah well, it's still cool.
  • Reply 26 of 70
    thuh freakthuh freak Posts: 2,664member
    ok, so its patented, and closed source (you evil bastard ), but could you divulge some of the trickier puzzles or pitfalls? like, so a freak could get a head-start on a illegal version?



    a more specific question: does this thimble have anything special about it? magick potion? leds? just colorful paint?
  • Reply 27 of 70
    sunreinsunrein Posts: 138member
    Looks really excellent. My mind reels with all the applications this technology could work with. For musicians particularly...
  • Reply 28 of 70
    kickahakickaha Posts: 8,760member
    Virtual theramin, anyone?



    freak, you know I can't do that... sheesh. I can't talk about anything that's not in the published papers... http://www.cs.unc.edu/~stotts/work/#TVF Knock yerself out.



    And yeah, this thing rocks with SubEthaEdit - two people at remote locations, interacting with the *same document* and both can edit it. "No, dipstick, this line of code *right here* (point point point)..." Works great with one person solo at a machine, or two at different locations.
  • Reply 29 of 70
    kim kap solkim kap sol Posts: 2,987member
    Kickaha and Co: building a generation of beefy-armed kids.



    I can picture it now...year 2015, a bunch of people flailing their huge arms out of habit or maybe a nervous tick.
  • Reply 30 of 70
    ast3r3xast3r3x Posts: 5,012member
    Pull up that old video of thing thing. I can't wait to use it (will be able to eventually correct?)...I'll buy a FW camera just for this.



    The thing that gets me so excited is having the iSight just not capable of seeing my fingers when they are typing. Lift my hand up...maybe just my finger so others don't have to leave the keyboard...and click drag, select, edit, whatever...and go back to typing.



    HOW LONG DO WE HAVE TO WAIT?
  • Reply 31 of 70
    thuh freakthuh freak Posts: 2,664member
    Quote:

    Originally posted by Kickaha

    freak, you know I can't do that... sheesh. I can't talk about anything that's not in the published papers... http://www.cs.unc.edu/~stotts/work/#TVF Knock yerself out.



    i know. was only teasin. but if large amounts of source code was to find its way into, i dont know, an email, or a pm, a freak wouldn't tell anyone. wink wink.
  • Reply 32 of 70
    kickahakickaha Posts: 8,760member
    Talk to your employers, or developers you know.



    Tell them that if they're interested, to contact the UNC Office of Tech Development (http://research.unc.edu/otd/) and speak to Jim Deane regarding the FaceTop technology.



    Seriously.



    Oh, and Jim's a busy guy... serious inquiries only.



    Honestly, this is the fastest way to get this out there, because I can't give the yes or no - only OTD can.



    FWIW, that article has already generated more than a couple inquiries, but they're all for in-house use, not commercial or consumer product interest.
  • Reply 33 of 70
    hobbeshobbes Posts: 1,252member
    Wow, really interesting.



    Kickaha, did you find that using gestures is easier and/or more accurate for users with one's mirror image on the screen, or about the same? The mirror image that fades in and out is a lovely idea, but I'm curious if it's as necessary for solo use.



    Also, can you use this system while decked out in crimson garb?
  • Reply 34 of 70
    kickahakickaha Posts: 8,760member
    Various clothing (and lit exit signs, we found out at one conference) do indeed cause the system fits at times...



    The mirror image is the entire *key* to this system. Turn it off, and people have a hell of a time coordinating where their hand is in the air, and the cursor position.



    See, the whole problem boils down to one of spatial registration - previous systems used multiple cameras, radio transmitters, what have you, to figure out where in space the fingertip was, then most tried to map out the user's line of sight relative to that, and then project the fingertip onto the surface of the screen, and figure out what the person was pointing at.



    Instead, we bypass the whole problem by letting one of the more advanced spatial registration systems we have available handle it... the user's brain. By using the user's own eye-hand coordination, we eliminate the need for costly hardware or computationally expensive algorithms. The user sees their own image, and their own fingertip, and in a couple of seconds figures out how far they need to move their hand in what direction/angle to move the cursor to a given location.



    One thing we've noticed a lot of people do almost automatically is figure out the boundaries - how far they have to move their finger to hit the top, bottom, left and right edges. After that, it's locked in their brain, and away they go. We don't tell them to do it, they just do it instinctually.



    So no, removing the image doesn't help... in fact it makes the wetware have fits.



    We get that question a lot, and the easiest way to show it is to have the person interacting with the system, and I'll just reach over and turn the image off. Suddenly, it's not so easy for them. If I move the camera slightly, disturbing their mental map of the spatial registration, they utterly lose their place.
  • Reply 35 of 70
    So this is kinda like ToySight for the OS? Cool. All my friends love playing ToySight. It'd be a hoot to film them with their arms flailing all over.
  • Reply 36 of 70
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by Mr Beardsley

    So this is kinda like ToySight for the OS? Cool. All my friends love playing ToySight. It'd be a hoot to film them with their arms flailing all over.



    Yeah, we actually predate ToySight by a few months, but it's that idea on steroids. They're doing edge detection of movement deltas, and it works nicely for what they need. We're doing... a bit more.
  • Reply 37 of 70
    vox barbaravox barbara Posts: 2,021member
    Actually, you want some UI fun, you get some UI fun







  • Reply 38 of 70
    kickahakickaha Posts: 8,760member
    A pox on thee! I'm gloating here!
  • Reply 39 of 70
    Not trying to downplay your achievements at all. ToySight is just the closest thing I can compare it to. And, everyone raves about ToySight. If your system does more, then I imagine everyone will rave about it too.
  • Reply 40 of 70
    kickahakickaha Posts: 8,760member
    Oh, ToySight is a kickass technology, no doubt about it. Heck, I was stunned that no one had done this before us *at all*. I mean, it wasn't exactly an earth-shattering idea, and I thought it was kind of obvious, but... *shrug*



    Anyway, it's fun.
Sign In or Register to comment.