Hacker News new | past | comments | ask | show | jobs | submit login
Best Practices of Touch Screen Interface Design (voltagecreative.com)
14 points by wmeredith on May 16, 2008 | hide | past | favorite | 18 comments



The Newton provided some nice UI guidelines. One of the ones I liked was putting controls on the bottom of the screen and the results/display on the top. This prevented your hand from obscuring the results.


That's a good point. I've appended it to bullet point #7.


I think that direct manipulation on a touch screen is a big deal.

"Grabbing" an object by touching it and then being able move it is an even better illusion on a touch screen than with a mouse. And it is delightful.


But "grabbing" is a less-efficient solution. Just as moving the mouse around may be less efficient than typing on that keyboard; grabbing stuff to interact with them can be a waste of time. More intuitive != more productive; and the current problem (IMHO) with mobile devices is that they have such low efficiency rates that their productivity _despite_ the advantage of being always on and always with you still suffers (as a result of size, awkwardness, and time-consuming adaptations we need to use).


There's more too.

There's a boatload of research in CHI and Human Factors about input mechanisms. Much of it is crappy. Some of the best was done in the last 60's and revolved around the "McRuer Crossover Model".

There's a neat bit of work looking at what kind of models a user has to create in order to control a noisy system efficiently. It turns out that you can cleanly calculate when the task gets too hard. This has since been corroborated with the flight testing of (among other things) fighter aircraft.

The bottom line is that it says that the less (mental) work you have to do to, the better you'll do.

Each integration is considered work, and by this:

touchscreen < [mouse|trackpad] < [joystick|nubbin]

With a mouse, you have to think "I have to move it for this amount of time to get to where I want". You integrate its motion to get the final position.

Joysticks & pointing nubbins are one step worse: "I have to hold it in this deflection to cause it to move the cursor for this amount of time."

Touch typing on a keyboard fall outside this space because there is no pointing/tracking task at all.

And, for reference, reversing a two-stage tractor trailer is 5 levels of integration. Getting an oil tanker into harbor is 6!


This is awesome stuff, Sanj, thank you for sharing. Your "about" section in your profile is blank, do you have a blog or site or anything?


My company blog needs more comments before it'll be worth reading!


I totally disagree. Building a handheld application requires getting into the mindset of the UI that it requires. I say this having worked on handhelds for, um, 15 years now.

And you're badly undervaluing delight.

Let me give a concrete example from my own work.

A buddy an I wrote a text-adventure parser so you could play stuff like Zork and HHGTTG on the Newton. It worked really well and was reasonably popular at least partly based on the interface.

Understand that you always have access to a keyboard on the Newton -- there was even an external one.

But what we did was to have a word automatically entered on the input box when you tapped on that word above in the description.

That was scads faster than tapping something out. And made users wicked happy.


Thanks for sharing your experience in this field. But (and forgive me if I'm mistaken) your application was a special exception where the option of using the touchscreen UI made more sense, just as the Nintendo DS has a usability edge with its touchscreen. But for the more-general "productivity" applications - from email clients to word processors to graphics design software (where the "word bias" isn't an issue), I don't see touchscreens being the more efficient way to go.

In my opinion, delight is great, but not when it comes at the expense of productivity. For games and leisure, productivity is a non-issue and - you're absolutely right - delight and simplicity should be the developer's number one focus. But for productivity tools, making things easier and quicker to get done is the most important thing. There's nothing more frustrating than banging out an email on a touch-screen keyboard; for productivity tools, increased productivity is the delight.


So perhaps point #8 should be:

don't use a touchscreen when you shouldn't.


sanj,

Your comments have been pretty insightful. Would you mind sharing a little more about your background wrt handheld development?


Sure. I ended up with a Newton from school when Apple was dumping the MP100s. A buddy and I started writing stuff for them -- some games and a VT100 emulator.

Bits of them here: http://web.archive.org/web/20041210150544/http://www.scrawls...

The game I mentioned is here: http://web.archive.org/web/20030811081833/scrawlsoft.com/pro...

When the Newton disappeared on 1997-02-27, we got call from the guys at Palm (mostly ex-Newton folks) to lure us to code for the PalmOS rather than WinCE. We only wrote one neat bit: http://www.powermedia.com/pilot/index.shtml

About a year after that I helped cofound PatientKeeper, http://www.patientkeeper.com, where I wrote the first 7ish versions of the handheld app before we hired some really talented engineers to hand it off to. I spent the next several years writing the server side of that system while watching them take ideas I'd had to new, amazing places.

My last project there resulted in this: http://www.patientkeeper.com/news_and_events/press_releases/...

Which, by my calculations, represented ~70% of their $45m FY08.


It's about size. And the mouse is not less efficient than the keyboard for certain tasks. For others (especially, I'd imagine, the hacker news crowd) it can be less efficient, but I don't think that's so much of a general rule.

I love the keyboard on my full-sized computers. But lately, I tried to ssh to a server to fix an issue from my iPhone and typing was a bitch. Despite gestures and shortcuts and my aliases, I couldn't really get anything done and gave it up to do later.

I don't know what your idea of productivity is but some things are very amenable to being grabbed and manipulated. It's just so much simpler and faster. For example, I use irssi on my phone when colloq stops working to get on irc, and I wish I could flick back and forth between windows...or I wish I could manipulate and split windows with only my fingers zooming in and out and dragging things. No, I have to type it out and it is a very annoying pain in the ass. I know irssi isn't a mobile app, but it sure does make a good example.


this is one of the reasons i use my wacom tablet all the time

now i just need to buy me one of them expensive wacoms that let you write right on the screen <_<


I'd like one where I could use my hands on the screen with an appropriate UI at a reasonable price.

I have a wacom intuos tablet which can be sweet for the likes of Photoshop, but I got an iPhone a year ago and (with the exception of badly designed apps, of which there are plenty) it's just not the same.

In fact, every single LCD I use has fingerprints all over it because sometimes I forget they're not touchscreens and I touch the display to do something.

But the iPhone just feels different from most touchscreen-based systems I've used. Probably the UI. It's a lot more responsive and intuitive.


Good to see that this is being blogged about. There's been a lot of research done to find the most efficient button sizes for handheld touchscreens. I think the next big push is going to involve design guidelines for truly mobile apps, apps you use while actually navigating the world and not stopping to work.

See http://www.mediateam.oulu.fi/publications/pdf/1076.pdf for a paper called "Target size study for one-handed thumb use on small touchscreen devices". It's a good starting point.

I had to think about this hard when we created Mobiphos, http://nirmalpatel.com/mobiphos.pdf to study mobile collocated photo sharing.


Yes, I whole heartedly agree. This is one of the strengths of touch screen interfaces. Reaching out and manipulating something is just about the most natural way to interact with it. Just watch a baby for a few minutes, they're instinct is to touch everything.


It could be argued that the reason they touch is because they haven't learned to use other (more efficient) senses in-tandem with the unique human ability to deduce and comprehend. We can touch things to see what they do, or we can process what we see and hear to arrive at the same conclusion but with less physical effort and in less time.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: