2011/02/28

Even older than Raskin

As promised, a post about Fitts's Law - the second post about GUI paradigms.

Paul Fitts was an Ohio State Univerty psychologist who developed a model about human movement (wikipedia).  It turned out to be a very accurate and usable model, predicting how fast we (humans) can accurately aim at a target depending on the size and the distance of the target. This model is called Fitts's Law.

The reason why Fitts's Law became so widely used in GUI design is because it's well applicable to the usage of pointer devices such as the mouse - it predicts, for example, why it's so easy to hit targets that are on the edge of your screen (and even easier to reach targets that are on the corners of your screen). For details, see the (somewhat dated) article on AskTog: First Principles of Interaction Design. Or, for a straight-forward explanation of the impact of Fitts's Law: see Particletree's Visualizing Fitts's Law.

The reason why I post about it is the reason that we're still discovering new ways about how to apply a well-known law from 1954:
- In 1954, Fitts's law was "discovered"
- In 1984, Fitts's Law is used "for real" for the first time (I think) in a Graphical User Interfaces (GUI) with the Apple Macintosh, having a menu bar always on top of the screen which is very easy to target
- In 2003, Apple introduced Exposé with Active Screen Corners, making effectively use of these easy-to-use areas to operate functions in your computer
- In 2009, Windows 7 was introduced - using the edges for arranging your windows quickly - just smash a window to the top to maximise it, to the left to fill only the half of your screen et cetera.

That means that, 54 years after Fitts developed his model about human movement, we're still finding new ways how to implement it the right way. That's a long time - about 10 generations. And it makes me look forward to what ways we'll find the next 30 years about how to operate touch devices :-)...

2011/02/25

About User Interfaces

Last week I did some research for a tiny project I'm doing at the University of Amsterdam.

This project is about the development of User Interfaces through the years. For example, back in olden days your monitor was... well... just a speaker! Imagine a processor running ticking at 8 kHz, and the checksum of the "input bits" being redirected to a speaker. It turns out you get a strange kind of melody. If you were to see a stream of dots passing by at this speed, you wouldn't recognize an unevenness in the pattern. But when one bit is missing, the pitch changes dramatically - so the monitoring proceeded by ear. dr. Gerard Alberts has posted some videos of these early computers but I can't find them right now (so I'll post 'em later).

Anyhow, while looking for history on User Interfaces I stumbled across the book "The Humane Interface" by Jef Raskin. Ordered it at the university library, and read it last week. It's a really good read, and triggered me by thinking about how User Interfaces are supposed to work. Raskin explores User Interfaces from the viewpoint of humans and human shortcomings (very interesting!). Some human shortcomings (such as the time we need to switch from one task to another) can be exploited by UI's. Read the wiki page, and if you're not convinced, read the book :-). Some of the points Raskin states are (points are from Wikipedia, explanation is mine):
  • Modelessness. Computers shouldn't be in different "modes" where they react different - for example: using ctrl-c for copying text in one context and ctrl-c for terminating a program in another are different modes where the same action triggers different behavior.
  • Monotony of design. System / UI designers should figure out the best way to do something, and only implement that one - it confuses users when dozens of options are offered to do one task.
  • Every action must be undoable. Programmers have known this for long and name it version control ;-). The effect is, however, that save-buttons should be obsolete. A computer should never discard your work. This also diminshes the need for dialog boxes
  • Elimination of warning screens. People tend to automate tasks, almost making it a reflex to click away a dialog box.
  • Universal use of text. Icons are okay, but should be accompanied by text so it's obvious what they do
  • When you stop using your computer, you shouldn't have to wait for booting, but continue with the task where you left off
Especially the undoable functionality struck me. Of course, there will be operations where it's (because of law restrictions) needed to explicitly save things. But still, a computer should simply never, never discard my work! When the lights go off and I turn my computer back on, my work should be still there.

This was 2000. Eleven years ago - nearly two computer-lifespans (you remember, in 2000 you still played around with Win ME, when you were on Microsoft. The first stable Microsoft-consumer-OS - Windows XP - wasn't even completed at that moment!).

What you should know, however, is that Jef Rasking designed the original Macintosh interface. Not all of his ideas were implemented, though. Today the new specifications of Mac OS X 10.7 (Lion) were made public. And imagine what? Three in a row:

  • Auto Save - "Say good-bye to manual saving. Auto Save in Mac OS X Lion automatically saves your work - while you work - so you don’t have to."
  • Versions - Versions records the evolution of a document as you create it. Mac OS X Lion automatically creates a version of the document each time you open it and every hour while you’re working on it. If you need to revert to an older version or retrieve part of a document, Versions shows you the current document next to a cascade of previous versions (...) so you can see how your work looked at any given time. 
  • Resume - If you’ve ever restarted your Mac, you know what’s involved. First you save your work, then close all your apps, then spend valuable time setting everything up again. With Resume, that time-consuming process is a thing of the past. Resume lets you restart your Mac — after a software update, for example — and return to what you were doing. With all your apps back in the exact places you left them. In fact, whenever you quit and relaunch an app, Resume opens it precisely the way you left it. So you never have to start from scratch again.
Amazing - ideas from eleven years ago (technically already no problem for over a decade) still inspiring today's computer makers. Of course, Apple will claim this as their invention. We know it isn't, but still they're the first to include the features in consumer-grade OS - makes it (for me at least) worth considerable when buying my next computer..

Next time I'll digg in some more UI-stuff with Fitts's Law (even older than Raskin)

2011/02/24

Twin-platform-development on Silverlight for Desktop and WP7

It's been a while since I did my last post. I'm hoping to do more posts several weeks, and become eventually a real active blogger. Just kidding :-).

From now on, I'll use this blog to posts about my research project (my bachelor's final project). The project will be about developing an application on different devices; whereby the focus of the project will be about the Windows Desktop and the Windows Phone. For me as a Linux-user pretty new, but I'm curious to the stuff Microsoft has to offer! (and for that sentence I will be brought to death before the Open Source Inquisition).

At the end of the research, I will produce a whitepaper containing advices, best practices and pitfalls about developing on two different devices. Besides, I will deliver a Proof of Concept-application that I will develop myself. My research will be largely based on literature (scientific papers and articles), community content (weblogs, general articles) and written-down experience.

The next couple of posts will be about my explorings of the Microsoft-universe; if you're interested about my quest and findings: stay tuned!

2010/10/12

Windows Phone 7

I'm not fully convinced about the Windows Phone 7 ideas, but they have some cool videos really taking on me - one called "Really?" and another called "Season of the Witch".

The cool thing about the concept is that Windows Phone is not another mobile OS immersing you in its experience. As Steve Ballmer stated: "You get in, out, and back to life." - so the videos are making fun of everyone always everywhere looking at their mobile phones. And they've got a point there. Maybe we have no need for another "beautiful phone" phones using even more of your time. Not another phone that you just "keep using" because it looks so beautiful and works so well. Maybe it's time for phones using less of your time, that don't go in the way of communicating with others, but still let you do the things you want to do.

Or, maybe it's time for users that can handle priorities. Maybe it's not the phone that needs change. Maybe it's the people and their behaviour, which is not changed by a different type of mobile phone. Even then, the videos are still funny.

To finish: one quote from the developer videos (announcing the most recent tools available): "Go get 'em. Be inspired, be awesome, stay nerdy" :-).

Android 2.2 - update

Ok, so even Android 2.2 isn't a silver bullet ;-).

After using it for a while, I had to turn of JIT-compiling, because it made my phone .. well .. run slower. The only reason for that I can think of is a lack of internal memory. Binary code simply is larger - and thus uses more memory than bytecode does, and my two year old G1 - being the first Android handset - hasn't that much RAM  (around 74 MB, compare that to the myTouch and iPhone 4 having 512 MB).

Still, it's an amazing system. A state-of-the-art Smartphone OS running quite smooth on quite old hardware. Thumbs up for Cyanogen (and a $10-donation).

2010/08/15

Android 2.2 - Android grown mature :)

At first, I could hardly believe it - on my two year old (!) G1, I installed a new Android OS and it actually ran faster than any before! After digging in, I could hardly believe another thing - that Google managed to go on for three years with "scripted" apps that knew no compiling... But finally, Android seems to have grown mature.

Highlight for the moment (running it for only two hours now)
- JIT: speeeeeedy :)
- Gallery - integrates with Picasa, and the "stack peek" we know from the iPad is implemented in a really nice way. Ever been looking at a Photo, and after a while realised that is what floating in 3D while following the direction of your hand? It's a WOW. Two thumbs up for Android - (ok, and for Cyanogen, who makes this available for my old G1)

2009/04/29

Defaultism

Three decades ago, a software company named Microsoft successfully started to use a new marketing device: the OEM license. With a pre-installed OS on you new-bought PC, users didn't think anymore about which OS to use - the default, pre-installed OS did indeed meet much requirements, so why should you ever take the effort of looking further?

Much has changed since then, but one thing has not: users are still using mainly the software bundled with the computer or OS, until it lacks important possibilities or they already have experience with a piece of software not present by default. Of course, I am guilty of this behavior as well: back in the days I was still using Windows, I started using a Linux distro because I was very sure it was better. But I became disappointed: as an unexperienced user, not all peripherals did work right out of the box, and messing with configuration files in Vi is quite difficult if all you ever worked with was Dreamweaver. I didn't start using Linux until I got a smooth working distribution out of the box by default settings (or by few and very easy configuration). This behavior I call defaultism.

Most of the Linux users don't act that way. At least, they think they don't. Because they install an OS that's not bundled with the computer by default. After some playing around with desktop enivronments they start looking around and they start discovering the shell. And more and more they become aware of the powerful possibilities. But this is where it ends for the most people.
For most Linux distributions, BASH (the GNU Bourne Again Shell) is default. And it's a pretty good one. It's powerful, relatively easy to use, there is a lot of documentation about it on the internet and - hey, it's default! Sometimes users try other shells like the C shell, the Korn shell, TCSH or ZSH. However, by default, the shells aren't configured the way Bash is. So they experience the shell as "difficult", say it has a "lack of possibilities" or isn't "accessible" enough. All because a decent configuration file is absent.

Last months I've become enthusiast about a particular shell - ZSH. But not before I got a useful .zshrc file - the ZSH config file you have to store in your home directory to make it useful. ZSH is much faster than Bash, extremely configurable and available on nearly every Unix-box. Including Mac. Including BSD. Including every Linux distro I've known so far.

Right now I discover I'm falling back into defaultism again - all I do is making some minor tweaks to existing zsh config files to get it working my way. I don't dive deep into the zshrc file structure to fine-tune it to my wishes. Again, I'm sticking with default settings.

Same story with Vim: I discovered Vim, got used to the basics and became very excited about the ease and efficiency of use. Still I didn't dive into the configuration to make it even more useful. I didn't explore all possibilities but a few - although they're very useful and even now Vim outperforms any other editor I've ever used. But still I stick to defaults.

"Never change a winning team" is well-known, but applied somewhat too easy. I think as computer users, sometimes we don't know whether we're winning or not.

Definitely, in many cases we know. As a beginning C programmer, I knew I wasn't winning when I din't use Makefiles. As a decent company, you know you're not winning when you don't have a website (Besides, are you decent at all without a website?). But in that case, as soon as you will get the opportunity, you will leap up and make sure the winning gear will be in house - before losing even more.
However, sometimes we don't know whether we're winning or not. I never knew I was not winning when I used Linux without using the shell. Until I experienced the advantages. I never knew I was not winning while sticking to Eclipse or gedit. Until I discovered the advantages of Vim. And I'm aware I will be winning when I set applications to my preferences.

Undoubtedly there are improvements out there still unknown to me. Everytime I choose the non-defaultist's way there are. So I got to keep my eyes open for the non-default way to keep winning.

(P.S. wanna get excited too about ZSH?
ZSH: the last shell you'll ever need (Fried CPU)
ZSH for productivity (Prashblog)
Phil!'s ZSH prompt
ZSH description from ArchLinux wiki)