Getting above information
Are zooming user interfaces (ZUIs) a future for human machine interaction?
They don’t concretely map to anything we’ve got in the real world. Binoculars maybe. Something here about vision, sliding focus -> see “Yoga for Your Eyes” (1999) video / book by Meir Schneider.
Personally, I think of them as more of an art fad thing than anything else.
Can’t imagine actually using them to find information, as the UI still forces me to remember a hierarchical or list structure of that data I have and haven’t seen. It’s really like a glorified tree, and we all know how well those work for people who don’t speak geek. Not very.
Do we even need an information overview in the first place?
How we use a radio
- Do I want to listen to music or talk?
- What kind of music do I want to listen to?
- What stations play that kind of music?
We find a source we like, and we poke one of the little buttons on our dash to store it. Maybe we’re seeing a similar thing happen with RSS / Atom and news aggregators.
But if our information was classified like our music, we have the RIAA arguing over wether something was “rock” or “pop”. Maybe we need a social tagging system for music genre’s as well. Then Six Pence None the Richer could be classified as “gospel”, and Brittany Spears could be classified as “bad”.
Back to the not quite original question
What does getting a “bird’s-eye-view” or our information gain us? Has anyone considered that maybe even that’s going to be too much?