Thoughts, links, and essays from Peter Merholz
petermescellany   petermemes


Archives before June 13, 2001

RSS Feed

Adaptive Path (my company!)

About peterme

Most of the Time
Oakland, CA

American history around the time of the Revolution, figuring out how to marry top-down task-based information architecture processes with bottom-up document-based ones, finding a good dentist in San Francisco Oakland
Designing the user experience (interaction design, information architecture, user research, etc.), cognitive science, ice cream, films and film theory, girls, commuter bicycling, coffee, travel, theoretical physics for laypeople, single malt scotch, fresh salmon nigiri, hanging out, comics formalism, applied complexity theory, Krispy Kreme donuts.

Click to see where I wander.

Wish list
Show me you love me by
buying me things.

Track updates of this page with Spyonit. Clickee here.

[Editor's note: began as a site of self-published essays, a la Stating The Obvious. This evolved (or devolved) towards link lists and shorter thoughtpieces. These essays are getting a tad old, but have some good ideas.]
Reader Favorites
Interface Design Recommended Reading List
Whose "My" Is It Anyway?
Frames: Information Vs. Application

Interface Design
Web Development
Movie Reviews

I hate passwords, too. Posted on 04/03/2002.

Bonnie Nardi, a design anthropologist, speaks out on what collaboration software can and ought to support. I first saw Bonnie at IA 2000, where she gave a great talk on ecological design, ideas she had earlier written down in her important work, Information Ecologies: Using Technology with Heart.

One of the things that seems so obvious, but clearly isn't considering the way computers behave, is how she points out how things like passwords and firewalls are so obstructive. IT types have convinced us that we NEED these things, and security wonks have sold us on the THREATS out there, but, really, much of what they do is simply block the free flow of ideas, or prevent people from using their own systems ("Which password did I use here, again?"), and I believe that the degree of security folks are forced to place on their own system is far too draconian.

10 comments so far. Add a comment.

Previous entry: "Seeking Meaning."
Next entry: "'tis the season of my dis-content."


My position is that usability and security are two concepts naturally at odds with each other.

- Security's purpose is to keep people out, to put up barriers.

- Usability's goal is to remove barriers, to facilitate people getting to things easily.

A software architect friend and I had engaged in lengthy debates about this. He claimed that security wasn't that much at odds with usability. He put faith in things like "single sign-on" and directory services, yadda yadda yadda.

At least that was his position until he had to work on a large project involving "single sign-on" types of goals. He ran into some technical glitches trying to make the security framework really usable due to vendor product design decisions. One day he made a point to tell me I was right and that security systems in and of themselves hamper usability and really provide no value. They are just necessary evils that help limit risk. They do nothing to help deliver the core functionality of a system.
Posted by Lyle Kantrovich @ 04/03/2002 06:57 PM PST [link to this comment]

It's good to see Peter talkin' libertarian.

But there's no reason for usability and security to be at odds with each other. This is shaping up as a classic debate between user-centrists, who make things easy for people, and machine-centrists, who make things easy for computers and the folks who program em.

But these days, it's usually feasible to do both.

The "single sign-on" project Lyle's friend worked on seemed like it was doomed from the beginning, but I've personally worked on projects where ALL the security features were invisible and worked fine. In that case, as in many cases, security was desired by the provider of a service, not the user. To prevent abuse of the system. And the only people who were affected by it were those who wanted to hack.

Users don't want security that often. Like Peter says, it's often forced on them by the services -- how many of us actually care if someone else uses our New York Times account? (you don't need an account at all, when hackers conveniently install known username-password combinations on such sites for all to use.)

Users should only have to be concerned with security when they themselves care. There are plenty of usable interfaces to weighty security systems. The PGP module plugs into Eudora very nicely. Ebay lets you choose what functionality to protect.

The main point is that security, like most other functionality, should sit quietly in the background until someone wants it, and then it should present itself for easy use within the appropriate context. As a piece of a system, it's as valid as any other piece. When it's used in a way that violates the above, you could chalk it up to bad design -- but as with so many other "problems", it's more often a purposeful move driven by financial intent.
Posted by Travis Wilson @ 04/03/2002 10:42 PM PST [link to this comment]

I disagree strongly with the attitude that security is incompatible with usability; and even more with the idea that security is unimportant of itself (but that's another rant).

There's a really interesting study on the subject here:

One of the most interesting things it points out that is that we need more effective models for security, as at the moment they're difficult to reflect in software.

See also the thread on this mailing list (scr) - the topic to look at is 'Internet Explorer: Danger in Numbers' (discussion starts at the top of the archive page).
Posted by Celia Romaniuk @ 04/03/2002 11:42 PM PST [link to this comment]

i hold up my "agree" flag in Celia's direction...

from Neal Stephenson's homepage

"Basically I think that security measures of a purely technological nature, such as guns and crypto, are of real value, but that the great bulk of our security, at least in modern industrialized nations, derives from intangible factors having to do with the social fabric, which are poorly understood by just about everyone. If that is true, then those who wish to use the Internet as a tool for enhancing security, freedom, and other good things might wish to turn their efforts away from purely technical fixes and try to develop some understanding of just what the social fabric is, how it works, and how the Internet could enhance it. However this may conflict with the (absolutely reasonable and understandable) desire for privacy."

People who deal with people's experience of tech in context (UI/IA/UX...whateverdude) speshly those with an "information ecology" bent would seem well placed to engage in "efforts away from purely technical fixes", providing real people with real digital security and privacy rather than "security and privacy products"

blimey, neal stephenson's good isn't he.
Posted by matt @ 04/04/2002 03:29 PM PST [link to this comment]

This idea of "studying the social fabric" to understand how to enhance it in online environments is something I've been thinking about a lot lately. Mostly, I've been thinkin' about Trust. In fact, I'm writing a paper comparing notions of trust from research in the social sciences to trust and "trust management" from the computer security and cryptography perspective.

Some key points from the forthcoming paper:
- Most "trust management" systems are no more than complex systems for authentication.
- Most security and cryptographic systems that talk about "trust" are actually talking about a very small aspect of social trust (e.g. reputation management).
- There have been a few attempts to formalize Trust as a computational concept but they typically fall very short of capturing the important nuances of social Trust. (Not so surprising, really.)
- In the end, formalizing trust is likely impossible. The key will be providing tools that focus on specific aspects of Trust and help people manage their relationships with people and companies.

If anyone's interested, email me and I'll let ya know when it's done.
Posted by tpodd @ 04/05/2002 01:02 PM PST [link to this comment]

You guys sound just like my technical analyst friend before he saw the light. Here's a simple test: hold a deadbolt lock in your hand and then tell me what value you can get from it *in and of itself* -- you get nothing (okay, you can use it as a paperweight, but...).

When you put the lock on a house, it does provide some security, BUT security is just a necessary evil. The homeowner will get no more value from the house because the lock is there.

As a necessary evil, it's also important to note that security often keeps out or hampers the ability of the right people to get value from something. Ever locked your keys in your car? Forgot your password?

A gun is a tool -- I can hunt up some dinner with it. Crypto? It just clouds everyone elses eyes, but does nothing for me personally except add more complexity to using something.

Ideally security should be transparent to the user and work flawlessly to keep bad people out, but show me a world where that happens. It's only in your dreams.
Posted by Lyle Kantrovich @ 04/05/2002 04:31 PM PST [link to this comment]


You sound like my designer friend of mine until his server got hacked.

The major problem with security is in order for it to work, you have to understand it. You can't just expect it to happen automagically - Microsoft have tried to implement that sort of thinking and their products are riddled with holes that compromise your machine and your data.

The 'Why Johnny can't encrypt' paper is interesting and I agree with its conclusions that we need to change the way people think about security. There are lots of ideas about this but this text entry margin is too small to contain them.

What I do find interesting is the fact that security designers (because you *do* have to design it and it is an art, as well as a science) have a lot in common with IAs (and copyrighters and ....) - both argue that they need to be involved with the design process from the beginning because their input is not something that can be bolted on at the end.

I'll leave you with a question - if security is so useless, why do Systems Administrators hold so much stock with it and make their own lives difficult by locking down the machines under their control? For fun?

Actually, I'll leave you with a link -
"How much security is secure enough?"
Posted by simon @ 04/08/2002 03:27 AM PST [link to this comment]

I held Hoover Dam in my hand and didn't get any value from it in and of itself. It was just heavy. Good thing they put it next to a large force of water, so it could serve its purpose by countering the force, and add value to an agricultural system.

another example of something that, when well-built, doesn't act until acted against.
Posted by Trav @ 04/08/2002 10:57 AM PST [link to this comment]

Yes, it's quite astonishing to hear presumably technologically-sophisticated folks argue that security is not a value-add. Granted there is some poorly implemented and poorly practiced security out there, but to hear the concept itself attacked is quite surprising to me (and I've been around the usability arena a bit, published at CHI, etc... I've also played in the security field a little bit, just to pre-empt the cred questions).

I'm curious whether those who think that security is damaging to the social fabric also believe that privacy is damaging to the social fabric? While security is occasionally at odds with privacy, data and personal information can't really be protected without good security in place.

Simon is exactly right when he talks about how security and usability both need to be considered from the beginning of any system design to be effective.

Anyway, just some offhand reactions. Oh, and check out Cybersecurity Today and Tomorrow: Pay Now or Pay Later for a good and relatively brief overview of security concepts, risks, threats, and principles.
Posted by Medley @ 04/12/2002 01:01 PM PST [link to this comment]

Oops. Messed up the link (*sigh*). The report is here.
Posted by Medley @ 04/12/2002 02:49 PM PST [link to this comment]

Add A New Comment:


E-Mail (optional)

Homepage (optional)

Comments Now with a bigger box for text entry! Whee!

All contents of are © 1998 - 2002 Peter Merholz.