You (or your organization or your mission or your family or etc.) pass the “onion test” for honesty if each layer hides but does not mislead about the information hidden within.

When people get to know you better, or rise higher in your organization, they may find out new things, but should not be shocked by the types of information that were hidden. If they are, you messed up in creating the outer layers to describe appropriately the kind-of-thing that might be inside. 

Examples

Positive Example: 

Outer layer says "I usually treat my health information as private."

Next layer in says: "Here are the specific health problems I have: Gout, diabetes."
 

Negative example:

Outer layer says: "I usually treat my health info as private."

Next layer in: "I operate a cocaine dealership.  Sorry I didn't warn you that I was also private about my illegal activities."
 

Negative example:

Outer layer says: "Is it ok if I take notes on our conversation?"

Next layer in: "Here’s the group chat where I mocked each point you made to 12 people, some of whom know you”

 

Positive Example: 

Outer layer says "Is it ok if I take notes on our conversation?  Also, I’d like to share my unfiltered thoughts about it with some colleagues later."

Next layer in says: "Jake thinks the new emphasis on wood-built buildings won’t last. Seems overconfident."

------------------------------------------------------------------------------------------------

Passing the test is a function both of what you conveyed (explicitly and implicitly) and the expectations of others. If it’s normal to start mocking group chats, then it doesn’t need to be said to avoid shock and surprise. The illusion of transparency comes to bite here.

Andrew:

Social friction minimization is the default trend that shapes the outer layers of a person or institution, by eroding away the bits of information that might cause offence, leaving layers of more pungent information underneath.  The “onion model” of honesty or integrity is that each layer of your personality or institution should hide but not mislead about the layer underneath it.   This usually involves each layer sharing something about the kinds of information that are in the next layer in, like “I generally keep my health information private”, so people won’t assume that a lack of info about your health means you’re doing just fine health-wise.

It takes a bit of work to put sign-posts on your outer layer about what kinds of information are inside, and it takes more work to present those sign-posts in a socially smooth way that doesn’t raise unnecessary fears or alarms.  However, if you put in that work, you can safely get to know people without them starting to wonder, “What else is this person or institution hiding from me?”  And, if everyone puts in that work, society in general becomes more trustworthy and navigable.

I started using the onion model in 2008, and since then, I’ve never told a lie.  It’s surprisingly workable once you get the hang of it.  Some people think privacy is worse than lies, but I believe the opposite is true, and I think it's worth putting in the effort to quit lying entirely if you’re up to the challenge.  Going a bit further, you can add an outer layer of communications that basically tells people what kinds of things you’re keeping private, so not only have you not lied, you’ve also avoided misleading them.  That’s the whole onion model.

Chana:

I have found this model extremely useful in the last few months talking about organizational strategy as a way of carving between “not everyone gets to know everything” and “actively pointing people in the wrong direction about what’s true lacks integrity” and avoiding “I didn’t lie but I knowingly misled.”

So far I have thought about it as a backwards reflecting device - what on the inside would people be shocked to find out, and how can I make sure they are not shocked, rather than forward thinking and signposting all the things I might want to signpost, but I could imagine that changing. (ie right now I’m taking this as a useful quick tool, rather than a full orientation to honesty as Andrew does, but that could definitely change).

In general, over the last few years I have shifted pretty far towards “transparency, honesty, earnestness are extremely powerful and fix a lot of things that can otherwise go wrong.”

On a different note, for me, virtue ethics is attractive, but not real, and tests for integrity are important and useful pointers at things that frequently go wrong and can go better, rather than referenda on your soul. I would guess there are situations in which glomarizing is insufficient, and focusing too much on integrity will reveal the existence of secrets you have no interest in revealing, at least if you are not massively skilled at it.

[Some small edits made, including to the title, for clarification purposes]

51

New Comment
7 comments, sorted by Click to highlight new comments since: Today at 6:52 AM

I think the first negative example is not particularly good. The outer layer is not related to the inner layer. People have a general expectation that others will be private about any illegal activities. Operating a cocaine dealership is negative, but that's really a completely separate concern from social issues of transparency and trust.

A possibly better negative example here might be 'I have an STD and don't inform sex partners about it'.

 

This is helpful. My entire career revolves around "conceal, but don't mislead" and even I'm still learning where lines are. Thank you for this post.

I'm curious how this applies to infohazards specifically. Without actually spilling any infohazards, could you comment on how one could do a good job applying this model in such a situation?

Perhaps "I won't tell you things I think will be negative for the world to be more public" or "by default, I won't tell you things I think will make you worse off"

Curious about disagreement votes if people want to air them, but no pressure.

Edit: I want to highlight that I do appreciate the compassion that I think is part of the model in your post, and I don't mean this comment as a personal attack but rather a very specific criticism.

Huh, I'm not sure why I didn't voice my disagreement initially. My vote was because the phrases you suggested come across as arrogant and patronising, in my opinion.

I think it's sometimes obvious that you'd not tell someone you care about something that would hurt them; and at other times whether you should tell them or not is something that needs to be established explicitly according to their preferences. If it's not your private information, it should be an extremely rare occasion anyway.

It also adds that in either case, you're hiding the information without actually giving any information on what they can expect it to be like, which perhaps contradicts your model.

I admit my disagreement partially has to do with rejecting the concept of infohazards, which I find arrogant and patronising in general.

Really interesting!

I get the impression that you do organizational consulting. I have been in various business environments where I watched organizational consultants work from my perspective as an employee. 

I am curious how your approach and ethics let you handle:

  • emperor wears no clothes organizational problems: everyone seems to think some X is really great, but X is a fiction and only you see that.
  • elephant in the room communication situations: there's something everyone knows about, fears, and won't talk about and it's the problem that needs handling. 
  • covert consulting needs: you're consulting, but the problems are so obviously related to leadership or the organization, that you either leave or create organizational change covertly despite whatever management identified as problems to fix.

These situations were a test of consultant integrity, from what I saw, but they also show up in everyday life, where fictions, secrets, or politics conflict with desire for integrity.