Hide table of contents

For a personal decision, I'd like to know if a person's expected impact is roughly proportional to their hours worked (keeping output per hour fixed). Suppose the decision would make you work x% fewer hours on useful things but keep your performance in the remaining hours the same - you won't be more rested. The 20% just goes into work that's not helpful for career capital or impact.  In other words, you're x% less productive. Does that mean you have roughly x% less expected impact? 

Discussion

One reason your expected impact may decrease by >x% is that personal impact is (supposedly) heavy-tail distributed across people. To be in the heavy tail you'd need to be roughly at your highest productivity. So being x% less productive could reduce your expected impact by well over x%. 

More than x% impact loss seems intuitive when you consider large x. Say you reduce your work time by 70%, and keep your productivity in the remaining 30% fixed. This seems to almost completely kill your chances of becoming a heavy-tail top performer in your field as you won't be able to invest in yourself enough to stay competitive.[1][2]

On the other hand, your impact depends on other factors than the quantity of your work: talent and luck. In fact, talent and luck may be the main reason why impact is seems heavy-tailed. This view suggests that, if you work 20% less, your chances of being in the heavy tail don't change much, and your expected impact decreases only by ca. 20%.

Edit: The answer seems to depend on the career path. In this case it's academic research or startup founder.

New Answer
New Comment

4 Answers sorted by

The law of logarithmic utility has also been applied to research funding[74]—and a simple rule of thumb is that a dollar is worth 1/X times as much if you are X times richer. So doubling someone's income is worth the same amount no matter where they start.[75] Past the point of increasing returns to scale, the next dollar donated say at the $500k funding mark might have 10x as much impact as the dollar donated after the $5m mark.

Maybe a useful first approximation might be that with hours worked it's similar, where past the point of increasing returns to scale, the next hour worked at the 10h/week mark might have 10x as much impact as the hour worked after the 100h/week mark (An hour might be worth 1/X times as much if you work X times more). More realistically, if you work 40h week vs. 80h week, the hours leading up to 80h/week are only ~half as valuable (but I definitely think the 1st hour of the day is often 10x more valuable then the 10th).

CS professor Cal Newport says that if you can do DeepWork TM for 4h / day, you’re hitting the mental speed limit, the amount of concentration your brain is actually able to give. Poincaré could only work 4 hours a day. 

This suggests that'd it be better to work 5h/d for 7d/week rather than 7h for 5 days and all else equal, hiring more researchers at lower pay rather than more at higher pay.

Ideally, you'd do admin / research management in the afternoons. But then sometimes I feel like long days are also sometimes useful in research because it takes a some time to 'upload' the current research project into your mind in the morning and you need to reboot it the next day. I remember someone very productive saying and I can confirm from personal experience that you can 'reset', a little bit, the buildup of adenosine with 1.5h naps (1 full sleep cycle), after working the morning and then continue working 'another morning' in the afternoon.

It's important to keep in mind that you always want to prevent burnout by keeping work efficiency high (= Total work time / Time in office. The section Work All the Time You Work in  Eat That Frog says that you don’t want to be spending your intended-work-time not-working such that you have to spend your intended-leisure-time working.

But yes this is all different in winner-takes-all-markets. 

[anonymous]2
0
0

 CS professor Cal Newport says that if you can do DeepWork TM for 4h / day, you’re hitting the mental speed limit

and:

the next hour worked at the 10h/week mark might have 10x as much impact as the hour worked after the 100h/week mark

Thanks Hauke that's helpful. Yes, the above would be mainly because you run out of steam at 100h/week. I want to clarify that I assume this effect doesn't exist. I'm not talking about working 20% less and then relaxing. The 20% of time lost would also go into  work, but that work has no benefit for career capital or impact. 

5
Hauke Hillebrandt
Yes - I think running out of steam does some of the work here, but assuming that you prioritize the most productive tasks first, my sense is this should still hold.
2[anonymous]
It seems to depend on your job. E.g. in academia there's a practically endless stream of high priority research to do since each field is way too big for one person solve. Doing more work generates more ideas, which generate more work. 
2[anonymous]
Another framing on this: As an academic, if I magically worked more productive hours this month, I could just do the high-priority research I otherwise would've done next week/month/year, so I wouldn't do lower-priority work. 
1[comment deleted]
1[comment deleted]

Startup founder success is sometimes winner-take-all (Facebook valued at hundreds of billions of dollars, Myspace at ~$0).

If that's true in your market, then the question reduces to how likely that additional 20% is to make you better than your competitor. My guess is that you will be competing against people who are ~equally talented and working at 100%, so the final 20% of your work effort is relatively likely to push you into being more productive than them (meaning that ~100% of the value is lost by you cutting your work hours 20%).

I assume this is less true in academia.

I'd guess that quite often you'd either win anyway or lose anyway, and that the 20% don't make the difference. There are so many factors that matter for startup founder success (talent, hard-workingness, network, credentials, luck) that it would be surprising if the competition was often so close that a 20% reduction in working time changes things.

Another way to put this: it seems likely that Facebook would still be worth hundreds of billions of dollars, and Myspace ~$0, had the Facebook founders worked 20% less).

I don't have a good object-level answer, but maybe thinking through this model can be helpful.

Big picture description: We think that a person's impact is heavy tailed. Suppose that the distribution of a person's impact is determined by some concave function of hours worked. We want that working more hours increases the mean of the impact distribution, and probably also the variance, given that this distribution is heavy-tailed. But we plausibly want that additional hours affect the distribution less and less, if we're prioritising perfectly (as Lukas suggests) -- that's what concavity gives us. If talent and luck play important roles in determining impact, then this function will be (close to) flat, so that additional hours don't change the distribution much. If talent is important, then the distributions for different people might be quite different and signals about how talented a person is are informative about what their distribution looks like.

This defines a person's expected impact in terms of hours worked. We can then see whether this function is linear or concave or convex etc., which will answer your question.

More concretely: suppose that a person's impact is lognormally distributed with parameters  and , that  is an increasing, concave function of hours worked, , and that  is fixed. I chose this formulation because it's simple but still enlightening, and has some important features: expected impact, , is increasing in hours worked and the variance is also increasing in hours worked. I'm leaving  fixed for simplicity. Suppose also that , which then implies that expected impact is , i.e. expected impact is linear in hours worked.

Obviously, this probably doesn't describe reality very well, but we can ask what changes if we change the underlying assumptions. For example, it seems pretty plausible that impact is heavier-tailed than lognormally distributed, which suggests, holding everything else equal, that expected impact is convex in hours worked, so you lose more than 20% impact by working 20% less.

Getting a good sense of what the function of hours worked (here  ) should look like is super hard in the abstract, but seems more doable in concrete cases like the one described above. Here, the median impact is , if , so the median impact is linear in hours worked. This doesn't seem super plausible to me. I'd guess that the median impact is concave in hours worked, which would require  to be more concave than , which suggests, holding everything else equal, that expected impact is concave in hours worked. I'm not sure how this changes if you consider other distributions though -- it's a peculiarity of the lognormal distribution that the mean is linear in the median, if  is held fixed, so things could look quite different with other distributions (or if we tried to determine  and  from  jointly).

Median impact being linear in hours worked seems unlikely globally -- like, if I halved my hours, I think I'd more than half my median impact; if I doubled them, I don't think I would double my median impact (setting burnout concerns aside). But it seems more plausible that median impact could be close to linear over the margins you're talking about. So maybe this suggests that the model isn't too bad for median impact, and that if impact is heavier-tailed than lognormal, then expected impact is indeed convex in hours worked.

This doesn't directly answer your question very well but I think you could get a pretty good intuition for things by playing around with a few models like this.

After a little more thought, I think it might be helpful to think about/look into the relationship between the mean and median of heavy-tailed distributions and in particular, whether the mean is ever exponential in the median.

I think we probably have a better sense of the relationship between hours worked and the median than between hours worked and the mean because the median describes "typical" outcomes and means are super unintuitive and hard to reason about for very heavy tailed distributions. In particular, arguments like those given by Hauke seem mo... (read more)

[anonymous]1
0
0

Thanks Aidan, I'll consider this model when doing any more thinking on this. 

This is a bit of a summary of what other people have said, and a bit of my own conceptualisation:

A) If the work is not competitive (not a winner-takes-all market), then:

  • For some jobs, marginal returns on quality-adjusted time invested will decrease, and you lose less than 20% of impact. This is true for jobs where some activities are clearly more valuable than others, so that you cut the less valuable ones.
  • For some jobs, marginal returns on quality-adjusted time invested will increase, and you lose more than 20% of impact. This could be e.g. because you have some maintenance activities that are fixed costs (like reading papers to stay up to date), or have increasing returns because you benefit from deep immersion.

 

B) If the work is competitive (a winner-takes-all market), either:

  • you are going to win anyway, in which case the same as above applies, or
  • you are going to lose anyway, in which whether or not you spend 20% of your time on something else doesn't matter, or
  • working less is causing you to lose the competition, in which case you lose 100% of value.

 

Of course, this is nearly always gradual because the market is not literally winner-takes-all, just winner-takes-a-lot-more-than-second. For example, if you're working towards an academic faculty position, then maybe a position at a tier 1 uni is twice as impactful as one at a tier 2 uni, which is twice as impactful than one at a tier 3 uni, and so on (the tiers would be pretty small for the difference only being 2x, though).
 

On average, the more "competitive" a job, and the closer the distance between you and the competition, the more value you lose from working 20% less.


Nearly every job has some degree of "competitiveness"/"winner-takes-all-market" going on, but for some jobs this degree is very small (e.g. employee at EA org), and for others it's large (academia before you got a tenure-track position, for-profit startup founder).

 

For academic research, I'd guess that from looking at A) alone, you'd get roughly linear marginal returns, and how much B) matters depends on your career stage. It matters a lot before you got a tenure-track position (because the market is "winner-takes-much-more-than-second" and competition is likely close because so many people compete for these positions). After you got a tenure-track position, it depends on what you want to do. E.g., if you try to become the world-leader in a popular field, then competition is high. If you want to research some niche EA topic well, then competition is low.

Comments3
Sorted by Click to highlight new comments since:

There's also an argument that impact diminishes by <20%: the hours you'll cut out first will be your least important hours (assuming you're prioritizing well). 

I think the main argument for >20% is that you might get increasing returns from deep immersion and mastery of a field (this is a version of the point you made about "making it in the heavy tail").

I think it depends on the type of work you're doing. If you work at an EA org and do very generalist tasks with a lot of prioritizing on the go (for example, some of all of the following: hiring, headhunting/recruiting, developing strategy docs, mentoring, etc.), I could imagine that you lose <20%.

By contrast, if you're a researcher doing cutting-edge work, you may benefit from deep immersion, so I'd expect you to lose >20%.

Also, if you're on a career path where getting promoted is important (for instance because you want to make it to an influential position in government or academia), you almost certainly lose >20% because of the inherent competitiveness of the career track. 

Another case where you lose >20% with 20% less hours: earning to give as normal employee (not as entrepreneur).

Salary is ~ linear with the hours worked. You can only donate the part of the salary above a certain baseline because you need the rest for your living costs*. Let's say you can donate 40% of your salary if you work 40h/week. If you work 32h/week, can only donate 20% of a full-time salary. That's 50% less impact for 20% less hours.

Caveat 1: You can also donate a fixed percentage, then it doesn't work like this.
Caveat 2: I'm neglecting non-donation impact here.

[anonymous]3
0
0

Thanks Lukas that's helpful. Some thoughts on when you'd expect diminishing returns to work: Probably this happens when when you're in a job at a small-sized org or department where you have a limited amount to do. On the other hand, a sign that there's lots to do would be if your job requires more than one person (with roughly the same skills as you). 

In this case here the career is academia or startup founder. 

Curated and popular this week
Relevant opportunities