Hide table of contents

The following question was submitted to the EA librarian. Any views expressed in questions or the librarian’s response do not necessarily represent the Centre for Effective Altruism’s views.

We are sharing this question and response to give a better idea of what using the service is like and to benefit people who have similar questions. Questions and answers may have been edited slightly to improve clarity or preserve anonymity.

Question

"What does "orthogonality" and "orthogonal" mean when used by EAs? According to google it just means that two lines are perpendicular, but from the blurbs, I've read about the "orthogonality thesis" there is no mention of linear algebra, and instead mentions interdependence between intelligence and motivation ....

Answer

This is such a great question! I think it’s really great that you could pin down your confusion so precisely, which is a very understandable confusion to have. I think you have touched on a few related things here and I’ll try to show the link between them.

Orthogonality in maths

As you said orthogonality to most people probably just means at a right angle too, e.g. when people are standing their feet are orthogonal to their legs (at least along the longest dimension of each one). In linear algebra, it has the same meaning but an interesting property appears that makes orthogonality useful to think about. I don’t know how much knowledge of linear algebra you have but orthogonal vectors are linearly independent of each other, or you can’t write a vector in terms of a vector that is orthogonal to it. If I throw a ball at you and you know the velocity of the ball but not its position, there is nothing that you can do with the velocity information to derive information about the position of the ball see related discussion of independence on research gate. Two vectors that are not orthogonal are the speed of the ball in mph and kph, there is a simple transformation that you can do to give you the speed in kph if you only know the speed in mph.

EAs talking about orthogonality

People who use orthogonal in conversation are often pointing at the lack of a relationship between two variables. Imagine the following conversation:

Sarah: “I understand that EAs generally would prioritise distributing bed nets over training guide dogs, I just don’t get the same sense of fulfilment from the callous, calculation driven act of providing bed nets”.

Harriet: “I understand that providing bed nets might feel calculating, I just think that feeling like an act is calculating is orthogonal to how good the act is, the thing I really care about is which intervention brings about more happiness”.

Here, Harriet is pointing out that the calculating feeling isn’t really related to how much good is done, or fundamentally what they believe makes some action worth taking. I think that often EAs say things are orthogonal that aren’t technically truly orthogonal, they are normal pointing towards two factors not being dependent on each other.

The orthogonality thesis

I think that this video does a better job of explaining the orthogonality thesis than I will. Essentially intelligence and motivation are not dependent on each other (they are orthogonal). It is possible to have very intelligent agents that have ‘bad’ goals. This point combats a common perspective that artificial general intelligence won’t go wrong and maximise some bad moral goal (e.g. converting the whole world into stamps) as it will be intelligent enough to choose a better goal. I encourage you to watch the video linked above for a better explanation.

9

0
0

Reactions

0
0
Mentioned in

More posts like this

Comments
No comments on this post yet.
Be the first to respond.
More from calebp
Curated and popular this week
Relevant opportunities