J

Jarrah

13 karmaJoined Feb 2023

Comments
3

I think you're right here. It tends to be senior people who have that capability, and there's not enough of them in the industry. What makes this especially hard for us is that EAs tend to be younger and early to mid-career.

Air gaps can function in networks that don't need to have much data coming in or out. This used to be the case for industrial controls systems and maybe weapons systems. But even when I've talked with industrial control systems experts on it, they recommended against it, because the gap will be plugged due to operational necessities whether you like it or not. Often it ends up being dirty USB drives bypassing your security that you have no control over. I strongly believe that the volume of external data processing needed by AI research means airgapping is impractical.

If someone has enough IT skills to get an entry-level position, I would encourage them to take that route. If they don't, then I would nudge them towards a degree that both will help to motivate them and to gain a credential to help them get in the door.

The problem might come down to security being nuanced, complex, hard to measure, needing to be tied to the mission to be effective - so it often requires a lot of judgement . In my experience it's easy for contractors to apply the same cookie-cutter security they've always done, and miss the point.

Two real examples that may be illustrative:

An company with altruistic goals wanted to reduce the risk of a compromise that could prevent them from achieving the mission, so they hired contractors to support cybersecurity. The contractors recommended working on security policy and set to work on it. One of the benefits of policy is demonstrating security compliance, so that other businesses are comfortable buying your services. The policies were designed along these lines, even though sales wasn't the true motivation for security, and was out of touch with the organisation's culture. For example, staff were told that they had to follow it for the good of the company, including "don't be the reason [company] loses a sale".

The company's motivation and culture was explained clearly to the contractors. But it's unusual for an organisation to care about their mission more than money, and common for companies to pretend, so I can understand why the contractors had a hard time understanding.

Another example of disconnect is that many companies and security professionals explicitly do not attempt to defend against nation state attacks, and ignore external harms. I talked to a sucessful cybersecurity professional (CISO at a large tech company) about the  security difficulties faced by AI startups and the damage that could be done to the world by the leak of powerful technologies. One of their recommendations was for AI labs to get cyber insurance so they would be financially protected against a compromise. I argued that this doesn't protect against a foreign state brainwashing its citizens with a large language model, and they agreed, but their initial reaction was that the AI lab can't get sued over that anyway. In fairness I don't believe they were callous - just not used to thinking about risks beyond company suceess and survival.

Different contractors may be better, and there may be some out there that 'get' it, but it's an added difficulty when it's already hard to get and vet information security expertise.

I think it can work to hire contractors for specific technical tasks that require a high amount of expertise and not as much mission judgement, e.g. deploy a security product.

I don't believe the issue is limited to information security - I rememberTara discussing  the difficulty of outsourcing financial accounting.