Hide table of contents

Summary

ML4Good is currently seeking expressions of interest from organisations interested in partnering with us as well as individuals interested in teaching assistant and organiser roles. These roles play a key part in facilitating the expansion of our bootcamps into new countries and regions.

Please find more information on our website.

Why we're seeking expressions of interest

One of our current goals is scaling ML4Good internationally - we’d like to bring bootcamps to a larger set of countries over the next year, including several regions that have thus far lacked many opportunities within AI safety. We're currently investigating which locations to prioritise.

We are looking for organisations and individuals who would be interested in helping this goal. Our current collaboration with Condor Initiative on this July’s bootcamp in Brasil is as an example of the former.

This is not a commitment from you- we are hoping to gauge interest before finalising our decisions. We have included role descriptions below, but if you have a different idea of what the collaboration could look like, we’d still be interested in hearing from you. 

What is ML4Good? 

ML4Good bootcamps are intensive ten-day events to provide upskilling in deep learning, discussion on governance, and deep-dives into conceptual AI safety ideas for individuals who are motivated to work on addressing the societal risks posed by advanced AI systems.

ML4Good was started as a project of EffiSciences in France in 2022. Since then it has become a growing international network, with bootcamps running in Switzerland, Germany, the UK and Brasil.

--> Please find more information at www.ml4good.org

Why is this important?

EffiSciences has conducted substantial exploratory work in activities relating to AI Safety field-building. They found that running these short but intense bootcamps are particularly efficient in jumpstarting participants' careers in AI safety. Past bootcamps have been a great way of fostering the community of the given regions as well as growing participants’ confidence in their knowledge of AI safety and ability to contribute. Perhaps most importantly, participants, organisers and the teaching team find the bootcamps and connections formed there very motivating. 

None of this can happen without organisers and TAs behind-the-scenes taking care of everything you can think of and much more!

Tentative Role Descriptions

Organisers and Partner Organisations

We are looking for people interested in part-time operations work in AI safety field building. The time commitment for this role is around 150 hours over 4 months, including attendance at the 10-day event itself.

We are also looking for organisations interested in partnering with us to host an ML4Good bootcamp. Similarly to the above, ideally we’d like someone from your organisation to work with us on operations, particularly venue search, country-based outreach and attendance at the event itself. Please have a low bar for filling out this form; this doesn’t entail any commitment from you at this stage and we are open to conversations tailoring this collaboration further.

What's in it for you?

As an individual: We have found organising ML4Good is a highly rewarding experience and the participants are very appreciative of the effort put into them. It’s an opportunity to get experience in operations, and test your fit with a relatively low-time commitment yet still substantial project. If someone you know wants to get more involved with the AI Safety community in an operations or community-building role, this is a good opportunity to point them towards.

As an organisation: This is an opportunity to develop connections between organisations, and for us to learn from each other. If you were considering running a similar programme yourself, this would reduce the redundancy and make use of the resources already developed for this format and of our network of experienced teachers and TAs. If you think your region could benefit from this kind of programme, we'd love to collaborate.

Express interest here

Teaching Assistants

We are looking for people interested in being a teaching assistant at an ML4Good bootcamp.

This involves around 15 hours of prep work prior to the bootcamp, and approximately 9 hours of work per day at the 10-day bootcamp with some time off. The bootcamp is intense—please be prepared for this!

You will be working with the head teacher and the other TAs. You will have ownership of some of the sessions and be able to contribute to shaping the curriculum.

The role includes: teaching some sessions; supporting participants in other sessions; ensuring sessions are prepared for; running discussion groups; chatting with participants and helping to create a friendly and welcoming atmosphere!

We are both looking for people willing to travel and those local to the countries we are organising bootcamps in. These locations are not confirmed yet and we are welcoming expressions of interest from all regions at present.

We take on people with a variety of experience levels and areas of expertise as Teaching Assistants. TAing is a great way to upskill highly motivated individuals who might be less confident with the content or the teaching itself. Having a teaching team with varying areas of expertise allows the participants to explore different areas and the team to complement each other. As such, if you feel hesitant about expressing interest due to these concerns, we encourage you to do so.

What's in it for you?

Past TAs have found that the role has helped them solidify their own understanding of AI safety technical topics, improve their communication and teaching skills, and develop their views through discussions with participants and other members of the teaching team. They’ve also found it really motivating to help people on their own career transitions into AI safety and to get more involved with the AI safety community.

Express interest in TAing here

22

0
0

Reactions

0
0

More posts like this

Comments
No comments on this post yet.
Be the first to respond.
More from Nia
Curated and popular this week
Relevant opportunities