37 karmaJoined


I did not like the analogies. They do not seem to make an effort to point at something meaningful, they are superficial.

For example, password salts are added before hashing the passwords. If you switch to adding them after, this makes salting near useless.

You could make the analogy with concatenating the salt to the head or tail of the string, which would be fine.

Randomly adding / subtracting extra pieces to either rockets or cryptosystems is playing with the worst kind of fire, and will eventually get you hacked or exploded, respectively.

"Randomly" doing stuff to a neural network is bad too, you are not doing "random" modifications. I'm not an engineer yet I bet there are tons of modular parts in a rocket.

By April 14? You are brave!

I'm just guessing, but I imagine it would involve a ton of coding in practice, and tinkering with variations of existing models to make them work.

To start from nothing, this book I heard about on Gelman' blog comes to mind: https://dataorigami.net/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/

I'd like a PDF version too. I appreciated the effort in an online highlighter, but on the PDF I can take arbitrary notes and they will stay saved forever on my computer if I need to find something I read and my thoughts on it, it does happen.

In particular I'd like a PDF with WIDE MARGINS. Not like normal newspaper pdfs were you have to take notes in collapsible stickies or find a sophisticated note-taker app that is not compatible with other readers.

- Whether candidates had worked in a company with more than 1000 employees - we excluded this in favour of looking at whether candidates had worked at a FAANG company; it was not possible to include both since the variables are not independent.

I'm confused, why can't you include two predictors if they are not independent? I'm assuming that with "independent" you mean correlation 0, if you instead mean no collinearity, i.e., linearly independent vectors of predictors, then feel free to ignore my comment.