I think that these kinds of interview quizzes are a generally terrible practice. I prefer to bring people in and have them pair with people on real things, that are really happening, that are entirely representative of the real work and the real team dynamics. All parties involved get a much, much better SNR out of arrangements like that. A bunch of contrived nonsense representative of nothing used as a measuring stick for hiring has always seemed nonsensical to me.
In any case, if one were resolute to pursue a practice like this, such as your colleague, then what they should have done is pick a problem they definitely don't know the answer to and then sit down and work it out actively in cooperation with the candidate. That would have been about a thousand times more useful as an interviewing activity.
Knowing the answer ahead of time makes the dynamic worse. Things will seem obvious that aren't when you already know the answer. It sounds like your colleague made things a degree worse by both not knowing the answer, but assuming that they did somehow intuitively.
> I prefer to bring people in and have them pair with people on real things, that are really happening, that are entirely representative of the real work and the real team dynamics.
That should be the most important test, but most of the daily realistic work is really a matter of practice. By testing it you really test how much practice with that particular toolchain and workflow the person had, not how smart or good programmer they are. Brainteasers are unrealistic and relying just on them is stupid, but they give you some insight in how good in quick thinking someone really is. If I was looking for a newbie to train, I'd always prefer someone who can't finish the real world test, but knows how to think technically, to the opposite.
Even if they don't know the tools or languages, which in our case they're almost certainly not going to (Rust, Agda, TLA+, etc.), you get to see how they interact with a situation where they need to be able to direct some of their own learning by seeing if they ask questions, if they can synthesize insights despite lack of deep familiarity, how they react to having naivety exposed, etc.
Most of the challenging work is collaborative problem solving, not exam proctoring, and it's important to see how a person interacts with that. You'll see how much direction they're likely to need, if they can navigate and contribute positively to team dynamics, and so on.
We don't work on cryptocurrencies. Right now we work on automated testing & verification tools for "cyber-physical systems". Our target markets & customers often have safety-critical considerations.
We use Agda to create proofs for some of the components of our platform.
In any case, if one were resolute to pursue a practice like this, such as your colleague, then what they should have done is pick a problem they definitely don't know the answer to and then sit down and work it out actively in cooperation with the candidate. That would have been about a thousand times more useful as an interviewing activity.
Knowing the answer ahead of time makes the dynamic worse. Things will seem obvious that aren't when you already know the answer. It sounds like your colleague made things a degree worse by both not knowing the answer, but assuming that they did somehow intuitively.