Any studies that support the answer "no" for both questions?
Technical interview performance has ~0 correlation with performance on the job? I find it hard to believe. Is the mean performance on the job of a software engineer that fails to pass Google interviews as good as one that does? Why do most companies want to hire Google software engineers if that is the case?
As for my second question, it should have been "is it easi-er to measure this performance reliably than other skills that may be relevant"? It sounds like this would be hard to answer. But this kind of interview seems to be best suited for repeatable evaluation (although, of course, you do have the problem that people will prepare and bias the results, but I don't see how you can avoid that if you want a repeatable way to measure performance).
Finally, "and companies use them in spite of that". Any guesses why?
> Any studies that support the answer "no" for both questions?
Google had a study that has been linked to death around here which showed the scores hires received on their interviews didn't correlate to job performance ratings. There are of course biases and flaws in that study, but it underscores another separate point: studies should show they do correlate, not the other way around (prove a negative).
> Finally, "and companies use them in spite of that". Any guesses why?
Bandwagon effect, cargo cults, ego boosting/confirmation bias from applying a hazing ritual, laziness all come to mind.