It's sad this is the reality of hiring now. It's byzantine and wasteful and I'm surprised that companies are willing to burn money this way. There are great candidates who may not perform well in an interview and there are a lot of serious clowns that can waltz through this process. The questions being asked are not likely going to yield useful information and the whole thing seems designed by pick-up artists to neg candidates into accepting bad offers. Why stop at 6 interviews?
I understand your sentiment, but, having been on the other side of the fence, there are a lot of people that apply for jobs they're completely unqualified for, and they end up being most of your candidate pool. (Come to think of it, this would make a decent whiteboard question about reservoir sampling without replacement...)
Another way to think about it is that these interview processes often have a ~ 1% acceptance rate, and the softball questions typically filter 80-90% of candidates.
So, yeah, it sucks for a qualified candidate if they fail half their interviews, but they'll end up getting a job somewhere. That's much better for them than working at a company where 99% of the engineers can't perform basic coding tasks or explain how products in their industry segment are designed.
I was thinking of a variant where there's a finite pool of candidates, and everyone is implementing a similar algorithm. Each time a candidate is hired from the pool, a candidate with IID skill level is put into the pool.
What does the average skill level in the pool converge to over time?
(This might be too hard for a whiteboard problem.)
Reading and assessing resumes and having a single interview with HR and the hiring manager or a panel. And this process should fit within the larger operational framework and processes of a company.
That sound terrible. Most jobs are for specialty positions, so you'd need 10,000's of different certifications tracks.
A written engineering competency test (basically, an industry-standard set of whiteboard questions) sounds good, but, having hired credentialed-but-incompetent engineers from other fields, I can definitely say it helps a lot less than you'd think.
There is no reason why we can't have that, and I don't think you'd need that many tracks. Even something as narrow as cybersecurity has several different (ISC)2 tracks. They could have a different cert for each major programming language, for instance.
So, I write mission critical software that sits behind narrow interfaces. In my field, security and correctness are basically synonyms. Other people manage large fleets of machines, and there intrusion detection is probably the best you can do. Still other sets of people write frontend software with 10,000's of dependencies, so dependency management is the name of the game. Still others deal with PII and credit card numbers, and in that space compliance matters deeply.
I could list many more sub-disciplines of cybersecurity, but I'm already up to five (and haven't even mentioned cryptography!). None of the skill sets from the five I mentioned really translate well to each other.
Also, what purpose would having a cert for a programming language serve? For things like C++ and JavaScript, there are so many sub-dialects that the cert wouldn't say much about whether you could write inside a particular code base. For things like rust and go, the certificate would need to expire every few years.
Also, it takes an experienced developer a day or two to pick up a new language. Getting a certificate for a language would take longer than that. That lands us where we are right now, where such certifications exist, but they're simultaneously too much trouble to be worth it, and also not worth the paper they're printed on.