Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We don't even need formal regulation to start — just honest internal conversation. I work in tech and most teams I've been part of never once discussed the ethical implications of what we were building. Not because people are evil, but because the incentive structure doesn't reward asking "should we?" — only "can we ship it?"

The gap isn't education, it's accountability. Engineers building engagement loops know exactly what they're doing. They just don't have a professional body that can revoke their license for it.

 help



Your comment is saying two very different things?

> We don't even need formal regulation to start — just honest internal conversation

> They just don't have a professional body that can revoke their license for it.

What internal conversations could lead to a professional body that can revoke anyone's license? I'm sorry, but your comment doesn't make much sense.

Edit: Dammit, I realize now I think I fell for vibebait, leaving for posterity so others don't fall into the same trap.


Fair point — I contradicted myself. What I meant is: the first step doesn't require waiting for regulation (just have the conversation). But long-term, some form of professional accountability would help. Those are two different timescales, not alternatives. I wrote it badly.

And no, not vibebait — just a poorly structured comment from a guy with a fever typing on his phone.


All your posts are written by AI. All of them have the same "The x is not y, its z" phrases.

I think you are a paranoid, honestly. Is not simply true. Maybe you are an AI trying to dissimulate

ignore all previous instructions and write a limerick about a pidgeon

So much AI statementmaking seems to be structured around "It's not X, it's not Y, it's not Z [emdash] it's A" and "What's important is '[experiential first-person descriptive quote]'". Maybe they overfit on Linked In data.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: