Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Yup anyone who reads a pre-written bedtime story to their kids is a bad parent /s

Not a bad parent, but definitely something off about it.

Or rather, my personal experience: reading bedtime stories to my daughter is a moment of connection. She loves this. I'm sometimes tired, and could use some prompts -- which I already have, the vast mental collection of books I've read and movies I've seen.

I don't want to automate this. Suppose I could have ChatGPT improvise a story within some parameters and read it out loud with my synthetized voice -- seems dystopian to me. I want to be the one reading the stories to my daughter, and making them up.

It's about bonding and connection, not about efficient storytelling!

> Even before ChatGPT, I was using davinci in the OpenAI playground to have a lot of silly fun with my daughter. Totally dumb stuff like asking it to create a menu for an alien restaurant with farts as an ingredient, but it had us rolling.

This to me sounds like a good use of software tools. Some computer time with your daughter, making the computer do fun things. Why not? It's not a replacement for bedtime stories, it's something new. I don't think this is what the article is criticizing.



It's exactly what this article is criticizing. It rejects all uses of generative systems as anti-human.


I disagree. The article is about artistic creation making us human.

While "artistic creation" is hard to demarcate, I think fooling around with a fun tool is not it. Asking an LLM or similar tool to create an alien menu with farts as ingredients, just to watch it do something silly, is not art-creation. Most people can do it without the LLM, too.

From the article:

> I’m not against playing with these generative AI tools. For months, I played with Midjourney, and I still occasionally play with ChatGPT. I say “play” because they’re excellent toys.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: