How my son got into Duke

Penelope Trunk on asking her son to practice writing papers before he begins college:

My son said, “I don’t need to practice. I’ll do the triple AI approach: ask ChatGPT write the paper, tell ChatGPT to rewrite the paper to not sound like ChatGPT, and then use ChatGPT to check the grammar.”

Is that cool, weird, upsetting, clever, or something of a gray area? Yes, all of that.

As a longtime editor, seeing the next generation of writers gravitating toward AI tooling invokes an automatic gag reflex. There’s no need for elaboration; we all know the wide spectrum of opinions thanks to AI posts littering RSS feeds everywhere.

A couple of thoughts and observations on this passage:

  • Asking AI to write a draft then re-write itself to not sound like itself merely shows that AI is incapable of hitting the mark on a first try. The more personal your post, the less likely AI is able to replicate it. Conversely, the more generalized your content, the more likely AI is able to replicate it — but at the expense of sounding like ChatGPT.
  • If we can ask AI to re-write itself to escape AI-reeking prose, why can’t it do that in the first place? What is it about the second prompt that makes the writing more effective? Or is the second prompt an indication that the first prompt could have been better? In other words, is that a robot factor or a human one?
  • The attraction to AI seems less about its quality than it is being able to shovel off responsibility to something else. So, is the root cause a lack of personal responsibility, or is something closer to the likes of simply fearing what to do with a blank page? Because if it’s the latter, AI is a heavy-handed solution where personal prompts could be equally effective at helping writers beat the blank page.
  • I would be devastated if someone said my writing reads like AI output. That hasn’t happened, thankfully, but I believe that’s indicative of writing about personal experiences that are outside of AI’s wheelhouse. If I do get that criticism, though, I know it means I didn’t “put enough of myself” into my writing.
  • If I have to “put enough of myself” into my writing, then what does that mean for objective writing in journalism? Subjectivity finds its way into any profession, though, so are scientific disciplines, say Physics, bound to let AI write research methodologies to prevent subjective leaks?
  • Humans writing about news has always been rife with conflict between objectivity and subjectivity. Is the price of objectivity the loss of subjectivity? Again, AI is a heavy-handed solution here.
  • Isn’t subjectivity baked into AI as it is? I mean, it’s scraping content from many varying sources. That includes personal blogs, like this one. My site has been scraped, so is my content polluting the objectivity of AI output that leans on it? Maybe that becomes less of a thing the more training a model goes through?
✏️ Handwritten by Geoff Graham on April 1, 2024

2 Comments

  1. # April 16, 2024

    I have to say I left reading that article a bit confused. She spent all this time talking about how she was told her writing was like ChatGPT and then ends telling us her son’s strategy to write papers in college is to use ChatGPT.

    I appreciated the insights you shared. I like the fact that you talked about personal responsibility because I think it does play into it. I also liked that you explored that facing a blank page might be more of the issue and that we might help writers by building better tools to help them get going.

    Reply

Leave a Reply

Markdown supported