Opinion | How A.I. Can Use Your Personal Data to Hurt Your Neighbor - The New York Times
<a href="https://news.google.com/rss/articles/CBMiakFVX3lxTE1wdk9WNHNEVmZVbUtZRUd3NzJFOGJGSXlTN01qZWFud215NjZ5bXk0T0JTMmxFTXJCaU13QUhqUWJlZFFJb3ZRTGdfX2NMOG1CbWpxWGg4MXBDUVlqalFxOV9MU0NySmZLMEE?oc=5" target="_blank">Opinion | How A.I. Can Use Your Personal Data to Hurt Your Neighbor</a> <font color="#6f6f6f">The New York Times</font>
Could not retrieve the full article text.
Read on GNews AI privacy →Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
opinion
A New York Times critic used AI to write a review, but good criticism can’t be outsourced
An author and freelance journalist has admitted to using AI to help him write a book review for The New York Times . Alex Preston’s review of Jean-Baptiste Andrea’s novel Watching Over Her , published by The New York Times in January 2026, draws phrases and full paragraphs from Christobel Kent’s review in The Guardian . The “error” was brought to light by a reader, who alerted The New York Times to the similarities. Preston told The Guardian he is “hugely embarassed” and “made a huge mistake.” The Times promptly dropped Preston, calling his “reliance on A.I. and his use of unattributed work by another writer” a “clear violation of the Times’s standards.” An editor’s note now precedes the review online, advising readers of the issue and providing a link to the Guardian review. Preston’s apo
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.





Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!