Yesterday I published ChatGPT’s version of my bio. It is highly plausible but full of falsehoods. Today, I reveal what was false and attempt to figure out the source of the lies.
In what follows, quoted material is from ChatGPT’s response to “Who is Josh Bernoff?” The numbers count the lies. There are 14 lies in 142 words, which is troubling.
Josh Bernoff is a well-known author, speaker, and strategic advisor on business strategy and marketing.
- I am not well-known. If you asked 1,000 business people “Who is Josh Bernoff,” 999 would likely say “I don’t know.”
- I have given no speeches in the last four years, so I am not a speaker (certainly, not a well-known speaker).
- I am not a strategic advisor. I advise companies on ways to improve writing, not strategy.
- I am not an expert on business strategy. Maybe a decade ago, but not now.
- I am not an expert on marketing.
It is true that I am an author, the rest is false. Where did these lies come from?
There are online bios of me from past speeches that I have given. Some of those have me as a marketing expert or business strategist. Those date back to 2010 or so, when I was at Forrester.
This first line here reads like the biography of a “thought-leader” type speaker. There are 105 million web pages that include the words “well-known,” “author,” “speaker,” and “strategic advisor.” It’s likely that ChatGPT’s pattern-matching algorithms, having determined that I am author, followed the pattern and ascribed these qualities to me, especially since some of them are mentioned in obsolete web pages about my past speaking engagements.
He has over two decades of experience in technology, marketing, and management, and is a recognized expert in the field of customer experience and digital transformation.
6. I am not a recognized expert in customer experience.
7. I am not a recognized expert in digital transformation.
I do have decades of experience in technology, marketing, and management. It’s more like four decades, but let’s not quibble.
The lies about my expertise fascinate me. I edited a book on customer experience. But I have written nothing about digital transformation.
I ghostwrote a book that has customer experience in the subtitle, and based on that book, I coauthored an article in Sloan Management Review. And there is one blog post that mentions me in the context of digital transformation.
There are 18 million pages online that mention both customer experience and digital transformation. ChatGPT appears to believe all business jargon is the same, so if I said something about customer experience, I must be an expert in both that and digital transformation.
Bernoff has written several bestselling books, including “Groundswell: Winning in a World Transformed by Social Technologies,” “Writing Without Bullshit: Boost Your Career by Saying What You Mean,” and “The Mobile Mind Shift: Engineer Your Business to Win in the Mobile Moment.”
8. I have only one bestselling book, Groundswell. I was the coauthor of that book.
There are many places that list my books, including Amazon and various online biographies. The list of books is accurate. And I am a bestselling author. The distinction between “bestselling author” and “written several bestselling books” is too subtle for ChatGPT to tease apart. The pattern in the online sites it has consumed is that bestselling authors write bestselling books.
He is also a contributor to publications such as Harvard Business Review, The Wall Street Journal, and The New York Times.
9. I am not a contributor to The Wall Street Journal.
10. I am not a contributor to The New York Times.
I have contributed to the Harvard Business Review, both in print and online, with 33 results on their site. And I have been quoted multiple times in the Journal and the Times. ChatGPT thinks that if a person is a writer and online biographies mention publications, the writer must have written for those publications. That’s an oversimplification.
Before starting his own consulting firm, Bernoff worked as a senior vice president at Forrester Research, a global research and advisory firm.
True. And mentioned in nearly every online site that includes my biography.
He holds an MBA from the Harvard Business School and a BA in Classics from Harvard College.
11. I have no MBA.
12. I have never been enrolled at Harvard Business School.
13. I have no BA in Classics. I have a BS in Mathematics.
14. I did not go to Harvard. I went to Penn State for undergraduate, and studied mathematics in the Ph.D. program at MIT. (MIT and Harvard are both in Cambridge, Massachusetts, but are no more the same than the Yankees and the Mets.)
I find it interesting that ChatGPT has fabricated these academic credentials for me. Except for my publications in HBR, I have nothing to do with Harvard. But somehow, people with my sorts of qualifications apparently look like Harvard types. It’s probably relevant that my coauthor on Groundswell, Charlene Li, has an MBA from HBS and an undergraduate degree from Harvard, but I promise you, we are two different people.
ChatGPT is the George Santos of biographers
Everything ChatGPT writes is in the right form. It’s very good at mimicking the format and style of various writing forms, such as biographies, essays, articles, poems, and popular songs.
It’s also pretty decent at seeing what sorts of words are usually near other sorts of words. Harvard Business Review is connected to Harvard Business School. Customer experience is connected to digital transformation. Authors are speakers. People who write about business are probably business strategy experts.
But connections are not truths. I watch baseball — that doesn’t make me a baseball player, or even a baseball expert.
Words have meanings. ChatGPT doesn’t understand those meanings. It has, regrettably, now reached the skill level of a typical business jargon babbler. But like those babblers, it’s a bullshitter.
It’s imperative that you learn to identify and call out business bullshit now. Because with writing tools like this, the Web is about to be flooded with it.
6 responses to “Why ChatGPT lies about me: an analysis”
As it is now, ChatGPT can replace bad writers and imitate mediocre writers. It can’t touch good writers and maybe it never will. But after seeing your example here, I wonder if it might not replace entire public relations and management consulting firms. If projecting confidence is more valuable than being correct, it’s already good enough.
Josh, this is what ChatGPT said about me: “Corbett passed away in January 2021, leaving behind a legacy as a leader in the public relations profession and a champion for ethics, diversity, and professionalism in the field.”
Trying to figure out how to correct it. Here is ChatGPT’s response: “I apologize for the error in my previous response. You are correct that Gerard F Corbett is still alive and well as of my knowledge cutoff in September 2021. I strive to provide accurate information and I apologize for any confusion or misinformation that my previous response may have caused. Thank you for bringing this to my attention.”
Fascinating, both articles on your bio. I’m scared to see what it would turn up about me – and don’t want to take the time to learn the programs enough to find out – but it seems that if you do, the next step is to correct what it wrote.
But I’m wondering if those corrections will stick in any useful way – or would you (or someone else researching you) get the same misinformation as you got on your first attempt.
It thanked you for making corrections (very politely programmed), but did it learn anything?
Alicia, sadly no. I put in my name again. It seems ChatGPT learns nothing. “Gerard F Corbett (1947-2014) was a prominent public relations executive and advocate for the public relations industry. He was the founder and CEO of Redphlag LLC, a public relations and strategic communications firm, and served as the 2012 Chair and CEO of the Public Relations Society of America (PRSA), the world’s largest professional organization for public relations practitioners.”