KC Green, creator of the 2013 ‘This Is Fine’ webcomic, says AI startup Artisan ran subway ads using his art without his knowledge or consent.
Artisan—which raised $25 million in April 2025 and previously ran ‘Stop Hiring Humans’ billboards—modified the meme so the dog says ‘[M]y pipeline is on fire’ to promote its AI sales agent Ava.
Green is exploring legal representation and compared the situation to Matt Furie’s 2019 Infowars copyright settlement over unauthorized use of Pepe the Frog.
KC Green created “This Is Fine” in 2013 for his webcomic Gunshow. A cartoon dog sits smiling in a burning room and declares everything is fine. If you’ve spent any amount of time on the internet over the last decade, you’ve no doubt seen the meme. But now, an AI startup is using it to sell sales automation software via an ad in a New York subway station. The problem is nobody asked Green.
The company is Artisan, best known for plastering “Stop Hiring Humans” billboards across San Francisco and featuring the infamous “Wolf of Wall Street” Jordan Belford on its ads. In their latest campaign, the dog’s speech bubble now reads “My pipeline is on fire,” with an overlay urging commuters to “Hire Ava the AI BDR.”
Ava is Artisan’s AI-powered business development representative—the core product of a company that raised $25 million in April 2025.
Green found out the way artists usually find out these days: someone tagged him on social media. It was Daniel Radosh, an Emmy-winning Daily Show senior writer and producer whose upcoming horror-comedy The Big Kill is currently in production. “There’s no way KC Green approved this,” Radosh wrote on Bluesky. “Hard to believe an AI company would just steal someone’s work though!”
Green’s response was not nice. “I’ve been getting more folks telling me about this and it’s not anything I agreed to,” he wrote. “It’s been stolen like AI steals.”
“Please vandalize it if and when you see it.”
The post was already vandalized when Radosh tagged Green.
KC Green said he’d emailed someone at the company but didn’t expect much of a response. A few minutes later, he muted the thread. The notifications had become overwhelming. “Go to bed, it’s late,” he told the pile-on.
Artisan said it has “a lot of respect for KC Green and his work” and was reaching out to him directly. In a follow-up statement the company said it had scheduled time to speak with him.
Artisan’s CEO Jaspar Carmichael-Jack has written that the “Stop Hiring Humans” campaign was “a provocation” designed to generate attention. He also acknowledged on the company blog that the billboard was “mostly just for attention” and that he doesn’t actually believe AI will replace all human workers. The company has since doubled down on controversy as a marketing strategy—and this time it has a copyright problem to show for it.
This isn’t an isolated pattern. The unconsensual use of AI to steal creators’ works, images, and IP is a growing trend that seems far from ending.
In February 2025, Israeli digital marketers created a viral deepfake video featuring AI-generated likenesses of Scarlett Johansson, Drake, Jerry Seinfeld, Steven Spielberg and others—all wearing anti-Kanye protest shirts—without any of the celebrities’ consent. Johansson condemned it: “The potential for hate speech multiplied by A.I. is a far greater threat than any one person who takes accountability for it,” she said.
Last year, MrBeast pulled an AI thumbnail tool after creators including Irish YouTuber “Jacksepticeye” found their logos and visual styles used without permission in promotional materials. Artists have filed waves of copyright litigation over AI training data. Green’s case is distinct from all of those—Artisan didn’t train a model on his work. They just took the image, tweaked it with AI, and put it on a wall.
Green said he is considering legal action.
Daily Debrief Newsletter
Start every day with the top news stories right now, plus original features, a podcast, videos and more.
The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.
We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.
Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.
Your permission applies to the following domains:
https://fsnn.net
Necessary
Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.
Statistic
Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.
Preferences
Preference cookies enable a website to remember information that changes the way the website behaves or looks, like your preferred language or the region that you are in.
Marketing
Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.