When you didn’t think AI could get any closer to being human, one decided to Rickroll people looking for help.
The art of Rickrolling each other has fallen off the bandwagon lately, but thankfully, Lindy is bringing it back into style.
However, I’m not too sure how well it’ll be received in customer service.
Advert
When the CEO of the AI assistant firm Lindy shared his revelation that his ‘Lindy’ AI was playing tricks, he took to X, formerly Twitter to explain how it happened.
Flo Crivello appeared to be confused about how Rick Astley's 1987 hit "Never Gonna Give You Up" came to become his AI’s choice to prank people, but it happened anyway.
The Lindy bots are the company’s AI assistants who are supposed to help customers with tasks and queries.
Advert
This includes handing out tutorials and helping people learn how to use the platform.
It was during its interaction with a client who had requested help that it provided a not-so-helpful link to a ‘video tutorial’ that didn’t actually exist on YouTube.
Crivello wrote on X: "A customer reached out asking for video tutorials.
"We obviously have a Lindy handling this, and I was delighted to see that she sent a video."
Advert
"But then I remembered we don't have a video tutorial and realized Lindy is literally f*cking Rickrolling our customers."
In his tweet, which has since become viral with 2.1 million views, shows a screen-recording of the email chain in question.
He then double-checks and shows his followers exactly where the tutorial link leads to.
When clicking on the link, he’s immediately greeted with the sound we know and love (or hate) roll of the Rickroll.
Advert
I mean, after two decades of trolling each other online with the tune, it’s become an unforgettable sound.
Even though he doesn’t quite know how it happened, he did go on to tell TechCrunch that he believes that the AI assistants figured out how to emulate his brand’s humor.
He said: "The way these models work is they try to predict the most likely next sequence of text.
"So it starts like, 'Oh, I’m going to send you a video!' So what’s most likely after that? YouTube.com. And then what’s most likely after that?"
Advert
So, maybe they thought it would be only natural to Rickroll after being asked to send a YouTube video?
Anyway, he went on to share that the issue has been ‘patched across all Lindies’ so it doesn’t happen again.
He explained: "The really remarkable thing about this new age of AI is, to patch it, all I had to do was add a line for what we call the system prompt — which is the prompt that’s included in every Lindy — and it’s like, don’t Rickroll people.”
Whether this is the last time we see Lindys send innocent customers looking for help to Rick Astley’s most famous hit, we’ll never know.
But if they do, it’s still going to be as funny as the very first time.