AI Confusion: Companies Using It, But Scared of Legal and IP RIsk

Certain things confuse me.

Are more companies rushing to use AI to create content? Yes.

A Deloitte survey shows that marketers are facing a constant demand for content — which increased by about 150% in 2023 — but they are only able to meet the demand about half the time. So, many have turned to generative AI.

But…

65% of companies say they are VERY OR EXTREMELY CONCERNED about the intellectual property or legal risks from using gen AI.

Seems like there’s a disconnect here. If there is such widespread concern about legal risk from AI, why are companies rushing to embrace it?

Just ask the folks at Air Canada who got sued for bad info coming from their AI Chatbot — and lost. Or the websites that got crushed from using AI-generated content. Some saw their traffic drop from millions of visitors per month to near zero.  Entire sites have been de-indexed and no longer appear in search results.

I ran a test launching two identical websites at the same time last year. One has 100% human-written content and the other has 100% AI-written content on the same topic. Both saw regular traffic growth until Google’s focus on AI… and traffic dropped by 90%+. Read more about this test.

A January 2024 survey by Aporia of machine learning pros at large companies found an interesting result:

89% of those who use LLMs and generative AI say they see regularly signs of hallucinations, including factual errors, biased content, and potentially dangerous content.

If you can’t trust the output, you’ve got to carefully scrutinize the content before you put it out there. Avivah Litan, a research analyst at Garner, told SiliconANGLE, “I have heard stories from clients that spent a lot of time getting their prompt engineering to run predictably. They test and test, then put it into production, and six weeks later, it starts hallucinating.”

It’s so common that Dictionary.com’s 2023 Word of the Year was hallucinated.

“If you’re not concerned about it, you’re going to get burned,” said Emory Healthcare Chief Information Officer Alistair Erskine told the Wall Street Journal.

Is there a solution?

“The rate of hallucinations will decrease,” said Mohamed Elgendy, co-founder and chief executive of Kolena. “But it is never going to disappear.”

The stakes are high.

Sports Illustrated had its license to publish revoked due to the failure to make payments to the rights holder. The entire staff was laid off. It didn’t help that in a last-ditch effort, the magazine and online sports website relied on AI to churn out poor-quality content. Although there are efforts to resurrect the renowned brand, there’s no plan in place currently.

Look, AI has its place. It is a great tool for many things. But it’s not a replacement for human writing and fact-checking. But, for gosh sake, read it and check it first.

Don’t be like Statista, who has a reputation for high-quality research, and leave crap like this on your page:

“As an AI language model, I can provide you with country-specific statements about current trends in the [TOPIC X]. However, I need you to specify the country you are interested in so that I can tailor the statement accordingly.”

Guess they forget to cut when pasting.

Or another site that was trying — I guess — to add some real-world examples and left this gem in their copy:

“Illustrative examples highlighting each stage elucidate the intricacies and practical considerations inherent.”

Umm, OK.

An increasing amount of my workload has shifted from writing to guiding AI. It’s why I became a certified Prompt Engineer through Vanderbilt University. You can use AI as a tool, but there’s still no substitute for a human fact-checker and someone to guide SEO.

IMAGE SOURCE: Bing Image Creator (AI tool)