AI generated content is fine but....
....will machines be ever able to deliver the creative stuff?
|Sorab Ghaswalla||Jan 10|| 1|
In a previous newsletter I had talked of how machine learning (ML), a sub-discipline of artificial intelligence (AI) was “creating” and editing content. In the last few years, AI has “invaded” almost every profession. AI has replaced humans in some tasks, and is heading toward replacing some more.
One area where ML has been deployed with a degree of success is content, specifically, content that is based on data, i.e. content that is more “black and white” minus shades of gray.
News outfits, for example Bloomberg, have successfully deployed AI in the filing of a certain type of business report - company results, or markets’ reports. Others are using this tech to make machines file sports reports, even.
AI can “create” fresh content and even edit the already created ones. There are also some state-of-the-art paraphrasing tools available that come handy for re-writes. Grammer-check software, even. More so, a slice of modern-day “digital” marketing activities has also been taken over by AI.
Suffice to say any process that has some kind of repetitive task is well-served by ML as the latter is used to automate processes running on copious amounts of data that humans simply cannot cope with.
So popular has the use of AI in content generation become that online universities are actually offering courses where students are taught how to use AI tools in the creation of content, and in digital marketing. The target audience of such courses are content marketers whose primary weapon is content forms such as blogs, videos, white papers, and so on.
Would it be correct then to say that ML has gone mainstream in the production of content? The answer is no.
On a scale of 1 - 10, ML is probably at 1.5 in the world of content. Quality is still an issue where machine generated or even machine corrected content is concerned. While AI-driven editing and para-phrasing tech has certainly improved, things are far from being perfect.
Not that AI is not learning. It keeps getting better with time. One must not forget that AI is eventually controlled by humans, and machine-based content software is as good as the algorithm written for it. By humans. And, humans are imperfect, right? Till a degree of viability is reached, commercialization of these services will not happen. Nobody wants an imperfect system.
You see, contrary to popular (mis) perception, AI does not have the “intelligence” to create new or independent thought, a.k.a. content, although companies such as Google (Deepmind project) have managed to make machines (AlphaGo zero) that can teach themselves without the intervention of Man (not on the content front, though).
So, there. We said it. Creating new content out of a thought process as humans do is still a no-go for machines. Thinking for itself to create content from previous iterations without any human intervention? Yes, possible. The very nature of ML tools is to “learn” from the copious amounts of data fed to it.
For an exchange of ideas on content, join our LinkedIn Group - All About Content. Click here.
Here, a reference to natural language generation (NLG) would not be out of place. An NLG-software automatically turns data into human-friendly prose. But, once again, NLG just can’t produce content on its own. It requires a structured data set (previous content) and pre-set templates, following which it spews out content within the selected template.
So the answer to the million dollar question, “Will I (content creator) get replaced by a machine?” is a firm,"No”. Not at least till machines are developed that can get “creative” like humans. Not until machines get a mind of their own and develop content without any preset templates or previous iterations. Will there ever be such machines? The answer to that is linked to, “Will scientists be able to create a flesh and blood human being in their labs?” Same for the brain, then, mate. For all that fancy gobbledygook about AI and machines you read daily, remember, at heart, it’s only about data and algorithms.
Here’s a YouTube video courtesy Synthesia that talks of “synthetic media”.
So, should you use AI tech in your content efforts? What are the pros and cons?
The best positive is you can use AI to handle the complexity of your content. For example, automation does help in creating over a 1000 descriptions of the same product in a very short time, a task near-impossible for humans. Or, you may use a machine to perform a complex research task, requiring simultaneous retrieval across 100s of media sources. So you can save time and money.
AI also helps in the personalization of content. At speed.
A human is always in control. Which means wrong information/data can make way into the machine and force it to give erroneous output.
Because it lacks emotion, a machine cannot make out the difference between sense and nonsense, or common sense, for that matter, which could get to be a problem in some types of content generation.
Has no clue about human behavior which is a hurdle when it comes to situations where a judgement call is required.
Customization is fine but ML generated content just can’t reach the levels or depth of context and conversation of a Man-written piece. For example, it can’t put forth an argument in a debate or take a stand on a contentious issue. It can though, refer to the date when controversy started.
The story of AI in the world of content has just started. Nobody can say for sure how it will end. Yes, an educated guess can be made based on the milestones crossed so far. But one thing is for sure - AI in content is not going away.