GPT-3 and content writing - Express Writers

What Is GPT-3 and Will It Take Over Our Writing Jobs?

What Is GPT-3 and Will It Take Over Our Writing Jobs?

As a writer myself and the owner of a writing agency, artificial intelligence (AI) has been after our jobs for a while. Can it be done cheaper? By a robot? …For less money? Time and time again, the answer has been no. I even wrote a two-part series a couple years ago for Content Marketing Institute, debunking the myth that robots can write content and replace a real, human, breathing writer. Today, technology is growing at a faster pace than ever. From manufacturing to rote office tasks, we’re finding AI replaces human workers in practically every industry. Anyone working a rote or repetitive task is quickly seeing him or herself replaced by a machine. In fact, AI expert Kai Fu Lee believes as many as 40 percent of the world’s jobs will one day be automated. So, as of 2020, could content writing be among them? With the launch of GPT-3 in May 2020, many people think it’s possible. Before you panic, however, keep reading. I’m taking a closer look at what GPT-3 is, how it’s causing us to rethink content, and why artificial intelligence will never replace certain activities – particularly those that rely on human creativity. Ready? Let’s dive in. [bctt tweet=”Is AI in the form of GPT-3 coming for our content writing jobs in the near future? @JuliaEMcCoy doesn’t think so. Get 5 major reasons why, now on the Write Blog ✅” username=”ExpWriters”] What Is GPT-3? GPT stands for “Generative Pre-Trained Transformer.” It’s a language AI created by San Francisco-based tech company OpenAI – backed by, yes, Elon Musk. GPT-3 is the third edition of GPT, which rolled out in May 2020. In July 2020, OpenAI announced the creation of its waitlist for access. Those encouraged to sign up include entrepreneurs, academic researchers, or members of the general public interested in becoming beta testers. Transformer-based AI language generators have already been around for a few years, first appearing in 2017. OpenAI GPT appeared in 2018, and the upgrade, GPT-2, was released in February 2019. Compared to OpenAI GPT, GPT-2 was a sleek and sophisticated model with some 1.5 billion parameters for constructing text. At the time of its release, OpenAI noted GPT-2 was “probably too dangerous to release” but it had “seen no strong evidence of misuse so far.” GPT-3 is two orders of magnitude more powerful than GPT-2, with 175 billion parameters. It’s better equipped than any other AI model out there to spin out realistic, convincing text. And it does. Check out this pop song by Taylor Swift about Harry Potter… Arram Sabeti used GPT-3 to produce numerous pieces of content, including instructional manuals, interviews, and pop songs. How GPT-3 Works The GPT-n series uses what’s called an autoregressive language model that makes use of the same deep learning mechanisms found in natural language processing. (If any of those words sound familiar, it’s because I’ve talked about them briefly before. Google now deploys NLP to better understand search intent – that’s what BERT was all about.) Let’s break down what all of that means. An autoregressive language model is a model that attempts to identify the next best word in a string of words based on the ones that come before. For a very simple example, consider the first line of the Taylor Swift song above. We know in English that all sentences follow a certain structure: it’s usually something like “subject – verb – object.” If we start with a subject – the most obvious choice being “Harry” – then we know that the next word needs to be a verb. Sure, we could shove any verb in there. That’s not how our brains actually work when we’re writing. However, in autoregressive modeling, the algorithm will only consider the word(s) before. Right now, that’s our starting point. So, Harry. Harry what? Harry’s got. Got what? Glasses. That “Harry’s got glasses” ended up being the first line is random. It could have just as easily spit out “Harry’s got a scar” or “Harry’s got books.” We’ve clearly been given more data about Harry and his physical features. As a result, the grammar and context are both correct. The lyrics feel genuine. That’s a major step up from what AI could do just three years ago when I tested out another model. Given the subject (content marketing) and some basic data about it, the AI returned this brilliant gem of utter nonsense: The state of AI in 2020 is a whole other level compared to what it was in 2017. Source: CMI. In 2017, it was much easier to laugh at the idea of AI taking over things like creating news stories, wiki articles and – yes. Web content. But if you’re looking at that Taylor Swift song and getting an uneasy feeling in your stomach, you’re not alone. Why Some Fear GPT-3 Will Make Writers Obsolete GPT-3 was specifically built to create realistic, convincing text – and it does exactly that. In fact, some people have already attempted to use it in content writing. Will that make writers in this industry obsolete? Here’s why some people think so: Early creations are convincing. Early research around GPT-3 has shown it can – and does – generate news articles that human readers have difficulty distinguishing from reality. That makes it one of the most important (if not frightening) advances in AI yet. GPT-3 can seemingly predict the future. Want to predict the future? Maybe tell GPT-3 what’s going on and ask what’s going to happen. Actually, researchers did exactly that, updating the AI about the 2020 pandemic. It accurately predicted which industries would be most affected, and what would happen to the world economy as a result. But really. If you’re a writer with any skill whatsoever, AI is not coming to take away your job. Here’s why. [bctt tweet=”If you’re a writer with any skill whatsoever, AI is not coming to take away your job. Learn 5 reasons why via @JuliaEMcCoy” username=”ExpWriters”] 5 Reasons Why … Read more