In 2023, I watched a YouTube that bent my life in a new direction.

I was learning more about screenplay storytelling and I stumbled onto a video of an indie filmmaker talking about using ChatGPT to write a script.

He said he just finished writing the first draft of a script. This first draft would typically take him six weeks with a co-writer to write, but with ChatGPT he was able to get a beat sheet and dialogue cranked out in a day. A day!?

I was able to find the video!

Once I heard him say this, I had to learn more about this generative AI technology and how it would aid in creating. This launched me into a 6-month deep dive into machine learning, python, and gen AI.

Here are a handful of takeaways from my 6-month deep dive, with an emphasis on how I think it will affect online learning:

Generative AI is better at teaching a concept than a person.

What shocked me, even with ChatGPT-3.5 was the clarity in which it would explain a concept. The LLM produced an explanation better than most websites. And unlike a person, it didn't need rewrites, it didn't need an editor. Plus, I could ask it to explain the concept in different ways, such as if I was in 5th grade. I could ask it for examples. Even better, I could ask it for examples of a specific industry.

My first experience of gen AI explaining concepts better than the teacher was during an intro Python course on Coursera. Earlier in the course, I was getting confused by the teacher's exemplar. It just didn't seem clear. I literally copied and pasted the question into ChatGPT and not only spit out the correct answer, but the code comments were so much more clear than the instructor I was actually able to understand the coding concept. Then and there I knew teaching was going to change. I went on to use ChatGPT to help explain almost every exercise.

Gen AI is neither a software nor a person. It's something in between.

The concept of "something in between" is difficult for people and companies to understand. Gen AI isn't software, it often makes mistakes. On the other hand, it also isn't a person, it can't understand context like a person does. Gen AI lands somewhere in between.

When working with gen AI, you can speak to it like a person and it will for the most part understand. However, you'll have to coach it and massage its outputs to get the result you want. This is a skill that only comes with a lot of time working with the LLM. At least 10 hours, ideally over 100.

Gen AI is a tool, rather than a complete replacement.

In learning, we won't be able to use LLMs to craft the entire curriculum. Currently, what is called the LLMs' "context window" is too limited. For example, we can use it as a tool to draft syllabi (similar to the beat sheet in the video above), and specific sections, such as lessons in curriculum. Beyond that, the LLM likely won't create specific curriculum for what we need.

However, keep in mind, that one of the reasons why LLMs are limited in their ability to craft more complex items, such as in-depth curriculum for a six-week course, is because of the limited context window. Google just released an advanced version of Gemini with the ability to hold around 750,000 words. Which is a total game-changer.

Very soon the general public may be able to prompt long, detailed instructions to the gen AI so that the output is much more specific to what we want. Additionally, soon, the gen AI will be able to create images and diagrams to supplement the curriculum, which will be absolutely helpful (and crazy to consider the cost savings) from the learning perspective.

One writer will be able to do the work of three writers and an editor.

One writer who can prompt well is able to do the work of several writers.

Now, that doesn't mean we should use LLMs for all our writing. For me, writing is a process of understanding. To get the jumbled-up thoughts in my head down on paper. And after writing, the ideas are better explained verbally.

While working with an LLM does help me understand a topic better, particularly how to explain a topic concisely, it doesn't help me organize my own thoughts like writing does.

If we still want people who can explain a topic well in a classroom, a training, or even through a video platform like TikTok, the process of writing is still necessary. Clear writing = clear thinking.

A writer who knows how to work well with gen AI won't need an editor

I've worked with editors for a few years and I think if you prompt the LLM correctly, it can come over close to replacing an editor. And I think the software and pre-training for this task will get there in the next couple of years.

We will still need a position to ensure that the writing adheres to the style guide and is factually accurate. However, can I diligent writer do that? Additionally, the better a writer can prompt an LLM, the closer they will be able to get the writing to the style guide.

Additionally, the large context window, aka more information, to provide the LLM may help it better understand the role it needs to play and the end result you would like to achieve with the writing.

The future of the writer role will certainly change

When the more advanced abilities of gen AI were released, many people were saying that writers will eventually become prompt engineers. Even if the LLM technology advances significantly, I don't think that's entirely true.

It's difficult for LLMs to fully understand the context of what you're trying to write. They're good at bland, unoriginal content. Yes, that writing can be prompted to be more creative, but it often misses the mark on the tone and purpose. Sometimes the output is fun and better than you imagined. However, often for me, I've gotten frustrated with the output.

So, I think the future role of the writer will be more of a coach and editor. You'll need to know how to communicate to the LLM as a coach to rework the writing.

Most importantly, and I think the largest takeaway for me, is that writers will need to keep their "noticing" ability. They still need the ability to notice good writing from bad. And when the LLM creates content they were originally looking for. At least for now, we need people to have the skillset to do the noticing.

What I learned from a 6-month deep dive into Generative AI