Publishing 350 Blog Posts with ChatGPT Content

I spent last week spamming the internet with useless content with the help of AI. There is no result of the experimentation. I just wanted to see how I could make a system to produce trash articles quickly and then publish them. And that I did.

It was not that hard once I got it. 

So I am writing this blog post so you don’t have to go through the experimentation phase, and you can just execute what you like.

What did I started with?

Here are several things:

#1 OpenAI API: Make an account on openAi. Account is free. Find the API keys under your profile section.

The API key cost money. So add a card. Don’t worry. It is not expensive. You can create so much content in only one dollar. And for the experiment, a dollar is enough.

#2 Python and Pycharm: We will be using the Python script to automate the content creation part. So download Python and Pycharm. Both are free. Watch a video to see how to set it up.

#3 Excel or any other alternative. 

#4 WordPress website: I made a dummy website on a subdomain. Will disclose later in the article.

#ChatGPT: I don’t know Python. I use ChatGPT to create the script.

That’s it. Now moving on 

The Process

#1 Generating the Python script

I asked ChatGPT to create a python script that can use OpenAPI to generate the articles. But it was not enough.

I wanted to automate the process and did not want to feed the keyword to the program.

So I asked for the way to create a program that fetches the words from the columns of an excel sheet and then use the OpenAI API to generate the article. 

It worked. But the issue was I had to feed the small list of the keywords. Because the code tended to break and I was losing the result.

The output could be only generated when all the items in the list were executed.

So I made another change in the script.

Now the output was saved in every iteration. So if the code breaks at keyword 15, we will have the output saved till the keyword 14.

It was the solution.

After several tweaks and tuning, I turned into something like this:

import openai
import pandas as pd

openai.api_key = "ADD YOUR OPENAI API KEY"

def generate_article(prompt, model_engine):
    response = openai.Completion.create(
        engine=model_engine,
        prompt=prompt,
        max_tokens=4000,
        n=1,
        stop=None,
        temperature=0.5,
    )

    article = response.choices[0].text
    return article.strip()

# ask for file path
file_path = input("Enter the file path: ")

# ask for column name
column_name = input("Enter the column name: ")

# set OpenAI API key
model_engine = "text-davinci-003"

# read the Excel file
df = pd.read_excel(file_path)

# create a list to hold the articles
articles = []

# generate articles for each keyword in the specified column
for i, keyword in enumerate(df[column_name]):
    prompt = f"Compile top 10 book quotes from {keyword}"
    article = generate_article(prompt, model_engine)
    articles.append(article)
    df.loc[i, 'Article'] = article
    output_file_path = f"output_{i}.xlsx"
    df.to_excel(output_file_path, index=False)
    print(f"Article written for {keyword} saved to {output_file_path}")

# save the final output file
output_file_path = "output_final.xlsx"
df.to_excel(output_file_path, index=False)
print(f"All articles saved to {output_file_path}")

Add your OpenAPI AI key at the place of ADD YOUR OPENAI API KEY. Don’t remove the quote marks – it’s part of the code.

In the section: # set OpenAI API key, model_engine = “text-davinci-003” is set up. You can change the model engine as you wish.

Here is the list of all the models: https://platform.openai.com/docs/models/gpt-3-5

#2 Preparing the Sheet

You need to prepare the sheet to feed it. It will ask you for the File Path of the sheet.

Copy File Path

Copy the File path and give it to the code when it asks for it. Remove the Quotation marks from the pasted file path.

What goes into the Sheet?

Do the keyword research yourself and add the keywords. I asked the ChatGPT for the most popular books and added them to the sheet in column A naming it keywords.

So when it asks for column A – I give it keywords.

In the code, there is a section: prompt = f”Compile top 10 book quotes from {keyword}”

As you can see, Compile top 10 book quotes from {keyword} is my prompt.

You can change it too. Prompt would be according to your keyword.

Like if the sheet has the name of cricket players, it could be: Find the top 10 facts about the {keyword}

Or anything you like. You think of the use case.

#3 Running the Code

Here are several screenshots showing you the way code runs.

1

Python code running Publishing 350 Blog Posts with ChatGPT Content

2

Python code running Publishing 350 Blog Posts with ChatGPT Content

3

Python code running Publishing 350 Blog Posts with ChatGPT Content

So it begins.

#4 Spamming the Internet

Now the exciting part. Spamming the Internet.

I already created the WordPress site on the subdomain of one of the domains I already have. I am not buying the new domain name for this project.

Importing all the posts to WordPress

I installed the Content Excel Importer plugin. This plugin takes the Excel sheet and publishes the posts automatically.

But you have to modify the sheet a little bit.

The output sheet looks like this.

Output Sheet with 350 Posts

So I add several columns and use the Sheet formulas (basically concatenate) to fill them.

Title: =CONCATENATE(“Top 10 Best “, A2, ” Quotes”)

URL: =CONCATENATE(A2, ” quotes”)

The final sheet looks like this.

Turning the sheet importablle

Then I import it all at once.

Matching the SHeet Header with the Fields

Match the fields with the Sheet headers. And upload.

Importing all the posts

Challenge

The challenging part is indexation. I don’t think Google is going to index the blog.

Still I have made the sitemap and submitted it to the search console.

The latest condition is this:

Index state

But I have plans of ordering some web2.0 and comments links to see if it can be indexed. Maybe I will do it.

Update: Everything Index

All the blog posts are indexed successfully. I used the Python script (made by ChatGPT) to automate the indexing.

from oauth2client.service_account import ServiceAccountCredentials
import httplib2
import json

# https://developers.google.com/search/apis/indexing-api/v3/prereqs#header_2
JSON_KEY_FILE = r"C:\Users\Admin\Desktop\credentials.json.json"
SCOPES = ["https://www.googleapis.com/auth/indexing"]

credentials = ServiceAccountCredentials.from_json_keyfile_name(JSON_KEY_FILE, scopes=SCOPES)
http = credentials.authorize(httplib2.Http())


def indexURL(urls, http):
    ENDPOINT = "https://indexing.googleapis.com/v3/urlNotifications:publish"

    for u in urls:
        content = {}
        content['url'] = u.strip()
        content['type'] = "URL_UPDATED"
        json_ctn = json.dumps(content)

        response, content = http.request(ENDPOINT, method="POST", body=json_ctn)

        result = json.loads(content.decode())

        # For debug purpose only
        if ("error" in result):
            print("Error({} - {}): {}".format(result["error"]["code"], result["error"]["status"],
                                              result["error"]["message"]))
        else:
            print("urlNotificationMetadata.url: {}".format(result["urlNotificationMetadata"]["url"]))
            print("urlNotificationMetadata.latestUpdate.url: {}".format(
                result["urlNotificationMetadata"]["latestUpdate"]["url"]))
            print("urlNotificationMetadata.latestUpdate.type: {}".format(
                result["urlNotificationMetadata"]["latestUpdate"]["type"]))
            print("urlNotificationMetadata.latestUpdate.notifyTime: {}".format(
                result["urlNotificationMetadata"]["latestUpdate"]["notifyTime"]))


"""
List of URLs to be indexed
"""
urls = ['http://bookquotes.inpin.in/pere-goriot-by-honore-de-balzac-quotes/']

indexURL(urls, http)

JSON_KEY_FILE: There is a long process to get this file. Read the process. Once you have the JSON file, paste its location in the code in the correct place.

Finally, add the URLs to the ‘List of URLs to be indexed ‘ section. Run the code. That’s it.

What’s the point?

As I mentioned, the point is to use AI to spam the internet and find the process. We can think of the use case later.

Maybe with good keyword research and sound prompting, we can guide OpenAI to generate good articles. I mean, surely there is a niche that can glow on this type of content.

Let’s see. Do let me know what you think about it in the comments. 🙂

Faizan Fahim

Faizan Fahim is a content marketing executive at Improving Pune. Here he writes about marketing, SEO, and writing. Learning new things every day, with an open mind. Join him on Twitter.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *