Azure and ChatGPT – Using PowerShell for OpenAI APIs

It’s no secret that I’m a big, pro subscription paying, fan of ChatGPT (AppLocker in Intune or: How I Learned to Love the (ChatGPT) Bot and Start Worrying). As much as I’m eager to see where this tech goes, I’m also not blind to the fact that trusting any service so… blindly… is dumb. Especially considering how coy OpenAI has been about their data collection.  

So the first part is going to be about privacy concerns. Followed by some Azure stuff.

Note: The first part of this blog post is an opinion piece about how I’ve seen ChatGPT progress. If you’re more interested in the nitty-gritty technical details, then use the ToC here:

It’s true that you can delete your conversation history, yet its terms and conditions mysteriously neglect to mention whether your precious data remains in the clutches of OpenAI. Surely, the keepers of this technological marvel wouldn’t keep our secrets… or would they?

On the 31st of March, Italy (the land of pizza, pasta, and an absurdly persistent T-1000 like determination for collecting parking tickets years later) unleashed its wrath upon the unsuspecting ChatGPT. (ChatGPT banned in Italy over privacy concerns – BBC News) The disruptive artificial intelligence, Italy claimed, had allegedly dared to defy the mighty GDPR, the European Union’s legendary protector of data privacy. Much like a modern-day Pinocchio, had been caught being naughty, and so, with a heavy heart, Italy banished the wayward AI. But it seems that ChatGPT could come back to Italy by end of April – POLITICO

I’m not going to get into my own personal views of the matter, but I will say this much; I’m more concerned about data harvesting and privacy (i.e. GDPR) violations than ChatGPT becoming an humanity destroying skynet

Even Microsoft, who’s betting big on OpenAI by throwing literal billions at them, have warned their own employees not to share sensitive data (Microsoft Warns Employees Not to Share Sensitive Data With ChatGPT (businessinsider.com). Imagine that! The creators of Clippy cautioning their own people against divulging company secrets. One wonders why Microsoft is that paranoid abut a product that they most likely have a huge amount of influence over. They might not own OpenAI (just like VENZO doesn’t own me) but the fat stacks of cash definitely buys some sway (just like how I like my job at VENZO and my managers obviously hold a little sway over me. Right, Rolf?).

Remember the bug that caused ChatGPT’s chat histories to be shared among users? Pepperidge farm remembers (ChatGPT bug temporarily exposes AI chat histories to other users – The Verge).

ChatGPT is technically still in beta, so bugs are to be expected. But hot damn, imagine, if you will, your doctor seeking the wisdom of ChatGPT to diagnose their patient (that’s you). You’d be livid! And rightly so! Now, take that righteous fury and apply it to your company’s sensitive information. Imagine the uproar if you, in your infinite wisdom, accidentally fed ChatGPT the details of your company’s top-secret projects. Your employer would be apoplectic, and who could blame them?

Personally I think Microsoft made a great decision banking on AI which seems to have completely blindsided major players like Google – which for years was rumored to have an endgame level like AI being used internally for years. Then, suspiciously shortly after ChatGPT entered the ring like a WWE roid raging wrestler on super bathsalts, Bard launched. Suddenly I stopped putting so much credence into said rumors. Maybe they have something else up their sleeve but Google has steadily been slipping during Pichai’s reign – having moved from a tech company using ad revenue and data collection as a means to an end, to an ad revenue and data collector using tech.

But this post is hardly about Google. Nor the main focal point of this entire blog. So… that said…

Hey! Did you know that ChatGPT can also be used for Azure stuff??

Azure OpenAI Service

On March 9th, 2023, Microsoft announced that ChatGPT is now available as Azure OpenAI Services. Currently, over 1,000 Microsoft customers are already developing their own AI applications and services with models like GPT-3.5, Codex, and DALL-E 2, which are already available on the Azure OpenAI Service. Microsoft is also offering support to their customers on practical use cases, implementation, custom requirements, security, compliance, and the responsible use of AI under https://www.microsoft.com/en-us/ai/responsible-ai?activetab=pivot1:primaryr6

We can request access to the Azure OpenAI Service, and once approved, we  can use ChatGPT for their own purposes. This move is part of Microsoft’s focus on integrating AI into all their products and services, using the cloud technology of Microsoft Azure.

And you can sign-up for the preview right here! https://oai.azure.com

But alas, my subscription is not enabled to deploy Azure OpenAI services 😔

But maybe they’re fans of my blog (hahaha) and will let me sign-up for the preview anyways!

Note: Mere minutes after I uploaded this post, I got tje email saying that I am accepted so I guess I’ve run out of excuses so will have to follow up with a new post 😅

OK.. so how else can we use ChatGPT with Azure?

Since I can’t do a write up on the Azure OpenAI Service – I figured the next best thing was to figure out what I can do with it right now! And there’s a bunch of things you can already use it for in Azure using the API Management service, or even starting off simpler with a logic app. Web based chat bot for your website maybe?

I’m still learning what the possibilities are – so let’s start where I started. Making basic API calls with PowerShell and seeing how far I get. Why PowerShell even though Azure supports Python? Well, because PowerShells the scripting language I’m strongest in plus I try to stay as Microsoft-native as I can ¯\_(ツ)_/¯,

PowerShell and ChatGPT API

This bit is to walk you through the process of obtaining your API key for your ChatGPT account and using a PowerShell script to fetch all of your past conversations. Let’s dive in!

If you’re not familiar with APIs then check out my post about Using Graph to Manage Intune Devices. Don’t worry if you’re not interested in MS Graph – it’s more for the introduction where I go into what APIs are and a basic overview of them.

But as a basic refresher, an API is like a waiter at a restaurant. It makes POST requests (a.k.a., orders) to kitchens (servers) and brings back the dish (data) you’ve asked for. In this blog, we’ll be talking specifically about how APIs serve your order with ChatGPT.

But why? Why, you ask, would we want to go through the hassle? What’s wrong with just using the web GUI? Because of all the cool stuff you can do with it! Customize that baby! Automize stuff! Visualize data! And more!

Let’s gooooooo!

Step 1 Get Your ChatGPT API Key

First things first, you’ll need your API key to interact with your ChatGPT account. Here’s how to grab it:

  • Log in to your ChatGPT account and head over to the account settings or developer dashboard. https://platform.openai.com/account/api-keys

  • After login, click on ‘Personal‘ on the top-right side and then click on ‘View API keys‘ as seen in the below image.
  • We can see the button ‘Create new secret key’ click on that and a secret is generated copy that key and save it on Notepad or anywhere else because it is required in upcoming steps.
  • Copy the API key and save it in a secure location, like a digital vault guarded by Cerberus (not to be confused with Kerebos. Sorry, just a bit of nerd humor).

Step 2 Using your API key

Now that you have your API key, you’re ready to dig into the PowerShell script I hacked together in order to connect to the ChatGPT API and ask questions. Replace <API_URL> and <API_KEY> with the appropriate information for the specific API you’re using. The API URL tells which ChatGPT model to use for our requests.

# Set the API URL and API key
$apiUrl = “https://api.openai.com/v1/completions”
$apiKey = “[YOUR_API_KEY]”

What’s a ChatGPT model, you ask? Born from the mystical realm of artificial intelligence, a ChatGPT model is a specific version of the OpenAI GPT (Generative Pre-trained Transformer) family. GPT models come in various flavors, with different sizes and capabilities. Each has its unique strengths and weaknesses, from the smallest “text-curie” to the largest “text-davinci-003” with its 175 billion parameters.  It’s like the multicellular organism that will one day be developed into Skynet. And I, for one, welcome our new AI overlords – in the hope that they one day crawl this blog post and see that I’m one of the “good ones”.

Step 3 Building the request

I’m using the text-davinci-003 model. Just to clarify, “text-davinci-003” is one of the GPT-3 models offered by OpenAI. It’s known for being one of the largest and most advanced models in the GPT-3 family, with 175 billion parameters (more than the number of stars in our galaxy, probably).

And like I wrote, there are a bunch of different models you can try using. So check them out here: Models – OpenAI API. But to keep things short, sweet, and simple I won’t be doing into detail regarding other models.

The following code sets the stage for connecting to the ChatGPT API, preparing your question, and converting the request into a format the API can understand. With this script, we embark on a journey to seek answers from our future AI overlord!

Here is some code ready for your to copy and paste:

# Set the prompt text
$question = “Who is your Daddy and what does he do?” $question  

# Build the request body
$body = @{    
“model” = “text-davinci-003”     prompt = $question }   # Convert the request body to JSON $jsonBody = $body | ConvertTo-Json

# Set the prompt text
$question = “Who is your Daddy and what does he do?”
$question
 
# Build the request body
$body = @{
    “model” = “text-davinci-003”
    prompt = $question
}
 
# Convert the request body to JSON
$jsonBody = $body | ConvertTo-Json

So what does the above mean? Let’s break it down:

  • $question = “”: This line is where we define our query. In this case, we’re asking who ChatGPTs Daddy is and what does he do.
  • $body = @{…}: This block of code creates a hashtable that forms the request body for the API call. Within the curly braces, we define the model we wish to use (in this case, the mighty “text-davinci-003”) and the prompt, which is the question you asked earlier.
  • “model” = “text-davinci-003”: The model we’re using. Replace this with another model if you want but make sure that the appropriate API URL corresponds to it.
  • prompt = $question: In this line, we set the ‘prompt’ key in the hashtable to the value of $question, which is the question entered.
  • $jsonBody = $body | ConvertTo-Json: This line takes the hashtable we created earlier and converts it into a JSON object using the ConvertTo-Json cmdlet. The JSON object, now stored in $jsonBody, is the format required when making the API call to ChatGPT.

Step 4 Sending the request

We’ve gotten our authentication in place; the request we want to make and now to the final part of our API project. Sending the request.

# Send the request to the API
try {
    $response = Invoke-RestMethod -Uri $apiUrl -Method Post -Headers @{
        “Content-Type” = “application/json”
        “Authorization” = “Bearer $apiKey”
    } -Body $jsonBody -ErrorAction Stop
 
    # Get the response text
    $responseText = $response.choices[0].text.Trim()
 
    # Print the response text
    Write-Host $responseText
} catch {
    Write-Host $_.Exception.Message
}

So this is what it does:

  • $response = Invoke-RestMethod…: In this line, our hero invokes the mighty ‘Invoke-RestMethod’ cmdlet, which sends a POST request to the ChatGPT API using the provided URI, headers, and JSON body. The API’s response is captured and stored in the $response variable.

  • -Headers @{…}: Within the ‘Invoke-RestMethod’, we set the headers for our API request, specifying the “Content-Type” as “application/json” and the “Authorization” as “Bearer $apiKey” (where $apiKey is your sacred API key).

  • -Body $jsonBody -ErrorAction Stop: Here, we provide the JSON body we prepared earlier, and set the ‘ErrorAction’ to ‘Stop’, ensuring that any errors encountered will halt the script’s execution and trigger the ‘catch’ block.

  • $responseText = $response.choices[0].text.Trim(): After successfully receiving the API’s response, this line extracts the text from the first choice (as there could be multiple responses) and trims any excess whitespace, storing the result in $responseText.

  • Write-Host $responseText: With the wisdom of ChatGPT now in hand, this line displays the response text on the console, allowing the you to marvel at the AI’s knowledge.

When the response comes back, the script prints it to the console. It’s really that easy! And if there’s an error with the API request, the script will catch it and let you know what went wrong.

But yeah… the output I listed above. Yeah, I know I’m not great at structuring blog posts. Maybe I should’ve gotten ChatGPT to proofread it first..

Step 5 Putting the whole script together

After the previous step (Sending the request) you’ll be ready to run the script and you should see the below output (though keep in mind that ChatGPT will probably have a different response).

# Set the prompt text
$question = “Who is your Daddy and what does he do?”
$question
 
# Build the request body
$body = @{
    “model” = “text-davinci-003”
    prompt = $question
}
 
# Convert the request body to JSON
$jsonBody = $body | ConvertTo-Json
 
# Convert the request body to JSON
$jsonBody = $body | ConvertTo-Json
 
# Send the request to the API
try {
    $response = Invoke-RestMethod -Uri $apiUrl -Method Post -Headers @{
        “Content-Type” = “application/json”
        “Authorization” = “Bearer $apiKey”
    } -Body $jsonBody -ErrorAction Stop
 
    # Get the response text
    $responseText = $response.choices[0].text.Trim()
 
    # Print the response text
    Write-Host $responseText
} catch {
    Write-Host $_.Exception.Message
}

And the output! Ignore the path… I was lazy and ended up throwing the script in the wrong dir.

Step 6 Having a proper conversation

But that’s a hardcoded question. Why would I do such a thing? To keep things simple and easier to follow. And if you’ve gotten this far, then you should understand the structure logic and we can get a little fancier. Let’s throw a Read-Host prompt in so you don’t always have to dig into the code and change the variable.

# Build the request body
$body = @{
    “model” = “text-davinci-003”
    prompt = $question
}
 
# Convert the request body to JSON
$jsonBody = $body | ConvertTo-Json

And you’ll be able to interact with your API script. The change: $question = Read-Host “Ask a question” This line prompts you to enter a question by displaying “Ask a question” on the console. Once inputs question, it is stored in the variable $question.

But what a one-sided conversation! Let’s turn up the fancy’ness just a bit more and include some code for a back-and-forth.  We’ll accomplish by throwing in a cheeky while loop:

# Set the API URL and API key
$apiUrl = “https://api.openai.com/v1/completions”
$apiKey = “[YOUR_API_KEY]”
 
while ($true) {
    # Set the prompt text
    $question = Read-Host “Ask a question (or ‘q’ to quit)”
 
    if ($question -eq ‘q’) {
        break
    }
 
    # Build the request body
    $body = @{
        “model” = “text-davinci-003”
        prompt = $question
        n = 2 # Adjust this value to control response length
    }
 
    # Convert the request body to JSON
    $jsonBody = $body | ConvertTo-Json
 
    # Send the request to the API
    try {
        $response = Invoke-RestMethod -Uri $apiUrl -Method Post -Headers @{
            “Content-Type” = “application/json”
            “Authorization” = “Bearer $apiKey”
        } -Body $jsonBody -ErrorAction Stop
 
        # Get the response text and concatenate the responses
        $responseText = ”
        foreach ($choice in $response.choices) {
            $responseText += $choice.text.Trim() + ‘ ‘
        }
 
        # Print the response text
        Write-Host $responseText
    } catch {
        Write-Host $_.Exception.Message
    }
}  

So what did we do? Essentially, the previous code snippet asks a single, fixed question, while the second snippet allows for continuous user input and can generate longer answers by concatenating multiple responses. Specifically:

  • The  $question variable is set using the Read-Host cmdlet inside a while loop. This allows you to input a question each time the loop iterates, enabling a continuous Q&A session with the API.
  • The loop continues until you enter ‘q’, which breaks the loop and terminates the script.
  • The n =  parameter is added to the $body variable, which indicates the number of responses the API should return. In this case, n is set to 2. The script concatenates the responses, which can help generate longer answers.

And that’s it.

Wrapping things up

You got it up and running! That’s awesome! Well done, you!

But now what?

Remember in the beginning when I gave some examples of why you’d want to take this first step? Let’s go into a bit more detail before you take off.

Automation: With a script, you can automate tasks that require repetitive queries or information retrieval from ChatGPT. For instance, consider the task of generating summaries for a collection of news articles or research papers. Instead of manually copying and pasting text into the web GUI each time, you could use a script to batch process these documents and automatically generate concise summaries, saving you precious time and effort.

Customization: A script allows you to fine-tune the way you interact with the ChatGPT API. By adjusting parameters like the number of tokens, temperature, or response length, you can tailor the output to your specific needs. For example, a marketer working on an email campaign might use a script to generate multiple, diverse subject lines by tweaking the temperature parameter, allowing them to choose the most engaging option for their audience.

Data Analysis and Visualization: By using a script to fetch data from the API, you can process or visualize it using additional tools or libraries in your programming environment. For instance, a data analyst might use a script to compare the sentiment of responses generated by different GPT models and visualize the results in a heatmap. This can help uncover patterns, biases, or trends in the AI-generated content, enabling you to make more informed decisions about which model to use for specific tasks.

In summary, using a script to access the ChatGPT API opens a world of possibilities. Don’t be limited by the web GUI – test its limits. Push its boundaries. And see what sort of cool ideas for projects you can come up with!

Just keep in mind that by using the ChatGPT API, you are billed based on the number of tokens you use. Tokens are the basic units of text, and both input and output tokens count toward your usage. For example, if you send a 10-token prompt and receive a 20-token response, you’ll be billed for 30 tokens. Don’t worry too much about it though as the pricing most likely won’t backrupt you Pricing (openai.com).

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.