Skip to content

Commit

Permalink
Python: implement Chat history, and refactor prompt config/prompt con…
Browse files Browse the repository at this point in the history
…fig template (#5023)

### Motivation and Context

This PR addresses a few of our remaining urgent work items: #4856,
#4630.

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->

### Description

In this PR:
- The prompt template/prompt template config is refactored to be similar
to the dotnet versions.
- The KernelPromptTemplate is introduced as the default prompt template
(will need to make it more dynamic in the future when we have other
prompt templates, like handlebars or jinja2).
- The methods related to `create_semantic_function` were removed and it
their places a method `create_function_from_prompt` was added.
- ChatHistory was introduced as a way to take over what ChatMessage was
doing. ChatHistory is passed into the complete chat/text methods.
- As of these latest changes, all unit and integration tests are
passing.
- New kernel examples were added to exercise this new code.

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄

---------

Co-authored-by: Eduard van Valkenburg <eavanvalkenburg@users.noreply.github.com>
  • Loading branch information
moonbox3 and eavanvalkenburg committed Feb 22, 2024
1 parent 52d043b commit ec02d89
Show file tree
Hide file tree
Showing 120 changed files with 5,242 additions and 3,019 deletions.
18 changes: 16 additions & 2 deletions python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,11 +33,25 @@ kernel = sk.Kernel()

# Prepare OpenAI service using credentials stored in the `.env` file
api_key, org_id = sk.openai_settings_from_dot_env()
kernel.add_chat_service("chat-gpt", OpenAIChatCompletion("gpt-3.5-turbo", api_key, org_id))
kernel.add_service(
OpenAIChatCompletion(
service_id="chat-gpt",
ai_model_id="gpt-3.5-turbo",
api_key=api_key,
org_id=org_id
)
)

# Alternative using Azure:
# deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()
# kernel.add_chat_service("dv", AzureChatCompletion(deployment, endpoint, api_key))
# kernel.add_service(
# AzureChatCompletion(
# service_id="dv",
# deployment_name=deployment,
# base_url=endpoint,
# api_key=api_key
# )
# )

# Wrap your prompt in a function
prompt = kernel.create_semantic_function("""
Expand Down
75 changes: 49 additions & 26 deletions python/notebooks/00-getting-started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,20 +31,10 @@
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Option 1: using OpenAI\n",
"\n",
"**Step 2**: Add your [OpenAI Key](https://openai.com/product/) key to a `.env` file in the same folder (org Id only if you have multiple orgs):\n",
"\n",
"```\n",
"OPENAI_API_KEY=\"sk-...\"\n",
"OPENAI_ORG_ID=\"\"\n",
"```\n",
"\n",
"Use \"keyword arguments\" to instantiate an OpenAI Chat Completion service and add it to the kernel:"
"### Configure the service you'd like to use via the `Service` Enum."
]
},
{
Expand All @@ -53,21 +43,28 @@
"metadata": {},
"outputs": [],
"source": [
"from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
"from services import Service\n",
"\n",
"api_key, org_id = sk.openai_settings_from_dot_env()\n",
"\n",
"kernel.add_chat_service(\n",
" \"chat-gpt\",\n",
" OpenAIChatCompletion(ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
")"
"# Select a service to use for this notebook (available services: OpenAI, AzureOpenAI, HuggingFace)\n",
"selectedService = Service.OpenAI"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Option 1: using OpenAI\n",
"\n",
"**Step 2**: Add your [OpenAI Key](https://openai.com/product/) key to a `.env` file in the same folder (org Id only if you have multiple orgs):\n",
"\n",
"```\n",
"OPENAI_API_KEY=\"sk-...\"\n",
"OPENAI_ORG_ID=\"\"\n",
"```\n",
"\n",
"Use \"keyword arguments\" to instantiate an OpenAI Chat Completion service and add it to the kernel:\n",
"\n",
"## Option 2: using Azure OpenAI\n",
"\n",
"**Step 2**: Add your [Azure Open AI Service key](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart?pivots=programming-language-studio) settings to a `.env` file in the same folder:\n",
Expand All @@ -87,14 +84,23 @@
"metadata": {},
"outputs": [],
"source": [
"from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
"service_id = None\n",
"if selectedService == Service.OpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
"\n",
"deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
" api_key, org_id = sk.openai_settings_from_dot_env()\n",
" service_id = \"chat-gpt\"\n",
" kernel.add_service(\n",
" OpenAIChatCompletion(service_id=service_id, ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
" )\n",
"elif selectedService == Service.AzureOpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
"\n",
"kernel.add_chat_service(\n",
" \"chat_completion\",\n",
" AzureChatCompletion(deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
")"
" deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
" service_id = \"chat_completion\"\n",
" kernel.add_service(\n",
" AzureChatCompletion(service_id=service_id, deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
" )"
]
},
{
Expand All @@ -113,10 +119,27 @@
"metadata": {},
"outputs": [],
"source": [
"plugin = kernel.import_semantic_plugin_from_directory(\"../../samples/plugins\", \"FunPlugin\")\n",
"try:\n",
" plugin = kernel.import_plugin_from_prompt_directory(service_id, \"../../samples/plugins\", \"FunPlugin\")\n",
"except ValueError as e:\n",
" # Don't fail if we try to add the plug in again\n",
" # This is just for the sake of the example\n",
" # Once the plugin has been added to the kernel it will fail\n",
" # to add it again, if this cell is run multiple times without\n",
" # restarting the kernel\n",
" pass"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"joke_function = plugin[\"Joke\"]\n",
"\n",
"print(await joke_function(\"time travel to dinosaur age\"))"
"joke = await kernel.invoke(joke_function, sk.KernelArguments(input=\"time travel to dinosaur age\", style=\"super silly\"))\n",
"print(joke)"
]
}
],
Expand Down
81 changes: 24 additions & 57 deletions python/notebooks/01-basic-loading-the-kernel.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,29 +34,7 @@
"metadata": {},
"outputs": [],
"source": [
"import semantic_kernel as sk\n",
"from semantic_kernel.connectors.ai.open_ai import (\n",
" AzureChatCompletion,\n",
" OpenAIChatCompletion,\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"You can instantiate the kernel in a few ways, depending on your use case."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"# Simple instance\n",
"kernel_1 = sk.Kernel()"
"import semantic_kernel as sk"
]
},
{
Expand All @@ -66,7 +44,7 @@
"source": [
"When using the kernel for AI requests, the kernel needs some settings like URL and credentials to the AI models.\n",
"\n",
"The SDK currently supports OpenAI and Azure OpenAI, other services will be added over time.\n",
"The SDK currently supports OpenAI and Azure OpenAI, among other connectors.\n",
"\n",
"If you need an Azure OpenAI key, go [here](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/quickstart?pivots=rest-api)."
]
Expand All @@ -77,39 +55,10 @@
"metadata": {},
"outputs": [],
"source": [
"kernel = sk.Kernel()\n",
"from services import Service\n",
"\n",
"kernel.add_chat_service( # We are adding a text service\n",
" \"Azure_curie\", # The alias we can use in prompt templates' config.json\n",
" AzureChatCompletion(\n",
" deployment_name=\"my-finetuned-Curie\", # Azure OpenAI *Deployment name*\n",
" endpoint=\"https://contoso.openai.azure.com/\", # Azure OpenAI *Endpoint*\n",
" api_key=\"...your Azure OpenAI Key...\", # Azure OpenAI *Key*\n",
" ),\n",
")\n",
"\n",
"kernel.add_chat_service( # We are adding a text service\n",
" \"OpenAI_chat_gpt\", # The alias we can use in prompt templates' config.json\n",
" OpenAIChatCompletion(\n",
" ai_model_id=\"gpt-3.5-turbo\", # OpenAI Model Name\n",
" api_key=\"...your OpenAI API Key...\", # OpenAI API key\n",
" org_id=\"...your OpenAI Org ID...\", # *optional* OpenAI Organization ID\n",
" ),\n",
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"When working with multiple services and multiple models, the **first service** defined\n",
"is also the \"**default**\" used in these scenarios:\n",
"\n",
"* a prompt configuration doesn't specify which AI service to use\n",
"* a prompt configuration requires a service unknown to the kernel\n",
"\n",
"The default can be set and changed programmatically:"
"# Select a service to use for this notebook (available services: OpenAI, AzureOpenAI, HuggingFace)\n",
"selectedService = Service.OpenAI"
]
},
{
Expand All @@ -118,7 +67,25 @@
"metadata": {},
"outputs": [],
"source": [
"kernel.set_default_text_completion_service(\"Azure_curie\")"
"kernel = sk.Kernel()\n",
"\n",
"service_id = None\n",
"if selectedService == Service.OpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
"\n",
" api_key, org_id = sk.openai_settings_from_dot_env()\n",
" service_id = \"oai_chat_gpt\"\n",
" kernel.add_service(\n",
" OpenAIChatCompletion(service_id=service_id, ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
" )\n",
"elif selectedService == Service.AzureOpenAI:\n",
" from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
"\n",
" deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
" service_id = \"aoai_chat_completion\"\n",
" kernel.add_service(\n",
" AzureChatCompletion(service_id=service_id, deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
" )"
]
},
{
Expand Down
Loading

0 comments on commit ec02d89

Please sign in to comment.