Skip to content

nicolasfguillaume/chatgpt_prompt_engineering

Repository files navigation

chatgpt_prompt_engineering

Notebooks from the course ChatGPT Prompt Engineering for Developers from deeplearning.ai.

This short course taught by Isa Fulford (OpenAI) and Andrew Ng (DeepLearning.AI) will describe how LLMs work, provide best practices for prompt engineering, and show how LLM APIs can be used in applications for a variety of tasks, including:

  • Summarizing (e.g., summarizing user reviews for brevity)
  • Inferring (e.g., sentiment classification, topic extraction)
  • Transforming text (e.g., translation, spelling & grammar correction)
  • Expanding (e.g., automatically writing emails)

In addition, you’ll learn two key principles for writing effective prompts, how to systematically engineer good prompts, and also learn to build a custom chatbot.

Notes on using the OpenAI API outside of this classroom

To install the OpenAI Python library:

!pip install openai

The library needs to be configured with your account's secret key, which is available on the website.

You can either set it as the OPENAI_API_KEY environment variable before using the library:

!export OPENAI_API_KEY='sk-...'

Or, set openai.api_key to its value:

import openai
openai.api_key = "sk-..."

A note about the backslash

  • In the course, we are using a backslash \ to make the text fit on the screen without inserting newline '\n' characters.
  • GPT-3 isn't really affected whether you insert newline characters or not. But when working with LLMs in general, you may consider whether newline characters in your prompt may affect the model's performance.

About

deeplearning.ai - ChatGPT Prompt Engineering for Developers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published