本文特指openai使用sdk的方式调用工具链。
安装openai
pip install openai
export OPENAI_API_KEY="YOUR OPENAI KEY"
定义工具函数
from openai import OpenAI
import jsonclient = OpenAI()
#工具函数
def get_current_weather(location, unit="fahrenheit"):"""Get the current weather in a given location"""if "tokyo" in location.lower():return json.dumps({"location": "Tokyo", "temperature": "10", "unit": unit})elif "san francisco" in location.lower():return json.dumps({"location": "San Francisco", "temperature": "72", "unit": unit})elif "paris" in location.lower():return json.dumps({"location": "Paris", "temperature": "22", "unit": unit})else:return json.dumps({"location": location, "temperature": "unknown"})#工具函数的说明,传给sdk,让大模型理解工具的功能和调用方式
tools = [ {"type": "function","function": {"name": "get_current_weather","description": "Get the current weather in a given location","parameters": {"type": "object","properties": {"location": {"type": "string","description": "The city and state, e.g. San Francisco, CA",},"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},},"required": ["location"],},},}
]
大模型调用工具链
注意尽量保证调工具(传递了参数tools=tools)的prompt(messages中)不要有其它指令,否则大模型可能会出现幻觉,调用不存在的函数或者不回答其它指令(下面结果ChatCompletionMessage的content=None,就是没有回答问题,纯粹调链)。
tool_choice参数指定了调用工具的模式
- 强制模型调用特定的函数,可以通过使用特定的函数名称设置tool_choice来实现。
- 设置tool_choice:“none”来强制模型生成面向用户的消息。
- 请注意,默认行为(tool_choice:“auto”)是由模型自己决定是否调用函数
下面案例是一个指令中调用了多次工具,所以返回的结果也调用了多次。
messages = [{"role": "user", "content": "What's the weather like in San Francisco, Tokyo, and Paris?"}]
def run_conversation():# Step 1: send the conversation and available functions to the modelresponse = client.chat.completions.create(model="gpt-3.5-turbo-1106",messages=messages,tools=tools,tool_choice="auto", # auto is default, but we'll be explicit)response_message = response.choices[0].messageprint(response_message )return response_message run_conversation()
'''
执行结果
ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_hoCM3KJJRQhkupaZR7gguZjr', function=Function(arguments='{"location": "San Francisco", "unit": "celsius"}', name='get_current_weather'), type='function'), ChatCompletionMessageToolCall(id='call_OTiRmgF7F7AHx92RHnfa8pSl', function=Function(arguments='{"location": "Tokyo", "unit": "celsius"}', name='get_current_weather'), type='function'), ChatCompletionMessageToolCall(id='call_zpIGNP5BuxpVVq2EUdSg8f1x', function=Function(arguments='{"location": "Paris", "unit": "celsius"}', name='get_current_weather'), type='function')]
)
'''
执行调用并汇总结果
注意下面汇总结果到messages时,注意这里的角色不是user也不是assistant,而是tool。
将上下文所有的消息全都汇总到message中,即可实现,模型调用工具拿到结果后,再汇总结果输出给用户。
def main(response_message):tool_calls = response_message.tool_callsif tool_calls:# Step 3: call the function# Note: the JSON response may not always be valid; be sure to handle errorsavailable_functions = {"get_current_weather": get_current_weather,} # only one function in this example, but you can have multiplemessages.append(response_message) # extend conversation with assistant's reply# Step 4: send the info for each function call and function response to the modelfor tool_call in tool_calls:function_name = tool_call.function.namefunction_to_call = available_functions[function_name]function_args = json.loads(tool_call.function.arguments)function_response = function_to_call(location=function_args.get("location"),unit=function_args.get("unit"),)messages.append({"tool_call_id": tool_call.id,"role": "tool", #注意这里的角色不是user也不是assistant,而是tool"name": function_name,"content": function_response,}) # extend conversation with function response#汇总并输出结果second_response = client.chat.completions.create(model="gpt-3.5-turbo-1106",messages=messages,) # get a new response from the model where it can see the function responsereturn second_response