101.使用Grop网站提供的api
在不同环境中调用api
1.创建api key
创建api key
在gropapi.com注册账号,登录后创建一个新的api,填写相关信息后保存,系统会生成一个唯一的api key。保存api key
gropapikey1:gsk_开头的一串字符串1
gsk_开头的一串字符串
2. crul调用api
使用curl发送请求查看所有可用模型
- 打开终端环境
- 运行curl带参数3.查看结果
1
2
3curl -X GET "https://api.groq.com/openai/v1/models" \
-H "Authorization: Bearer gsk_开头的一串字符串" \
-H "Content-Type: application/json"
curl调用免费模型
- 打开终端环境
- 输入命令
1
2
3
4
5
6
7
8
9
10
11
12
curl https://api.groq.com/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer gsk_开头的一串字符串" \
-d '{
"model": "llama-3.3-70b-versatile",
"messages": [{
"role": "user",
"content": "Explain the importance of fast language models"
}]
}'
3.javascript调用api
在浏览器控制台中调用api key进行单次对话
打开浏览器。
打开浏览器console
输入以下js代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19fetch("https://api.groq.com/openai/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": "Bearer gsk_开头的一串字符串"
},
body: JSON.stringify({
model: "llama-3.3-70b-versatile",
messages: [
{ role: "user", content: "hi" }
]
})
})
.then(r => r.json())
.then(data => {
console.log("AI:", data.choices[0].message.content);
})
.catch(console.error);
在浏览器控制台中调用api key查看所有json数据
打开浏览器。
code
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17fetch("https://api.groq.com/openai/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": "Bearer gsk_开头的一串字符串"
},
body: JSON.stringify({
model: "llama-3.3-70b-versatile",
messages: [
{ role: "user", content: "hi" }
]
})
})
.then(r => r.json())
.then(console.log)
.catch(console.error);
node运行javascript调用api进行流式输出
创建testapi.js
代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24import OpenAI from "openai";
const client = new OpenAI({
apiKey: "gsk_开头的一串字符串",
baseURL: "https://api.groq.com/openai/v1"
});
async function main() {
const stream = await client.chat.completions.create({
model: "llama-3.3-70b-versatile",
stream: true,
messages: [
{ role: "user", content: "Explain the importance of fast language models" }
]
});
for await (const chunk of stream) {
const text = chunk.choices?.[0]?.delta?.content;
if (text) process.stdout.write(text);
}
}
main();node运行
1
node testapi.js
写一个网页与大模型对话
- 创建testchat.html文件
- 编写代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
<html>
<body>
<textarea id="out" style="width:100%;height:200px;"></textarea>
<input id="msg" /><button onclick="go()">Send</button>
<script>
const KEY = "gsk_开头的一串字符串";
async function go() {
const msg = document.getElementById("msg").value;
const out = document.getElementById("out");
const res = await fetch("https://api.groq.com/openai/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": "Bearer " + KEY
},
body: JSON.stringify({
model: "llama-3.3-70b-versatile",
messages: [
{ role: "user", content: msg }
]
})
});
const data = await res.json();
out.value += "\nAI: " + data.choices[0].message.content;
}
</script>
</body>
</html>
- 打开网页进行对话测试
4. python调用api
在python中调用 api key查看json输出
安装环境
1
pip3 install openai
编写代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15from openai import OpenAI
import json
client = OpenAI(
api_key="gsk_开头的一串字符串",
base_url="https://api.groq.com/openai/v1"
)
resp = client.chat.completions.create(
model="llama-3.3-70b-versatile",
messages=[{"role": "user", "content": "hi"}]
)
print(json.dumps(resp.model_dump(), indent=2, ensure_ascii=False))运行查看结果
在python中调用 api key进行多轮对话
安装环境
编写代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25from openai import OpenAI
client = OpenAI(
api_key="gsk_开头的一串字符串L", # 必须是纯 ASCII
base_url="https://api.groq.com/openai/v1"
)
history = []
while True:
user = input("你: ")
if user.lower() == "exit":
break
history.append({"role": "user", "content": user})
resp = client.chat.completions.create(
model="llama-3.3-70b-versatile",
messages=history
)
ai = resp.choices[0].message.content # ← 正确写法
print("AI:", ai)
history.append({"role": "assistant", "content": ai})运行查看结果
5.cpp调用api
cpp在linux中调用api打印返回的json数据
下载httplib.c文件
1
wget https://raw.githubusercontent.com/yhirose/cpp-httplib/master/httplib.h
或:
1
curl -O https://raw.githubusercontent.com/yhirose/cpp-httplib/master/httplib.h
创建testgroq.cpp
code
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
int main() {
// HTTPS 客户端
httplib::SSLClient cli("api.groq.com", 443);
// 如果你想验证证书,改成 true
cli.enable_server_certificate_verification(false);
std::string body = R"({
"model": "llama-3.3-70b-versatile",
"messages": [
{ "role": "user", "content": "hi" }
]
})";
httplib::Headers headers = {
{"Content-Type", "application/json"},
{"Authorization", "Bearer gsk_你的真实key"}
};
auto res = cli.Post("/openai/v1/chat/completions", headers, body, "application/json");
if (res) {
std::cout << "Status: " << res->status << "\n";
std::cout << "Body:\n" << res->body << "\n";
} else {
std::cout << "Request failed\n";
}
return 0;
}编译命令(Linux)
1
g++ groq.cpp -o groq -lssl -lcrypto
Linux 默认自带 OpenSSL,所以不需要安装额外库。
cpp在linux中调用api通过流式输出打印消息
创建teststream.cpp
code
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
int main() {
// 创建 HTTPS 客户端
httplib::SSLClient cli("api.groq.com", 443);
cli.enable_server_certificate_verification(false); // 简化示例
// 请求头
httplib::Headers headers = {
{"Content-Type", "application/json"},
{"Authorization", "Bearer gsk_开头的一串字符串"}
};
// 流式请求体
std::string body = R"({
"model": "llama-3.3-70b-versatile",
"stream": true,
"messages": [
{ "role": "user", "content": "hello" }
]
})";
// 发送 POST(流式接收)
auto res = cli.Post(
"/openai/v1/chat/completions",
headers,
body,
"application/json",
[&](const char *data, size_t len) {
std::string chunk(data, len);
// SSE 格式:data: {...}\n\n
std::istringstream ss(chunk);
std::string line;
while (std::getline(ss, line)) {
if (line.rfind("data:", 0) == 0) {
std::string json = line.substr(5);
if (json.find("[DONE]") != std::string::npos) {
std::cout << "\n[流式结束]\n";
return true;
}
// 查找 content 字段
auto pos = json.find("\"content\":");
if (pos != std::string::npos) {
auto start = json.find("\"", pos + 10);
auto end = json.find("\"", start + 1);
if (start != std::string::npos && end != std::string::npos) {
std::string token = json.substr(start + 1, end - start - 1);
std::cout << token << std::flush;
}
}
}
}
return true; // 继续接收
}
);
if (!res) {
std::cout << "请求失败\n";
}
return 0;
}创建
httplib_wrapper.cpp文件将httplib写成静态库1
2创建
makefile文件,解决重复编译httplib耗时问题1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21CXX = g++
CXXFLAGS = -O2 -Wall -std=c++17
LDFLAGS = -lssl -lcrypto
all: stream
libhttplib.a: httplib_wrapper.o
ar rcs libhttplib.a httplib_wrapper.o
httplib_wrapper.o: httplib_wrapper.cpp httplib.h
$(CXX) $(CXXFLAGS) -c httplib_wrapper.cpp -o httplib_wrapper.o
stream: stream.o libhttplib.a
$(CXX) stream.o -L. -lhttplib $(LDFLAGS) -o stream
stream.o: stream.cpp
$(CXX) $(CXXFLAGS) -c stream.cpp -o stream.o
clean:
rm -f *.o *.a stream编译命令
1
make
运行
1
./stream
附录
python脚本检测模型
1 | from openai import OpenAI |
All articles in this blog are licensed under CC BY-NC-SA 4.0 unless stating additionally.
