Skip to content

Commit db9c960

Browse files
authored
Merge branch 'main' into fix-noise-cancellation-signal
2 parents f21453f + bebda79 commit db9c960

13 files changed

+18241
-11241
lines changed

docusaurus/video/docusaurus/docs/api/recording/storage.mdx

+43-13
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,25 @@ await call.startRecording({
5858
<TabItem value="py" label="Python">
5959

6060
```py
61-
// TODO: code example for Python
61+
# 1. create a new storage with all the required parameters
62+
aws_s3_config = S3Request(
63+
s3_region='us-east-1',
64+
s3_api_key='my-access-key',
65+
s3_secret='my-secret',
66+
)
67+
68+
response = client.video.create_external_storage(
69+
name='my-s3',
70+
storage_type='s3',
71+
bucket='my-bucket',
72+
path='directory_name/',
73+
aws_s3=aws_s3_config
74+
)
75+
# 2. update the call type to use the new storage
76+
client.video.update_call_type(name='allhands', external_storage= "my-s3")
77+
78+
# 3. alternative, specify the storage when starting call recording
79+
call.start_recording(recording_external_storage= "my-s3")
6280
```
6381

6482
</TabItem>
@@ -130,7 +148,11 @@ await call.startRecording({
130148
<TabItem value="py" label="Python">
131149

132150
```py
133-
# TODO: code example for Python
151+
# 1. update the call type to use Stream S3 storage
152+
client.video.update_call_type('my-call-type', external_storage="stream-s3")
153+
154+
# 2. specify Stream S3 storage when starting call recording
155+
call.start_recording(recording_external_storage="my-storage")
134156
```
135157

136158
</TabItem>
@@ -159,26 +181,26 @@ curl -X POST "https://video.stream-io-api.com/video/call/default/${CALL_ID}/star
159181

160182
| Name | Description | Required |
161183
|---------------|-------------|----------|
162-
| name | | |
163-
| storage_type | | |
164-
| bucket | | |
165-
| custom_folder | | |
184+
| name |unique name | yes |
185+
| storage_type |s3, gcs or abs| yes |
186+
| bucket |bucket name| yes |
187+
| custom_folder |path inside the bucket| |
166188

167189
## Amazon S3
168190

169191
To use Amazon S3 as your storage provider, you have two authentication options: IAM role or API key.
170192

171193
If you do not specify the `s3_api_key` parameter, Stream will use IAM role authentication. In that case make sure to have the correct IAM role configured for your application.
172194

173-
| Name | Description | Required |
174-
|------------|-------------|----------|
175-
| s3_region | | yes |
176-
| s3_api_key | | |
177-
| s3_secret | | |
195+
| Name | Required |
196+
|------------|----------|
197+
| s3_region | yes |
198+
| s3_api_key | |
199+
| s3_secret | |
178200

179201
There are 2 ways to configure authentication on your S3 bucket:
180202
- By providing a key and secret
181-
- Or by having Stream's AWS account assume a role on your SQS queue.
203+
- Or by having Stream's AWS account assume a role on S3 bucket.
182204
With this option you omit the key and secret, but instead you set up a resource-based policy to grant Stream SendMessage permission on your S3 bucket.
183205
The following policy needs to be attached to your queue (replace the value of Resource with the fully qualified ARN of you S3 bucket):
184206

@@ -230,13 +252,21 @@ await serverSideClient.createExternalStorage({
230252
```
231253

232254
</TabItem>
255+
233256
<TabItem value="py" label="Python">
234257

235258
```py
236-
# TODO: code example for Python
259+
response = client.video.create_external_storage(
260+
name='my-gcs',
261+
storage_type='gcs',
262+
bucket='my-bucket',
263+
path='directory_name/',
264+
gcs_credentials="content of the service account file"
265+
)
237266
```
238267

239268
</TabItem>
269+
240270
<TabItem value="curl" label="cURL">
241271

242272
```bash

openapi/chat-openapi-clientside.json

+1-1
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)