From 6e480ae0c9afe2bf4b687186f314b774af29571b Mon Sep 17 00:00:00 2001 From: Chen Yang <2023225768@qq.com> Date: Sat, 22 Jun 2024 14:51:15 +0800 Subject: [PATCH] fix: tensors created by torch.tensor don't share memory with ndarrays,while tensors created by torch.from_numpy share memory with ndarrays. --- chapter_preliminaries/ndarray.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/chapter_preliminaries/ndarray.md b/chapter_preliminaries/ndarray.md index bce0c7ff5..3d784804f 100644 --- a/chapter_preliminaries/ndarray.md +++ b/chapter_preliminaries/ndarray.md @@ -602,7 +602,7 @@ computation(X, Y) :begin_tab:`pytorch` 将深度学习框架定义的张量[**转换为NumPy张量(`ndarray`)**]很容易,反之也同样容易。 -torch张量和numpy数组将共享它们的底层内存,就地操作更改一个张量也会同时更改另一个张量。 +通过torch.numpy将torch张量转换为numpy数组,或是通过torch.from_numpy将numpy数组转换为torch张量,torch张量和numpy数组将共享它们的底层内存,就地操作更改一个张量也会同时更改另一个张量。而torch.tensor是为numpy数组新建一个副本,二者不共享内存。 :end_tab: ```{.python .input} @@ -614,7 +614,7 @@ type(A), type(B) ```{.python .input} #@tab pytorch A = X.numpy() -B = torch.tensor(A) +B = torch.from_numpy(A) type(A), type(B) ```