Skip to content

Commit

Permalink
Linear Algebra: Chapter 5 - 3D Linear Transformations - Straightforwa…
Browse files Browse the repository at this point in the history
…rd example to puzzle (#424)

* Update index.mdx

A more applicable example in the final puzzle introducing non-square matrices. The new example is also better related to the puzzle question.
Also caught a few typos -  hopefully didn't introduce any.

* Update index.mdx

fixed indices of matrices from ij,ik,... to 1,2,...

* Update index.mdx

changed example back to original,
modified typo fix as suggested.
  • Loading branch information
JJones780 committed May 23, 2024
1 parent b0c3bf5 commit 8c348f8
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions public/content/lessons/2016/3d-transformations/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -378,7 +378,7 @@ Here's another puzzle for you: It's also meaningful to talk about a linear trans

A simple example from 3d to 2d would be the shadow cast by a 3d object onto a 2d plane. Although, in order for the transformation to be linear, the light rays should be considered parallel to eachother to simplify the problem. If you consider the light source to be something really far away, like the sun, then this is a reasonable choice to make.

You can represent these transformations with matrices. The number of columns corresponds to the dimension of the input and the number of rows corresponds to the dimension of the output. For example, a matrix that maps coordinates on a sphere to a plane would have three columns and two rows.
You can represent these transformations with matrices. The number of columns corresponds to the dimension of the input and the number of rows corresponds to the dimension of the output. For example, a matrix that maps coordinates in 3d to 2d would have three columns and two rows.

$$
A = \left[\begin{array}{ccc}
Expand All @@ -387,7 +387,7 @@ a_4 & a_5 & a_6
\end{array}\right]
$$

It's meaningful to talk about multiplying these matrices when the number of columns on the left matrix is equal to the number of rows on the right matrix. That whey when apply these matrices to a vector, reading right to left, the dimensions of the input and output match up.
It's meaningful to talk about multiplying these matrices when the number of columns on the left matrix is equal to the number of rows on the right matrix. That way, when we apply these matrices to a vector, reading right to left, the dimensions of the input and output are as we expect.

$$
\left[\begin{array}{cc}
Expand Down

0 comments on commit 8c348f8

Please sign in to comment.