Thread: Using the product rule for a partial derivative of a matrix / vector function...

1. Suppose we have a function consisting of a series of matrices multiplied by a vector:
f(X) = A * B * b
--where X is a vector containing elements that are contained within A, b, and/or b,
--A is a matrix, B is a matrix, and b is a vector

Each Matrix and the vector is expressed as more terms, ie...
X = (x1, x2, x3)

A =
[ x1 + y1 y4 y7 ]
[ y2 x2 + y5 y8 ]
] y3 y6 x3 + y9 ]

B =
[ y1 x2 + y4 x3 + y7 ]
[x1 + y2 y5 y8 ]
] y3 y6 y9 ]

b = [y1 y2 y3]' (' means transposed)

Now we want to find the Jacobian of f - ie the partial derivative of f wrt X.

One way to do this is to multiply the two matrices and then multiply that by the vector, creating one 3x1 vector in which each element is an algebraic expression resulting from matrix multiplication. The partial derivative could then be computed per element to form a 3x3 Jacobian. This would be feasible in the above example, but the one I'm working is a lot more complicated (and so I would also have to look for patterns in order to simplify it afterwards).

I was wanting to try to use the chain rule and/or the product rule for partial derivatives if possible. However, with the product rule you end up with A' * B * b + A * B' * b + A * B * b', where each derivative is wrt to the vector X. I understand that the derivative of a matrix wrt a vector is actually a 3rd order tensor, which is not easy to deal with. If this is not correct, the other terms still have to evaluate to matrices in order for matrix addition to be valid. If I use the chain rule instead, I still end up with the derivative of a matrix wrt a vector.

Is there an easier way to break down a matrix calculus problem like this? I've scoured the web and cannot seem to find a good direction.  2. I understand that the derivative of a matrix wrt a vector is actually a 3rd order tensor, which is not easy to deal with.
Why not ? A tensor can be represented by a matrix, and the same operations can be applied to it. I doubt very much that there is another solution to this apart from the two you have mentioned.  3. I guess I was a bit confused on how you can use a 3rd order tensor in a matrix calculus expression like this.
Would a 3rd order tensor (from the derivative of a matrix wrt a vector) times a matrix (derivative of a vector wrt a vector) yield another matrix? How would the rules of matrix multiplication we're all familiar with (row times column) correlate to three dimensional space with a 3rd order tensor times a matrix or with one 3rd order tensor times another?

If I could systemically and cleanly use 3rd order tensors for problems like this it would be wonderful - this is the only way to do problems like this arbitrarily without instead playing pattern recognition games between several separate results...  4. Originally Posted by datahead8888 If I could systemically and cleanly use 3rd order tensors for problems like this it would be wonderful - this is the only way to do problems like this arbitrarily without instead playing pattern recognition games between several separate results...
Perhaps you would benefit from an introductory text to tensor calculus and "index gymnastics" - you just need to be careful to remember that while every tensor can be given a matrix representation ( in some coordinate basis ), not every matrix is automatically a tensor.  Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Forum Rules