Maths - Inertia tensor - Newsgroup Discussion

From: mjb
Newsgroups: sci.physics
Subject: Inertia Tensor Fundamentals
Date: Mon, 09 Apr 2007 08:27:53 +0100

People seem quite pedantic about using the term 'inertia tensor', rather
than say 'inertia matrix', why is this?

Is it because its form changes depending on the number of dimensions that we are working in? For instance:

When rotating in 2 dimensions, then the torque (scalar) is related to the
angular acceleration (scalar) by a scalar (grade 0 tensor) second moment
of mass.

When rotating in 3 dimensions, then the torque bivector is related to the
angular acceleration bivector by a 3x3 matrix (grade 2 tensor)

So what happens in 4 dimensions? (I'm not necessarily talking about
space-time here) I'm just trying to extend the above sequence to 4 space
dimensions. As I understand it, both torque and angular acceleration would
be represented by a 6 dimensional bivector? So how do we relate these two
6D bivectors? Do we use a 6x6 matrix? Or if we follow the above sequence then it would be some sort of grade 4 hypermatrix? But I can't imagine how this fits into the equation T = I a?

So my best guess so far is that the inertia tensor is always a matrix? Is
there a scientific way to work this out? I would do an experiment but I
don't know how to apply a torque in 4D?

Is the importance of the term 'inertia tensor' more to do with the way that
it changes with a change of coordinates? For instance if we rotate the
coordinates with the rotation matrix (or is it a tensor) [R]? then:
T' = [R] T
a' = [R] a
so:
T' = [R] I [R]^t a'
so
I' = [R] I [R]^t
I don't know much about tensors, do all tensors vary in this way with a
change of dimensions?
Does it make any difference that 'T' and 'a' are bivectors (as opposed to
vectors)?

Martin


From: "Eric Gisse"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 9 Apr 2007 00:43:22 -0700

On Apr 8, 11:27 pm, mjb <mjb> wrote:
> People seem quite pedantic about using the term 'inertia tensor', rather
> than say 'inertia matrix', why is this?

A tensor is not a matrix.

A rank 2 tensor has a matrix representation, but that does not mean an
arbitrary tensor is representable as a matrix.

>
> Is it because its form changes depending on the number of dimensions that we
> are working in? For instance:

No.

[...]

> So my best guess so far is that the inertia tensor is always a matrix? Is
> there a scientific way to work this out? I would do an experiment but I
> don't know how to apply a torque in 4D?

A rank 2 tensor is always representable as a nxn matrix where n is the
number of dimensions. The three dimensional inertia tensor has a 3x3
matrix representation, for example.

>
> Is the importance of the term 'inertia tensor' more to do with the way that
> it changes with a change of coordinates? For instance if we rotate the
> coordinates with the rotation matrix (or is it a tensor) [R]? then:
> T' = [R] T
> a' = [R] a
> so:
> T' = [R] I [R]^t a'
> so
> I' = [R] I [R]^t
> I don't know much about tensors, do all tensors vary in this way with a
> change of dimensions?

Actually, that is exactly how they transform. It is just that it they
aren't representable as a single matrix in higher dimensions.

> Does it make any difference that 'T' and 'a' are bivectors (as opposed to
> vectors)?

Vector - a (0,1) tensor. Bivector - a (1,0) tensor.

The inertia tensor is a rank 2 symmetric tensor - and is not a
bivector.

>
> Martin


From: mjb
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Mon, 09 Apr 2007 09:03:08 +0100

> A rank 2 tensor is always representable as a nxn matrix where n is the
> number of dimensions. The three dimensional inertia tensor has a 3x3
> matrix representation, for example.

So what form would an inertia tensor take in 4 dimensions? From what you say its not a 6x6 matrix? is it a grade 4 tensor?

> Vector - a (0,1) tensor. Bivector - a (1,0) tensor.

So is a bivector the same as a covector?

Martin


From: "Eric Gisse"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 9 Apr 2007 08:26:22 -0700

On Apr 9, 12:03 am, mjb <mjb> wrote:
> > A rank 2 tensor is always representable as a nxn matrix where n is the
> > number of dimensions. The three dimensional inertia tensor has a 3x3
> > matrix representation, for example.
>
> So what form would an inertia tensor take in 4 dimensions? From
> what you say its not a 6x6 matrix? is it a grade 4 tensor?

4x4.

The rank of a tensor has to do with the number of indexes it carries.
However you don't have a four dimensional inertia tensor - there are
only 3 spatial dimensions.

>
> > Vector - a (0,1) tensor. Bivector - a (1,0) tensor.
>
> So is a bivector the same as a covector?

Sounds like it.

>
> Martin


From: mjb
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Mon, 09 Apr 2007 17:11:36 +0100

Eric Gisse wrote:

>> So what form would an inertia tensor take in 4 dimensions? From what
>> you say its not a 6x6 matrix? is it a grade 4 tensor?
>
> 4x4.

So are you saying that in 4 dimensions the inertia tensor is represented by
a 4x4 matrix? (ie two indices) This would imply that an inertia tensor is
always a matrix? I thought (from reading Feynmans lectures) that rotation
in 'n' dimensions happens in a 'directed plane' which is represented by a
bivector. So in 4D there are 6 combinations of 2 from 4 so this bivector is
represented by 6 scalar values. So if it is a matrix it should be 6x6?

> The rank of a tensor has to do with the number of indexes it carries.
> However you don't have a four dimensional inertia tensor - there are
> only 3 spatial dimensions.

Well I must admit I haven't seen 4 spacial dimensions but I thought one of
the big advantages of tensors is that the concepts can be extended to 'n'
dimensions, as a mathematical exercise at least, even if it does not
represent a proven physical theory. Are you saying that tensors can't be
used in this way?

>> > Vector - a (0,1) tensor. Bivector - a (1,0) tensor.
>>
>> So is a bivector the same as a covector?

This might be an issue of terminology? I have been using the term bivector
to mean the result of cross product of two vectors, which represents a
directed plane, which can represent (infinitesimal or continuous) rotation?
And I've been using covector to represent covariant index or 'linear
operator'.
Is this correct terminology? and is there a way to show if these two are the
same thing?

Thanks,

Martin


From: "Eric Gisse"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 11 Apr 2007 14:40:51 -0700

On Apr 9, 8:11 am, mjb wrote:
> Eric Gisse wrote:
> >> So what form would an inertia tensor take in 4 dimensions? From what you
> >> say its not a 6x6 matrix? is it a grade 4 tensor?
>
> > 4x4.
>
> So are you saying that in 4 dimensions the inertia tensor is represented by
> a 4x4 matrix? (ie two indices) This would imply that an inertia tensor is
> always a matrix? I thought (from reading Feynmans lectures) that rotation
> in 'n' dimensions happens in a 'directed plane' which is represented by a
> bivector. So in 4D there are 6 combinations of 2 from 4 so this bivector is
> represented by 6 scalar values. So if it is a matrix it should be 6x6?
>

Rotations in general can be described by a tensor.

There isn't a 4 dimensional inertia tensor anyway - there are only 3
spatial dimensions. This does not change in relativity.

Look - each index grants you n entries. A rank 1 tensor - n. Rank 2 -
n^2. Rank 3 - n^3. You can describe a rank 3 tensor with n nxn
matricies, but not with one single matrix.

> > The rank of a tensor has to do with the number of indexes it carries.
> > However you don't have a four dimensional inertia tensor - there are
> > only 3 spatial dimensions.
>
> Well I must admit I haven't seen 4 spacial dimensions but I thought one of
> the big advantages of tensors is that the concepts can be extended to 'n'
> dimensions, as a mathematical exercise at least, even if it does not
> represent a proven physical theory. Are you saying that tensors can't be
> used in this way?

It can, it just doesn't have any physical meaning.

>
> >> > Vector - a (0,1) tensor. Bivector - a (1,0) tensor.
>
> >> So is a bivector the same as a covector?
>
> This might be an issue of terminology? I have been using the term bivector
> to mean the result of cross product of two vectors, which represents a
> directed plane, which can represent (infinitesimal or continuous) rotation?

Ah. That makes more sense now.

A bivector in 3 dimensions is a 3-vector, but that is not true in
general. In general, it is a tensor.

It does _NOT_ represent rotation. It represents something orthogonal
to the two things you are wedging.

> And I've been using covector to represent covariant index or 'linear
> operator'.
> Is this correct terminology? and is there a way to show if these two are the
> same thing?

No, because they are not.

>
> Thanks,
>
> Martin


From: "Jim Black"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 9 Apr 2007 17:55:11 -0700

On Apr 9, 2:43 am, "Eric Gisse" wrote:
> On Apr 8, 11:27 pm, mjb <mjb> wrote:
> > Does it make any difference that 'T' and 'a' are bivectors (as opposed to
> > vectors)?
>
> Vector - a (0,1) tensor. Bivector - a (1,0) tensor.

You're thinking of covectors, and I think your order is opposite to
the standard one, in which a vector is of type (1,0) and a covector of
type (0,1). A bivector is an antisymmetric tensor of type (2,0).

--
Jim E. Black


From: mjb
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Tue, 10 Apr 2007 08:23:07 +0100

Jim Black wrote:

> You're thinking of covectors, and I think your order is opposite to
> the standard one, in which a vector is of type (1,0) and a covector of
> type (0,1).  A bivector is an antisymmetric tensor of type (2,0).

Jim,

I came across bivectors through Geometric Algebra/Clifford Algebra where
they represent directed areas and rotations in 'n' dimensions. In Geometric
Algebra they appear as if they have one index, but then I don't think it
has the concept of multiple indices? How would I define the bivector in
tensor algebra? Is it the outer product (or do I mean the wedge product?)
of two basis vectors in the plane required?

I guess I need to find out about the various types of multiplication in
tensor algebra. Do you know of any sources that don't assume too much maths
background?

Thanks,

Martin


From: "Jim Black"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 10 Apr 2007 21:51:44 -0700

On Apr 10, 2:23 am, mjb <mjb> wrote:
> Jim Blackwrote:
> > You're thinking of covectors, and I think your order is opposite to
> > the standard one, in which a vector is of type (1,0) and a covector of
> > type (0,1). A bivector is an antisymmetric tensor of type (2,0).
>
> Jim,
>
> I came across bivectors through Geometric Algebra/Clifford Algebra where
> they represent directed areas and rotations in 'n' dimensions. In Geometric
> Algebra they appear as if they have one index, but then I don't think it
> has the concept of multiple indices? How would I define the bivector in
> tensor algebra? Is it the outer product (or do I mean the wedge product?)
> of two basis vectors in the plane required?

Unfortunately I'm not familiar with Clifford Algebra. In tensor
notation, you can do a wedge product by taking the tensor product and
antisymmetrizing:

A_uv = (1/2) (B_u C_v - B_v C_u)

> I guess I need to find out about the various types of multiplication in
> tensor algebra. Do you know of any sources that don't assume too much maths
> background?

I don't know how much I can help you with picking a good textbook,
although you may find this online text helpful:

http://www.math.odu.edu/~jhh/counter2.html

Fortunately there's not too much to learn. The tensor product is done
just by multiplying components, and everything else you can get by
permuting of indices and contractions.

--
Jim E. Black


From: Uncle Al
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Mon, 09 Apr 2007 08:06:42 -0700

mjb wrote:
>
> People seem quite pedantic about using the term 'inertia tensor', rather
> than say 'inertia matrix', why is this?
>
> Is it because its form changes depending on the number of dimensions that we
> are working in? For instance:
>
> When rotating in 2 dimensions, then the torque (scalar) is related to the
> angular acceleration (scalar) by a scalar (grade 0 tensor) second moment
> of mass.
>
> When rotating in 3 dimensions, then the torque bivector is related to the
> angular acceleration bivector by a 3x3 matrix (grade 2 tensor)
>
> So what happens in 4 dimensions? (I'm not necessarily talking about
> space-time here) I'm just trying to extend the above sequence to 4 space
> dimensions. As I understand it, both torque and angular acceleration would
> be represented by a 6 dimensional bivector? So how do we relate these two
> 6D bivectors? Do we use a 6x6 matrix? Or if we follow the above sequence
> then it would be some sort of grade 4 hypermatrix? But I can't imagine how
> this fits into the equation T = I a?

There are no rotation axes in 4-D.

> So my best guess so far is that the inertia tensor is always a matrix? Is
> there a scientific way to work this out? I would do an experiment but I
> don't know how to apply a torque in 4D?
>
> Is the importance of the term 'inertia tensor' more to do with the way that
> it changes with a change of coordinates? For instance if we rotate the
> coordinates with the rotation matrix (or is it a tensor) [R]? then:
> T' = [R] T
> a' = [R] a
> so:
> T' = [R] I [R]^t a'
> so
> I' = [R] I [R]^t
> I don't know much about tensors, do all tensors vary in this way with a
> change of dimensions?
> Does it make any difference that 'T' and 'a' are bivectors (as opposed to
> vectors)?
>
> Martin

--
Uncle Al


From: mjb
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Mon, 09 Apr 2007 16:44:15 +0100

Uncle Al wrote:
> There are no rotation axes in 4-D.

I don't think I used the word 'axes' did I?

I thought (from reading Feynmans lectures) that rotation in 'n' dimensions
happens in a 'directed plane' which is represented by a bivector.

So in 4D there are 6 combinations of 2 from 4 so this bivector is
represented by 6 scalar values.

As I understand it 3D is a special case where a vector is the dual of the
bivector.

What are you trying to say here? are you telling me I'm wrong about this?

Martin


From: John C. Polasek
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Mon, 09 Apr 2007 21:24:54 -0400

On Mon, 09 Apr 2007 08:27:53 +0100, mjb wrote:

>People seem quite pedantic about using the term 'inertia tensor', rather
>than say 'inertia matrix', why is this?
>
>Is it because its form changes depending on the number of dimensions that we
>are working in? For instance:
>
>When rotating in 2 dimensions, then the torque (scalar) is related to the
>angular acceleration (scalar) by a scalar (grade 0 tensor) second moment
>of mass.
>
>When rotating in 3 dimensions, then the torque bivector is related to the
>angular acceleration bivector by a 3x3 matrix (grade 2 tensor)
>
>So what happens in 4 dimensions? (I'm not necessarily talking about
>space-time here) I'm just trying to extend the above sequence to 4 space
>dimensions. As I understand it, both torque and angular acceleration would
>be represented by a 6 dimensional bivector? So how do we relate these two
>6D bivectors? Do we use a 6x6 matrix? Or if we follow the above sequence
>then it would be some sort of grade 4 hypermatrix? But I can't imagine how
>this fits into the equation T = I a?
>
>So my best guess so far is that the inertia tensor is always a matrix? Is
>there a scientific way to work this out? I would do an experiment but I
>don't know how to apply a torque in 4D?
>
>Is the importance of the term 'inertia tensor' more to do with the way that
>it changes with a change of coordinates? For instance if we rotate the
>coordinates with the rotation matrix (or is it a tensor) [R]? then:
>T' = [R] T
>a' = [R] a

No, T' = [R}^t T [R}
You must premultiply and postmultiply the inertia tensor T by the
rotation matrix and its inverse.
In 4 dimensons you'll find it hard to define a rotation matrix.

>so:
>T' = [R] I [R]^t a'
>so
>I' = [R] I [R]^t
>I don't know much about tensors, do all tensors vary in this way with a
>change of dimensions?
>Does it make any difference that 'T' and 'a' are bivectors (as opposed to
>vectors)?
>
>Martin
John Polasek


From: mjb
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Tue, 10 Apr 2007 08:37:46 +0100

John C. Polasek wrote:
> In 4 dimensons you'll find it hard to define a rotation matrix.

I read somewhere, I can't remember where and my memory may be faulty, that
one of the advantages of tensors is that equations can be defined in a way
that applies to any number of dimensions? Is this not true?

Is there a way to define the rotation matrix (or should that be rotation
tensor?) in terms of bivectors?

Thanks,

Martin


From: "Jim Black"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 9 Apr 2007 18:29:36 -0700

On Apr 9, 2:27 am, mjb <mjb> wrote:
> People seem quite pedantic about using the term 'inertia tensor', rather
> than say 'inertia matrix', why is this?
>
> Is it because its form changes depending on the number of dimensions that we
> are working in? For instance:
>
> When rotating in 2 dimensions, then the torque (scalar) is related to the
> angular acceleration (scalar) by a scalar (grade 0 tensor) second moment
> of mass.
>
> When rotating in 3 dimensions, then the torque bivector is related to the
> angular acceleration bivector by a 3x3 matrix (grade 2 tensor)
>
> So what happens in 4 dimensions? (I'm not necessarily talking about
> space-time here) I'm just trying to extend the above sequence to 4 space
> dimensions. As I understand it, both torque and angular acceleration would
> be represented by a 6 dimensional bivector? So how do we relate these two
> 6D bivectors? Do we use a 6x6 matrix? Or if we follow the above sequence
> then it would be some sort of grade 4 hypermatrix? But I can't imagine how
> this fits into the equation T = I a?

You want a linear map from rank-2 tensors to rank-2 tensors. You need
a rank-4 tensor. The equation T = I a would now look like:

T_mn = (1/2) I_mnpr a_pr

Repeated indices indicate summation, so what I've written above
actually means:

T_mn = (1/2) sum_{p=1}^{4} sum_{r=1}^{4} I_mnpr a_pr

I've thrown in the factor of (1/2) to compensate for duplicate terms
such as I_1212 a_12 and I_1221 a_21.

--
Jim E. Black



From: mjb
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Tue, 10 Apr 2007 08:50:58 +0100

Jim Black wrote:
> You want a linear map from rank-2 tensors to rank-2 tensors. You need
> a rank-4 tensor. The equation T = I a would now look like:
>
> T_mn = (1/2) I_mnpr a_pr
>
> Repeated indices indicate summation, so what I've written above
> actually means:
>
> T_mn = (1/2) sum_{p=1}^{4} sum_{r=1}^{4} I_mnpr a_pr
>
> I've thrown in the factor of (1/2) to compensate for duplicate terms
> such as I_1212 a_12 and I_1221 a_21.

Does this mean that, in general, Inertia Tensors are always rank-4?

Is it just in 3 dimensions we can take advantage of the duality between
vectors and bivectors and therefore reduce the torque and ang acceleration
to rank 1 and inertia tensor to rank 2?

Thanks,

Martin


From: "Jim Black"
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: 10 Apr 2007 21:55:39 -0700

On Apr 10, 2:50 am, mjb <mjb> wrote:
> Jim Black wrote:
> > You want a linear map from rank-2 tensors to rank-2 tensors. You need
> > a rank-4 tensor. The equation T = I a would now look like:
>
> > T_mn = (1/2) I_mnpr a_pr
>
> > Repeated indices indicate summation, so what I've written above
> > actually means:
>
> > T_mn = (1/2) sum_{p=1}^{4} sum_{r=1}^{4} I_mnpr a_pr
>
> > I've thrown in the factor of (1/2) to compensate for duplicate terms
> > such as I_1212 a_12 and I_1221 a_21.
>
> Does this mean that, in general, Inertia Tensors are always rank-4?
>
> Is it just in 3 dimensions we can take advantage of the duality between
> vectors and bivectors and therefore reduce the torque and ang acceleration
> to rank 1 and inertia tensor to rank 2?

Yes.

--
Jim E. Black


From: John C. Polasek
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Tue, 10 Apr 2007 10:20:53 -0400

On Mon, 09 Apr 2007 08:27:53 +0100, mjb wrote:

>People seem quite pedantic about using the term 'inertia tensor', rather
>than say 'inertia matrix', why is this?
>
>Is it because its form changes depending on the number of dimensions that we
>are working in? For instance:
>
>When rotating in 2 dimensions, then the torque (scalar) is related to the
>angular acceleration (scalar) by a scalar (grade 0 tensor) second moment
>of mass.
>
>When rotating in 3 dimensions, then the torque bivector is related to the
>angular acceleration bivector by a 3x3 matrix (grade 2 tensor)
>
>So what happens in 4 dimensions? (I'm not necessarily talking about
>space-time here) I'm just trying to extend the above sequence to 4 space
>dimensions. As I understand it, both torque and angular acceleration would
>be represented by a 6 dimensional bivector? So how do we relate these two
>6D bivectors? Do we use a 6x6 matrix? Or if we follow the above sequence
>then it would be some sort of grade 4 hypermatrix? But I can't imagine how
>this fits into the equation T = I a?
>
>So my best guess so far is that the inertia tensor is always a matrix? Is
>there a scientific way to work this out? I would do an experiment but I
>don't know how to apply a torque in 4D?
>
>Is the importance of the term 'inertia tensor' more to do with the way that
>it changes with a change of coordinates? For instance if we rotate the
>coordinates with the rotation matrix (or is it a tensor) [R]? then:
>T' = [R] T
>a' = [R] a

No, T' = [R}^t T [R}
You must premultiply and postmultiply the inertia tensor T by the
rotation matrix and its inverse.
In 4 dimensons you'll find it hard to define a rotation matrix.

>so:
>T' = [R] I [R]^t a'
>so
>I' = [R] I [R]^t
>I don't know much about tensors, do all tensors vary in this way with a
>change of dimensions?
>Does it make any difference that 'T' and 'a' are bivectors (as opposed to
>vectors)?
>
>Martin
John Polasek


From: John C. Polasek
Newsgroups: sci.physics
Subject: Re: Inertia Tensor Fundamentals
Date: Wed, 11 Apr 2007 15:43:27 -0400

On Mon, 09 Apr 2007 08:27:53 +0100, mjb wrote:

>People seem quite pedantic about using the term 'inertia tensor', rather
>than say 'inertia matrix', why is this?

Because it is most easily manipulated as a matrix.

>Is it because its form changes depending on the number of dimensions that we
>are working in? For instance:
>
>When rotating in 2 dimensions, then the torque (scalar) is related to the
>angular acceleration (scalar) by a scalar (grade 0 tensor) second moment
>of mass.
>
>When rotating in 3 dimensions, then the torque bivector is related to the
>angular acceleration bivector by a 3x3 matrix (grade 2 tensor)
>
>So what happens in 4 dimensions? (I'm not necessarily talking about
>space-time here) I'm just trying to extend the above sequence to 4 space
>dimensions. As I understand it, both torque and angular acceleration would
>be represented by a 6 dimensional bivector? So how do we relate these two
>6D bivectors? Do we use a 6x6 matrix? Or if we follow the above sequence
>then it would be some sort of grade 4 hypermatrix? But I can't imagine how
>this fits into the equation T = I a?
>
>So my best guess so far is that the inertia tensor is always a matrix? Is
>there a scientific way to work this out? I would do an experiment but I
>don't know how to apply a torque in 4D?
>
>Is the importance of the term 'inertia tensor' more to do with the way that
>it changes with a change of coordinates? For instance if we rotate the
>coordinates with the rotation matrix (or is it a tensor) [R]? then:
>T' = [R] T
No, T' = [R}^t T [R}
You must premultiply and postmultiply the inertia tensor T by the
rotation matrix and its inverse when they are written as matrices.
In 4 dimensons you'll find it hard to define a rotation matrix.

>a' = [R] a
No, Torque = RTR'a with column vectors or
Torque = aR'TR with row vectors
where a = domega/dt
Can you see the logic in the last equation? Even if we dont know how
to transforma a 2d rank tensor, we leave it alone, and take a into R'
to rotate a backwards, run it through the tensor, then rotate it
forward with R. It's a tensor sandwich.

>T' = [R] I [R]^t a'
The last is nonsense, says T' = a' since the two R's equal identity
>I' = [R] I [R]^t
>I don't know much about tensors, do all tensors vary in this way with a
>change of dimensions?
T is a tensor because it transforms like its vectors do, and the
operations can be carried out as matrices or as subscripted tensor
components.
Just because a matrix is declared a tensor doesn't make it so. The
metric tensor of relativity is really a mirror matrix because it
doesn't transform like its vectors and because its determinant is -1.
Forget this mapping fol de rol. In physics a 2d rank tensor represents
the characteristics of a he-man object.
>Does it make any difference that 'T' and 'a' are bivectors (as opposed to
>vectors)?
You could do with a little study, but pick your text carefully. If
it's by a full-fledged mathematician you'll be lost shortly. .
>Martin
John Polasek



metadata block
see also:
Correspondence about this page

Book Shop - Further reading.

Where I can, I have put links to Amazon for books that are relevant to the subject, click on the appropriate country flag to get more details of the book or to buy it from them.

 

cover us uk de jp fr ca Schaum's Outline of Theory and Problems of Tensor Calculus - I'm finding this hard going, it starts off with as review of linear algebra, matrix notation, etc. It redefines a lot of conventions which are hard to relearn, such as superscrips instead of subscripts to identify elements, and a summation convention, then it goes into coordinate transformations. I cant find any definition of what a tensor is. I think this book is aimed at people who already have some knowledge of the subject.

I can't find any mention in this book of the term hyper-matrix.

Terminology and Notation

Specific to this page here:

 

This site may have errors. Don't use for critical systems.

Copyright (c) 1998-2017 Martin John Baker - All rights reserved - privacy policy.