One of the difficult things to understand about tensors is their terminology, especially related to the indicies. Here we introduce contravariant covariant and tensor indices.
We have already seen how the rank and dimension of tensors relates to their indices.
 Rank: is the number of indicies required to determine the component.
 Dimension: is the range of each of the indicies.
We now see that it is useful to have 2 types of indices:
 Contravariant tensor indices, shown as superscripts.
 Covariant tensor indices, shown as subscripts.
Vectors and Covectors
Usually a vector is denoted as a column like this:
a 
b 
c 
This can represent say a position, relative to the origin, at x=a ,y=b, z=c.
Sometimes we might write this as (a,b,c) just to make it more conveniant to write on the page.
However what is the meaning of a true row vector like this?
a  b  c 
It would take value of x,y and z and give a scalar with the following value:
x*a + y*b + z*c
We can express these in tensor terminology as follows, for a vector we can use superscripts to denote the elements:
t^{1} 
t^{2} 
t^{3} 
and for a covector we can use subscripts:
t_{1}  t_{2}  t_{3} 
which gives a scalar value of:
s = ∑ t_{i}e^{i}
where:
 i = index values that we are summing over, in the above case 1,2 and 3.
 t = tensor element, in this case the values of the covector.
 e = unit length basis.
 s = scalar value.
Matrix as a Tensor
If we combine vectors and covectors we can write a matrix as a tensor:











with the superscripts denoting the rows and the subscripts denoting the columns. A matrix with one contravariant index and one covariant index is known as a 'linear operator', it is possible to define other matrices, two contravariant or two covariant indicies. But, for now, back to the linear operator:

= 


This is equivalent to these linear equations:
p'^{1} = t^{1}_{1} p^{1 }+ t^{1}_{2} p^{2} + t^{1}_{3} p^{3}
p'^{2} = t^{2}_{1} p^{1 }+ t^{2}_{2} p^{2} + t^{2}_{3} p^{3}
p'^{3} = t^{3}_{1} p^{1 }+ t^{3}_{2} p^{2} + t^{3}_{3} p^{3}
Remember that, in this context, superscripts are treated as indicies and not exponents. The above equations can be represented as:
p'^{1} = ∑ t^{1}_{i} p^{i}
p'^{2} = ∑ t^{2}_{i} p^{i}
p'^{3} = ∑ t^{3}_{i} p^{i}
Where 'i' is summed over the values 1,2 and 3. In general where an index is repeated then it is summed over its range, in this case 3 dimensions.
These three equations can be combined into one equation:
∑ p'^{k} = t^{k}_{i} p^{i}
This can be interpreted as follows: The repeated index 'i' is summed in each equation, in this case with the values 1,2 and 3. The remaining superscript index represents repeated equations.
Einsteins Summation Convention
A repeated index on one side of the equation indicates summation. So we can remove the summation symbol ∑ in the above equations and it will be implied.
So the equation:
p'^{k} = t^{k}_{i} p^{i}
represents an complete matrix equation as above.
Basis
A vector 'a' can be defined as a linear combination of a set of basis vectors: e_{1}, _{} e_{2}, e_{3} ...
This can be written as a vector equation:
a = a^{1} e_{1} + a^{2} e_{2} + a^{3} e_{3} ...
or in terms of linear equations:
a^{i} a^{j} a^{k} 
= a^{1}  e_{1}^{i} e_{1}^{j} e_{1}^{k} 
+ a^{2}  e_{2}^{i} e_{2}^{j} e_{2}^{k} 
+ a^{3}  e_{3}^{i} e_{3}^{j} e_{3}^{k} 
Don't forget: superscripts are indexes in this context, not exponents .
So the vector a is represented as a linear combination of the basis vectors. These basis vectors need to be independent (not parallel) to each other. It can provide simplifications if the basis vectors are orthogonal (mutually perpendicular) although this is not a requirement. We will treat the orthogonal and nonorthogonal cases separately.
Dual Basis
We now define vector 'a' in terms of an alternative set of basis vectors: e^{1}, ^{} e^{2}, e^{3} ... which are the dual of the above basis.
This can be written as a vector equation:
a = a_{1} e^{1} + a_{2} e^{2} + a_{3} e^{3} ...
or in terms of linear equations:
a^{i} = a_{1} e^{1i} + a_{2} e^{2i} + a_{3} e^{3i}
a^{j} = a_{1} e^{1j} + a_{2} e^{2j} + a_{3} e^{3j}
a^{k} = a_{1} e^{1k} + a_{2} e^{2k} + a_{3} e^{3k}
The basis vectors and the dual basis vectors are related by:
e^{i}e_{j} = δ^{i}_{j}
where
δ^{i}_{j}= Kronecker Delta (identity element as described on this page)
We can calculate the dual basis from:
V = volume of parallelped = e_{1}• _{} (e_{2} × e_{3})
so:
e^{1}= (1/V)(e_{2} × e_{3})
and so on for the other bases.
Example 1
If the basis vectors are:
e_{1}= 

e^{2}= 

e^{3}= 

then what are the dual basis vectors?
V = 1*1*1 = 1
using e^{1}= (1/V)(e_{2} × e_{3}) we get:
e^{1}= e_{2} × e_{3}= (1,0,0)
e^{2}= e_{3} × e_{1}= (0,1,0)
e^{3}= e_{1} × e_{2}= (0,0,1)
Example 2
If the basis vectors are:
e_{1}= 

e^{2}= 

e^{3}= 

then what are the dual basis vectors?
V = 1*1*1 = 1
using e^{1}= (1/V)(e_{2} × e_{3}) we get:
e^{1}= e_{2} × e_{3}= (0.7071,0.7071,0)
e^{2}= e_{3} × e_{1}= (0.7071,0.7071,0)
e^{3}= e_{1} × e_{2}= (0,0,1)
Components of a Tensor
These methods define the tensor in terms of its components (the elements of the vector, matrix, etc.). It helps if we have an arithmetic expression which allows us to determine all the component values from a single algebraic expression. It is best if this is expressed in terms of a dot product.
Transformations and Vectors
If x is a physical vector then it can be represented by the following expressions:
x = x^{i} e_{i} = x_{i} e^{i} = x'^{i} e'_{i} = x'_{i} e'^{i}
where:
 x^{i} = contravariant component of vector x
 e_{i} = covariant basis vector
 x_{i} = covariant component of vector x
 e^{i} = contravariant basis vector
 x'^{i} = contravariant component of vector x after being transformed
 e'_{i} = covariant basis vector after being transformed
 x'_{i} = covariant component of vector x after being transformed
 e'^{i} = contravariant basis vector after being transformed
 ' = this is usually written as 'bar' that is a line over the top but thats difficult to do on web page.
We can invert these to give these two dual or reciprical basis:
x • e_{i} = x_{i}
x • e^{i} = x^{i}
So e^{i} is a different basis.
When we are using the dual frame then:
x_{i} = g_{ji} x^{i}
x^{i} = g^{ji} x_{i}

= 


where
 g_{ji} = e_{i} • e_{j}
 g^{ji} = e^{i} • e^{j}
g is known as the Gram matrix it converts a vector to its reciprical. If its determinant is zero then the basis vectors are linearly independant.
Contravariant tensor
If the basis vectors are transformed according to the relation:
e_{i}= t^{j}_{i} e'_{j}
And the components x_{i} of a vector x are transformed according to the relation:
x^{i}= t'^{i}_{j} x'^{j}
Then index 'i' is contravariant, that is the component transform in the opposite way to the bases.
The tangent of a differential function is a contravatiant vector.
If we take the example of a vector field
T^{i} = T^{r} ∂x'^{i} / ∂x^{r}
= ∑ T^{r} ∂x'^{i} / ∂x^{r}
= T^{1} ∂x'^{i} / ∂x^{1} + T^{2} ∂x'^{i} / ∂x^{2} + T^{3} ∂x'^{i} / ∂x^{3} + ...
Example take the mapping:
x' = x*cos(θ)  y*sin(θ)
y' = x*sin(θ) + y*cos(θ)
and put this in tensor notation:
x'^{0} = x^{0}*cos(θ)  x^{1}*sin(θ)
x'^{1} = x^{0}*sin(θ) + x^{1}*cos(θ)
so
T^{i} = T^{r} ∂x'^{i} / ∂x^{r} =
Covariant tensor
If the basis vectors are transformed according to the relation:
e_{i}= t^{j}_{i} e'_{j}
And the components x_{i} of a vector x are transformed according to the relation:
x_{i}= t^{j}_{i} x'_{j}
Then index 'i' is covariant, that is the component transform in the same way as the bases.
The gradient of a differential function is a covariant vector.
T_{i} = T_{r} ∂x^{r} / ∂x'^{i}