1. ## Linear Algebra

Specifically Discrete Dynamical Systems.

My professor doesn't give us the answers to the homework, and neither does the textbook, so I've mostly been teaching myself how to do the problems and it's worked so far.
I've spent a good 3-4 hours trying to understand this today and I just do not get it.

Classify the origin as an attractor, repeller, or saddle point of the dynamical system x_k+1 = Ax_k. Find the directions of greatest attraction and/or repulsion.

A = [ v1 v2 ] v1 = [.3 -.3] v2 = [.4 1.1]

(how are matricies usually written on a computer? seems awkward to me)

Now, I know you find the eigenvalues, which the book says are the diagonal values of A in echelon form, but using that method on a problem that has an answer in the back does not give me the eigenvalues they say.

I'm going to stare at this problem for a few like an hour more before I go to bed, and hopefully someone can help out by the time I have class tomorrow.

edit: I feel like it's something stupidly easy that I'm just missing.

edit2: ok lol got it just had to subtract eigenvalue of 1 then take the determinant. will just use this thread if I ever come have trouble again.

3. ## Re: Linear Algebra

Physics and math double major

The homework quiz ended up being on the one problem the book explained, which I conveniently forgot how to do after learning this type of problem.

Oh whale.

4. ## Re: Linear Algebra

So, test on a few chapters tomorrow. Going to kind of use this as a checklist for what I know and what the topics are.

Coordinate systems - good

[x]B
x = P_B[x]_B (and therefore) P^-1_B[x]=[x]_B
p(t) = a0 + a1*t + a2*t^2 + ... + an*t^2 then R^n+1 = [a0;...;an]

Dimension of a vector space - good

Find basis. If no basis (no free variables), dim = 0.
dim Col A + dim Nul A = n (n is number of colums)
Nul A found by : reduced echelon form equal to zero. Use equations from pivot variables.

Rank - good

rank A + dim Nul A = n
rank A = dim Col A
For row equivalent matrices : Use echelon form to find rank A, basis for Col A uses the original matrix.
Basis for Row A uses echelon (NOT reduced). Just rows with pivots.
Basis Nul A found same as before.

Change of Basis - okay. straightforward but the language confuses me.

P = [ [d1]A [d2]A [d3]A ] then [x]A = P[x]D
/*if D = {d1,d2,d3} and F = {f1,f2,f3} and f1 = 2d1 - d2 + d3, f2 = 3d2 + d3, f3 = -3d1 + 2d3
P(D<-F) is just [d1 d2 d3]
Finding [x]D for x = f1 - 2f2 + 2f2 is simply P(D<-F)(dot)x because the coefficients of x create a vector.*/
[c1 c2 | b1 b2] ~ [I | P(C<-B)]
P(B<-C) = P(C<-B)^-1

Applications to Markov chains - meh fractions suck

state vectors
word problems
xk+1 = Pxk
Steady-State vector : (P - I)x = 0
Probability vector: divide steady state by sum of its entries.

Eigenvectors and Eigenvalues - sehr gut

A-λIx = 0
if above is linearly dependent (free variables) then λ is an eigenvalue. Corresponding eigenvector solve and take vector, you know what I mean. Same stuff you've been doing.
Basically a bunch of preliminary stuff for sections coming up

Characteristic equation - good

solve det(A-λI) = 0 to find the characteristic equation for A; shows all possible values of λ. Expand for polynomial.
Let A be an n x n matrix. A is invertible iff:
a. The number 0 is not an eigenvalue of A
b. the determinant of A is not zero.

detAB = detA*detB
detA^T=detA
if A is triangular, determinant is the diagonal
row replacement doesn't change det A. row switching does. (-1)^r where r is number of row switches.

Diagonalization - good

If A = PDP^-1 then A^k = PD^kP^-1
Diagonalizing an n x n matrix A:
1. Find eigenvalues
2. Find n linearly independent eigenvectors of A
3. Construct P from vectors in step 2
4. Construct D from the corresponding eigenvalues
(check: AP = PD)

Eigenvectors and Linear Transformations - wat

mapping sounds confusing but is easy
D = {d1, d1} and B = {b1,b2} are bases for vector spaces V and W
let T : V -> W be a linear transformation with T(d1) = 3b1 - 3b2, T(d2) = -2b1 + 5b2
Matrix for T relative to D and B is simply [3 -3;-2 5]

Discrete Dynamical Systems - semi good

for λ < 0, as k -> ∞, λ^k -> 0
origin is an attractor when all λ < 1
origin is a repeller when all λ > 1
origin is saddle point when some λ < 1 and some λ > 1
probably gonna yolo questions on this

Inner product, Length, and Orthogonality - good

basically calc III with matrices
dot product : multiply and add. gives scalar
unit vector in direction of given vector : divide everything by length
Distance between two vectors : subtract them and find distance of that vector
Orthogonal vectors : dot product = 0
two vectors are orthogonal iff : ||U + V||^2 = ||u||^2 + ||v||^2

ok I think that's everything I guess if anyone feels like reading through all of it and wants to comment on something wrong or just whatever then go ahead but this is mostly for me and yes this is a run-on sentence dealwithit

5. ## Re: Linear Algebra

Dunno if it helps to point out why these conditions are equivalent - since you're solving for det(A-λI) = 0, iff 0 is an eigenvalue then you know det(A-0) = 0. And invertibility has A*A^-1 = I, so det(A) * det(A^-1) = 1. If either of those is 0 the other's undefined since det(A^-1) = 1/0.

I just find it easier to remember stuff based on quick explanations of where it comes from than by rote. If I remember a proof wrong then a step becomes illogical; if I remember a sentence wrong it's hard to tell.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•