CSE1205 Linear Algebra -- Exam Pattern Analysis
Based on 11 past exams (2019--2024): 3 Midterms, 4 Endterms/Finals, 4 Resits
MID Midterm
END Endterm
RES Resit
1. Solving / Analyzing Linear Systems (with parameter) Very High
MID 2020 Q7 • 2023 Q1 • 2024 Q1 |
END 2019 Q1 |
RES 2019 Q1 • 2022 Q2 • 2023 Q1 • 2024 Q1
Given a linear system (often as an augmented matrix) with a parameter h or α, determine for which values the system is consistent, inconsistent, has a unique solution, infinitely many solutions, or a specific number of free variables.
Methods to Solve
- Row reduce the augmented matrix [A | b] to echelon form, keeping the parameter symbolic
- Look for contradictory rows (e.g., [0 0 ... 0 | k] with k ≠ 0) to find inconsistency conditions
- Count pivot positions vs. free variables: #free vars = #columns - #pivots
- For "exactly 1 free variable": need exactly (n-1) pivots and no contradiction in the augmented column
- A system is consistent with infinitely many solutions iff it has at least one free variable and no contradictions
Example (Midterm 2024 Q1): For which value(s) of h is x1a1 + x2a2 + x3a3 = b inconsistent? Row reduce and find h = -5 or h = -3.
2. Matrix Equation Solving (Solve for X) Very High
MID 2020 Q4 • 2023 Q5 • 2024 Q5 |
END 2019 Q16 • 2022 Q2 • 2023 Q4 • 2024 Q11 |
RES 2022 Q3 • 2023 Q2 • 2024 Q4
Given an equation involving invertible matrices A, B, C, X with inverses and transposes, isolate X. Typically multiple-choice with tricky orderings.
Methods to Solve
- Key rules: (AB)-1 = B-1A-1, (AT)-1 = (A-1)T, (AB)T = BTAT
- Isolate X by multiplying both sides on the correct side (left or right) by the appropriate inverse
- When transposes are involved: take the transpose of both sides if needed, remembering (AB)T = BTAT
- Order matters! Matrix multiplication is not commutative -- always multiply on the same side
- Simplify fully: no expressions like (AB)-1 in the final answer
Example (Midterm 2020 Q4): ATB-1(CT+X)T = C. Answer: X = (BTA-1 - I)CT.
3. Determinant Computation Very High
MID 2020 Q3 • 2023 Q10 • 2024 Q10 |
END 2019 Q14-15 |
RES 2019 Q6 • 2022 Q7
Compute the determinant of a 3×3, 4×4, or 5×5 matrix. Often combined with det properties (column swaps, scalar multiplication).
Methods to Solve
- Cofactor expansion: expand along a row/column with the most zeros
- Row reduction to upper triangular, then det = product of diagonal entries (track sign changes from row swaps)
- For triangular/block-triangular matrices: det = product of diagonal entries
- Properties: det(cA) = cndet(A) for n×n matrix; det(AB) = det(A)det(B); row swap flips sign; adding a multiple of one row to another doesn't change det
- Column swap: changes sign of determinant
Example (Midterm 2024 Q6): If det[a b c; d e f; g h i] = x, find det(-3B) where B has columns permuted. Answer: (-3)3 · det(B) = -27x.
4. Matrix Inverse Computation High
MID 2020 Q5 • 2023 Q6 |
END 2019 Q12 • 2022 Q3 • 2023 Q4 |
RES 2022 Q5
Compute A-1 for a 3×3 matrix, or find AX = B (which requires computing A-1). Sometimes asks for the sum of entries or a specific entry.
Methods to Solve
- Augmented matrix method: Row reduce [A | I] to [I | A-1]
- For 2×2: A-1 = (1/det A) · [d -b; -c a]
- Check det(A) ≠ 0 first; if det(A) = 0, write DNE
- For AX = B: solve by computing X = A-1B (row reduce [A | B])
5. Linear Transformations -- Standard Matrix Very High
MID 2020 Q6,Q8 • 2023 Q2,Q3 • 2024 Q2,Q4 |
END 2019 Q5 • 2022 Q1 • 2023 Q1 |
RES 2024 Q2
Find the standard matrix of a composed transformation: typically rotation (counter/clockwise) followed by reflection (across x-axis, y=-x, etc.), or shear. Also: given T on specific vectors, find T on another vector.
Methods to Solve
- Rotation matrix (counter-clockwise by θ): [cosθ -sinθ; sinθ cosθ]
- Clockwise rotation by θ: use -θ in the formula
- Reflection across x-axis: [1 0; 0 -1]; y-axis: [-1 0; 0 1]; y=x: [0 1; 1 0]; y=-x: [0 -1; -1 0]
- Reflection across x+y=0 (i.e., y=-x): [0 -1; -1 0]
- Composition: If S comes after T, then standard matrix = [S][T] (right to left)
- For "find T(v) given T on other vectors": express v as a linear combination of the known inputs, then use linearity
Example (Midterm 2024 Q4): Reflect through y=-x then rotate clockwise 120°. Compute [Rotation]·[Reflection].
6. Null Space Dimension / Rank-Nullity Theorem Very High
MID 2020 Q10 • 2023 Q9 • 2024 Q9 |
END 2019 Q7 |
RES 2019 Q2,Q4 • 2022 Q4
Given matrix dimensions and/or rank, find dim(Nul A). Or: for which parameter h does dim(Nul A) equal a specific value?
Methods to Solve
- Rank-Nullity Theorem: rank(A) + dim(Nul A) = n (number of columns)
- rank(A) = number of pivot columns after row reduction
- dim(Nul A) = number of free variables = n - rank(A)
- For parametric versions: row reduce with parameter, find when a pivot disappears
Example (Midterm 2020 Q10): A is 9×11 with rank 6. dim(Nul A) = 11 - 6 = 5.
7. Column Space Basis Identification High
MID 2020 Q12 |
END 2022 Q5 • 2024 Q3 |
RES 2019 Q3 • 2023 Q9
Given A row-equivalent to an echelon form U, determine which sets of original columns {ai} form a basis for Col(A).
Methods to Solve
- Pivot columns of A (columns corresponding to pivot positions in U) always form a basis for Col(A)
- Any set of dim(Col A) linearly independent vectors from Col(A) is also a basis
- Non-pivot columns are linear combinations of pivot columns, but can sometimes replace pivot columns in a basis
- Standard basis vectors ei are NOT guaranteed to be in Col(A) -- they form a basis for Col(A) only if A has full row rank
- To check: a set of original columns is a basis iff the corresponding sub-matrix formed from U has full column rank
8. Coordinate Vectors w.r.t. a Basis High
MID 2020 Q13 • 2023 Q8 • 2024 Q8 |
END 2022 Q6 |
RES 2019 Q9,Q10 • 2022 Q6 • 2023 Q5 • 2024 Q9
Given a basis B = {b1, b2, ...} and a vector w, find [w]B. Sometimes asks whether w lies in Span(B) at all.
Methods to Solve
- Set up the equation c1b1 + c2b2 + ... = w
- Row reduce the augmented matrix [b1 b2 ... | w]
- If consistent: the solution gives the coordinate vector [w]B
- If inconsistent: w is not in Span(B), write DNE
9. Linear Independence / Dependence (with parameter) Medium
MID 2020 Q9 • 2024 Q3 |
END 2019 Q2 |
RES 2023 Q11
For which value(s) of α is a given set of vectors linearly dependent (or: has span dimension less than k)?
Methods to Solve
- Form the matrix with the vectors as columns, row reduce
- Vectors are linearly dependent iff det = 0 (for square matrices) or iff there's a free variable
- For parametric: row reduce, find the parameter value that creates a zero row / eliminates a pivot
- dim(Span) < k iff the vectors are linearly dependent
10. Eigenvalues and Eigenvectors Very High
MID 2020 Q11 |
END 2019 Q11,Q14-15 • 2022 Q7-8 • 2023 Q5-7 • 2024 Q4,Q7 |
RES 2019 Q11,Q13 • 2022 Q8 • 2023 Q4 • 2024 Q4,Q6
Find eigenvalues (including complex), eigenvectors, algebraic/geometric multiplicities. Given constraints on eigenvalues, determine properties of det(A) or invertibility.
Methods to Solve
- Characteristic polynomial: det(A - λI) = 0
- Eigenspace: Eλ = Nul(A - λI); find by row reducing (A - λI)
- Algebraic multiplicity: multiplicity of λ as root of the characteristic polynomial
- Geometric multiplicity: dim(Eλ) = dim Nul(A - λI)
- For triangular matrices: eigenvalues are the diagonal entries
- det(A) = product of eigenvalues; trace(A) = sum of eigenvalues
- If Av = λBv for invertible A,B: multiply by A-1 to get λ-1 is eigenvalue of A-1B
11. Diagonalizability (with parameter) High
END 2022 Q7 • 2023 Q7 • 2024 Q4 |
RES 2019 Q12-13 • 2022 Q17
For which value of a parameter is a matrix diagonalizable? Check if geometric multiplicity equals algebraic multiplicity for each eigenvalue.
Methods to Solve
- A is diagonalizable iff for every eigenvalue, geometric multiplicity = algebraic multiplicity
- If all eigenvalues are distinct ⇒ automatically diagonalizable
- For repeated eigenvalue λ with a.m. = k: check if dim Nul(A - λI) = k
- Row reduce (A - λI) with the parameter to find when g.m. = a.m.
- Symmetric matrices are always (orthogonally) diagonalizable
Example (Endterm 2022 Q7): A has eigenvalue 5 with a.m.=2. A is diagonalizable iff g.m.(5) = 2, which requires α = 1.
12. Discrete Dynamical Systems High
END 2022 Q8 • 2023 Q8 • 2024 Q2 |
RES 2023 Q7 • 2024 Q8
Given xk+1 = Axk, find the general solution, a specific component formula, or classify the origin (attractor / repeller / saddle).
Methods to Solve
- Diagonalize A = PDP-1, then xk = PDkP-1x0
- General solution: xk = c1λ1kv1 + c2λ2kv2
- Use initial condition x0 to solve for c1, c2
- Origin classification:
- Attractor: all |λi| < 1
- Repeller: all |λi| > 1
- Saddle: mixed (one <1, one >1 in absolute value)
- If one eigenvalue is 1: origin is neither attractor nor repeller
Example (Endterm 2024 Q2): A = [4 2; 1 5], eigenvectors [2;-1] and [1;1] with eigenvalues 3,6. General solution: c13k[2;-1] + c26k[1;1]. With x0=[1;4]: yn = 3·6n + 3n.
13. Least-Squares Solutions / Best Fit Line Very High
END 2019 Q21 • 2022 Q13-14 • 2023 Q10 • 2024 Q8 |
RES 2019 Q19 • 2022 Q13 • 2023 Q10 • 2024 Q5
Find the least-squares solution of an inconsistent Ax = b, or the best-fit line y = a + bx through given data points.
Methods to Solve
- Normal equations: ATA x̂ = ATb
- Compute ATA and ATb, then solve the resulting system
- If ATA is singular: the LS solution has free variables (parametric)
- Best-fit line: Set up A = [1 x1; 1 x2; ...], b = [y1; y2; ...], solve normal equations for [a; b]
- LS error: ||b - A x̂|| = ||b - b̂||
Example (Endterm 2024 Q8): Points (1,10),(-1,-25),(3,10),(2,10). Set up design matrix, solve normal equations to get y = -10 + 9x.
14. Orthogonal Projection onto a Subspace Very High
END 2019 Q13 • 2022 Q10-11 • 2023 Q9 • 2024 Q5 |
RES 2019 Q17 • 2022 Q12,Q14 • 2023 Q8 • 2024 Q10
Find projW(y) where W = Span{b1, b2}, or find the projection matrix, or compute the distance from y to W.
Methods to Solve
- If {b1, b2} are orthogonal: projW(y) = (y·b1/b1·b1)b1 + (y·b2/b2·b2)b2
- If not orthogonal: first apply Gram-Schmidt, or use the projection matrix P = A(ATA)-1AT
- Projection matrix: P = UUT if U has orthonormal columns spanning W
- Distance from y to W: ||y - projW(y)||
- The standard matrix of the projection has P2 = P and PT = P
15. Gram-Schmidt Process Medium
END 2022 Q12 • 2024 Q5-6 |
RES 2019 Q18
Apply Gram-Schmidt to a set of vectors to produce an orthogonal basis. Often asks for the third vector v3 (up to rescaling).
Methods to Solve
- v1 = b1
- v2 = b2 - (b2·v1 / v1·v1)v1
- v3 = b3 - (b3·v1 / v1·v1)v1 - (b3·v2 / v2·v2)v2
- Verify: each vi should be orthogonal to all previous vj (dot product = 0)
- "Up to rescaling" means any scalar multiple of your answer is also correct
16. Orthogonal Complement (W⊥) Medium
MID 2023 Q7 |
END 2024 Q9 |
RES 2022 Q15 • 2023 Q9 • 2024 Q7
Find a basis for W⊥, or find a vector orthogonal to Col(A).
Methods to Solve
- W⊥ = Nul(AT) where A has columns spanning W (or Col(A) = W)
- If W is given by spanning vectors: form the matrix with those as rows, then find the null space
- If W is given by an equation (e.g., 3x2 + 2x3 + 3x4 = 0): the coefficient vector is the basis for W⊥ (or find Nul of the constraint)
- Vectors orthogonal to Col(A) lie in Nul(AT), i.e., the left null space
17. Subspace Identification Medium
END 2019 Q6 • 2023 Q2-3
Determine whether a given subset of Rn is a subspace. Find intersection of subspaces.
Methods to Solve
- Check the three conditions: (1) contains zero vector, (2) closed under addition, (3) closed under scalar multiplication
- Common non-subspaces: sets not containing 0, affine subsets (plane not through origin), unit vectors
- Null spaces and column spaces of matrices are always subspaces
- Intersection of subspaces: set up the system where both spanning representations are equal, solve
18. Determinant Properties (row ops, scalar mult, column swaps) Medium
MID 2024 Q6 |
END 2023 Q5 |
RES 2023 Q3
Given det(A) = x, compute det of a modified matrix (columns swapped, rows scaled, etc.).
Methods to Solve
- Row swap: multiplies det by -1
- Scaling a row by c: multiplies det by c
- det(cA) = cn det(A) for an n×n matrix
- Column permutation: each transposition (swap of two columns) flips the sign
- A cyclic permutation of 3 columns = 2 transpositions ⇒ no sign change
- det(AB) = det(A)det(B)
19. Matrix Powers via Diagonalization Medium
END 2019 Q9 |
RES 2019 Q15 • 2022 Q9-10 • 2024 Q3
Compute Ak or simplify A100 using diagonalization or a recursive identity like A5 = -3A2.
Methods to Solve
- Diagonalization: A = PDP-1 ⇒ Ak = PDkP-1
- Dk is diagonal with entries λik
- Recursive identity: If A5 = -3A2, then A3 = -3I (when A is invertible), or repeatedly apply to reduce exponent
- A100 = (A3)33 · A = (-3)33A etc. — careful with the algebra
- For rotation matrices: Ak = I when kθ = 2πm, i.e., k = 2π/θ
20. Complex Eigenvalues and A = PCP-1 Decomposition Medium
END 2019 Q11 • 2022 Q9 • 2024 Q10 |
RES 2019 Q14 • 2022 Q11 • 2023 Q6
Given a 2×2 matrix with complex eigenvalues a ± bi, express A = PCP-1 where C = [a -b; b a]. Or: identify the rotation-scaling geometric interpretation.
Methods to Solve
- Find eigenvalues a ± bi from the characteristic polynomial
- Find an eigenvector v for eigenvalue a - bi (convention varies)
- P = [Re(v) | Im(v)] and C = [a -b; b a]
- Geometric interpretation: rotation by angle φ = arctan(b/a) scaled by r = √(a2+b2)
- Matrix is "similar to" C means they share eigenvalues; (a,b) are uniquely determined (with b > 0)
21. Invertibility / det(A) from Matrix Equations High
MID 2020 Q1,Q11 • 2023 Q11 |
END 2019 Q4 • 2022 Q1 • 2024 Q11 |
RES 2019 Q5 • 2024 Q12
Given a matrix equation (e.g., A4(A-1)T = -2AT or A3 = -5(AT)-1), find det(A). Also: for which α is A singular?
Methods to Solve
- Take determinants of both sides: det(AB) = det(A)det(B), det(AT) = det(A), det(A-1) = 1/det(A)
- det(cA) = cndet(A) for n×n matrix
- Solve the resulting polynomial equation for det(A)
- For singularity: compute det(A) in terms of α and set = 0
- If all entries of AB are strictly positive, both A and B must be invertible
Example (Endterm 2024 Q11): A3 = -5(AT)-1. Taking det: a3 = (-5)4/a, so a4 = 625, giving det(A) = ±5.
22. Range / Image of a Linear Transformation (with parameter) Medium
MID 2023 Q3 • 2024 Q2 |
END 2024 Q1
For which value(s) of h does a vector b lie in the range (column space) of a transformation T?
Methods to Solve
- b is in the range of T iff the system Ax = b is consistent (A is the standard matrix of T)
- Row reduce [A | b] and check for contradictions
- If parameterized: find values of h that avoid contradictory rows
23. LU Decomposition Low
RES 2019 Q7-8
Find the LU decomposition A = LU. Identify columns of L and U.
Methods to Solve
- Row reduce A to echelon form U using only replacement operations (no row swaps)
- L is lower triangular with 1s on diagonal; entries below diagonal are the negatives of the multipliers used
- If row swaps are needed: use PA = LU (permuted LU)
24. Academic Reasoning (True/False Proofs) Very High
END 2019 Q22-25 • 2022 Q16-17 • 2023 Q11-12 • 2024 Q12-13 |
RES 2019 Q20-22 • 2022 Q16-17 • 2023 Q11-12 • 2024 Q11-12
Prove a statement is true, or provide an explicit numerical counterexample to show it's false. Worth 3-5 points each. Appears on every endterm and resit (2 questions each).
Common Topics & Strategies
- Frequent true statements:
- If v is eigenvector of A and B, then v is eigenvector of AB (prove by direct computation)
- ||u-v|| = ||u+v|| ⇒ u ⊥ v (expand dot products)
- Orthogonal projection eigenvalues are 0 or 1 (use P2 = P)
- Matrix equation (e.g., 2A2+3A = 4I) implies invertibility (express A-1 algebraically)
- If ATA is diagonal, columns of A are orthogonal (use definition of ATA entries)
- Frequent false statements (need counterexample):
- A,B diagonalizable ⇒ AB diagonalizable (use simple 2×2 upper triangular)
- A row equivalent to B and B diagonalizable ⇒ A diagonalizable
- A similar to symmetric B ⇒ A orthogonally diagonalizable
- (A+B)(A-B) = A2-B2 (fails because AB ≠ BA)
- λ=1 eigenvalue of A ⇒ λ=1 eigenvalue of row-equivalent B (false for λ≠0)
- v eigenvector of A2 ⇒ v eigenvector of A
- Proof strategies: direct computation, Invertible Matrix Theorem, properties of determinants, dot product expansion
- Counterexample strategies: Try 2×2 matrices with simple entries (0,1); upper triangular matrices; diagonal matrices
25. Orthonormal Matrices and Properties Low
END 2019 Q19 • 2022 Q15 |
RES 2019 Q16
Properties of orthogonal matrices (QTQ = I vs QQT = I), whether products of orthogonal matrices are orthogonal, etc.
Methods to Solve
- Q has orthonormal columns iff QTQ = I
- Q has orthonormal rows iff QQT = I
- For square Q: QTQ = I ⇔ QQT = I ⇔ Q-1 = QT
- Product of orthogonal matrices is orthogonal: (UV)T(UV) = VTUTUV = I
- Sum of orthogonal matrices is NOT necessarily orthogonal (counterexample: I + (-I) = 0)
Generated from 11 exams: Midterms 2020, 2023, 2024 | Endterms 2019, 2022, 2023, 2024 | Resits 2019, 2022, 2023, 2024
CSE1205 Linear Algebra, TU Delft