45  Topics for Presentations/Posters/Papers

Warning

To be updated….

Presentations/posters will take place on the week beginning April, 27. Papers will be due the same week.

Presentations.

Posters.

Papers.

45.1 Rubric

Here are the things that I’ll be looking for in your presentations/posters/papers:

Content (4 points)

  • Accuracy: the method/results/proof/examples should be correct,
  • Originality: tell us something that we haven’t seen in lectures but links to existing topics or extends the content completed in class or on the homework assignments,
  • Give examples to motivate/explain key ideas,

Numerical Implementation (2 points)

  • Show us some numerical results/experimentation

Presentation (2 points)

  • You will be graded on how clearly you present the key ideas,
  • Think about how you want to present (whiteboard/slides/notebook) - what is the best way to convey the main point?

Questions (2 points)

  • How well are you able to answer the questions at the end?
  • Bonus points for asking good questions

(The following is in no particular order. To be updated….)

45.2 Convergence of Jacobi and Gauss-Seidel

In the midterm we prove that Jacobi and GS converge for SDD matrices. You can actually prove convergence under a weaker assumption. Show that Jacobi and GS converge under the assumption that A is irreducibly diagonally dominant.

45.3 “From potential theory to matrix iterations in six steps”

Pick out something that you find interesting from this paper Driscoll, Toh, and Trefethen (1998). [more than one person can do this: just coordinate so that you are not giving the same presentation].

45.4 Convergence of Conjugate Gradient

Give some ideas in the proof of Theorem 11.2 from the lecture notes.

45.5 Preconditioned CG

Use e.g. Incomplete LU to improve the conditioning of a matrix. Give numerical examples that show this improves the convergence behaviour.

45.6 Arnoldi Iteration

This page would be quite useful. Driscoll and Braun (2018)

45.7 GMRES

45.8 MINRES

When A is symmetric (or Hermitian) the Arnoldi iteration simplifies to the Lanczos iteration (and GMRES becomes MINRES)

45.9 Solving Poisson’s equation in 2d

See Possion’s equation in 2d.

45.10 Chebyshev differentiation matrices

45.11 (other) Applications of the Discrete Fourier Transform

Find something that you find interesting in e.g. “Brigham, E. O. (1988). The Fast Fourier Transform and Its Applications”. For example, how can you use the FFT to compute the product of two polynomials? Why is this useful?

45.12 Richardson-Lucy iteration for image de-blurring

Suppose you have some grayscale image represented by some matrix in [0,1]^{n\times m} that has been flattened into a vector b \in [0,1]^{nm} (e.g. b represents a NASA image from the Hubble space telescope) and a blurring matrix A. The observed image is then b = Ax where x is the original (sharp) image. The Richardson-Lucy method is an iterative method that tries to recover x.

[ Info: Saved animation to c:\Users\math5\Math 5485\Math5486\pics\math5486.gif

45.13 A posteriori error analysis and mesh refinement

FEM Suli and Mayers (2012) chapter 14.5

45.14 Fornberg algorithm

Finite differences on arbitrary nodes

45.15 Google PageRank and the Power Method

Explain how Google uses the power method (see midterm) in their page ranking algorithm.

45.16 Convergence analysis of FEM

What is the order of convergence of FEM in terms of the regularity of the solution and the polynomial degree of the basis?

45.17 Driscoll and Braun (2018)

Pick out a couple of sections from Driscoll and Braun (2018) that we haven’t covered in class


Driscoll, Tobin A, and Richard J Braun. 2018. Fundamentals of Numerical Computation. https://fncbook.com/.
Driscoll, Tobin A, Kim-Chuan Toh, and Lloyd N Trefethen. 1998. “From Potential Theory to Matrix Iterations in Six Steps.” SIAM Rev. Soc. Ind. Appl. Math. 40 (3): 547–78.
Suli, Endre, and David F Mayers. 2012. An Introduction to Numerical Analysis. Cambridge, England: Cambridge University Press.