What’s going on?

Quite some time since my last post - I’ve been rather really busy will different stuff. First of all, I had to work a bit more as I was falling behind the schedule, but I’m on track now again. I didn’t work much on niven lately, but the new object management is in good shape. I polished the integrated math suite a bit, it’s more flexible and generic than before and has a few more useful features. For university, I took some time to implement a few things - iterative solvers, FFT and LUP-Decomposition. Read on for more details.

  • Iterative solvers - solve equation systems by iterating from a first (guessed) solution. Sorted by difficulty … actually, Richardson is the most tricky, as you need a correction factor smaller than twice the spectral radius of the matrix (if you think hu? - I had also no idea, but it turns out that the spectral radius is less or equal to any matrix norm, so it’s not that difficult)
    • Richardson
    • Jacobi
    • Gauss-Seidel, normal and with successive over-relaxation (which did not change much on the test data)
  • FFT - fast polynomial multiplication by transforming between coefficient and point-value-representation. Although I did a “by-the-book” implementation I’m not satisfied at all with the quality - the Intel MKL offers much better numeric stability than the naive solution I implemented. I think this might come from the following issue: Instead of computing the n-th root directly, I’m starting from the first one and multiplying it n-times; I assume a good complex exponentiation to be much more precise here.
  • LUP-Decomposition - fast and exact solving of linear equations, plus very fast determinants. I implemented it with pivoting so I can solve a wider range of problems and I get better numeric stability. This one was real fun, and it turned out to be simpler than expected.


Comments powered by Disqus