Lecture 34.
Least squares.
I began with reviewing the normal
equations used to fit a line to a set of data points. I then used the concise
matrix notation to represent the line fitting equations. I pointed out how the
matrix version easily generalizes for any linear combinations of functions, and
in particular polynomial functions. I wrote out the underlying matrix for
fitting a degree k polynomial to a set of data points. The matrix we got is
Vandermonde and we know how those are likely to be ill-conditioned. Thus I
mentioned in passing that there are other techniques for solving least squares
curve fitting that are better conditioned.
I then showed how we could
transform an exponential function into a linear function so that least squares
line fitting could be applied.
Finally I demonstrated the
censusgui program from Moler. I also showed solutions to the two questions from
Moler on HW-12.
On Thursday I
will continue with solutions to HW-12 and move on to the practice final if time
permits. On Friday our last lecture I will finish up with the practice
final.
Posted: Tue - November 29, 2005 at 11:49 AM