Talk:QR decomposition: Difference between revisions

(further thoughts)
Line 2:
 
What does "and the usage for linear least squares problems on the example from Polynomial_regression" mean, specifically, as a task requirement? --[[User:Rdm|Rdm]] 16:02, 17 June 2011 (UTC)
:It means that there already is a existing task on RC ([[Polynomial regression]]) which requires linear least squares, and since LLS is one use case for QR, that task can be used here as an example. The Go and R solutions of the Polynomial regression already used QR instead of the normal equations approach. --[[User:Avi|Avi]] 20:52, 17 June 2011 (UTC)
:According to the wikipedia page on least squares, QR reduction is supposed to be more numerically stable than faster approaches. But we can solve the Polynomial regression example exactly, so I am not sure that that example is a good one for QR reduction least squares fitting. --[[User:Rdm|Rdm]] 16:34, 17 June 2011 (UTC)
::The advantage is that it is already existing, has many solutions and can be used as a comparison. --[[User:Avi|Avi]] 20:52, 17 June 2011 (UTC)
69

edits