Optimization

  • Find the \(5^{th}\) lecture notes linked: Optimization Lecture Notes 5 (Google Drive Link), Lecture Notes 5(Github Link)
  • Difference between necessary and sufficient conditions. A sufficient condition is enough to guarantee the result but not necessary for the result to hold true. Recalled how Taylor’s theorem can be used to give a sufficient condition that if gradient is 0 and hessian is positive definite at some point (with the rest of the setup as in the notes) then it is a local minima.
  • Problems on Optimization over open sets, some problems on optimization which are not obviously on open set but after some transformation, they can be seen as that.
  • Least square regression.
  • Use Implicit function theorem to convert the problem of finding into an optimization on open set problem. I can’t stress how important this tool from Calculus is. The Implicit function theorem is critical to many proofs in analysis, especially when we don’t have nice explicit expressions to work with. Its proof is that it follows as a corollary from Inverse function Theorem and is rather not well done in standard textbooks or in multivariable calculus class. I am writing up an expository proof which will be posted soon.