Revision as of 22:03, 11 June 2023 by Admin (Created page with "Section Logistic Regression discussed <span class="mw-gls mw-gls-first" data-name ="logreg">logistic regression</span> as a ML method that le...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
ABy Admin
Jun 11'23

Exercise

Section Logistic Regression discussed logistic regression as a ML method that learns a linear hypothesis map by minimizing the logistic loss. The logistic loss has computationally pleasant properties as it is smooth and convex. However, in some applications we might be ultimately interested in the accuracy or (equivalently) the average 0/1 loss.

Can we upper bound the average [math]0/1[/math] loss using the average logistic loss incurred by a given hypothesis on a given training set?