SSRN Author: Chris HennessyChris Hennessy SSRN Content
https://privwww.ssrn.com/author=106237
https://privwww.ssrn.com/rss/en-usTue, 04 May 2021 01:26:54 GMTeditor@ssrn.com (Editor)Tue, 04 May 2021 01:26:54 GMTwebmaster@ssrn.com (WebMaster)SSRN RSS Generator 1.0REVISION: Goodhart's Law and Machine Learning: A Structural PerspectiveWe develop a structural framework illustrating how penalized regression algorithms affect Goodhart bias when training data is clean but covariates are manipulated at cost by future agents facing prediction models. With quadratic manipulation costs, bias is proportional to sum-of-squared slopes, micro-founding Ridge. Lasso is micro-founded in the limit under increasingly steep cost functions. However, standard penalization is inappropriate if costs depend upon percentage rather than absolute manipulation. Nevertheless, with known costs of either form, the following algorithm is proven manipulation-proof: Within training data, evaluate candidate coefficient vectors at their respective incentive-compatible manipulation configuration. Moreover, we obtain analytical expressions for the resulting coefficient adjustments: slopes (intercept) shift downward if costs depend upon percentage (absolute) manipulation. Statisticians ignoring agent borne manipulation costs select socially suboptimal ...
https://privwww.ssrn.com/abstract=3639508
https://privwww.ssrn.com/2020582.htmlMon, 03 May 2021 15:01:34 GMTREVISION: Goodhart's Law and Machine LearningWe examine how the choice of penalized regression prediction algorithm affects Goodhart bias and deadweight costs when historical training data is clean, but covariates can be manipulated at a cost by future agents who face the algorithm. With naïve prediction and quadratic costs, Goodhart bias and manipulation costs are proportional to sum-of-squared slope coefficients supporting Ridge penalization. A general algorithm is presented for addressing future model-induced manipulation: Within training data, evaluate each candidate coefficient vector's predictive power at its respective incentive-compatible data manipulation configuration. With homogeneous agents, we obtain simple analytical expressions for downward adjustments of historical intercept and/or slope coefficients that eliminate future Goodhart bias. In addition to such downward coefficient drift over time, Goodhart's Law also predicts positive correlation between covariates and regression coefficients, as agents attempt to ...
https://privwww.ssrn.com/abstract=3639508
https://privwww.ssrn.com/1924567.htmlThu, 23 Jul 2020 15:00:48 GMT