optimization - Can variance be replaced by absolute value in this objective function? -


initially modeled objective function follows:

argmin var(f(x),g(x))+var(c(x),d(x))

where f,g,c,d linear functions

in order able use linear solvers modeled problem follows

argmin abs(f(x),g(x))+abs(c(x),d(x))

is correct change variance absolute value in context, i'm pretty sure imply same meaning having least difference between 2 functions

you haven't given enough context answer question. though question doesn't seem regression, in many ways similar question of choosing between least squares , least absolute deviations approaches regression. if term in objective function in sense error term appropriate way model error depends on nature of error distribution. least squares better if there distributed noise. least absolute deviations better in nonparametric setting , less sensitive outliers. if problem has nothing probability @ other criteria need brought in decide between 2 options.

having said this, 2 ways of measuring distance broadly similar. 1 small if , if other -- though won't equally small. if similar enough purposes fact absolute values can linearized motivation use it. on other hand -- if variance-based 1 better expression of interested in fact can't use lp isn't sufficient justification adopt absolute values. after -- quadratic programming not harder lp, @ least below scale.

to sum -- don't imply same meaning, imply similar meanings; and, whether or not similar enough depends upon purposes.


Comments

Popular posts from this blog

c# - Binding a comma separated list to a List<int> in asp.net web api -

Delphi 7 and decode UTF-8 base64 -

html - Is there any way to exclude a single element from the style? (Bootstrap) -