the most important factor that accounts for the variation of y most

Discussion in 'Scientific Statistics Math' started by Mike, May 27, 2009.

  1. Mike

    Mike Guest

    Hi

    I have to pick up the most explainable variable in eq. like:

    y=x1*x2/(x3*x4)... even general in y=f(x1,x2,x3,x4).

    In fact, x1~x4 are all function of some angles and other parameters.
    I have the model run for many conditions.
    Then, I notice variation of y. I'd like to account for the variation
    of y. What is the most important factor that accounts for the
    variation of y most. How to do this?

    I compute stdev for y, x1~x4, but I have doubt about it.
    I don't know how to write an eq. like:
    stdev(y) = some forms of stdev(x.....)
    Then pick up the largest one?
    Or, I try to compute correlation coefficient r between y and xi, then
    can I pick up the largest r as the most important factor? Is that
    reasonable?

    Any suggestion will be pleased

    Mike
     
    Mike, May 27, 2009
    #1
    1. Advertisements

  2. Mike

    Rich Ulrich Guest

    I have to assume that you are referring to a regression-like
    example, where the fit is not perfect, or the statistical problem,
    right off, makes no sense at all. Everything is perfectly
    determined, symmetrically in the prediction (barring scaling
    as "raw" or "reciprocal") and you want to know which matters
    most?

    I suggest that you read up on the problem for the simpler
    linear regression case, where y is a linear combination of X and
    coefficients, y= a + b1*x1 + b2*x2 + ... .

    Since there is no general, accepted solution to "best predictor"
    in this easy case, except where the same variable scores as
    "best" by both the univariate and regression p-values, you
    have some pondering to do.
    Function of *angles*? Surely, that makes the situation
    even more complicated.
    "Least squares regression" is what is used to account
    for "variation in y" when that is measured as "variance".

    For your non-linear example, the example is *simplified* to
    a linear regression case if you are willing to take the logs
    of both sides, so that you minimize the variance of log(y)
    instead of the variance of (y).

    Instead of looking at variances by Least Squares, maximum
    likelihood is often the criterion used for non-linear fitting
    (ML methods).
     
    Rich Ulrich, May 28, 2009
    #2
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.