Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Characterizing a nondeterministic function
Posted:
Jun 20, 2012 11:30 PM


I have a function f(x) that returns either 0 or 1. The output is nondeterministic, but there is generally a positive correlation between the value of x and the likelihood of f(x) returning 1. What I'd like to do is iterate over a set of xvalues, and come up with some new function g(x) that, for arbitrary x, outputs some value in the range [0,1] that approximates the likelihood of f(x) will return 1.
I've taken some statistics courses, and I'm sure that there's some procedure for doing what I want, but I haven't the faintest idea what it's called. Any tips?



