I have a function f(x) that returns either 0 or 1. The output is non-deterministic, but there is generally a positive correlation between the value of x and the likelihood of f(x) returning 1. What I'd like to do is iterate over a set of x-values, and come up with some new function g(x) that, for arbitrary x, outputs some value in the range [0,1] that approximates the likelihood of f(x) will return 1.
I've taken some statistics courses, and I'm sure that there's some procedure for doing what I want, but I haven't the faintest idea what it's called. Any tips?