The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » sci.math.* » sci.math.num-analysis

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: binary search with arbitrary or random split
Replies: 3   Last Post: Jul 26, 2012 12:02 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
pamela fluente

Posts: 232
Registered: 1/21/06
Re: binary search with arbitrary or random split
Posted: Jul 26, 2012 11:32 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On Jul 26, 3:55 pm, Ben Bacarisse <> wrote:

> You did not answer my key question (maybe I forgot to ask it!). Do you
> agree that the cost function (in terms of compares) of your uniform
> splitting algorithm is C(n) = n?  I can say that this is not in
> O(log(n)) (but it is in O(n)) without taking any limits, and without any
> reference to probability.

Sure it is.
You have understood perfectly my point, though.

Considering the formulation with random splitting point n/k ( not n-
k ! ) (again: "divide", not subtract!)

IF the only cases where i get cost n are those "degenerate", where by
mere (insignificant, for n large) chance
the binary search happens to be a linear search, i would still object
that in a probabilistic
setting, using sure convergence has little meaning.

So considering that O() is defined using limits, as far i can see
*in a probabilistic setting * (such is what we are considering here)
i would consider *probabilistic forms* of convergence.

You say that this is not the way this is done in CS and they always
consider "sure" convergence.
I can and have to accept that for sure. But i also feel to say it
makes no sense, in a probabilistic setting, nor intuitively.

I have in mind large problem, like n bigger than 5000 digits. In this
case i only care of asymptotics
and to me if the cost n reduces to O(logn) asymptotically, because the
events causing cost n are
practically impossible, i think it's right to see it as O(logn), no
matter what happens for small n.

Clearly, IF there are other situations which causes the cost n (apart
the linear searches and having
a probability which does not tend to 0 then ok, i could not,
obviously, object anything).

> O(log(n)) (but it is in O(n)) without taking any limits

This part is not completely clear to me yet. I always understood O()
as a short hand
for an event involving a limit. A description of an asymptotic
But maybe you are referring to complexity classes definition and they
don't use limits (?).

> --
> Ben.

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.