Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » Software » comp.soft-sys.math.mathematica

Topic: more on vector unions
Replies: 0  

Advanced Search

Back to Topic List Back to Topic List  
Russell Towle

Posts: 70
Registered: 12/3/04
more on vector unions
Posted: Apr 29, 2000 11:41 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

David Park supplied a new "vunion" (for "vector union") function which
vastly improved upon my own, and incorporated Carl Woll's curious
OrderedUnion function. Park's vunion, though, operated upon a list of
Round-ed proxies for the real-valued vectors.

This distressed me a little, since if my polyhedra were to grow to a
million miles across, well, Round might lead to problems.

Carl Woll was kind enough to devise a vector union function too. Here is
what he wrote:

>The vunion function I have in mind would declare two points to be the same
>if the maximum of the difference of the coordinates were less than some
>tolerance, which I'll abbreviate by tol. First, I'll define a vunion
>function which carries out the above:
>
>
>tol=0.000001;
>
>vunion[v_] := Union[v, SameTest ->
> (Max[Abs[#1 - #2]] < tol &)]
>
>For a large set of points, this vunion function is very slow, since given
>a set of N points, it goes like O(N^2). This is because the current Union
>function works by comparing every possible pair of points to determine if
>they are the same. My idea for speeding up this function is to first split
>up the data set into a disjoint union of smaller sets where the first
>coordinates of these smaller sets are basically (but not really) a
>distance tol apart, which is an O(N log N) process. The not really in the
>previous sentence is there because if the tolerance is large enough, then
>the splitting may not break up the set into smaller pieces after all. For
>example, with a tolerance of .1, the following data set will not be broken
>into smaller pieces:
>
>{{.05,1,1},{.1,1,1},{.15,1,1},{.2,1,1},{.25,1,1}}
>
>even though the first point and the last point are clearly farther than a
>tolerance apart. At any rate, after splitting up into smaller subsets,
>apply the vunion function to each of these smaller subsets. Hence, my
>modified vunion function is
>
>vunion1[v_] := Module[{r},
> r = Split[Sort[v], (Abs[#1[[1]] - #2[[1]]] < tol &)];
> Flatten[vunion /@ r, 1]]



Initial tests of this vunion1 function suggest that it is not only accurate
but very fast; it found the union of a list of 8448 3-vectors in 2.95
seconds. However, I myself believe it should be modified to Sort a Chop-ped
list:

>vunion1[v_] := Module[{r},
> r = Split[Sort[ Chop[v, .000001] ], (Abs[#1[[1]] - #2[[1]]] < tol &)];
> Flatten[vunion /@ r, 1]]


I think this is crucial, because a typical list of vectors (vertices of
polytopes, actually) with many duplicates will have many components close
to zero--which actually should be zero--but which differ in sign. Chop
fixes this and Sort proceeds as it should... right?

Russell Towle
Box 141
Dutch Flat, CA 95714
(530) 389-2872








Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.