Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Re: EM parameters during GMM classifier training
Posted:
Feb 4, 2013 5:08 PM


> Is there a way to have more control over the way EM is operating, e.g., > define starting points?
The following shows that you can specify the starting values of the mean, covariance, and mixing parameters. Here I have data coming from a mixture of three components. If I fit two components then two clusters get merged. By manipulating the starting values I change which clusters get merged.
>> x = [mvnrnd([4 0],eye(2),100); mvnrnd([4 0],eye(2),100); mvnrnd([0 >> 6],eye(2),100)]; >> plot(x(:,1),x(:,2),'bx') >> gmdistribution.fit(x,2) ans = Gaussian mixture distribution with 2 components in 2 dimensions Component 1: Mixing proportion: 0.333259 Mean: 3.6882 0.0566 Component 2: Mixing proportion: 0.666741 Mean: 1.9401 2.9808
>> s.mu = [2 3;4 0]; >> s.Sigma = cat(3,eye(2),eye(2)); >> s.PComponents = [.5 .5]; >> gmdistribution.fit(x,2,'start',s) ans = Gaussian mixture distribution with 2 components in 2 dimensions Component 1: Mixing proportion: 0.665897 Mean: 1.8197 3.0296 Component 2: Mixing proportion: 0.334103 Mean: 3.8195 0.0333
 Tom



