Homework

Submitted by: Submitted by

Views: 219

Words: 1975

Pages: 8

Category: Science and Technology

Date Submitted: 03/18/2013 12:23 PM

Report This Essay

Bayes Classifier:

In a Bayes Classifier, a probabilistic model of features is used to predict the classification of a new example.

Can be used if we know the following information:

No. of classes, means(centroid), Covariance matrix and priori probabilities.

Algorithm:

[z]=bayes_classifier(m,S,P,X)

% Bayesian classification rule for c classes, modeled by Gaussian

% distributions (also used in Chapter 2).

%

% INPUT ARGUMENTS:

% m: lxc matrix, whose j-th column is the mean of the j-th class.

% S: lxlxc matrix, where S(:,:,j) corresponds to

% the covariance matrix of the normal distribution of the j-th

% class.

% P: c-dimensional vector, whose j-th component is the a priori

% probability of the j-th class.

% X: lxN matrix, whose columns are the data vectors to be

% classified.

%

% OUTPUT ARGUMENTS:

% z: N-dimensional vector, whose i-th element is the label

% of the class where the i-th data vector is classified.

1. Assuming the distributions to be Gaussian, generate the conditional probability.

2. x€Wi iff P(Wi|x)p(x) > P(Wj|x)p(x), for all j!=i

Program:

P1=0.5;

P2=0.5;

m1=[1 1]';

m2=[3 3]';

S=eye(2);

x=[1.8 1.8]';

p1=P1*comp_gauss_dens_val(m1,S,x)

p2=P2*comp_gauss_dens_val(m2,S,x)

Result:

p1 = 0.0420 p2 = 0.0189

* x is classified to cluster w1.

Euclidean Classifier:

Assumption:

1. Equal probability classes.

2. All classes are Gaussian.

3. Covariance is equal for all classes.

4. Covariance matrix is a diagonal matrix.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

% FUNCTION

% [z]=euclidean_classifier(m,X)

% Euclidean classifier for the case of c classes.

%

% INPUT ARGUMENTS:

% m: lxc matrix, whose i-th column corresponds to the mean of the i-th

% class.

% X: lxN matrix whose columns are the data vectors to be classified.

%

% OUTPUT ARGUMENTS:

% z:...