problem 1)(a) State Bayes’ theorem for a k-class problem involving d independent features. Discuss the various probabilities arising in Bayes’ theorem. describe liklihood based decision rule for classification.
(b) Consider data with features x and y, randomly selected from a population comprising classes A and B, as shown in the table. What is the probability that a new sample with x=1, y=2 belongs to class B? Make only necessary assumptions and list them.
Class Samples x=1 x=2 y=1 y=2
A 6 4 2 5 1
B 4 2 2 3 1
(c) List the various stages of classifier-design cycle, and describe the associated challenges to be addressed.
problem2)(a) What is meant by partitional clustering? Justify the following statement with an ex: Forgy’s algorithm groups a given set of N samples into K clusters.
(b) Consider the two class problem with class A and class B. A feature x is normally distributed for class A with μ_{A} =0, σ_{A}=1 and normally distributed for class B with μ_{B}=1, σ_{B}=2. What decision boundary will optimally divide the measurement x into decision regions if P(A)=P(B)? Find the decision rule based on the linear descriminant function and test the class of a random sample with x=4.
problem 3)(a) What is the need for a biometric system? describe the various issues to be addressed during the design of biometric pattern recognition / identification.
(b) describe various stages of the Hebb’s training algorithm. Design a 2-input AND function using Hebb’s rule.
problem 4)(a) With a neat diagram of generalized architecture pertaining to back propagation net (BPN), describe the four major stages of the training algorithm. List advantages and disadvantages of the algorithm.
(b) List methods for evaluating the error rate of a classifier. describe any one method with an ex.
problem 5)(a) describe briefly Statistical Pattern Recognition system.
(b) describe the Hopfield Net with a neat diagram.
(c) describe briefly, the Minimum Variance method of clustering.