Logic minimization is used to design multiclass classifiers for machine learning. This can be an alternative to a neural network. A partially defined classification function f is derived from the training set. Our multiclass classifier correctly classifies not only all the samples in the training set, but also much of samples in the unseen test set. To improve the test accuracy, 1) minimization of variables in f;2) minimization of the number of products in a ternary SOP for f; and 3) maximization of the number of literals in a ternary SOP for f, are performed. Experimental results using MNIST and fashion MNIST data set show that logic minimization improves the test accuracy. Our classifiers can be easily implemented by LUTs and glue logic.