There are several well known algorithms to minimize deterministic finite automata. Apart from the theoretical worst-case running time analysis, however, not much is known about the average-case analysis or practical performance of each of these algorithms. On this paper we compare three minimization algorithms based on experimental results. The choice of the algorithms was based on the fact that although having different worst-case complexities they are usually considered to be ones that achieve best performance. We used an uniform random generator of (initially-connected) deterministic finite automata for the input data, and thus our results are statistically accurate. Because one of the algorithms allowed to minimize non-deterministic finite automata (NFA), we also developed a non-uniform random generator for NFAs. Nevertheless, although not statistically significant, the results in this case are fairly interesting.