In this paper, two types of identifications are analyzed with respect to the worst case identification error, which is necessary in the ordinary robust control system design. We show that the worst case identification error in deterministic approach is considerably large when I/O signals are corrupted by comparatively large noises. We compare it with the error of the least squares method which exponentially converges to zero as the numbers of data increase when noises obey stochastic processes of normal distribution or uniform distribution, and show the error of the least squares method which is equal to the worst case error of the deterministic approach in size, is exceedingly conservative. We point out the necessity of a modified robust control system design which consists of multiple weights according to the different sizes of uncertainties, and show a numerical example.