1998 Volume 11 Issue 10 Pages 585-592
A procedure is developed to design a D/A converter via sampled-data control theory. A multirate converter with oversampling is designed to reconstruct a delayed original analog signal via H∞ and H2/H∞ design method. In spite of the multirate and delay elements, these problems are shown to be reducible to a finite dimensional discrete-time problem and furthermore to linear matrix inequality (LMI) conditions. Some numerical examples are presented to illustrate the results.