The purpose of this study is to validate course-oriented evaluation for PDCA (Plan-Do-Check-Act) cycle empirically; especially focusing on feedback methods for online-course improvements. The 2006 cycle (spring to fall semesters) and the 2007 cycle were compared. The 2006 evaluation activities had the following problems: (1) differences of evaluation focus between a course-implementation team and evaluators, (2) ignorance of course characteristics, and (3) ambiguity of suggestions for course improvements. The 2007 evaluation activities had been reconstructed considering these problems: inclusion of (1) evaluators’ actual experience of the online-course before evaluating the targeted course and (2) practitioners’ reflection on their course implementation. Results showed the increase of concrete suggestions and actual course-improvements for the revised 2007 evaluation activity.