Low-voltage CVT cables are extensively used in large quantities between secondary substations and other facilities in plants and factories. Upsizing their conductors is expected to result in an improvement in voltage drop and reduction in distribution loss (Joule loss)， thereby providing energy savings and power savings (peal power reduction). This paper discusses the relationship between voltage drop rate and distribution loss rate，when cables are modeled as a single load and the concept of “cable lower factor (cosφ)”is introduced， determines that，as compared to the actual voltage drop rate of 3.5% (= 2.0% (main line)+ 1.5%( branch line))，distribution loss rate has been slightly higher，at 4.0% (=2 .0% (main line)
+2 .0% (branch line))，and demonstrates that use of upsized conductors could reduce the distribution loss rate of 4.0% to reduced 1.7%，an improvement of 2.3%. Also，it demonstrates that，in most cases，there is no increase in power consumption of loads associated with increased load input voltage resulting from the improvement in voltage drop，because loads for industrial use have been subjected to inverter control，and that the distribution loss reduction achievable through conductor upsizing will lead directly to energy savings. Finally，the paper concludes that a quantitative estimation for the effects achievable through conductor upsizing using standard low-voltage CVT model circuits based on a field survey on actual cable usage in plants and factories shows the peak power reduction rate of 2.1 % and an energy saving rate of 1.8% in the entire circuit.