Using less electric power or speeding up processing is catching the interests of researchers in deep learning. Quanti-zation has offered distillation mechanisms that substitute floating numbers for integers, but little has been suggested about the floating numbers themselves. The use of Q-format notation reduces computational overheads that frees resources for the in-troduction of more operations. Our experiments, conditioned on varying regimes, introduce automatic differentiation on algo-rithms like the fast Fourier transforms and Winograd minimal filtering to reduce computational complexity (expressed in total number of MACs) and suggest a path towards the assistive intelligence concept. Empirical results show that, under specific heuristics, the Q-format number notation can overcome the shortfalls of floating numbers, especially for embedded systems. Further benchmarks like the FPBench standard give more details by comparing our proposals with common deep learning operations.
The Maxwell equations describe the interaction of electric and magnetic lines of force. However, Heaviside suggested that the interaction of electromagnetic waves is based on power and that their direction is perpendicular to the two lines of force owing to the roles of the Poynting vector. Therefore, the interaction of electromagnetic waves is analyzed by the expansion theorem derived by Heaviside. It is considered that the currently used z-transform and image parameter theory were developed from the expansion theorem. Heaviside also calculated the electromagnetic pulses (EMPs) for electrons, which give quantum mechanics a viewpoint different from the conventional concept.