2024 年 15 巻 2 号 p. 473-484
A cloudlet is a cluster of computers that exists within the same Local Area Network (LAN) and can be accessed by all users connected to the network in a single hop. Compared to the traditional edge computing system, where servers are located at base stations, cloudlets enable real-time communication with less network latency. On the other hand, the lack of computing power is another problem to be solved. Conventional research examines offloading jobs between cloudlets in order to reduce workload variance whilst ensuring that the total latency of each cloudlet remains below an acceptable level. However, the calculation of total latency is dependent on the fraction of offloaded jobs, which results in an unfair decision regarding latency for a smaller fraction of jobs within the system. It is necessary to explore fairer solutions for offloading decisions in cloudlet systems. In this paper, we present a method for load balancing that prioritises reducing latency, utilising game theory. We assess the method's effectiveness by comparing the differences in utilisation between cloudlets.