Human visual inspection is still considered to play a major role in industrial inspection processes. Many studies have focused on the inspection accuracy or visual fatigue of the inspectors. However, upper limb load caused by maintaining posture and grasping an object is also considered a problem in visual inspection. If the inspection object is light in weight but large in size (e.g., a plastic part), the inspectors tend to take an awkward posture because of the difficulty in handling the object. Despite this, few studies have elucidated the effect of upper limb load. In this study, we therefore intend to clarify the effect of speed and scanning direction on upper limb load. In this experiment, ten healthy male subjects were asked to inspect objects using combinations of four speed conditions and two scanning direction conditions. For the speed conditions, speeds of 1.25, 1.00, and 0.75 s per inspection point, and a maximum effort speed were considered, and for the scanning direction conditions, the vertical and horizontal directions were considered. Muscle activity and grasp force were used as evaluation indices. We also investigated subjective indices for burden and task difficulty. Electromyography was performed in the sternomastoid muscle, the middle part of the deltoid muscle, the anterior part of the deltoid muscle, the clavicular part of the pectoralis major muscle, the biceps brachii muscle, the flexor carpi radialis muscle, the extensor carpi ulnaris muscle, and the flexor digitorum superficialis muscle. The results showed that the burden on muscles operating the cervical joint and wrist joint is high, and that the maniphalanx load due to grasping motion is different depending on the hand region.
This paper provides a proof that the parameter estimation problem in the generalized nested logit (GNL) models used in marketing science and transportation planning fields, is equivalent to information minimization problems with constraints in disaggregate level. To be specific, equivalence between log-likelihood maximization estimation problems of the GNL model and information minimization problems is proved using the two-stage optimization problem. In this problem, parameters in definite utility functions and allocation parameters correspond to an alternative level and similarity parameters correspond to a nest level. As part of the process to provide the proof, we show that constraints on allocation parameters are naturally considered in the log-likelihood maximization estimation problem of the GNL model. Using the properties of allocation parameters, we show new methods for parameter estimation in the GNL model, which rectify the heuristic methods proposed in Vovsha. First, we propose an estimation method using parameters in two-stage estimation as initial values in simultaneous estimation. Second, we propose a method using the primal-dual interior point method which utilizes duality in each stage of two-stage estimation in the GNL model. The GNL model includes the multinomial logit, the nested logit, the cross-nested logit and the pairwise combination logit models, and equivalence between all these models and entropy models are proved in this paper.
The distributed development model has been successfully adopted in such open source projects as the GNU/Linux operating system, Apache HTTP server and so on. However, the poor handling of the quality and customer support prohibits the progress of open source software (OSS). We focus on the problems of software quality. OSS systems are key components of critical infrastructures in society and continue to expand even now. Many OSS is developed in all parts of the world; for example, Firefox, Apache HTTP server, Linux and Android. In particular, a large-scale open source solution composed of several OSS is now attracting attention as a next-generation software development paradigm because of the cost reduction, quick delivery and reduced amount of work. In this paper, we propose a new approach to software quality assessment based on stochastic differential equations in order to consider the interesting aspect of the collision status in the binding phase of OSS. Notably, we derive several quality assessment measures considering the software service performance. In addition, we analyze actual software fault-count data to show numerical examples of software quality assessment considering the component collision for several OSS. Moreover, we show that the proposed method can assist in improving the quality of the large-scale open source solution.
The objectives to grasp the service productivity are “1: to grasp the performance quantitatively ”,“2: to grasp the gap between service provided and the quality of the service perceived by customers”,“3: and to connect the data to the improvement of the service productivity”. And it is one of the roles that industrial engineering should assume. Although the improvement in the service productivity is required, the researches about techniques and instructions that lead to the specific plan for improvement of service productivity are not sufficiently studied. In this paper, technique that derives the specific plan for improvement is proposed, by grasping the causes of low service productivity in the company or service activity in each business process, and by applying multiple regression analysis with information from survey data. In addition, the proposed technique is verified in the izakaya industry as a representative of the face-to-face service.
The role of services to fulfill various customer needs has become more important. To satisfy these customer needs with limited resources, cooperation with companies or consumers is considered to be an effective approach. Meanwhile, there are few approaches available to analyze requirements of stakeholders including cooperative partners and to design cooperative services based on these requirements. Therefore, the authors introduce a modeling method to visualize requirements of stakeholders and service processes to satisfy them and a service evaluation method based on this for the design of cooperative services. In addition, the authors propose a design process of cooperative services using these methods.
This paper deals with a problem of evaluating the profitability and safety of manufacturing investment alternatives under uncertainties. A method for comparing the economic superiority of alternatives is discussed under the assumption that sales price and sales volume (i.e., production volume) of each product and initial investment cost, variable cost and annual fixed cost of each investment alternative are given as uncertain factors. In this research area, the single-item problem, that is, the problem in which several manufacturing alternatives are given to produce one product has been already discussed. The analysis chart for identifying the profitability and safety of alternatives has been developed. In order to extend the model to the multi-item problem, the paper first clarifies the property for changing the analysis chart, which is presented in the previous research, to another one ensuring that analysis can be conducted in the same manner. The paper then proposes an analyzing method and analysis chart for the multi-item problem using the acquired property. The analysis chart proposed in this research can visually show which alternative is safer against the uncertain fluctuation of each parameter (sales price, sales volume, variable cost and annual fixed cost). On this basis, the paper presents a method for analyzing the risk of alternatives against the parameter fluctuations in the multi-item problem while introducing the concept of safety. The proposed method can assist decision-making when selecting a manufacturing investment project from several proposals.
The publications of books, magazines, music compact discs and newspapers in Japan are traded under the resale price maintenance and consignment sale systems. By these systems, the retail price is set uniformly nationwide, and dead stock is repurchased with the same price as the wholesale price. Therefore, it has not been required that bookstores in Japan actively develop an aggressive sale policy. On the other hand, a main policy of publishing companies is to deliver a large number of books in order to secure sales space in bookstores. As a consequence, in recent years, the returned goods rate of books has reached approximately 40 percent of the quantity of publications. This fact is getting a serious problem on management and environment. Under such a situation, a new sales system is considered where the wholesale price is lowered while cutting the buyback price is considered. In this case, both the wholesale price and buyback price are important contract parameters, and should be agreed on appropriately between the publishing company and bookstore. In this article, the supply chain contract problem between bookstore and publishing company is formulated. In particular, we assume that an accurate demand distribution of books is unknown from the reason why book has long product lifecycle and uniqueness. Accordingly, we apply the distribution free approach to solve this problem by using the limited information on mean and variance of demand. Then, a new decision procedure for contract parameters such as wholesale price and buyback price is discussed and suggested using the conventional coordination approach and game theory.
The GRS test is a popular statistical method for the evaluation of asset pricing models. It evaluates the asset pricing models focusing on the existence of the unexplained excess return of the test portfolio (whether alpha do or do not exist). Due to the bias making the explanatory power of the model higher when the test portfolios are highly correlated with the factors of the model, it is recommended to adopt the test portfolios constructed by the cluster analysis in the GRS test for asset pricing models. In this research, we perform the GRS test for asset pricing models using the Japanese equity market data and surprisingly find that the asset pricing models tend not to be rejected when we adopt the test portfolios constricted by the cluster analysis. This result is totally different from that of the preceding research using the US equity market data. We examine the reasons behind our empirical result based on the simulation analysis. Our simulation analysis suggests that the asset pricing models sometimes may not be correctly evaluated when we rely only on the result of the GRS test and the magnitude of the average intercept and the coefficient of determinant should also be used to supplement the evaluation.
In recent years, information systems have becomes important in many corporations. A maintenance process in an information system has more costs than a development process for an information system as a managed item. As one of the causes, it appears that the system is not delivered by the appointed date and a part of the development process is executed in the maintenance process. Therefore, the cost of the maintenance process increases. However, it is not clear what influence is caused on the maintenance process by a development project with a delayed delivery date. As the main reasons for the increase in the cost of the maintenance process, there appears to be 1) an increasing number of user inquiries for software operation, data validity and software bugs, and 2) a long processing time for the user inquiries. However, it is not obvious what influence these factors have on the increased cost of a maintenance process. In this study, in order to enable proper countermeasures, the tendencies in the occurrence of solved user inquiries are analyzed. In this analysis, actual data from a development site is used.
To realize a sustainable society, there is an urgent need to find solutions and appropriate business models. In this context, services are becoming increasingly important in the manufacturing industry, since a longer life or greater added value of a product can be achieved by offering relevant services combined with a product. The authors carried out research on service design as part of the service engineering forum (SEFORUM), which is an industry-university cooperative consortium to discuss service design methodology established in 2002. In the SEFORUM, from 2008 to 2011, a project to improve a construction/maintenance service for power supply facilities was carried out. This paper reports the achievements of the project activities. The authors applied methods to support receiver-oriented service improvements through the project. Here, two kinds of methods, which were developed by the authors, were applied. The first is a method to enable service designers to acquire the receiver's requirements structure. The second is a method to identify service improvement plans that should be preferentially conducted in a service improvement. The concept of this method is to use an optimum resource allocation method. On this basis, service improvement plans are prioritized from the viewpoint of maximizing customer satisfaction. The application result shows that the first method is helpful to acquire the receiver's requirements structure easily in a step wise manner, and, the second method provides clear criteria that help designers' decision-making in service improvement design.