The Mahalanobis-Taguchi(MT) system is widely used and is one of the quality engineering methods(Taguchi method). Taguchi’s T-method is suitable for regression issues among MTsystems. In addition, there is an improved version of the T-method called the Ta-method.This study examinedthe effect of variable selection using the Ta method. Specifically, we consider two lasso’s single regression forvariable selection before applying the Ta method and evaluate their performanceusing Monte Carlo simulations.
The Mahalanobis Taguchi(MT) method is used for pattern recognition and anomaly detection. It defines apopulation homogeneous to the objective as the unit space and uses theMahalanobis distance from its center to discriminate. However, a problem withthis method is that the number of samples must be greater than the number ofvariables in the data. This problem is caused by the calculation of theMahalanobis distance. To solve this problem, the MT -bagging method, whichapplies feature bagging to the MT method was proposed. This study proposes twomethods that apply pruning to MT bagging. One is based on ordering pruning andthe other, on clustering pruning. In the ordering-based method, thesignal-to-noise ratio of the training abnormality data are calculated for eachweak learner. Only the weak learners with high signal-to-noise ratios areensembled. In the clustering-based method, weak learners are clustered usingthe K-means method with the Mahalanobis distance of the training anomaly data,and the centers of the clusters are ensemble learned. Breast cancer and drybean datasets are used to verify the performance of the proposed methods. Bothmethods outperformed the MT and MT-bagging methods in terms of abnormalitydiscrimination accuracy, suggesting that ensemble pruning is effective in somesituations.
Training of deep neural networks (DNNs) requires large amounts of data. However, the automotive components that are the subject of this research have an extreme lack of defective product data due to rapid model changes and a low defective product rate during the manufacturing process. Additionally, the anomaly areas are negligible. Data augmentation (DA), which increases data by image transformations, is a method for solving data deficiency. Particularly, a deep convolutional generative adversarial network (DCGAN) is frequently employed in the medical industry. DA is shown to have an effect on not small anomalies but on images that are accounted by the classification target for a large percentage of the total image.
Therefore, in the present study, we find the DA method that fits objects with very small anomaly areas. We use a method that combines existing methods and DCGAN because DCGAN only cannot be used in cases where target images are few. After increasing the data to some extent using existing methods for only the anomaly area, we increase it further using DCGAN and then paste the completed defect to the component images.
The classification performance using DA, which modifies size, shape, and the like, based on the inspector’s experience, yielded a recall value of 76.9%, whereas the performance using DA and DCGAN yielded a slightly lower recall value of 65.4%.
This research focused on a single-unit system thatdeteriorates in accordance with a Wiener process. The deterioration state iscompletely observed by sensors at equally spaced time intervals. The decision-makermust select one of three actions: do nothing, preventive replacement or correctivereplacement. Most of the recent research on condition-based maintenance (CBM)was based on the assumption that the deterioration process is the same in thetarget unit population. However, the spare units for replacement maydeteriorate at different rates due to population heterogeneity, which isreflected in the assumption that the spare unit population consists of somesubpopulations. The deterioration process is deemed on the basis of the currenttype probability vector and deterioration state. On the basis of thispopulation heterogeneity, the optimal decision-making problem of CBM isformulated as a Markov decision process. The objective is to minimize the totalexpected discounted cost over an infinite horizon. This research proved thatthe total expected discounted cost is monotonically non-decreasing in typeprobability vector and deterioration state. It also proved that the optimalmaintenance policy is a control limit policy. A numerical example demonstratedthat the total expected discounted cost with considering populationheterogeneity is reduced more than that without considering populationheterogeneity when the spare unit population is assumed to be heterogeneous.Sensitivity analysis is conducted in the diffusion coefficient in the Wienerprocess and the spare unit population distribution.
Control charts are a typical method of statistical process control. They arose in the context of low-mix high-volume production. However, high-mix low-volume production has become mainstream as needs diversify, and parameter estimation accuracy has decreased because obtaining sufficient measurement values for each product is difficult. Therefore, performing process control using multivariate characteristics has become challenging. control chart is widely used for managing multivariate control characteristics and several multivariate control charts have emerged based on it, but these methods work only with a sufficient number of samples, more than in the univariate case. Therefore, conventional control charts are not applicable for low-volume production, research on Bayesian statistics-based control charts has been conducted and their usefulness has been demonstrated. Although previous studies have proposed control charts using hierarchical Bayesian modeling, these charts do not address multivariate data. Therefore, in this study, we propose multivariate hierarchical Bayesian control charts that can accommodate multivariate characteristics. By developing hierarchical Bayesian modeling that considers differences among product types, estimation accuracy can be improved even in high-mix low-volume production. According to the simulation analysis, the proposed method outperformed control chart in the high-mix low-volume production environment, and its performance also improved as the number of product types increased.
Face-to-face interviews suffer from the problem that recruitment evaluations depend on the individual skills of the interviewer. Video interviewing is one potential solution to this problem, as it enables quality control over evaluations. However, there has been little research into using information gathered from video interviews to generate recruitment evaluation scores. In this paper, we research whether video interviews enable impressions to be formed of interviewees’ personalities, as outward impressions are more important than actual personality for recruitment interviews and in the service industry. We use the “Big Five” measures of personality traits and integrate scores using Item Response Theory and mean of standardized value. We also categorize personality impression items into three groups: items that raters give different scores, items that raters give similar scores, and items that cannot be evaluated in a video interview. Regression analysis shows that the items of the former two types explain 68.2% of the variance in hiring decisions. The impression of Extraversion was often observed in video and affected Smile and Voice Tone evaluations regarding recruitment. The impressions of openness and dutifulness have positive effects on hiring decision. This research indicates that it is possible to obtain personality scores that support recruitment activities from video interviews via, for example, machine learning.