IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Volume E92.D, Issue 1
Displaying 1-14 of 14 articles from this issue
Regular Section
  • Shigero SASAKI, Atsuhiro TANAKA
    Article type: PAPER
    Subject area: Computer Systems
    2009 Volume E92.D Issue 1 Pages 1-9
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    Cluster systems are prevalent infrastructures for offering e-services because of their cost-effectiveness. The objective of our research is to enhance their cost-effectiveness by reducing the minimum number of nodes to meet a given target performance. To achieve the objective, we propose a load balancing algorithm, the Nearest Underloaded algorithm (N algorithm). The N algorithm aims at quick solution of load imbalance caused by request departures while also preventing herd effect. The performance index in our evaluation is the xth percentile capacity which we define based on throughputs and the xth percentile response times. We measured the capacity of 8- to 16-node cluster systems under the N algorithm and existing Least-Loaded (LL) algorithms, which dispatch or transfer requests to the least-loaded node. We found that the N algorithm could achieve larger capacity or could achieve the target capacity with fewer nodes than LL algorithms could.
    Download PDF (562K)
  • Jungja KIM, Heetaek CEONG, Yonggwan WON
    Article type: PAPER
    Subject area: Data Mining
    2009 Volume E92.D Issue 1 Pages 10-15
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    In market-basket analysis, weighted association rule (WAR) discovery can mine the rules that include more beneficial information by reflecting item importance for special products. In the point-of-sale database, each transaction is composed of items with similar properties, and item weights are pre-defined and fixed by a factor such as the profit. However, when items are divided into more than one group and the item importance must be measured independently for each group, traditional weighted association rule discovery cannot be used. To solve this problem, we propose a new weighted association rule mining methodology. The items should be first divided into subgroups according to their properties, and the item importance, i.e. item weight, is defined or calculated only with the items included in the subgroup. Then, transaction weight is measured by appropriately summing the item weights from each subgroup, and the weighted support is computed as the fraction of the transaction weights that contains the candidate items relative to the weight of all transactions. As an example, our proposed methodology is applied to assess the vulnerability to threats of computer systems that provide networked services. Our algorithm provides both quantitative risk-level values and qualitative risk rules for the security assessment of networked computer systems using WAR discovery. Also, it can be widely used for new applications with many data sets in which the data items are distinctly separated.
    Download PDF (807K)
  • Marat ZHANIKEEV, Yoshiaki TANAKA
    Article type: PAPER
    Subject area: Networks
    2009 Volume E92.D Issue 1 Pages 16-23
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    Traditional traffic analysis is can be performed online only when detection targets are well specified and are fairly primitive. Local processing at measurement point is discouraged as it would considerably affect major functionality of a network device. When traffic is analyzed at flow level, the notion of flow timeout generates differences in flow lifespan and impedes unbiased monitoring, where only n-top flows ordered by a certain metric are considered. This paper proposes an alternative manner of traffic analysis based on source IP aggregation. The method uses flows as basic building blocks but ignores timeouts, using short monitoring intervals instead. Multidimensional space of metrics obtained through IP aggregation, however, enhances capabilities of traffic analysis by facilitating detection of various anomalous conditions in traffic simultaneously.
    Download PDF (736K)
  • Jong-In LEE, Ho-Jung BANG, Tai-Hyo KIM, Sung-Deok CHA
    Article type: PAPER
    Subject area: Dependable Computing
    2009 Volume E92.D Issue 1 Pages 24-31
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    Automated static timing analysis methods provide a safe but usually overestimated worst-case execution time (WCET) due to infeasible execution paths. In this paper, we propose a visual language, User Constraint Language (UCL), to obtain a tight WCET estimation. UCL provides intuitive visual notations with which users can easily specify various levels of flow information to characterize valid execution paths of a program. The user constraints specified in UCL are translated into finite automata. The combined automaton, constructed by a cross-production of the automata for program and user constraints, reflects the static structure and possible dynamic behavior of the program. It contains only the execution paths satisfying user constraints. A case study using part of a software program for satellite flight demonstrates the effectiveness of UCL and our approach.
    Download PDF (1195K)
  • Takanori ISOBE, Toshihiro OHIGASHI, Hidenori KUWAKADO, Masakatu MORII
    Article type: PAPER
    Subject area: Application Information Security
    2009 Volume E92.D Issue 1 Pages 32-40
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    In this paper, we propose an effective key recovery attack on stream ciphers Py and Pypy with chosen IVs. Our method uses an internal-state correlation based on the vulnerability that the randomization of the internal state in the KSA is inadequate, and it improves two previous attacks proposed by Wu and Preneel (a WP-1 attack and a WP-2 attack). For a 128-bit key and a 128-bit IV, the WP-1 attack can recover a key with 223 chosen IVs and time complexity 272. First, we improve the WP-1 attack by using the internal-state correlation (called a P-1 attack). For a 128-bit key and a 128-bit IV, the P-1 attack can recover a key with 223 chosen IVs and time complexity 248, which is 1/224 of that of the WP-1 attack. The WP-2 attack is another improvement on the WP-1 attack, and it has been known as the best previous attack against Py and Pypy. For a 128-bit key and a 128-bit IV, the WP-2 attack can recover a key with 223 chosen IVs and time complexity 224. Second, we improve the WP-2 attack by using the internal-state correlation as well as the P-1 attack (called a P-2 attack). For a 128-bit key and a 128-bit IV, the P-2 attack can recover a key with 223 chosen IVs and time complexity 224, which is the same capability as that of the WP-2 attack. However, when the IV size is from 64bits to 120bits, the P-2 attack is more effective than the WP-2 attack. Thus, the P-2 attack is the known best attack against Py and Pypy.
    Download PDF (374K)
  • Jaehoon KIM, Seog PARK
    Article type: PAPER
    Subject area: Application Information Security
    2009 Volume E92.D Issue 1 Pages 41-50
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    An expectation for more intelligent Web is recently being reflected through the new research field called Semantic Web. In this paper, related with Semantic Web security, we introduce an RDF triple based access control model having explicit authorization propagation by inheritance and implicit authorization propagation by inference. Especially, we explain an authorization conflict problem between the explicit and the implicit authorization propagation, which is an important concept in access control for Semantic Web. We also propose a novel conflict detection algorithm using graph labeling techniques in order to efficiently find authorization conflicts. Some experimental results show that the proposed detection algorithm has much better performance than the existing detection algorithm when data size and number of specified authorizations become larger.
    Download PDF (1065K)
  • Hiroyuki NARITA, Yasumasa SAWAMURA, Akira HAYASHI
    Article type: PAPER
    Subject area: Pattern Recognition
    2009 Volume E92.D Issue 1 Pages 51-58
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    One of the advantages of the kernel methods is that they can deal with various kinds of objects, not necessarily vectorial data with a fixed number of attributes. In this paper, we develop kernels for time series data using dynamic time warping (DTW) distances. Since DTW distances are pseudo distances that do not satisfy the triangle inequality, a kernel matrix based on them is not positive semidefinite, in general. We use semidefinite programming (SDP) to guarantee the positive definiteness of a kernel matrix. We present neighborhood preserving embedding (NPE), an SDP formulation to obtain a kernel matrix that best preserves the local geometry of time series data. We also present an out-of-sample extension (OSE) for NPE. We use two applications, time series classification and time series embedding for similarity search, to validate our approach.
    Download PDF (486K)
  • Ju LIU, Hua YAN, Jian-de SUN
    Article type: PAPER
    Subject area: Image Processing and Video Processing
    2009 Volume E92.D Issue 1 Pages 59-68
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    Considering the inaccuracy of image registration, we propose a new regularization restoration algorithm to solve the ill-posed super-resolution (SR) problem. Registration error is used to obtain cross-channel error information caused by inaccurate image registration. The registration error is considered as the noise mean added into the within-channel observation noise which is known as Additive White Gaussian Noise (AWGN). Based on this consideration, two constraints are regulated pixel by pixel within the framework of Miller's regularization. Regularization parameters connect the two constraints to construct a cost function. The regularization parameters are estimated adaptively in each pixel in terms of the registration error and in each observation channel in terms of the AWGN. In the iterative implementation of the proposed algorithm, sub-sampling operation and sampling aliasing in the detector model are dealt with respectively to make the restored HR image approach the original one further. The transpose of the sub-sampling operation is implemented by nearest interpolation. Simulations show that the proposed regularization algorithm can restore HR images with much sharper edges and greater SNR improvement.
    Download PDF (553K)
  • Dae Hyun KIM, Myoung-Jun KIM
    Article type: PAPER
    Subject area: Computer Graphics
    2009 Volume E92.D Issue 1 Pages 69-77
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    Pen-input is not a new means for CAD designers, in particular, in the concept design phase. Meanwhile, B-Splines are well known curve and surface design tool in 3D shape modeling in the final modeling stages in which neat curves and surfaces should be produced. In this paper, an intuitive B-Splines design method that can be used for the CAD systems both in conceptual modeling phase and in later design phases is proposed. Unlike the control point based interactive modification schemes for the B-spline curves and surfaces, we extend what has been called the “touch-and-replace” method used for poly-line modification in the late 1980s to B-Splines; our approach uses successive pen strokes to modify the final shape of the existing B-Spline curves and surfaces. We also show related user test results in this paper as an empirical proof.
    Download PDF (848K)
  • Hongwei DAI, Yu YANG, Cunhua LI, Jun SHI, Shangce GAO, Zheng TANG
    Article type: PAPER
    Subject area: Biocybernetics, Neurocomputing
    2009 Volume E92.D Issue 1 Pages 78-85
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    Clonal Selection Algorithm (CSA), based on the clonal selection theory proposed by Burnet, has gained much attention and wide applications during the last decade. However, the proliferation process in the case of immune cells is asexual. That is, there is no information exchange during different immune cells. As a result the traditional CSA is often not satisfactory and is easy to be trapped in local optima so as to be premature convergence. To solve such a problem, inspired by the quantum interference mechanics, an improved quantum crossover operator is introduced and embedded in the traditional CSA. Simulation results based on the traveling salesman problems (TSP) have demonstrated the effectiveness of the quantum crossover-based Clonal Selection Algorithm.
    Download PDF (603K)
  • Woohyung LIM, Chang Woo HAN, Nam Soo KIM
    Article type: LETTER
    Subject area: Speech and Hearing
    2009 Volume E92.D Issue 1 Pages 86-89
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    In this letter, we propose a novel approach to feature compensation performed in the cepstral domain. Processing in the cepstral domain has the advantage that the spectral correlation among different frequencies is taken into consideration. By introducing a linear approximation with diagonal covariance assumption, we modify the conventional log-spectral domain feature compensation technique to fit to the cepstral domain. The proposed approach shows significant improvements in the AURORA2 speech recognition task.
    Download PDF (91K)
  • Fa-Xin YU, Zhe-Ming LU, Zhen LI, Hao LUO
    Article type: LETTER
    Subject area: Image Processing and Video Processing
    2009 Volume E92.D Issue 1 Pages 90-92
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    In this Letter, we propose a novel method of low-level global motion feature description based on Vector Quantization (VQ) index histograms of motion feature vectors (MFVVQIH) for the purpose of video shot retrieval. The contribution lies in three aspects: first, we use VQ to eliminate singular points in the motion feature vector space; second, we utilize the global motion feature vector index histogram of a video shot as the global motion signature; third, video shot retrieval based on index histograms instead of original motion feature vectors guarantees the low computation complexity, and thus assures a real-time video shot retrieval. Experimental results show that the proposed scheme has high accuracy and low computation complexity.
    Download PDF (75K)
  • Cheon Seog KIM, Hosik SOHN, Wesley De NEVE, Yong Man RO
    Article type: LETTER
    Subject area: Image Processing and Video Processing
    2009 Volume E92.D Issue 1 Pages 93-96
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    In this paper, we propose an Adaptation Decision-Taking Engine (ADTE) that targets the delivery of scalable video content in mobile usage environments. Our ADTE design relies on an objective perceptual quality metric in order to achieve video adaptation according to human visual perception, thus allowing to maximize the Quality of Service (QoS). To describe the characteristics of a particular usage environment, as well as the properties of the scalable video content, MPEG-21 Digital Item Adaptation (DIA) is used. Our experimental results show that the proposed ADTE design provides video content with a higher subjective quality than an ADTE using the conventional maximum-bit-allocation method.
    Download PDF (421K)
  • Dongil HAN, Hak-Sung LEE, Chan IM, Seong Joon YOO
    Article type: LETTER
    Subject area: Image Processing and Video Processing
    2009 Volume E92.D Issue 1 Pages 97-101
    Published: January 01, 2009
    Released on J-STAGE: January 01, 2009
    JOURNAL FREE ACCESS
    This paper describes a color correction method of low-cost still/video camera images. Instead of using complex and non-linear equations, the concept of a three-dimensional reduced resolution look-up table is used for the real-time color gamut expansion of low-cost cameras. The proposed method analyzes the color gamut of low cost cameras and constructs 3-dimensional rule tables during the off-line stage. And, real-time color correction is conducted using that rule table. The experimental result shows that output images have more vivid and natural colors compared with originals. The proposed method can be easily implemented with small software and/or hardware resources.
    Download PDF (1117K)
feedback
Top