Transactions of Japanese Society for Medical and Biological Engineering
Online ISSN : 1881-4379
Print ISSN : 1347-443X
ISSN-L : 1347-443X
Volume 48, Issue 4
Displaying 1-6 of 6 articles from this issue
Contributions
  • Shin HIBINO, Masahiro NAKATOCHI, Norihiro UEDA, Mitsuru HORIBA, Kenji ...
    2010 Volume 48 Issue 4 Pages 359-368
    Published: August 10, 2010
    Released on J-STAGE: May 27, 2011
    JOURNAL FREE ACCESS
    The purpose of this study is to establish the accurate automated QT measurement system for detecting drug-induces QT interval prolongation. One hundred and thirty five electrocardiograms (ECGs) were recorded from fifty healthy men. We applied Gaussian mixture model to approximate the ECG trace with split and merge expectation maximization (SMEM) algorithm for optimal model parameters. Lead II ECG trace from the beginning of a T wave to the end of subsequent beat's P wave was approximated by six Gaussian functions with SMEM algorithm. The end of T wave from original ECG was registered by an experienced cardiologist and corresponding point on the approximated ECG trace was defined as a reference point. A primary estimated point for the end of T wave was selected by using a threshold that was the averaged height differences between the reference point and the 2 ms preceding point. The averaged interval in absolute value between the estimated point and the reference point was 5.4 ± 4.3 ms. Furthermore, it was diminished to 5.0 ± 3.8 ms by multiple regression model analysis (Dependent variable: the reference point as the end of T wave. Independent variables: the primary estimated point as the end of T wave and the each height associated with the fourth and the sixth Gaussian function). The final estimated QT interval and the reference QT interval were 399.2 ± 27.2 ms and 399.2 ± 28.0 ms, respectively. We conclude that our system is able to measure QT interval correctly.
    Download PDF (737K)
  • Yinlai JIANG, Masanaga IKEGAMI, Hirotaka YANAGIDA, Tatsuhisa TAKAHASHI ...
    2010 Volume 48 Issue 4 Pages 369-376
    Published: August 10, 2010
    Released on J-STAGE: May 27, 2011
    JOURNAL FREE ACCESS
    To quantify a subject's skill at visual interpolation, as in the identification of incomplete letters, we developed a new computer-based system for presenting fragmented letters of the English alphabet. Using this system, we examined the quantitative relationship between the fragmentation of letters and their correct identification in six healthy young male subjects. The 26 letters (black, 72-point, MSP Gothic) were randomly displayed one by one on the screen of a personal computer: each letter appeared once per test. Each of the letters was presented within a square (128 × 128 pixels) against a white background for 200 ms. The fragmented letters were produced by randomly removing pixels from complete letters using three different modes of elimination: removal of single dots or small and large rectangles with random rotation. The complete and fragmented letters were evaluated with respect to the density of information provided by the pixels that constituted the letters according to information theory. There was a direct correlation between the percentage of pixels removed from the complete letters and the density of information about the removed pixels. The scores for the correct identification of fragmented letters with dot elimination remained at almost 100% correct identification regardless of the percentage of pixels removed. In contrast, the correct identification scores with the elimination of large rectangles decreased with the removal of 70%, 80%, 86%, 90%, and 92% of pixels. The correct identification scores for the elimination of small rectangles fell between the scores for the other elimination modes. These results suggest that the identification of fragmented letters could be related to the recognition of structures in which imaginary lines exist between closest pixels, within some limited distance between existing pixels. The evaluation of fragmented-letter identification using this system may be useful for quantifying the capability of visual interpolation.
    Download PDF (1108K)
  • Kazushige OSHITA, Sumio YANO
    2010 Volume 48 Issue 4 Pages 377-382
    Published: August 10, 2010
    Released on J-STAGE: May 27, 2011
    JOURNAL FREE ACCESS
    The purpose of this study was to investigate the asymmetry of force fluctuation during low-intensity isometric plantar flexion. Twelve healthy males (21 ± 1 years) performed unilateral force matching tasks. Tasks were performed to maintain isometric plantar flexion for 15 seconds at levels corresponding to 10% and 20% maximal voluntary contraction (MVC) with the visual feedback of force. Force fluctuation during force matching tasks was quantified as the standard deviation of force signals. Power spectrum analysis of the force signals was obtained by the fast Fourier transformation method, and asymmetry of < 4 Hz power and 8-12 Hz power were calculated. Significant difference in force fluctuation between right and left legs was not observed, and significant correlation was observed in force fluctuation between right and left legs. Although force fluctuation was greater in 20% MVC task than that in 10% MVC task, asymmetry of force fluctuation was not different between 10% and 20% MVC tasks. Further, asymmetry of < 4 Hz power was not different between 10% and 20% MVC tasks. However, asymmetry of 8-12 Hz power was significantly greater in 20% MVC tasks than that in 10% MVC tasks. These results suggest that the asymmetry of force steadiness during low-intensity isometric plantar flexion is not observed. However, the asymmetry of physiological tremor in force fluctuation may increase with contraction intensity.
    Download PDF (317K)
  • Mamiko FUJII, Reiko ENDOH, Kiyoshi NAKAYAMA
    2010 Volume 48 Issue 4 Pages 383-395
    Published: August 10, 2010
    Released on J-STAGE: May 27, 2011
    JOURNAL FREE ACCESS
    In this study, we propose an inverse algorithm that suppresses the undesirable effects of skin circulation in near-infrared diffuse optical topography, which not only provides 2D images of deep regions but also displays undesirable changes in skin circulation. For this algorithm, a voxel is placed under each optode in a shallow plane, and a 2D voxel array is prepared as a target-imaging plane in a deep region. A sensitivity matrix having elements that correspond to these voxels in the shallow plane and target plane is calculated on the basis of optical diffusion approximation. The estimated relative absorption change is obtained from the observation data by Moore-Penrose inverse of the sensitivity matrix. The performances of the following two types of regularization are compared: simple regularization and sensitivity adaptive regularization. Both methods can suppress undesirable noise caused by absorption changes in shallow regions; in particular, the sensitivity adaptive method provides satisfactory reconstructed images of the target region. The performances of the voxels in the shallow plane are also investigated by interpolating the estimated absorption changes corresponding to the voxels to form a 2D image. The result shows that the voxels can provide distinct information about the skin circulation. A phantom experiment also shows the advantages of the proposed method.
    Download PDF (946K)
Essays
feedback
Top