詳細検索結果
以下の条件での結果を表示する: 検索条件を変更
クエリ検索: "Oracle Database"
46件中 1-20の結果を表示しています
  • 木村 達也, 金子 静花
    電子情報通信学会 通信ソサイエティマガジン
    2018年 11 巻 4 号 258-263
    発行日: 2018/03/01
    公開日: 2018/03/01
    ジャーナル フリー
  • Ru-Qiang Wang, Chu-Fu Li, Xiao-Rong He, Bing-Zhen Chen, Ping Wang, Heng-Qiu Wang, Chun-Jian Dong
    JOURNAL OF CHEMICAL ENGINEERING OF JAPAN
    2009年 42 巻 2 号 111-116
    発行日: 2009/02/20
    公開日: 2009/02/20
    ジャーナル 認証あり
    An optimization model for chemical production planning with optimal management of purchase and inventory was developed in this paper, based on the characteristics of the chemical industry. The objective of the model was to maximize the gross profit of a chemical enterprise while considering several constraints, such as material balance, chemical reaction balance, production capacity, product demand, raw material supply, inventory balance and inventory capacity. In the optimization model, a novel inventory management model was proposed to implement optimal management of purchase and inventory of raw and intermediate materials. In order for the above planning model to be applied in chemical enterprises to improve management level, a Graphic I/O Chemical Industry Modeling System (GIOCIMS) was developed. The proposed chemical production planning optimization model with optimal management of purchase and inventory was validated by two cases in which GIOCIMS was applied in a real world chemical enterprise. Results indicated that the proposed model was efficient for production planning optimization in the chemical industry.
  • 甲木 洋介
    知能と情報
    2012年 24 巻 3 号 100-105
    発行日: 2012/06/15
    公開日: 2018/01/11
    ジャーナル フリー
  • Xiaogang Li, Jin Gao, Chaofang Dong, Cuiwei Du, Degui Luo, Lin Lu
    Data Science Journal
    2007年 6 巻 S913-S925
    発行日: 2007/12/15
    公開日: 2007/12/19
    ジャーナル フリー
    This article discusses the key features of a newly developed national data-sharing online network for material environmental corrosion. Written in Java language and based on
    Oracle
    database
    technology, the central database in the network is supported with two unique series of corrosion failure data, both of which were accumulated during a long period of time. The first category of data, provided by national environment corrosion test sites, is corrosion failure data for different materials in typical environments (atmosphere, seawater and soil). The other category is corrosion data in production environments, provided by a variety of firms. This network system enables standardized management of environmental corrosion data, an effective data sharing process, and research and development support for new products and after-sale services. Moreover this network system provides a firm base and data-service platform for the evaluation of project bids, safety, and service life. This article also discusses issues including data quality management and evaluation in the material corrosion data sharing process, access authority of different users, compensation for providers of shared historical data, and finally, the related policy and law legal processes, which are required to protect the intellectual property rights of the database.
  • Cuiping Ge, Jun Zhao, Shaoliang Zhang, Lei Shang, Sheng Yin
    Data Science Journal
    2007年 6 巻 S867-S878
    発行日: 2007/12/02
    公開日: 2007/12/12
    ジャーナル フリー
    The comprehensive database system of the Northeast agro-ecology of black soil (CSDB_BL) is user-friendly software designed to store and manage large amounts of data on agriculture. The data was collected in an efficient and systematic way by long-term experiments and observations of black land and statistics information. It is based on the
    ORACLE
    database
    management system and the interface is written in PB language. The database has the following main facilities:(1) runs on Windows platforms; (2) facilitates data entry from *.dbf to ORACLE or creates ORACLE tables directly; (3)has a metadata facility that describes the methods used in the laboratory or in the observations; (4) data can be transferred to an expert system for simulation analysis and estimates made by Visual C++ and Visual Basic; (5) can be connected with GIS, so it is easy to analyze changes in land use ; and (6) allows metadata and data entity to be shared on the internet. The following datasets are included in CSDB_BL: long-term experiments and observations of water, soil, climate, biology, special research projects, and a natural resource survey of Hailun County in the 1980s; images from remote sensing, graphs of vectors and grids, and statistics from Northeast of China. CSDB_BL can be used in the research and evaluation of agricultural sustainability nationally, regionally, or locally. Also, it can be used as a tool to assist the government in planning for agricultural development. Expert systems connected with CSDB_BL can give farmers directions for farm planting management.
  • 西部 茂美, 吉田 弘, 油野 民雄
    The Journal of JASTRO
    2000年 12 巻 2 号 115-124
    発行日: 2000/06/25
    公開日: 2011/07/11
    ジャーナル フリー
    【背景・目的】我々の施設における放射線腫瘍学部門は, 膨大な量の患者情報を扱わなければならず, コンピュータシステムの活用無くしては, その効率的な運用利用は不可能にちかい. それゆえ, 我々は二つの目的すなわち, 1) 画像のフォーマットを統一するために, ACR-NEMA2.0フォーマットにタグ情報を附加し, 使用する全画像をDICOMフオーマットに統一・してデータの互換性を持たせること. 2) 画像データと放射線治療に関する文字・数値データを統合して管理することを主旨としてシステムの開発をした. 開発した情報ネットワークシステムは, このような情報の管理のために避けられない選択枝であると考える.
    【材料・方法】このネットワークシステムでは, 画像情報のフォーマットをDICOMに統一し, ダイコムサーバ上での運用方法を標準化した.画像モダリティ側とDICOMサーバの通信には, サービスの種類 (例として, Storage Service Class) と, 伝える情報内容 (CT, MR, CR画像等) を相手に伝え, その了解を得る. DICOMではサービスの種類をサービスクラスといい, 伝えられる情報内容のことを情報オブジェクト (IOD) と呼ぶ. この二つを組み合わせたものがサービスオブジェクト対クラス (SOP) と呼ばれるもので, SOPを送って相手側の回答を得る. 同じく, システム間のコミュニケーションはTCP/IP (FTP/NFS) を用い標準化した.
    【結果】今までの各画像装置問での複雑な調整作業から解放された.画像情報とテキストベース情報の両方を汎用のリレーショナルデータベース (Oracle 7) を用いて統一し, バラバラに記録していた種々のタイプのデータを効率的に管理することが可能となった.
    【結語】このシステムは, 放射線治療に関するさまざま情報 (治療計画画像, 放射線療法記録データや照射処方箋データ等を含む) をどの端末からでも迅速にサーバにアクセスして利用可能である. 近い将来, 大学全体を網羅する情報ネットワークシステムの拡大を予定している. さらに診療面でのシステムの評価結果は, 特に以前の診療形態 (フィルム等のデータの棚保管) と比較し, 開発したネットワークシステムを利用すれば, 過去20年間の再照射の患者情報を迅速に検索することができ, 煩雑な日常業務から解放されとても有用である.
  • 守口 淑秀, 武市 佳己, 末丸 克矢, 荒木 博陽
    医療薬学
    2004年 30 巻 8 号 511-517
    発行日: 2004/08/10
    公開日: 2011/03/04
    ジャーナル フリー
    Our pharmacy accepted the challenge of improving management at Ehime University Hospital from the perspective of rationalizing drug use. The following 3 points were considered important and examined : increasing the rate of prescriptions dispensed at pharmacies outside the hospital, switching to generic drugs, reducing unnecessary drug use by providing information on the proper use of drugs. Information was processed on the pharmacy client terminal computer system using Microsoft ACCESS 2000, which was connected through an open database connectivity (ODBC) interface with an
    Oracle
    database
    . This allowed information to be retrieved from the hospital application system network.
    As a result of our efforts, the Management Improvement Committee made a request to departments and doctors with low rates for prescriptions dispensed outside the hospital to cooperate in increasing them. The rate has now risen to over 90%. Nine injection drugs were switched to generics, which has reduced costs. Our investigation of the use of blood preparations (Antithrombin III drugs) revealed that it was more economical to use one particular drug in this category. Pharmaceutical kits containing antibiotics were switched to separate vials except in cases where there was concern associated with such a switch. High use of albumin preparations and G-CSF pharmaceuticals was reduced through the use of information sheets informing clinical departments and doctors about their proper use. Our contributions to management improvement enabled drug use to be reduced to about 70% of the previous level, even better than the target of about 85% versus the previous year. Our project seemed to have raised awareness of the importance of using drugs properly and drug cost control among the doctors in each clinical department.
  • 岩崎 将之
    知能と情報
    2013年 25 巻 6 号 190-194
    発行日: 2013/12/15
    公開日: 2017/12/14
    ジャーナル フリー
  • Fenglin Peng, Xiaoyang Shen, Keyun Tang, Jian Zhang, Qinghua Huang, Yuanfang Xu, Bangyan Yue, Dan Yang
    Data Science Journal
    2007年 6 巻 S404-S407
    発行日: 2007年
    公開日: 2007/07/19
    ジャーナル フリー
    The World Data Center (WDC) for Geophysics, Beijing, was founded in 1988. Supported by The Chinese Academy of Science and The Ministry of Science and Technology, our center has made much progress in recent years. The center has not only established the database to restore data which contain heat flow data, geomagnetic data, gravity data, etc. but also put them on the Internet (http://gp.wdc.cn) to provide free data service. The center has expended a great deal of effort to rescue the magnetograms observed 100 years ago by the Sheshan Observatory, the earliest geomagnetic observatory in China. The geophysics data of our center are abundant, and the way to get the data and information from the website is very simple and easily obtainable. In the future, the center will edit more data and construct a strong, convenient database in order to provide the better service to users.
  • Christopher J. Rusanowski
    Data Science Journal
    2007年 6 巻 S333-S352
    発行日: 2007年
    公開日: 2007/06/05
    ジャーナル フリー
    People believe what they can see. The Poles exist as a frozen dream to most people. The International Polar Year wants to break the ice (so to speak), open up the Poles to the general public, support current polar research, and encourage new research projects.
    The IPY officially begins in March, 2007. As part of this effort, the U.S. Geological Survey (USGS) and the British Antarctic Survey (BAS), with funding from the National Science Foundation (NSF), are developing three Landsat mosaics of Antarctica and an Antarctic Web Portal with a Community site and an online map viewer. When scientists are able to view the entire scope of polar research, they will be better able to collaborate and locate the resources they need. When the general public more readily sees what is happening in the polar environments, they will understand how changes to the polar areas affect everyone.
  • Miki ENOKI, Issei YOSHIDA, Masato OGUCHI
    IEICE Transactions on Information and Systems
    2017年 E100.D 巻 4 号 776-784
    発行日: 2017/04/01
    公開日: 2017/04/01
    ジャーナル フリー

    In Twitter-like services, countless messages are being posted in real-time every second all around the world. Timely knowledge about what kinds of information are diffusing in social media is quite important. For example, in emergency situations such as earthquakes, users provide instant information on their situation through social media. The collective intelligence of social media is useful as a means of information detection complementary to conventional observation. We have developed a system for monitoring and analyzing information diffusion data in real-time by tracking retweeted tweets. A tweet retweeted by many users indicates that they find the content interesting and impactful. Analysts who use this system can find tweets retweeted by many users and identify the key people who are retweeted frequently by many users or who have retweeted tweets about particular topics. However, bursting situations occur when thousands of social media messages are suddenly posted simultaneously, and the lack of machine resources to handle such situations lowers the system's query performance. Since our system is designed to be used interactively in real-time by many analysts, waiting more than one second for a query results is simply not acceptable. To maintain an acceptable query performance, we propose a capacity control method for filtering incoming tweets using extra attribute information from tweets themselves. Conventionally, there is a trade-off between the query performance and the accuracy of the analysis results. We show that the query performance is improved by our proposed method and that our method is better than the existing methods in terms of maintaining query accuracy.

  • Vita Rovite, Yael Wolff-Sagi, Linda Zaharenko, Liene Nikitina-Zake, Elmars Grens, Janis Klovins
    Journal of Epidemiology
    2018年 28 巻 8 号 353-360
    発行日: 2018/08/05
    公開日: 2018/08/05
    [早期公開] 公開日: 2018/03/24
    ジャーナル オープンアクセス
    電子付録

    Background: The Genome Database of the Latvian Population (LGDB) is a national biobank that collects, maintains, and processes health information, data, and biospecimens collected from representatives of the Latvian population. These specimens serve as a foundation for epidemiological research and prophylactic and therapeutic purposes.

    Methods: Participant recruitment and biomaterial and data processing were performed according to specifically designed standard protocols, taking into consideration international quality requirements. Legal and ethical aspects, including broad informed consent and personal data protection, were applied according to legal norms of the Republic of Latvia.

    Results: Since its start in 2006, the LGDB is comprised of biosamples and associated phenotypic and clinical information from over 31,504 participants, constituting approximately 1.5% of the Latvian population. The LGDB represents a mixed-design biobank and includes participants from the general population as well as disease-based cohorts. The standard set of biosamples stored in the LGDB consists of DNA, plasma, serum, and white blood cells; in some cohorts, these samples are complemented by cancer biopsies and microbiome and urine samples. The LGDB acts as a core structure for the Latvian Biomedical Research and Study Centre (BMC), representing the national node of Latvia in Biobanking and BioMolecular resources Research Infrastructure – European Research Infrastructure Consortium (BBMRI-ERIC).

    Conclusions: The development of the LGDB has enabled resources for biomedical research and promoted genetic testing in Latvia. Further challenges of the LGDB are the enrichment and harmonization of collected biosamples and data, the follow-up of selected participant groups, and continued networking and participation in collaboration projects.

  • ローレンソン マシュウ, 大塚 彰, 二宮 正士
    農業気象
    2002年 58 巻 1 号 1-9
    発行日: 2002/03/10
    公開日: 2010/02/25
    ジャーナル フリー
    仲介ソフトウエア, メットブローカは仲介機構によって作物モデルなど農業モデルに様々な気象データベースへの接続を提供する。本論文ではメットブローカを利用するソフトウエア開発の3つの手法 (Java アプレット, データ橋渡し Java アプリケーション, Java サーブレット) について議論する。ソフトウエア開発の参考としてそれぞれの手法を使ったアプリケーションを紹介した。Java サーブレット以外の手法はクライアントのコンピュータに Java 2の実行環境が必要となる。アプレットはソフトウエアの更新が行いやすいことが特徴で, 多くのメモリが必要で, かつ起動時間が長いのが欠点である。クライアントのコンピュータで実行される Java アプリケーションはメットブローカを利用して気象データを検索し, それをファイルに書き込むことができる。このデータの橋渡し機能によってFORTRANやBASICといった Java 以外の言語で記述された農業モデルが気象データを利用することができるようになる。Java サーブレットとして実装された農業モデルは入出力をHTTPプロトコルで受け渡す。このようなモデルはインターネット接続が可能な携帯電話など簡単なブラウザーで利用できる。Java ビーンズから ActiveXコントロールを作成する手法は現在取り組んでいる4番目の手法である。このコントロールによって Visual Basic などの Java 以外のビジュアルな開発環境を使用している開発者が MetBroker を利用できるようになるだろう。
  • Tao Li
    Journal of Advanced Computational Intelligence and Intelligent Informatics
    2019年 23 巻 4 号 775-781
    発行日: 2019/07/20
    公開日: 2019/07/20
    ジャーナル オープンアクセス

    The enrollment work of higher vocational colleges is an important part of a school’s strategic decision-making. Developing a reasonable enrollment plan is highly important for a school’s development. Previous enrollment information contains extensive valuable information, which should be used by adopting effective methods of data processing. This study used an improved Apriori algorithm to mine the association rules of enrollment information to obtain the factors that affect enrollment. A higher vocational college in Qingdao was taken as the object of study. Three attributes were selected for association rule mining: college entrance exam results, applied majors, and student background. It was found that student registration rates were significantly different under different rules. The data mining results can provide policy support for future enrollment plans.

  • Yutaro BESSHO, Yuto HAYAMIZU, Kazuo GODA, Masaru KITSUREGAWA
    IEICE Transactions on Information and Systems
    2022年 E105.D 巻 5 号 909-919
    発行日: 2022/05/01
    公開日: 2022/05/01
    ジャーナル フリー

    Parallel processing is a typical approach to answer analytical queries on large database. As the size of the database increases, we often try to increase the parallelism by incorporating more processing nodes. However, this approach increases the possibility of node failure as well. According to the conventional practice, if a failure occurs during query processing, the database system restarts the query processing from the beginning. Such temporal cost may be unacceptable to the user. This paper proposes a fault-tolerant query processing mechanism, named PhoeniQ, for analytical parallel database systems. PhoeniQ continuously takes a checkpoint for every operator pipeline and replicates the output of each stateful operator among different processing nodes. If a single processing node fails during query processing, another can promptly take over the processing. Hence, PhoneniQ allows the database system to efficiently resume query processing after a partial failure event. This paper presents a key design of PhoeniQ and prototype-based experiments to demonstrate that PhoeniQ imposes negligible performance overhead and efficiently continues query processing in the face of node failure.

  • Dafang Zhuang, Wen Yuan, Jiyuan Liu, Dongsheng Qiu, Tao Ming
    Data Science Journal
    2007年 6 巻 S770-S778
    発行日: 2007年
    公開日: 2007/10/26
    ジャーナル フリー
    The data sharing system for resource and environment science databases of the Chinese Academy of Science (CAS) is of an open three-tiered architecture, which integrates the geographical databases of about 9 institutes of CAS by the mechanism of distributive unstructured data management, metadata integration, catalogue services, and security control. The data tiers consist of several distributive data servers that are located in each CAS institute and support such unstructured data formats as vector files, remote sensing images or other raster files, documents, multi-media files, tables, and other format files. For the spatial data files, format transformation service is provided. The middle tier involves a centralized metadata server, which stores metadata records of data on all data servers. The primary function of this tier is catalog service, supporting the creation, search, browsing, updating, and deletion of catalogs. The client tier involves an integrated client that provides the end-users interfaces to search, browse, and download data or create a catalog and upload data.
  • Lili Meng, Jinlong Wen, Ran Liu, Hongyang Li, Zhi Zheng, Jinxiang Liu, Mingliang Zhi
    ISIJ International
    2024年 64 巻 14 号 1976-1987
    発行日: 2024/12/15
    公開日: 2024/12/15
    [早期公開] 公開日: 2024/10/23
    ジャーナル オープンアクセス HTML

    As a critical parameter in blast furnace production, the coal injection rate is not only related to the stability of furnace condition, but also a vital index for evaluating production economy. In most of the blast furnaces, this parameter is often determined by the operator’s experience. This paper establishes a coal injection rate prediction model based on the Catboost (category gradient boosting algorithm), which can provide a better basis for operators to control the parameter. At first, the collected steel production data were processed, the last time operational parameters that had greater impact on the coal injection rate were selected out as the input of the model, and the current time coal injection rate was used as the single output of the model. Next, the Catboost model was quoted, and the Optuna optimization algorithm based on the Bayesian principle was used to optimize the Catboost model (BO-Catboost), enhancing the model’s capabilities and avoiding over-fitting phenomenon. Then, the effects of the Catboost model under different optimization algorithms were compared, and the prediction results of the BO-Catboost model were compared with the predictions of the ordinary Catboost, BO-Random Forest and BO-XGboost (Extreme Gradient Boosting) model. The results show that the BO-Catboost model is better than other models. Finally, a blast furnace coal injection monitoring system based on Web technology was established, which can display the coal injection prediction information on the board, the test shows that it has a certain guidance for the control of the coal injection rate.

  • Nobuyoshi Yabuki, Kincho H. Law
    土木情報システム論文集
    2000年 9 巻 161-168
    発行日: 2000/10/23
    公開日: 2010/06/04
    ジャーナル フリー
    A formal and theoretical model for representing design standards is required in order to automate design and to develop design software efficiently. We developed a Hyper-Object-Logic Model, which is an integration of an Object-Logic Model, which object-oriented and logic programming paradigms are unified and Hyper Document Model for documentation of design standards and their related information. By using this model, one can systematically develop design software, linking provisions of design codes and their programs. This model can be used for conformance checking of design and automated design generation. This paper also discusses a prototype system and future research directions.
  • O Chunlu Liu, Sung Kin Pun, Yoshito Itoh
    建設マネジメント研究論文集
    2004年 11 巻 385-396
    発行日: 2004/12/07
    公開日: 2010/06/04
    ジャーナル フリー
    The demolition of buildings produces enormous amounts of waste materials that are hardly reused or recycled and therefore result in significant waste streams to landfills. This research aims to develop a Web-based information system for promoting management methodologies for demolition projects. Instead of the waste materials exchange in current online waste material management systems, this research aims to explore the possibility of exchange of demolition projects, by which the utilization of demolished materials may be ascertained before the demolition action is actually produced. With reference of the needs and difficulties of online drawing acquisition in architecture and building discipline, an online multimedia data acquisition tool is developed to collect the drawing data of a project to be demolished. Following the introduction for developing a Web-based distributed database system, a prototyped information system is demonstrated in detail through the system environment, structure and functions.
  • Frank TOUSSAINT, Michael LAUTENSCHLAGER, Hans LUTHARDT
    気象集誌. 第2輯
    2007年 85A 巻 475-485
    発行日: 2007年
    公開日: 2007/03/30
    ジャーナル フリー
    The World Data Center for Climate (WDC-Climate) is hosted at the Max Planck Institute for Meteorology (MPI-M) in Hamburg, Germany. WDC-Climate stores global and regional model output for the CEOP project, and raw model output is available in a number of different data structures. This model output is currently being restructured into a more homogenous form. This paper provides an overview of the CEOP model output and its overall data structure at WDC-Climate, as well as the different ways that this output can be accessed.
feedback
Top