人工知能学会研究会資料 人工知能基本問題研究会
Online ISSN : 2436-4584
97回 (2014/3)
選択された号の論文の18件中1~18を表示しています
  • 福田 翔士, 岩沼 宏治, 山本 泰生
    原稿種別: 研究会資料
    p. 01-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In order to avoid a combinatorial explosion in transaction stream pocessing, we propose a new approximation algorithm using the feature of the closed itemsets. The new algorithm LC-CloStream is an online frequent closed itemset mining algorithm obtained by combining CloStream algorithm and Lossy Counting algorithm. We give some theoretical studies and also show several results of its experimental evaluation.

  • 黒岩 健歩, 岩沼 宏治, 山本 泰生
    原稿種別: 研究会資料
    p. 02-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    1 Computer Science and Media Engineering, Interdisciplinary Graduate School of Medicine and Engineering, University of Yamanashi|2 Interdisciplinary Graduate School of Medicine and Engineering, University of Yamanashi

  • 狩山 和亮, Marco Cuturi, 山本 章博, 久保山 哲二, 福元 健太郎
    原稿種別: 研究会資料
    p. 03-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In this research, we propose a new biclustering method for extracting communities from binary matrices which represent a binary relation. A binary relation can be represented as a bipartite graph or a binary matrix. Many effective clustering methods for extracting communities from graphs and matrices have been proposed. In this paper, the objective data is a bid data which represent a participation record of companies in bids. A community in bid data means a set of companies which often participated in multiple bids. We aim at applying the community extraction to finding bid rigging groups. In order to achieve the goal, we propose a biclustering method based on the density of bipartite graphs and the characteristic extraction by the nonnegative matrix factorization.

  • 吉川 和, 平井 広志, 牧野 和久
    原稿種別: 研究会資料
    p. 04-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    We study representations of an antimatroid by Horn rules, motivated by its recent application to computer-aided educational systems. Since an antimatroid is a special union-closed family, it is represented by Horn rules in a natural way. This representation, however, is inconvenient since not every set of Horn rules corresponds to an antimatroid. In this paper, we introduce a useful representation of antimatroids by Horn rules, which associates every set R of Horn rules with an antimatroid A(R). Our representation is computationally implementable. We show that the following basic problems can be solved in linear time, as in the case of the natural representation: Membership problem: given a set R of Horn rules and a set X, is X a member of A(R)? Inference problem: given a set R of Horn rules and a Horn rule (A; q), does (A; q) accept A(R)? Our representation is essentially equivalent to the `circuit' representation of antimatroids by Korte and Lovasz. We establish their relationship, and give a polynomial time algorithm to construct the uniquely-determined minimal circuit representation from a given set of Horn rules. We explain that our results have potential applications to computer-aided educational systems, where an antimatroid is used as a model of the space of possible knowledge states of learners, and is constructed by giving Horn queries to a human expert.

  • ジェイ 泓杰, 大久保 好章, 原口 誠
    原稿種別: 研究会資料
    p. 05-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In the problem of finding pseudo-cliques, namely k-plexes, it becomes harder to enumerate all of them as we allow more number of disconnection among vertices in them. Particularly, there often exist a huge number of possible solutions with medium or small size which cannot be regarded as dense vertex sets. To overcome the difficulty, the authors have proposed two algorithms, J-CoreMaxKPlex and MetaClique, which can exclusively enumerate dense pseudo-cliques among them. A constraint on connectivity based on the notion of j-core is taken into account in the former and a structural constraint based on the notion of meta-cliques in the latter. In this report, we empirically observe performance of the algorithms for several graphs and then show their characteristics, comparing with an improved standard maximal k-plex enumerator.

  • 高畠 嘉将, 田部井 靖生, 坂本 比呂志
    原稿種別: 研究会資料
    p. 06-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    Edit distance with moves(EDM) is a string-to-string measure such that inclues substring moves in addiction to ordinal editing operations to turn one string to the other. EDM is applicable to error detections and suggesting keywords for searches. Online ESP(OESP) is a first online pattern matching for EDM. OESP builds a parse tree for a whole input text in an online manner and quickly comutes EDM in each position. However, the computaion of EDM of OESP only uses subtrees that generates a query length substring. The space for a parse tree is too large for the computation of EDM. Thus, we present a more space efficient OESP called improved OESP(OESP-I). OESP-I builds from a parse tree for i-th position query length string to a parse tree for i+1-th position query length string and quickly computes EDM. We show the time and space compreixity of OESP-I. Additionally, we show the approximate ratio of EDM of OESP-I.

  • 中島 健太, 前田 幸司, 高畠 嘉将, 坂本 比呂志
    原稿種別: 研究会資料
    p. 07-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー
  • 申 吉浩, Adrian Pino Angulo, 久保山 哲二
    原稿種別: 研究会資料
    p. 08-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    We present a method to define feature selection measures based on metrics. Also, we show that the well-known Bayesian risk can be derived from some metric and give a new characterization to it.

  • 戸田 貴久, 津田 宏治
    原稿種別: 研究会資料
    p. 09-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    We improve an existing OBDD-based method of computing all total satisfying assignments of a Boolean formula. To do this, we introduce a technique to reduce frequent cache operations in a systematic way. Furthermore, we introduce an alternative cutset caching that uses information provided by unit propagation. We implement our methods on top of a modern SAT solver, and evaluate their efficiency in experiments.

  • Aolong Zha, Ryuzo Hasegawa
    原稿種別: SIG paper
    p. 10-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    SATzilla2012, an automated approach for constructing per-instance algorithm portfolios for SAT that utilizes cost-sensitive classification models to choose among their constituent solvers, has achieved an excellent performance in SAT Challenge 2012. In this paper, we present the parallel portfolio SATzilla2012 (PPSz2012), which parallelizes the phases of the sequential portfolio SATzilla2012 (SPSz2012), based on assigning and running corresponding processes to the component solvers that were predicted their performances by decision forest (DF) and received the top two highest votes.

  • 鈴木 涼介, 橋本 健二, 酒井 正彦
    原稿種別: 研究会資料
    p. 11-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    We extend a model-counting solver for projected models of a propositional CNF formula. A projected model is a projection of a model of a formula on a given set of variables. We give an extension of a DPLL-based model-counting algorithm for projected models, and extend the model-counting solver sharpSAT to count the exact number of projected models. We experimentally show that the extended solver achieves better performance than existing solvers for projected models.

  • 宋 剛秀, 佐古田 淳史, 番原 睦則, 田村 直之
    原稿種別: 研究会資料
    p. 12-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In this paper, we propose a hybrid encoding of Constraint Satisfaction Problems (CSPs) to propositional logic (especially SAT and PB). There have been proposed several encodings of CSPs to propositional logic, such as direct encoding, order encoding, and binary encoding. However, each of these encodings has its pros and cons. For example, the order encoding which is used in a SAT-based constraint solver Sugar, showed a good performance on many problem instances, but it can not be applied to CSP instances with very large domain size. On the other hand, the binary encoding is applicable to those large instances, but the generated SAT instances take more time to be solved than the order encoded instances in general. The hybrid encoding proposed in this paper mixes the order and binary encodings so that it can be applied to very large instances while it keeps the good performance of the order encoding for small instances. We evaluated the performance of the proposed hybrid encoding with our implementation named Hysca. In our evaluation using 1453 instances of the 2009 CSP solver competition, Hysca succeeded to solve 1075 instances, which is 56 more than the order encoding, and 53 more than the binary encoding.

  • 上村 直輝, 越村 三幸, 長谷川 隆三
    原稿種別: 研究会資料
    p. 13-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    SATELITE is a preprocess of SAT solver. It eliminates variables and clauses, and decreases runtime of SAT solver. We use it in MaxSAT solver QMaxSAT which uses a normal SAT solver as an inference engine and CNF encodings of Boolean cardinality constraints. We compare QMaxSAT with SATELITE and without SATELITE by solving MaxSAT instances taken from the MaxSAT Evaluation 2014 while changing SAT solver. In this comparison, we use minisat2.0, minisat2.2.0, and glucose3.0 as an inference engine, and the CNF encoding ``auto'' which selects an appropriate CNF encoding form three encodings.

  • 早田 翔, 長谷川 隆三
    原稿種別: 研究会資料
    p. 14-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    Weighted Partial MaxSAT(WPMS) is a generalization of Satisfiability problem. Many optimization problems can be reduced to WPMS in polynomial time. So it is important to develop MaxSAT solvers. Cardinality constraints plays important role in solving MaxSAT. In this paper, we propose Weighted Totalizer(WTO) and Partial Encording(PE). WTO is based on Totalizer(TO), and use less variables and clauses than TO. PE is a new encording method, which is optimized for particular WPMS problems. Our experimental results show the effectiveness of these methods.

  • 力 規晃, 越村 三幸, 藤田 博, 長谷川 隆三
    原稿種別: 研究会資料
    p. 15-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    Inductive Logic Programming is a method of inductive learning based predicate logic. In this work, we propose the method that after converting a problem of inductive logic programming to MaxSAT problem, which solve the MaxSAT problem using a MaxSAT solver. MaxSAT is an optimization version of SAT which consists in finding an assignment that maximizes the number of satisfied clauses.

  • 杉本 拓也, 鍋島 英知
    原稿種別: 研究会資料
    p. 16-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In this paper, we propose two techniques of the dynamic simplification algorithm for CDCL (conflict-driven clause learning) solvers. Firstly, We propose an extension of the dynamic subsumption algorithm proposed by Hamadi et al. The clause learning mechanism in CDCL solvers can be formalized as a resolution process. The dynamic subsumption efficiently checks whether a resolvent subsumes the parent clause or not. The subsumed clause can be removed. In this study, we extend the subsumption checking for ancestors. Secondly, We propose an extraction method of implicit binary clauses from the resolution process and show a lightweight subsumption checking method with these binary clauses. When using the clauses to check subsumptions by binary clauses. Because it is a binary clause, cost of inspection is low, the effect is high. The experimental results show that our techniques can improve the performace of a CDCL solver.

  • 横前 菜々子, 鍋島 英知
    原稿種別: 研究会資料
    p. 17-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In this paper, we propose a new clause management strategy based on the depth of learnt clauses for CDCL solvers. A CDCL solver derives many learnt clauses from con icts occurred in the search process. These learnt clauses are useful to prevent similar con icts, but they are periodically reduced to avoid memory over ow and decreasing the speed of unit propagation. Hence, an evaluation criteria of learnt clauses is important for CDCL solvers. In this study, we propose a new evaluation criteria based on the depth of learnt clauses. Each learnt clause has the depth in the derivation tree. Our approach holds (1) learnt clauses which may become bottleneck, that is, there is no other clauses at the depth, and (2) learnt clauses at deepest part of the tree. The experimental results show that our criteria can help to identify useful learnt clauses compared with existing criteria used in CDCL solvers.

  • 渡辺 大樹, 鍋島 英知
    原稿種別: 研究会資料
    p. 18-
    発行日: 2015/03/18
    公開日: 2021/07/01
    会議録・要旨集 フリー

    In this paper, we propose a high-level unsatisfiable core extraction technique for SAT translation approaches, in which a given instance is encoded into a propositional formula, and then the formula is solved by a SAT solver. If a model which satisfies the formula is found, then it is decoded into a solution of the instance. If the formula is unsatisfiable, that means the instance has no solution, generally. In the latter case, it is often required to find a cause of the unsuccessful result. To extract a high-level unsatisfiable core of the instance, the propositional UNSAT core should be decoded into the format of the instance, but the development of the decoding mechanism is not an easy work when the translation approach consists of multiple layers. We show a high-level unsatisfiable core extraction technique based on decoding process of a model that is equipped in SAT translation approaches.

feedback
Top