Efficient Transition Adjacency Relation Computation for Process Model Similarity

Many activities in business process management, such as process retrieval, process mining and process integration, need to determine the similarity between business processes. Along with many other relational behavior semantics, Transition Adjacency Relation (abbr. TAR) has been proposed as a kind of behavioral gene of process models and a useful perspective for process similarity measurement. In this article we explain why it is still relevant and necessary to improve TAR or pTAR (i.e., projected TAR) computation efficiency and put forward a novel approach for TAR computation based on Petri net unfolding. This approach not only improves the efficiency of TAR computation, but also enables the long-expected combined usage of TAR and Behavior Profiles (abbr. BP) in process model similarity estimation.

© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Files

Metadata

Work Title Efficient Transition Adjacency Relation Computation for Process Model Similarity
Access
Open Access
Creators
  1. Jisheng Pei
  2. Lijie Wen
  3. Xiaojun Ye
  4. Akhil Kumar
License In Copyright (Rights Reserved)
Work Type Article
Publisher
  1. Institute of Electrical and Electronics Engineers (IEEE)
Publication Date 2020
Publisher Identifier (DOI)
  1. 10.1109/tsc.2020.2984605
Source
  1. IEEE Transactions on Services Computing
Deposited February 23, 2022

Versions

Analytics

Collections

This resource is currently not in any collection.

Work History

Version 1
published

  • Created
  • Added IEEE-TSC2020Camera-1.pdf
  • Added Creator Jisheng Pei
  • Added Creator Lijie Wen
  • Added Creator Xiaojun Ye
  • Added Creator Akhil Kumar
  • Published
  • Updated
  • Updated
  • Updated