To see the other types of publications on this topic, follow the link: Reduced ordered binary decision diagram.

Journal articles on the topic 'Reduced ordered binary decision diagram'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Reduced ordered binary decision diagram.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Dvořák, Václav. "Bounds on Size of Decision Diagrams." JUCS - Journal of Universal Computer Science 3, no. (1) (1997): 2–22. https://doi.org/10.3217/jucs-003-01-0002.

Full text
Abstract:
Known upper bounds on the number of required nodes (size) in the ordered binary and multiple-valued decision diagram (DD) for representation of logic functions are reviewed and reduced by a small constant factor. New upper bounds are derived for partial logic functions containing don t cares and also for complete Boolean functions specified by Boolean expressions. The evaluation of upper bounds is based on a bottom-up algorithm for constructing efficient ordered DDs developed by the author.
APA, Harvard, Vancouver, ISO, and other styles
2

Lai, Yong, Dayou Liu, and Shengsheng Wang. "Reduced ordered binary decision diagram with implied literals: a new knowledge compilation approach." Knowledge and Information Systems 35, no. 3 (2012): 665–712. http://dx.doi.org/10.1007/s10115-012-0525-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Prihozhy, Anatoly A. "Synthesis of quantum circuits based on incompletely specified functions and if-decision diagrams." Journal of the Belarusian State University. Mathematics and Informatics, no. 3 (December 14, 2021): 84–97. http://dx.doi.org/10.33581/2520-6508-2021-3-84-97.

Full text
Abstract:
The problem of synthesis and optimisation of logical reversible and quantum circuits from functional descriptions represented as decision diagrams is considered. It is one of the key problems being solved with the aim of creating quantum computing technology and quantum computers. A new method of stepwise transformation of the initial functional specification to a quantum circuit is proposed, which provides for the following project states: reduced ordered binary decision diagram, if-decision diagram, functional if-decision diagram, reversible circuit and quantum circuit. The novelty of the method consists in extending the Shannon and Davio expansions of a Boolean function on a single variable to the expansions of the same Boolean function on another function with obtaining decomposition products that are represented by incompletely defined Boolean functions. Uncertainty in the decomposition products gives remarkable opportunities for minimising the graph representation of the specified function. Instead of two outgoing branches of the binary diagram vertex, three outgoing branches of the if-diagram vertex are generated, which increase the level of parallelism in reversible and quantum circuits. For each transformation step, appropriate mapping rules are proposed that reduce the number of lines, gates and the depth of the reversible and quantum circuit. The comparison of new results with the results given by the known method of mapping the vertices of binary decision diagram into cascades of reversible and quantum gates shows a significant improvement in the quality of quantum circuits that are synthesised by the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
4

Dong, Rongsheng, Yangyang Zhu, Zhoubo Xu, and Fengying Li. "Decision Diagram Based Symbolic Algorithm for Evaluating the Reliability of a Multistate Flow Network." Mathematical Problems in Engineering 2016 (2016): 1–13. http://dx.doi.org/10.1155/2016/6908120.

Full text
Abstract:
Evaluating the reliability of Multistate Flow Network (MFN) is an NP-hard problem. Ordered binary decision diagram (OBDD) or variants thereof, such as multivalued decision diagram (MDD), are compact and efficient data structures suitable for dealing with large-scale problems. Two symbolic algorithms for evaluating the reliability of MFN, MFN_OBDD and MFN_MDD, are proposed in this paper. In the algorithms, several operating functions are defined to prune the generated decision diagrams. Thereby the state space of capacity combinations is further compressed and the operational complexity of the decision diagrams is further reduced. Meanwhile, the related theoretical proofs and complexity analysis are carried out. Experimental results show the following: (1) compared to the existing decomposition algorithm, the proposed algorithms take less memory space and fewer loops. (2) The number of nodes and the number of variables of MDD generated in MFN_MDD algorithm are much smaller than those of OBDD built in the MFN_OBDD algorithm. (3) In two cases with the same number of arcs, the proposed algorithms are more suitable for calculating the reliability of sparse networks.
APA, Harvard, Vancouver, ISO, and other styles
5

Das, Apangshu, Akash Debnath, and Sambhu Nath Pradhan. "Reduced ordered binary decision diagram-based combinational circuit synthesis for optimising area, power and temperature." International Journal of Nanoparticles 11, no. 2 (2019): 94. http://dx.doi.org/10.1504/ijnp.2019.099181.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pradhan, Sambhu Nath, Akash Debnath, and Apangshu Das. "Reduced ordered binary decision diagram-based combinational circuit synthesis for optimising area, power and temperature." International Journal of Nanoparticles 11, no. 2 (2019): 94. http://dx.doi.org/10.1504/ijnp.2019.10020325.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Radmanovic, Milos. "A study of binary decision diagram characteristics of bent Boolean functions." Facta universitatis - series: Electronics and Energetics 36, no. 2 (2023): 285–98. http://dx.doi.org/10.2298/fuee2302285r.

Full text
Abstract:
Bent Boolean functions exist only for an even number of variables, moreover, they are unbalanced. Therefore, they are used in coding theory and in many areas of computer science. General form of bent functions is still unknown. One way of representing Boolean functions is with a reduced ordered binary decision diagram (ROBDD). The strength of ROBDDs is that they can represent Boolean functions data with a high level of redundancy in a compact form, as long as the data is encoded in such a way that the redundancy is exposed. This paper investigates characteristics of bent functions with focus on their ROBDD parameters. Decision diagram experimental framework has been used for implementation of a program for calculation of the ROBDD parameters. The results presented in this paper are intended to be used to create methods for the construction of bent functions using a ROBDD as a data structure from which the bent functions can be discovered.
APA, Harvard, Vancouver, ISO, and other styles
8

Ali, Muhammad Ali Rushdi, and Mohammad Alturki Alaa. "Computation of k-out-of-n System Reliability via Reduced Ordered Binary Decision Diagrams." British Journal of Mathematics & Computer Science 22, no. 3 (2017): 1–9. https://doi.org/10.9734/BJMCS/2017/33642.

Full text
Abstract:
A prominent reliability model is that of the partially-redundant (k-out-of-n) system. We use algebraic as well as signal-flow-graph methods to explore and expose the AR algorithm for computing k-out-of-n reliability. We demonstrate that the AR algorithm is, in fact, both a recursive and an iterative implementation of the strategy of Reduced Ordered Binary Decision Diagrams (ROBDDs). The underlying ROBDD for the AR recursive algorithm is represented by a compact Signal Flow Graph (SFG) that is used to deduce AR iterative algorithms of quadratic temporal complexity and linear spatial complexity. Extensions of the AR algorithm for (single or scalar) threshold, double-threshold, vector-threshold, and k-to-<em>l</em>-out-of-n systems have similar ROBDD interpretations.
APA, Harvard, Vancouver, ISO, and other styles
9

Wille, Robert, Görschwin Fey, and Rolf Drechsler. "Building free Binary Decision Diagrams using SAT solvers." Facta universitatis - series: Electronics and Energetics 20, no. 3 (2007): 381–94. http://dx.doi.org/10.2298/fuee0703381w.

Full text
Abstract:
Free Binary Decision Diagrams (FBDDs) are a data structure for the representation of Boolean functions. In contrast to Ordered Binary Decision Diagrams (OBDDs) FBDDs allow different variable orderings along each path. Thus, FBDDs are the more compact representation while most of the properties of OBDDs are kept. However, how to efficiently build small FBDDs for a given function is still an open question. In this work we propose FBDD construction with the help of SAT solvers. "Recording" the single steps of a SAT solver during the search process leads to an FBDD. Furthermore, by exploiting approaches for identifying isomorphic sub-graphs, i.e. cutlines or cutsets reduced FBDDs are constructed.
APA, Harvard, Vancouver, ISO, and other styles
10

Rushdi, Ali, and Alaa Alturki. "Computation of k-out-of-n System Reliability via Reduced Ordered Binary Decision Diagrams." British Journal of Mathematics & Computer Science 22, no. 3 (2017): 1–9. http://dx.doi.org/10.9734/bjmcs/2017/33642.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

PRADHAN, SAMBHU NATH, and SANTANU CHATTOPADHYAY. "MULTIPLEXER-BASED MULTI-LEVEL CIRCUIT SYNTHESIS WITH AREA-POWER TRADE-OFF." Journal of Circuits, Systems and Computers 21, no. 05 (2012): 1250037. http://dx.doi.org/10.1142/s0218126612500375.

Full text
Abstract:
Due to the regularity of implementation, multiplexers are widely used in VLSI circuit synthesis. This paper proposes a technique for decomposing a function into 2-to-1 multiplexers performing area-power tradeoff. To the best of our knowledge this is the first ever effort to incorporate leakage into power calculation for multiplexer-based decomposition. With respect to an initial ROBDD (reduced ordered binary decision diagram)-based representation of the function, the scheme shows more than 30% reduction in area, leakage and switching for the LGSynth91 benchmarks without performance degradation. It also enumerates the trade-offs present in the solution space for different weights associated with these three quantities.
APA, Harvard, Vancouver, ISO, and other styles
12

Newton, Jim, and Didier Verna. "A Theoretical and Numerical Analysis of the Worst-Case Size of Reduced Ordered Binary Decision Diagrams." ACM Transactions on Computational Logic 20, no. 1 (2019): 1–36. http://dx.doi.org/10.1145/3274279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

K., V. B. V. Rayudu, Jahagirdar, and Srihari Rao P. "Modern design approach of faults (toggling faults,bridge faults and SAT) of reduced ordered binary decision diagram based on combo & sequential blocks." International Journal of Reconfigurable and Embedded Systems 9, no. 2 (2020): 158–68. https://doi.org/10.11591/ijres.v9.i2.pp158-168.

Full text
Abstract:
In this Research we are going to develop ROBDD (Reduced Ordered Binary Decision Diagram) designs to detect toggling faults, bridge faults and SAT (Stuck at Fault), Here we are going to develop sequential blocks using ROBDD and applying to the mux to detect stuck at faults and also connecting the combo &amp; Sequential blocks to find the toggling faults by connecting or using automatic test pattern generator. In this research we are going to develop the bridges between the blocks of ROBDD designs and converting them to and or logic to find the bridge faults of the design. Finding bridge and toggle faults are more difficult in logic designs, here we use an advance technique to find the faults of the design by calculating the path delays of the individual blocks of the design. More concentrating on the path delays by using basic stuck at faults methods to refer the faults (toggling and bridge faults) at mux output. In our research the basic design modules are ROBDD circuit of both combinational and sequential blocks are designed and tested using Multiplexer and K-map Simplification Methods. The main purpose of the research to find the faults at all levels of all logic designs which involves in both combinational and sequential blocks of the design.
APA, Harvard, Vancouver, ISO, and other styles
14

Vasantha Rayudu, Kurada Veera Bhoga, Jahagirdar Jahagirdar, and P. Rao. "Modern design approach of faults (toggling faults, bridge faults and SAT) of reduced ordered binary decision diagram based on combo & sequential blocks." International Journal of Reconfigurable and Embedded Systems (IJRES) 9, no. 2 (2020): 158. http://dx.doi.org/10.11591/ijres.v9.i2.pp158-168.

Full text
Abstract:
In this Research we are going to develop ROBDD (Reduced Ordered Binary Decision Diagram) designs to detect toggling faults, bridge faults and SAT (Stuck at Fault), Here we are going to develop sequential blocks using ROBDD and applying to the mux to detect stuck at faults and also connecting the combo &amp;amp; Sequential blocks to find the toggling faults by connecting or using automatic test pattern generator. In this research we are going to develop the bridges between the blocks of ROBDD designs and converting them to and or logic to find the bridge faults of the design. Finding bridge and toggle faults are more difficult in logic designs, here we use an advance technique to find the faults of the design by calculating the path delays of the individual blocks of the design. More concentrating on the path delays by using basic stuck at faults methods to refer the faults (toggling and bridge faults) at mux output. In our research the basic design modules are ROBDD circuit of both combinational and sequential blocks are designed and tested using Multiplexer and K-map Simplification Methods. The main purpose of the research to find the faults at all levels of all logic designs which involves in both combinational and sequential blocks of the design.
APA, Harvard, Vancouver, ISO, and other styles
15

Li, Shunxin, Yinglei Song, and Yaying Zhang. "Combinatorial Test Case Generation Based on ROBDD and Improved Particle Swarm Optimization Algorithm." Applied Sciences 14, no. 2 (2024): 753. http://dx.doi.org/10.3390/app14020753.

Full text
Abstract:
In applications of software testing, the cause–effect graph method is an approach often used to design test cases by analyzing various combinations of inputs with a graphical approach. However, not all inputs have equal impacts on the results, and approaches based on exhaustive testing are generally time-consuming and laborious. As a statute-based software-testing method, combinatorial testing aims to select a small but effective number of test cases from the large space of all possible combinations of the input values for the software to be tested, and to generate a set of test cases with a high degree of coverage and high error detection capability. In this paper, the reduced ordered binary decision diagram is utilized to simplify the cause–effect graph so as to reduce the numbers of both the inputs and test cases, thereby saving the testing cost. In addition, an improved particle swarm optimization algorithm is proposed to significantly reduce the computation time needed to generate test cases. Experiments on several systems show that the proposed method can generate excellent results for test case generation.
APA, Harvard, Vancouver, ISO, and other styles
16

Yan, Zongshuai, Chenhua Nie, Rongsheng Dong, Xi Gao, and Jianming Liu. "A Novel OBDD-Based Reliability Evaluation Algorithm for Wireless Sensor Networks on the Multicast Model." Mathematical Problems in Engineering 2015 (2015): 1–14. http://dx.doi.org/10.1155/2015/269781.

Full text
Abstract:
The two-terminal reliability calculation for wireless sensor networks (WSNs) is a #P-hard problem. The reliability calculation of WSNs on the multicast model provides an even worse combinatorial explosion of node states with respect to the calculation of WSNs on the unicast model; many real WSNs require the multicast model to deliver information. This research first provides a formal definition for the WSN on the multicast model. Next, a symbolic OBDD_Multicast algorithm is proposed to evaluate the reliability of WSNs on the multicast model. Furthermore, our research on OBDD_Multicast construction avoids the problem of invalid expansion, which reduces the number of subnetworks by identifying the redundant paths of two adjacent nodes ands-tunconnected paths. Experiments show that the OBDD_Multicast both reduces the complexity of the WSN reliability analysis and has a lower running time than Xing’s OBDD- (ordered binary decision diagram-) based algorithm.
APA, Harvard, Vancouver, ISO, and other styles
17

Affum, Eric, Xiasong Zhang, Xiaofen Wang, and John Bosco Ansuura. "Efficient Lattice CP-ABE AC Scheme Supporting Reduced-OBDD Structure for CCN/NDN." Symmetry 12, no. 1 (2020): 166. http://dx.doi.org/10.3390/sym12010166.

Full text
Abstract:
In line with the proposed 5th Generation network, content centric network/named data networking (CCN/NDN) has been offered as one of the promising paradigms to cope with the communication needs of future realistic network communications. CCN/NDN allows network communication based on content names and also allows users to obtain information from any of the nearest intermediary caches on the network. Due to that, the ability of cached content to protect itself is essential since contents can be cached on any node everywhere, and publishers may not have total control over their own published data. The attribute based encryption (ABE) scheme is a preferable approach, identified to enable cached contents to be self-secured since it has a special property of encryption with policies. However, most of the proposed ABE schemes for CCN/NDN suffer from some loopholes. They are not flexible in the expression of access policy, they are inefficient, they are based on bilinear maps with pairings, and they are vulnerable to quantum cryptography algorithms. Hence, we propose the ciphertext policy attribute based encryption access control (CP-ABE AC) scheme from a lightweight ideal lattice based on ring learning with error (R-LWE) problem, and demonstrated its use in practical applications. The proposed scheme is proved to be secure and efficient under the decision ring LWE problem in the selective set model. To achieve an efficient scheme, we used an efficient trapdoor technique and the access tree representation of access structure describing the access policies was modified into a new structure, based on a reduced ordered binary decision diagram (reduce-OBDD). This access structure can support Boolean operations such as AND, NOT, OR, and threshold gates. The final result showed that the proposed scheme was secure and efficient for applications, thereby supporting CCN/NDN as a promising paradigm.
APA, Harvard, Vancouver, ISO, and other styles
18

Opara, Adam, and Dariusz Kania. "Decomposition-based logic synthesis for PAL-based CPLDs." International Journal of Applied Mathematics and Computer Science 20, no. 2 (2010): 367–84. http://dx.doi.org/10.2478/v10006-010-0027-1.

Full text
Abstract:
Decomposition-based logic synthesis for PAL-based CPLDsThe paper presents one concept of decomposition methods dedicated to PAL-based CPLDs. The proposed approach is an alternative to the classical one, which is based on two-level minimization of separate single-output functions. The key idea of the algorithm is to search for free blocks that could be implemented in PAL-based logic blocks containing a limited number of product terms. In order to better exploit the number of product terms, two-stage decomposition and BDD-based decomposition are to be used. In BDD-based decomposition methods, functions are represented by Reduced Ordered Binary Decision Diagrams (ROBDDs). The results of experiments prove that the proposed solution is more effective, in terms of the usage of programmable device resources, compared with the classical ones.
APA, Harvard, Vancouver, ISO, and other styles
19

Hawkins, P. J., V. Lagoon, and P. J. Stuckey. "Solving Set Constraint Satisfaction Problems using ROBDDs." Journal of Artificial Intelligence Research 24 (July 1, 2005): 109–56. http://dx.doi.org/10.1613/jair.1638.

Full text
Abstract:
In this paper we present a new approach to modeling finite set domain constraint problems using Reduced Ordered Binary Decision Diagrams (ROBDDs). We show that it is possible to construct an efficient set domain propagator which compactly represents many set domains and set constraints using ROBDDs. We demonstrate that the ROBDD-based approach provides unprecedented flexibility in modeling constraint satisfaction problems, leading to performance improvements. We also show that the ROBDD-based modeling approach can be extended to the modeling of integer and multiset constraint problems in a straightforward manner. Since domain propagation is not always practical, we also show how to incorporate less strict consistency notions into the ROBDD framework, such as set bounds, cardinality bounds and lexicographic bounds consistency. Finally, we present experimental results that demonstrate the ROBDD-based solver performs better than various more conventional constraint solvers on several standard set constraint problems.
APA, Harvard, Vancouver, ISO, and other styles
20

Xu, Zhiwu, Hongxu Chen, Alwen Tiu, Yang Liu, and Kunal Sareen. "A permission-dependent type system for secure information flow analysis." Journal of Computer Security 29, no. 2 (2021): 161–228. http://dx.doi.org/10.3233/jcs-200036.

Full text
Abstract:
We introduce a novel type system for enforcing secure information flow in an imperative language. Our work is motivated by the problem of statically checking potential information leakage in Android applications. To this end, we design a lightweight type system featuring Android permission model, where the permissions are statically assigned to applications and are used to enforce access control in the applications. We take inspiration from a type system by Banerjee and Naumann to allow security types to be dependent on the permissions of the applications. A novel feature of our type system is a typing rule for conditional branching induced by permission testing, which introduces a merging operator on security types, allowing more precise security policies to be enforced. The soundness of our type system is proved with respect to non-interference. A type inference algorithm is also presented for the underlying security type system, by reducing the inference problem to a constraint solving problem in the lattice of security types. In addition, a new way to represent our security types as reduced ordered binary decision diagrams is proposed.
APA, Harvard, Vancouver, ISO, and other styles
21

Zhou, Feng, Hua Chen, and Limin Fan. "Prover - Toward More Efficient Formal Verification of Masking in Probing Model." IACR Transactions on Cryptographic Hardware and Embedded Systems 2025, no. 1 (2024): 552–85. https://doi.org/10.46586/tches.v2025.i1.552-585.

Full text
Abstract:
In recent years, formal verification has emerged as a crucial method for assessing security against Side-Channel attacks of masked implementations, owing to its remarkable versatility and high degree of automation. However, formal verification still faces technical bottlenecks in balancing accuracy and efficiency, thereby limiting its scalability. Former tools like maskVerif and CocoAlma are very efficient but they face accuracy issues when verifying schemes that utilize properties of Boolean functions. Later, SILVER addressed the accuracy issue, albeit at the cost of significantly reduced speed and scalability compared to maskVerif. Consequently, there is a pressing need to develop formal verification tools that are both efficient and accurate for designing secure schemes and evaluating implementations. This paper’s primary contribution lies in proposing several approaches to develop a more efficient and scalable formal verification tool called Prover, which is built upon SILVER. Firstly, inspired by the auxiliary data structures proposed by Eldib et al. and optimistic sampling rule of maskVerif, we introduce two reduction rules aimed at diminishing the size of observable sets and secret sets in statistical independence checks. These rules substantially decrease, or even eliminate, the need for repeated computation of probability distributions using Reduced Ordered Binary Decision Diagrams (ROBDDs), a time-intensive procedure in verification. Subsequently, we integrate one of these reduction rules into the uniformity check to mitigate its complexity. Secondly, we identify that variable ordering significantly impacts efficiency and optimize it for constructing ROBDDs, resulting in much smaller representations of investigated functions. Lastly, we present the algorithm of Prover, which efficiently verifies the security and uniformity of masked implementations in probing model with or without the presence of glitches. Experimental results demonstrate that our proposed tool Prover offers a better balance between efficiency and accuracy compared to other state-of-the-art tools (IronMask, CocoAlma, maskVerif, and SILVER). In our experiments, we also found an S-box that can only be verified by Prover, as IronMask cannot verify S-boxes, and both CocoAlma and maskVerif suffer from false positive issues. Additionally, SILVER runs out of time during verification.
APA, Harvard, Vancouver, ISO, and other styles
22

Takenaga, Yasuhiko, and Shuzo Yajima. "Hardness of identifying the minimum ordered binary decision diagram." Discrete Applied Mathematics 107, no. 1-3 (2000): 191–201. http://dx.doi.org/10.1016/s0166-218x(99)00226-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Sistla, Meghana, Swarat Chaudhuri, and Thomas Reps. "Weighted Context-Free-Language Ordered Binary Decision Diagrams." Proceedings of the ACM on Programming Languages 8, OOPSLA2 (2024): 1390–419. http://dx.doi.org/10.1145/3689760.

Full text
Abstract:
This paper presents a new data structure, called Weighted Context-Free-Language Ordered BDDs (WCFLOBDDs), which are a hierarchically structured decision diagram, akin to Weighted BDDs (WBDDs) enhanced with a procedure-call mechanism. For some functions, WCFLOBDDs are exponentially more succinct than WBDDs. They are potentially beneficial for representing functions of type B n → D , when a function’s image V ⊆ D has many different values. We apply WCFLOBDDs in quantum-circuit simulation, and find that they perform better than WBDDs on certain benchmarks. With a 15-minute timeout, the number of qubits that can be handled by WCFLOBDDs is 1-64× that of WBDDs(and 1-128× that of CFLOBDDs, which are an unweighted version of WCFLOBDDs). These results support the conclusion that for this application—from the standpoint of problem size, measured as the number of qubits—WCFLOBDDs provide the best of both worlds: performance roughly matches whichever of WBDDs and CFLOBDDs is better.(From the standpoint of running time, the results are more nuanced.)
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Ruiwei, and Roland H. C. Yap. "Encoding Multi-Valued Decision Diagram Constraints as Binary Constraint Trees." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 4 (2022): 3850–58. http://dx.doi.org/10.1609/aaai.v36i4.20300.

Full text
Abstract:
Ordered Multi-valued Decision Diagram (MDD) is a compact representation used to model various constraints, such as regular constraints and table constraints. It can be particularly useful for representing ad-hoc problem specific constraints. Many algorithms have been proposed to enforce Generalized Arc Consistency (GAC) on MDD constraints. In this paper, we introduce a new compact representation called Binary Constraint Tree (BCT). We propose tree binary encodings to transform any MDD constraint into a BCT constraint. We also present a specialized algorithm enforcing GAC on the BCT constraint resulting from a MDD constraint. Experimental results on a large set of benchmarks show that the BCT GAC algorithm can significantly outperform state-of-the-art MDD as well as table GAC algorithms.
APA, Harvard, Vancouver, ISO, and other styles
25

Devadas, S. "Comparing two-level and ordered binary decision diagram representations of logic functions." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 12, no. 5 (1993): 722–23. http://dx.doi.org/10.1109/43.277617.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Khan, Hafiz Anwar Ullah, Mohamed Al Hosani, and Hatem Zeineldin. "Topology planning for autonomous MMGs: an ordered binary decision diagram‐based approach." IET Smart Grid 3, no. 1 (2020): 60–68. http://dx.doi.org/10.1049/iet-stg.2019.0083.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Heap, Mark. "On the exact ordered binary decision diagram size of totally symmetric functions." Journal of Electronic Testing 4, no. 2 (1993): 191–95. http://dx.doi.org/10.1007/bf00971647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ross, Don E., Kenneth M. Butler, and M. Ray Mercer. "Exact ordered binary decision diagram size when representing classes of symmetric functions." Journal of Electronic Testing 2, no. 3 (1991): 243–59. http://dx.doi.org/10.1007/bf00135441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Wei, Guan. "Reliability Analysis of the Internet of Things Based on Ordered Binary Decision Diagram." International Journal of Online Engineering (iJOE) 14, no. 08 (2018): 20. http://dx.doi.org/10.3991/ijoe.v14i08.9185.

Full text
Abstract:
The reliability of the Internet of Things (IoT) system is analyzed and studied through ordered binary decision diagram (OBDD) to improve its design, application, and development. Based on the OBDD analysis, a reliability evaluation method named as enhanced node expansion (ENE) is proposed. This method provides an effective solution for the reliability assessment of IoT with large scale and complex network structure. A link importance assessment method based on OBDD analysis is also established. The proposed method can accurately and effectively evaluate the reliability of the IoT network and is practical for discussing the reliability, design, and development of this system.
APA, Harvard, Vancouver, ISO, and other styles
30

Li, Long, Tianlong Gu, Liang Chang, Zhoubo Xu, Yining Liu, and Junyan Qian. "A Ciphertext-Policy Attribute-Based Encryption Based on an Ordered Binary Decision Diagram." IEEE Access 5 (2017): 1137–45. http://dx.doi.org/10.1109/access.2017.2651904.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Schmidt, Tim, and Rong Zhou. "Representing Pattern Databases with Succinct Data Structures." Proceedings of the International Symposium on Combinatorial Search 2, no. 1 (2021): 142–49. http://dx.doi.org/10.1609/socs.v2i1.18195.

Full text
Abstract:
In this paper we describe novel representations for precomputed heuristics based on Level-Ordered Edge Sequence (LOES) encodings. We introduce compressed LOES, an extension to LOES that enables more aggressive compression of the state-set representation. We evaluate the novel repre- sentations against the respective perfect-hash and binary decision diagram (BDD) representations of pattern databases in a variety of STRIPS domains.
APA, Harvard, Vancouver, ISO, and other styles
32

Kapernaum, Nadia, Friederike Knecht, C. Scott Hartley, Jeffrey C. Roberts, Robert P. Lemieux, and Frank Giesselmann. "Formation of smectic phases in binary liquid crystal mixtures with a huge length ratio." Beilstein Journal of Organic Chemistry 8 (July 19, 2012): 1118–25. http://dx.doi.org/10.3762/bjoc.8.124.

Full text
Abstract:
A system of two liquid-crystalline phenylpyrimidines differing strongly in molecular length was studied. The phase diagram of these two chemically similar mesogens, with a length ratio of 2, was investigated, and detailed X-ray diffraction and electrooptical measurements were performed. The phase diagram revealed a destabilization of the nematic phase, which is present in the pure short compound, while the smectic state was stabilized. The short compound forms smectic A and smectic C phases, whereas the longer compound forms a broad smectic C phase and a narrow higher-ordered smectic phase. Nevertheless, in the mixtures, the smectic C phase is destabilized and disappears rapidly, whereas smectic A is the only stable phase observed over a broad concentration range. In addition, the smectic translational order parameters as well as the tilt angles of the mixtures are reduced. The higher-ordered smectic phase of the longer mesogen was identified as a smectic F phase.
APA, Harvard, Vancouver, ISO, and other styles
33

Choi, Arthur, and Adnan Darwiche. "Dynamic Minimization of Sentential Decision Diagrams." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (2013): 187–94. http://dx.doi.org/10.1609/aaai.v27i1.8690.

Full text
Abstract:
The Sentential Decision Diagram (SDD) is a recently proposed representation of Boolean functions, containing Ordered Binary Decision Diagrams (OBDDs) as a distinguished subclass. While OBDDs are characterized by total variable orders, SDDs are characterized more generally by vtrees. As both OBDDs and SDDs have canonical representations, searching for OBDDs and SDDs of minimal size simplifies to searching for variable orders and vtrees, respectively. For OBDDs, there are effective heuristics for dynamic reordering, based on locally swapping variables. In this paper, we propose an analogous approach for SDDs which navigates the space of vtrees via two operations: one based on tree rotations and a second based on swapping children in a vtree. We propose a particular heuristic for dynamically searching the space of vtrees, showing that it can find SDDs that are an order-of-magnitude more succinct than OBDDs found by dynamic reordering.
APA, Harvard, Vancouver, ISO, and other styles
34

Rauzy, Antoine, and Yang. "Decision Diagram Algorithms to Extract Minimal Cutsets of Finite Degradation Models." Information 10, no. 12 (2019): 368. http://dx.doi.org/10.3390/info10120368.

Full text
Abstract:
In this article, we propose decision diagram algorithms to extract minimal cutsets of finite degradation models. Finite degradation models generalize and unify combinatorial models used to support probabilistic risk, reliability and safety analyses (fault trees, attack trees, reliability block diagrams…). They formalize a key idea underlying all risk assessment methods: states of the models represent levels of degradation of the system under study. Although these states cannot be totally ordered, they have a rich algebraic structure that can be exploited to extract minimal cutsets of models, which represent the most relevant scenarios of failure. The notion of minimal cutsets we introduce here generalizes the one defined for fault trees. We show how algorithms used to calculate minimal cutsets can be lifted up to finite degradation models, thanks to a generic decomposition theorem and an extension of the binary decision diagrams technology. We discuss the implementation and performance issues. Finally, we illustrate the interest of the proposed technology by means of the use case stemmed from the oil and gas industry.
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Ruiwei, and Roland H. C. Yap. "On the Modelling of Constraints with Tractable Logical Operators." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 11 (2025): 11381–89. https://doi.org/10.1609/aaai.v39i11.33238.

Full text
Abstract:
Solving a Constraint Satisfaction Problem (CSP) usually requires a model typically using existing basic constraints. The most flexible form of constraint, ad-hoc (generic) constraints defined with certain constraint representations, such as binary constraint tree (BCT) and decision diagrams, have been proposed where basic constraints in intensional form are insufficient. A modeller may wish to combine basic constraints using logic operators (and, or, negation). However, negation, a key logical operator for expressivity is not tractable in many existing constraint representations. This creates a dilemma, for modelling, we would desire more flexibility, but a model whose operations are intractable may in turn be impractical. In this paper, we give a framework which allows for a tractable negation operator on constraint representations. We apply the framework on the BCT and ordered decision diagram constraints, giving new subforms. These subforms can be strictly more succinct than ordered multi-valued decision diagrams (OMDD), while being as tractable as OMDD for logical combinations. We give applications to show effective propagators from logical combinations and in building large constraint models for configuration problems.
APA, Harvard, Vancouver, ISO, and other styles
36

Mateescu, R., R. Dechter, and R. Marinescu. "AND/OR Multi-Valued Decision Diagrams (AOMDDs) for Graphical Models." Journal of Artificial Intelligence Research 33 (December 17, 2008): 465–519. http://dx.doi.org/10.1613/jair.2605.

Full text
Abstract:
Inspired by the recently introduced framework of AND/OR search spaces for graphical models, we propose to augment Multi-Valued Decision Diagrams (MDD) with AND nodes, in order to capture function decomposition structure and to extend these compiled data structures to general weighted graphical models (e.g., probabilistic models). We present the AND/OR Multi-Valued Decision Diagram (AOMDD) which compiles a graphical model into a canonical form that supports polynomial (e.g., solution counting, belief updating) or constant time (e.g. equivalence of graphical models) queries. We provide two algorithms for compiling the AOMDD of a graphical model. The first is search-based, and works by applying reduction rules to the trace of the memory intensive AND/OR search algorithm. The second is inference-based and uses a Bucket Elimination schedule to combine the AOMDDs of the input functions via the the APPLY operator. For both algorithms, the compilation time and the size of the AOMDD are, in the worst case, exponential in the treewidth of the graphical model, rather than pathwidth as is known for ordered binary decision diagrams (OBDDs). We introduce the concept of semantic treewidth, which helps explain why the size of a decision diagram is often much smaller than the worst case bound. We provide an experimental evaluation that demonstrates the potential of AOMDDs.
APA, Harvard, Vancouver, ISO, and other styles
37

NINOMIYA, Hiroshi, Manabu KOBAYASHI, and Shigeyoshi WATANABE. "Reduced Reconfigurable Logic Circuit Design Based on Double Gate CNTFETs Using Ambipolar Binary Decision Diagram." IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences E96.A, no. 1 (2013): 356–59. http://dx.doi.org/10.1587/transfun.e96.a.356.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Bibilo, P. N. "Disjunctive and conjunctive decompositions of incompletely defined Boolean functions in a Binary Decision Diagram." Informatics 22, no. 1 (2025): 40–65. https://doi.org/10.37661/1816-0301-2025-22-1-40-65.

Full text
Abstract:
Objectives. The problems of minimizing the number of cofactors (subfunctions) of the Shannon expansions located at the same level of the BDD, representing a system of incompletely defined (partial) Boolean functions, are considered. To reduce the number of functions, it is proposed to find a subset of such functions that can be expressed as algebraic decompositions of disjunctions or conjunctions of other predefined partial functions, while the directed graph of function occurrences in decompositions should not contain contours.Methods. Finding disjunctive and conjunctive decompositions requires searching for appropriate additional definitions of partial functions. Finding the largest number of separate algebraic decompositions reduces to the problem of finding a weighted row cover of a Boolean matrix of occurrences of system functions in separate decompositions. The task of finding consistent predefinitions of partial functions for various types of joint decompositions is reduced to the formulation and solution of logical equations.Results. It is shown that the constructed logical equations can be easily transformed to a conjunctive normal form (CNF), and finding the roots of such equations is reduced to the problem of satisfiability of a Boolean formula presented in the form of CNF, for which effective methods and programs are known.Conclusion. The proposed algorithms can be generalized to other types of algebraic decompositions, when, in addition to the logical operations of disjunction and conjunction, negations of these operations can be used. The application of the proposed algorithms and already known algorithms for minimizing multilevel BDD representations of partial function systems allows us to obtain the best results of technologically independent logical optimization, the initial stage of logic circuit synthesis.
APA, Harvard, Vancouver, ISO, and other styles
39

Layeb, Abdesslem, and Djamel-Eddine Saidouni. "A New Quantum Evolutionary Algorithm with Sifting Strategy for Binary Decision Diagram Ordering Problem." International Journal of Cognitive Informatics and Natural Intelligence 4, no. 4 (2010): 47–61. http://dx.doi.org/10.4018/jcini.2010100104.

Full text
Abstract:
In this work, the authors focus on the quantum evolutionary quantum hybridization and its contribution in solving the binary decision diagram ordering problem. Therefore, a problem formulation in terms of quantum representation and evolutionary dynamic borrowing quantum operators are defined. The sifting search strategy is used in order to increase the efficiency of the exploration process, while experiments on a wide range of data sets show the effectiveness of the proposed framework and its ability to achieve good quality solutions. The proposed approach is distinguished by a reduced population size and a reasonable number of iterations to find the best order, thanks to the principles of quantum computing and to the sifting strategy.
APA, Harvard, Vancouver, ISO, and other styles
40

Banov, Reni, Zdenko Šimić, and Davor Grgić. "A new heuristics for the event ordering in binary decision diagram applied in fault tree analysis." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 234, no. 2 (2019): 397–406. http://dx.doi.org/10.1177/1748006x19879305.

Full text
Abstract:
Fault tree is a common approach in probabilistic risk assessment of complex engineering systems. Since their introduction, binary decision diagrams proved to be a valuable tool for complete quantification of hard fault tree models. As is known, the size of the binary decision diagram representation is mainly determined by the quality of the selected fault tree event ordering scheme. Finding the optimal event ordering for binary decision diagram representation is a computationally intractable problem, for which reason heuristic approaches are applied to find reasonable good ordering schemes. The existing method for finding optimal ordering schemes related to special types of fan-in 2 read-once formulas is employed in our research to develop a new heuristic for fault tree. Various fault tree simplification methods are used for the sake of reducing fault tree model discrepancy from fan-in 2 read-once formulas. The reduced fault tree is traversed in a depth-first manner, as for every gate, the best ordering scheme is chosen from selected sets of input permutations. The quality of the final event ordering scheme is compared to orderings produced with depth-first left most heuristic on a set of fault tree models addressed in the literature as well as on a set of our hard models. Our method proves to be a useful heuristic for finding good static event ordering, and it compares favourably to the known heuristic based on a depth-first left most assignment approach.
APA, Harvard, Vancouver, ISO, and other styles
41

Xue, Yexiang, Arthur Choi, and Adnan Darwiche. "Basing Decisions on Sentences in Decision Diagrams." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (2021): 842–49. http://dx.doi.org/10.1609/aaai.v26i1.8221.

Full text
Abstract:
The Sentential Decision Diagram (SDD) is a recently proposed representation of Boolean functions, containing Ordered Binary Decision Diagrams (OBDDs) as a distinguished subclass. While OBDDs are characterized by total variable orders, SDDs are characterized by dissections of variable orders, known as vtrees. Despite this generality, SDDs retain a number of properties, such as canonicity and a polytime apply operator, that have been critical to the practical success of OBDDs. Moreover, upper bounds on the size of SDDs were also given, which are tighter than comparable upper bounds on the size of OBDDs. In this paper, we analyze more closely some of the theoretical properties of SDDs and their size. In particular, we consider the impact of basing decisions on sentences (using dissections as in SDDs), in comparison to basing decisions on variables (using total variable orders as in OBDDs). Here, we identify a class of Boolean functions where basing decisions on sentences using dissections of a variable order can lead to exponentially more compact SDDs, compared to OBDDs based on the same variable order. Moreover, we identify a fundamental property of the decompositions that underlie SDDs and use it to show how certain changes to a vtree can also lead to exponential differences in the size of an SDD.
APA, Harvard, Vancouver, ISO, and other styles
42

Xu, Zhou Bo, Tian Long Gu, Liang Chang, and Feng Ying Li. "A Novel Symbolic OBDD Algorithm for Generating Mechanical Assembly Sequences Using Decomposition Approach." Advanced Materials Research 201-203 (February 2011): 24–29. http://dx.doi.org/10.4028/www.scientific.net/amr.201-203.24.

Full text
Abstract:
The compact storage and efficient evaluation of feasible assembly sequences is one crucial concern for assembly sequence planning. The implicitly symbolic ordered binary decision diagram (OBDD) representation and manipulation technique has been a promising way. In this paper, Sharafat’s recursive contraction algorithm and cut-set decomposition method are symbolically implemented, and a novel symbolic algorithm for generating mechanical assembly sequences is presented using OBDD formulations of liaison graph and translation function. The algorithm has the following main procedures: choosing any one of vertices in the liaison graph G as seed vertex and scanning all connected subgraphs containing seed vertex by breadth first search; transforming the problem of enumerating all cut-sets in liaison graph into the problem of generating all the partitions: two subsets V1and V2of a set of vertices V where both the induced graph of vertices V1and V2are connected; checking the geometrical feasibility for each cut-set. Some applicable experiments show that the novel algorithm can generate feasible assembly sequences correctly and completely.
APA, Harvard, Vancouver, ISO, and other styles
43

Bibilo, P. N., and V. I. Romanov. "Experimental Study of Algorithms for Minimization of Binary Decision Diagrams using Algebraic Representations of Cofactors." Programmnaya Ingeneria 13, no. 2 (2022): 51–67. http://dx.doi.org/10.17587/prin.13.51-67.

Full text
Abstract:
BDD (Binary Decision Diagram) is used for technology-independent optimization, performed as the first stage in the synthesis of logic circuits in the design of ASIC (application-specific integrated circuit). BDD is an acyclic graph defining a Boolean function or a system of Boolean functions. Each vertex of this graph is associated with the complete or reduced Shannon expansion formula. Binary decision diagrams with mutually inverse subfunctions (cofac-tors) are considered. We have developed algorithms for finding algebraic representations of cofactors of the same BDD level in the form of a disjunction or conjunction of other inverse or non-inverse cofactors of the same BDD level. The algorithms make it possible to reduce the number of literals by replacing the Shannon expansion formulas with simpler logical formulas and to reduce the number of literals in the description of a system of Boolean functions. We propose to use the developed algorithms for an additional logical optimization of the constructed BDD representations of systems of Boolean functions. Experimental results of the application of the corresponding programs in the synthesis of logic circuits in the design library of custom VLSI CMOS circuits are presented.
APA, Harvard, Vancouver, ISO, and other styles
44

Bibilo, P. N., and V. I. Romanov. "Minimization of binary decision diagrams for systems of completely defined Boolean functions using Shannon expansions and algebraic representations of cofactors." Informatics 18, no. 2 (2021): 7–32. http://dx.doi.org/10.37661/1816-0301-2021-18-2-7-32.

Full text
Abstract:
In the systems of digital VLSI design (Very Large Integrated Circuits), the BDD (Binary Decision Diagram) is used for VLSI verification, as well as for technologically independent optimization as the first stage in the synthesis of logic circuits in various technological bases. The BDD is an acyclic graph defining a Boolean function or a system of Boolean functions. Each vertex of this graph corresponds to the complete or reduced Shannon expansion formula. When BDD representation for systems of Boolean functions is constructed, it is possible to perform additional logical optimization based on the proposed method of searching for algebraic representations of cofactors (subfunctions) of the same BDD level in the form of a disjunction, conjunction either exclusive-or of cofactors of the same level or lower levels of BDD. A directed BDD graph for a system of functions is constructed on the basis of Shannon expansion of all component functions of the system by the same permutation of variables. The method allows to reduce the number of literals by replacing the Shannon expansion formulas with simpler formulas that are disjunctions or conjunctions of cofactors, and to reduce the number of literals in specifying a system of Boolean functions. The number of literals in algebraic multilevel representations of systems of fully defined Boolean functions is the main optimization criterion in the synthesis of combinational circuits from librarian logic elements.
APA, Harvard, Vancouver, ISO, and other styles
45

Pimenov, Viktor, Mikhail Voronov, and Iliya Pimenov. "The cognitive visualization of classifying rules extracted from data based on binary solver matrix model." Information and Control Systems, no. 6 (January 16, 2020): 2–11. http://dx.doi.org/10.31799/1684-8853-2019-6-2-11.

Full text
Abstract:
Introduction: Huge volumes of data are generated in cyberspace or from internal information of various organizations. In order to obtain a set of output data with a clear structure, divide it into significant parts and develop rules of classification, machine learning methods are used. Most inductive methods simulate intermediate and high-level abstract categories in multidimensional space which are difficult to interpret. Purpose: Developing a model of machine learning in the form of a “white box” which explains the chosen solution using conventional production rules, along with cognitive visualizers for characterizing classes of objects. Methods: Formation of a binary decision matrix containing information about a combination of the selected informative sign values which imply the specified classes. Results: A binary decision matrix is formed automatically according to the results of cluster and discriminant analyzes. The learning procedure is reduced to setting interval thresholds and matrix elements, which makes it easy to implement a semantic interpretation of a solving rule. The object is recognized by elementwise conjunction of the matrix cells to which the values of the attributes are pointing, and by selection of a single cell corresponding to the class code. To interpret a rule, a universal algorithm for processing a binary matrix has been developed, which applies user-entered attribute values. The dimension of the viewed space is specified by adjustment rings on the recognition visualizer. The azimuth of an initiated diagram cell with the greatest dimensionality indicates the belonging of an object with the set features to a target class. For the characterization of classes, visualizers have been developed, demonstrating both the distinctive properties of a class and properties that several classes share. In many cases, the object type recognition stops when the depth of the scanned features space is significantly less than with a full search. Practical relevance: The proposed methods of cognitive analysis and data visualization provide not only the classification of data, determination of the significance of features, their ranking and selection, but also the development of rules which reveal the cause-and-effect relationship between the combination of factors and the type of a made decision.
APA, Harvard, Vancouver, ISO, and other styles
46

Afzaal, T., R. AlRamdan, H. Bualbanat, et al. "A67 REDUCING INAPPROPRIATE GAMMA-GLUTAMYL TRANSFERASE TESTING FOR INPATIENTS: A QUALITY IMPROVEMENT INITIATIVE IN LAB WASTE REDUCTION APPLYING THE MODEL FOR CONTINUOUS IMPROVEMENT." Journal of the Canadian Association of Gastroenterology 7, Supplement_1 (2024): 45–46. http://dx.doi.org/10.1093/jcag/gwad061.067.

Full text
Abstract:
Abstract Background Review of the literature identifies a rising trend in laboratory testing, with over 30% of tests estimated to be inappropriately repeated. Laboratory overutilization increases healthcare costs, and can lead to overdiagnosis, overtreatment and negative health outcomes. Indications for repeat Gamma Glutamyl Transferase (GGT) testing in adults are limited, particularly repeat testing within the same admission. Aims Our aim was to reduce the inappropriate ordering of repeat GGT testing by 25% for all inpatients at the London Health Sciences Centre (LHSC) over a one-year study period. Methods An interprofessional team was created to help engage relevant stakeholders, collect baseline data and reassess the indications for GGT testing. A combination of root cause analysis tools, specifically the Ishikawa diagram and Pareto chart, were employed to identify potential factors contributing to the overutilization of GGT testing. After prioritizing potential solutions, intervention bundles were developed, and Plan-Do-Study-Act (PDSA) cycles were created to target correctable factors. In PDSA cycle #1, the process started by eliminating GGT as a laboratory testing option in the three most commonly used admission order care sets. Considering the hierarchy of intervention effectiveness, PDSA cycle #2 involved implementing a computerized Clinical Decision Support (CDS) system to restrict the reordering of GGT tests within 72 hours of the same admission. Results Baseline data showed that in 2022, a total of 62,542 GGT tests were ordered, with an average of approximately 5,200 GGT tests ordered per month. Of these, 16.4% were ordered through the top 3 most prevalent admission order care sets, and around 25% of all GGT tests were repeats within 72 hours of admission. Referring to Figure 1, PDSA cycle #1 yielded no significant reduction in GGT testing. PDSA cycle #2 successfully reduced the proportion of repeat GGT tests ordered by 12% within two months of implementation, leading to an estimated annualized cost savings of approximately $37,440. Conclusions Our results establish the effectiveness of CDS systems in reducing laboratory testing overutilization, suggesting their superiority to individual care set targeting interventions, and emphasize the potential for cost-effective CDS development in contemporary healthcare. Funding Agencies None
APA, Harvard, Vancouver, ISO, and other styles
47

Sadchenko, A. V., O. A. Kushnirenko, and E. K. Koshelev. "Anti-interference pulsed laser ranging system." Технология и конструирование в электронной аппаратуре, no. 1-2 (2020): 8–14. http://dx.doi.org/10.15222/tkea2020.1-2.08.

Full text
Abstract:
Pulsed laser rangefinders prove to be cost-effective and practical devices when used at distances of several tens of kilometers due to their compactness, portability and energy efficiency. However, the measurement accuracy is significantly reduced by the presence of pulsed interference affecting the input of the optical receiver both during the sensing period and when the reflected signal is being received. Using the algorithms with the accumulation and subsequent processing of the results of several successive measurements reduces the speed of decision-making and does not guarantee the convergence of the results to the real value of the distance. The paper proposes a structural diagram of a laser rangefinder with the ability to detect pulsed interference in the range interval and correct errors that occur in the structure of the signal reflected from the target. The basis of the rangefinder circuit is a logical consistent filter, the structure of which contains multipliers (multiplication operations). The following requirements were formulated for the structure of the probe signal: — the first element should always be set to +1 to synchronize the receiver decider; — the weight of the coding sequence is equal to half its length; — the length of the coding sequence is even. Based on the requirements for coding sequences, the optimal structures of binary probing signals of length 8 were found, providing the best corrective ability. Comparison of the correlation properties of the found sequences and the sequences that are constructed using the Walsh functions showed the advantage of the optimal sequences by the criterion of the minimum level of the ACF side lobes. The simulation of the rangefinder under pulsed noise conditions has shown that the logical filter is advisable to use for those cases when the duration of the obstacle does not exceed 1/3 of the duration of the probing signal.
APA, Harvard, Vancouver, ISO, and other styles
48

Bagde, Vandana, and Dethe C. G. "Performance improvement of space diversity technique using space time block coding for time varying channels in wireless environment." International Journal of Intelligent Unmanned Systems 10, no. 2/3 (2020): 278–86. http://dx.doi.org/10.1108/ijius-04-2019-0026.

Full text
Abstract:
PurposeA recent innovative technology used in wireless communication is recognized as multiple input multiple output (MIMO) communication system and became popular for quicker data transmission speed. This technology is being examined and implemented for the latest broadband wireless connectivity networks. Though high-capacity wireless channel is identified, there is still requirement of better techniques to get increased data transmission speed with acceptable reliability. There are two types of systems comprising of multi-antennas placed at transmitting and receiving sides, of which first is diversity technique and another is spatial multiplexing method. By making use of these diversity techniques, the reliability of transmitting signal can be improved. The fundamental method of the diversity is to transform wireless channel such as Rayleigh fading into steady additive white Gaussian noise (AWGN) channel which is devoid of any disastrous fading of the signal. The maximum transmission speed that can be achieved by spatial multiplexing methods is nearly equal to channel capacity of MIMO. Conversely, for diversity methods, the maximum speed of broadcasting is much lower than channel capacity of MIMO. With the advent of space–time block coding (STBC) antenna diversity technique, higher-speed data transmission is achievable for spatially multiplexed multiple input multiple output (SM-MIMO) system. At the receiving end, detection of the signal is a complex task for system which exhibits SM-MIMO. Additionally, a link modification method is implemented to decide appropriate coding and modulation scheme such as space diversity technique STBC to use two-way radio resources efficiently. The proposed work attempts to improve detection of signal at receiving end by employing STBC diversity technique for linear detection methods such as zero forcing (ZF), minimum mean square error (MMSE), ordered successive interference cancellation (OSIC) and maximum likelihood detection (MLD). The performance of MLD has been found to be better than other detection techniques.Design/methodology/approachAlamouti's STBC uses two transmit antennas regardless of the number of receiver antennas. The encoding and decoding operation of STBC is shown in the earlier cited diagram. In the following matrix, the rows of each coding scheme represent a different time instant, while the columns represent the transmitted symbols through each different antenna. In this case, the first and second rows represent the transmission at the first and second time instant, respectively. At a time t, the symbol s1 and symbol s2 are transmitted from antenna 1 and antenna 2, respectively. Assuming that each symbol has duration T, then at time t + T, the symbols –s2* and s1*, where (.)* denotes the complex conjugate, are transmitted from antenna 1 and antenna 2, respectively. Case of one receiver antenna: The reception and decoding of the signal depend on the number of receiver antennas available. For the case of one receiver antenna, the received signals are received at antenna 1 , hij is the channel transfer function from the jth transmit antenna and the ith receiver antenna, n1 is a complex random variable representing noise at antenna 1 and x (k) denotes x at time instant k ( at time t + (k – 1)T.FindingsThe results obtained for maximal ratio combining (MRC) with 1 × 4 scheme show that the BER curve drops to 10–4 for signal-to-noise (SNR) ratio of 10 dB, whereas for MRC 1 × 2 scheme, the BER drops down to 10–5 for SNR of 20 dB. Results obtained in Table 1 show that when STBC is employed for MRC with 1 × 2 scheme (one antenna at transmitter node and two antennas at receiver node), BER curve comes down to 0.0076 for Eb/N0 of 12. Similarly, when MRC with 1 × 4 antenna scheme is implemented, BER drops down to 0 for Eb/N0 of 12. Thus, it can be concluded from the obtained graph that the performance of MRC with STBC gives improved results. When STBC technique is used with 3 × 4 scheme, at SNR of 10 dB, BER comes nearer to 10–6 (figure 7.3). It can be concluded from the analytics observed between AWGN and Rayleigh fading channel that for AWGN channel, BER is found to be equal to 0 for SNR value of 13.5 dB, whereas for Rayleigh fading channel, BER is observed nearer to 10–3 for Eb/N0 = 15. Simulation results (in figure 7.2) from the analytics show BER drops to 0 for SNR value of 12 dB.Research limitations/implicationsOptimal design and successful deployment of high-performance wireless networks present a number of technical challenges. These include regulatory limits on useable radio-frequency spectrum and a complex time-varying propagation environment affected by fading and multipath. The effect of multipath fading in wireless systems can be reduced by using antenna diversity. Previous studies show the performance of transmit diversity with narrowband signals using linear equalization, decision feedback equalization, maximum likelihood sequence estimation (MLSE) and spread spectrum signals using a RAKE receiver. The available IC techniques compatible with STBC schemes at transmission require multiple antennas at the receiver. However, if this not a strong constraint at the base station level, it remains a challenge at the handset level due to cost and size limitation. For this reason, SAIC technique, alternative to complex ML multiuser demodulation technique, is still of interest for 4G wireless networks using the MIMO technology and STBC in particular. In a system with characteristics similar to the North American Digital mobile radio standard IS-54 (24.3 K symbols per sec. with an 81 Hz fading rate), adaptive retransmission with time deviation is not practical.Practical implicationsThe evaluation of performance in terms of bit error rate and convergence time which estimates that MLD technique outperforms in terms of received SNR and low decoding complexity. MLD technique performs well but when higher number of antennas are used, it requires more computational time and thereby resulting in increased hardware complexity. When MRC scheme is implemented for singe input single output (SISO) system, BER drops down to 10–2 for SNR of 20 dB. Therefore, when MIMO systems are employed for MRC scheme, improved results based on BER versus SNR are obtained and are used for detecting the signal; comparative study based on different techniques is done. Initially ZF detection method is utilized which was then modified to ZF with successive interference cancellation (ZFSIC). When successive interference cancellation scheme is employed for ZFSIC, better performance is observed as compared to the estimation of ML and MMSE. For 2 × 2 scheme with QPSK modulation method, ZFSIC requires more computational time as compared to ZF, MMSE and ML technique. From the obtained results, the conclusion is that ZFSIC gives the improved results as compared to ZF in terms of BER ratio. ZF-based decision statistics can be produced by the detection algorithm for a desired sub-stream from the received vector whichs consist of an interference which occurred from previous transmitted sub-streams. Consequently, a decision on the secondary stream is made and contribution of the noise is regenerated and subtracted from the vector received. With no involvement of interference cancellation, system performance gets reduced but computational cost is saved. While using cancellation, as H is deflated, coefficients of MMSE are recalculated at each iteration. When cancellation is not involved, the computation of MMSE coefficients is done only once, because of H remaining unchanged. For MMSE 4 × 4 BPSK scheme, bit error rate of 10–2 at 30 dB is observed. In general, the most thorough procedure of the detection algorithm is the computation of the MMSE coefficients. Complexity arises in the calculation of the MMSE coefficients, when the antennas at the transmitting side are increased. However, while implementing adaptive MMSE receivers on slow channel fading, it is probable to recover the signal with the complications being linear in the antennas of transmitter node. The performance of MMSE and successive interference cancellation of MMSE are observed for 2 × 2 and 4 × 4 BPSK and QPSK modulation schemes. The drawback of MMSE SIC scheme is that the first detected signal observes the noise interference from (NT-1) signals, while signals processed from every antenna later observe less noisy interference as the process of cancellation progresses. This difficulty could be overcome by using OSIC detection method which uses successive ordering of the processed layers in the decreasing power of the signal or by power allocation to the signal transmitted depending on the order of the processing. By using successive scheme, a computation of NT delay stages is desired to bring out the abandoned process. The work also includes comparison of BER with various modulation schemes and number of antennas involved while evaluating the performance. MLD determines the Euclidean distance among the vector signal received and result of all probable transmitted vector signals with the specified channel H and finds the one with the minimum distance. Estimated results show that higher order of the diversity is observed by employing more antennas at both the receiving and transmitting ends. MLD with 8 × 8 binary phase shift keying (BPSK) scheme offers bit error rate near to 10–4 for SNR (16 dB). By using Altamonti space ti.Social implicationsIt should come as no surprise that companies everywhere are pushing to get products to market faster. Missing a market window or a design cycle can be a major setback in a competitive environment. It should be equally clear that this pressure is coming at the same time that companies are pushing towards “leaner” organizations that can do more with less. The trends mentioned earlier are not well supported by current test and measurement equipment, given this increasingly high-pressure design environment: in order to measure signals across multiple domains, multiple pieces of measurement equipment are needed, increasing capital or rental expenses. The methods available for making cross-domain, time-correlated measurements are inefficient, reducing engineering efficiency. When only used on occasion, the learning curve to understand how to use equipment for logic analysis, time domain and RF spectrum measurements often requires an operator to re-learn each piece of separate equipment. The equipment needed to measure wide bandwidth, time-varying spectral signals is expensive, again increasing capital or rental expenses. What is needed is a measurement instrument with a common user interface that integrates multiple measurement capabilities into a single cost-effective tool that can efficiently measure signals in the current wide-bandwidth, time-correlated, cross-domain environments. The market of wireless communication using STBCs has large scope of expansion in India. Therefore, the proposed work has techno-commercial potential and the product can be patented. This project shall in turn be helpful for remote areas of the nearby region particularly in Gadchiroli district and Melghat Tiger reserve project of Amravati district, Nagjira and so on where electricity is not available and there is an all the time problem of coverage in getting the network. In some regions where electricity is available, the shortage is such that they cannot use it for peak hours. In such cases, stand-alone space diversity technique, STBC shall help them to meet their requirements in making connection during coverage problem, thereby giving higher data transmission rates with better QOS (quality of service) with least dropped connections. This trend towards wireless everywhere is causing a profound change in the responsibilities of embedded designers as they struggle to incorporate unfamiliar RF technology into their designs. Embedded designers frequently find themselves needing to solve problems without the proper equipment needed to perform the tasks.Originality/valueWork is original.
APA, Harvard, Vancouver, ISO, and other styles
49

Paczesny, Sophie, Iztok Hozo, and Benjamin Djulbegovic. "Sinusoidal Obstruction Syndrome (SOS) Biomarkers-Derived Predictive Model to Individualize SOS Preemption in Patients Undergoing Allogeneic Stem Cell Transplant." Blood 142, Supplement 1 (2023): 4895. http://dx.doi.org/10.1182/blood-2023-172813.

Full text
Abstract:
Background: We have previously reported that the predictive model based on 3 biomarkers [L-ficolin, hyaluronic acid (HA), and stimulation 2 (ST2)] can accurately stratify the risk of sinusoidal obstruction syndrome (SOS) in high (&amp;gt;30%) vs low (&amp;lt;5%) risk groups [ JCI Insight. 2023; 8(10):e168221]. However, the improved risk stratification for SOS does not necessarily help guide the administration of the risk-adapted preemptive therapy. Here we set out to assess the clinical utility of the SOS model by integrating its prediction with the consequences (benefits and harms) of preemptive treatment with defibrotide. Methods: We used the data set from the prospective evaluation of the accuracy of the SOS predictive model. We have reformulated the SOS predictive model as a simple fast-and-frugal (FFT) decision tree. FFT consisted of sequentially ordered cues of 3 biomarkers [L-ficolin, HA, and ST2] via a series of binary (yes/no) classification decisions according to the optimal cutpoints chosen by Youden's index. After assessing the model's performance (discrimination, calibration) we applied a generalized decision curve analysis (gDCA) to integrate the effects of treatments with the predictive accuracy of the model. This, in turn, allowed us to develop individualized recommendations for the use of preemptive treatment of SOS with defibrotide. We calculated net benefits (NB) to compare 3 management strategies: don't provide any treatment vs. administer prophylaxis to all patients undergoing allo-transplant vs. use the SOS model to guide the preemption treatment with defibrotide. According to gDCA the best treatment is the one with the highest net benefits. We informed gDCA analysis using data from the published randomized trial comparing defibrotide vs. placebo treatment effects [ Lancet. 2012; 379 :1301]. Adverse events (AE) were seen in 2.5% more patients compared with placebo, and defibrotide reduced the risk of SOS by 40% (=relative risk reduction [RRR]). Results: The model had a good performance characteristic. Harell's C discrimination statistics was 0.77 with good calibration properties (with intercept and the slope not statistically significantly different from 0 and 1, respectively). Assuming AE of 2.5%, offering prophylaxis to all patients with defibrotide represents the best management strategy for very large RRR (&amp;gt;65%). Similarly, for RRR&amp;lt;2.5%, no prophylaxis should be offered. For all other values of treatment effects (ranging from RRR 2.5% to 65%), relying on the SOS model represents the best strategy to guide the use of defibrotide preemptively (Figure 1A). Assuming the baseline treatment values [AE=2.5%, RRR=40%], the SOS model almost perfectly individualizes the use of prophylaxis: all patients with a risk of SOS&amp;gt;6.25% should be given prophylaxis, otherwise not (Figure 1B). However, the risk threshold was sensitive to the assumptions about defibrotide treatment effects. Conclusion: The SOS biomarker-based model offers a new method to guide risk-adapted prophylactic or preemptive therapy for SOS.
APA, Harvard, Vancouver, ISO, and other styles
50

Das, Apangshu, Vivek Kumar Singh, and Sambhu Nath Pradhan. "Shared reduced ordered binary decision diagram‐based thermal‐aware network synthesis." International Journal of Circuit Theory and Applications, March 2022. http://dx.doi.org/10.1002/cta.3255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography