Boolean Functions Implementation in Perception Structures

Authors

  • S. V. Yakovyn Ivano-Frankivsk National Technical University of Oil and Gas
  • S. I. Melnychuk Ivano-Frankivsk National Technical University of Oil and Gas

DOI:

https://doi.org/10.31649/1997-9266-2024-176-5-48-55

Keywords:

perceptron structures, binary signals, Boolean functions, signal processing, probabilistic characteristics

Abstract

The authors consider the peculiarities of Boolean functions realization based on perceptron structures. The limitations of the typical structure of the perceptron are determined, in particular its ability to perform only linearly separable functions, as well as main approaches to overcoming such limitations, which are solved by complicating the structures.

Attention is also paid to probabilistic features that can be used to improve the learning process of perceptrons, using probabilistic learning algorithms and higher-order probabilistic perceptrons that use Bayesian probabilities. The problems of implementing Boolean functions of higher orders are considered, including the need to use polynomials of higher orders, the complexity of learning and computational practicality. Approaches to the decomposition of Boolean functions of higher orders into a series of linearly separable functions are presented.

In addition, the capabilities of single-layer perceptrons for the classification of linearly separable objects with binary results and their limitations are analyzed. Variants of improving single-layer perceptron structures by using power functions and probabilistic signal indicators to expand their classification capabilities are described.

The obtained results confirm the relevance and perspective of further research and development of new perceptron structures for more effective solution of machine learning problems, including the development of specialized structures and learning algorithms to work with higher-order functions.

Author Biographies

S. V. Yakovyn, Ivano-Frankivsk National Technical University of Oil and Gas

Post-Graduate Student the Chair of Computer Systems and Networks

S. I. Melnychuk, Ivano-Frankivsk National Technical University of Oil and Gas

Dr. Sc. (Eng.), Professor, Head of the Chair of Computer Systems and Networks

References

V. Silaparasetty, Perceptrons. Deep Learning Projects Using TensorFlow, 2021. https://doi.org/10.1142/9789814343039_0007.

M. Schuld, I. Sinayskiy, and F. Petruccione, Simulating a perceptron on a quantum computer, 2014. ArXiv, abs/1412.3635. https://doi.org/10.1016/j.physleta.2014.11.061.

T. Hong, and S. Tseng, “A probabilistic perceptron learning algorithm,” J. Exp. Theor. Artif. Intell., no. 4, pp. 265-279, 1992. https://doi.org/10.1080/09528139208953751 .

J. Clark, K. Gernoth, S. Dittmar, and M. Ristig, “Higher-order probabilistic perceptrons as Bayesian inference engines,” Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 59, 5 Pt B, pp. 6161-74, 1999. https://doi.org/10.1103/PHYSREVE.59.6161 .

J. Feldman, Probabilistic models of perceptual features. 2015. https://doi.org/10.1093/OXFORDHB/9780199686858.013.049 .

Y. Hu, “Fuzzy integral-based perceptron for two-class pattern classification problems”, Inf. Sci., no. 177, pp. 1673-1686, 2007. https://doi.org/10.1016/j.ins.2006.09.009 .

Y. Zhong, “An Integrated Perceptron Network and Learning Algorithm for Multi-Class Patterns Recognition,” Journal of Beijing Institute of Technology, 2001.

Y. Hu, “Pattern classification by multi-layer perceptron using fuzzy integral-based activation function,” Appl. Soft Comput., no. 10, pp. 813-819, 2010. https://doi.org/10.1016/j.asoc.2009.09.011 .

K. Singh, C. Sahu, and J. Singh, “Linearly separable pattern classification using memristive crossbar circuits,” Fifteenth International Symposium on Quality Electronic Design, pp. 323-329, 2014. https://doi.org/10.1109/ISQED.2014.6783343 .

Y. Hu, and F. Tseng, “Functional-link net with fuzzy integral for bankruptcy prediction,” Neurocomputing, no. 70, pp. 2959-2968, 2007. https://doi.org/10.1016/j.neucom.2006.10.111 .

H. Ramchoun, M. Idrissi, Y. Ghanou, and M. Ettaouil, “Multilayer Perceptron: Architecture Optimization and training with mixed activation functions,” Proceedings of the 2nd international Conference on Big Data, Cloud and Applications, 2017. https://doi.org/10.1145/3090354.3090427 .

H. Ramchoun, M. Idrissi, Y. Ghanou, and M. Ettaouil, “Multilayer Perceptron New Method for Selecting the Architecture Based on the Choice of Different Activation Functions,” Int. J. Inf. Syst. Serv. Sect., no. 11, pp. 21-34, 2019. https://doi.org/10.4018/ijisss.2019100102 .

S. Melnychuk, M. Kuz, and S. Yakovyn, “Emulation of logical functions NOT, AND, OR, and XOR with a perceptron implemented using an information entropy function,” 2018 14th International Conference on Advanced Trends in Radioelecrtronics, Telecommunications and Computer Engineering (TCSET), 2018, pp. 878-882. https://doi.org/10.1109/TCSET.2018.8336337 .

V. Murino, “Structured neural networks for pattern recognition. IEEE transactions on systems, man, and cybernetics,” Part B, Cybernetics, a publication of the IEEE Systems, Man, and Cybernetics Society, vol. 28, no. 4, pp. 553-610, 1998. https://doi.org/10.1109/3477.704294 .

A. Jain, R. Duin, and J. Mao, “Statistical Pattern Recognition: A Review,” IEEE Trans. Pattern Anal. Mach. Intell., no. 22, pp. 4-37, 2000. https://doi.org/10.1109/34.824819 .

S.Osowski, and D. Nghia, “Neural networks for classification of 2-D patterns. WCC 2000 - ICSP 2000,” 2000 5th International Conference on Signal Processing Proceedings. 16th World Computer Congress 2000, no. 3, pp. 1568-1571 vol. 3. https://doi.org/10.1109/ICOSP.2000.893399 .

E. El-Sebakhy, “Functional networks training algorithm for statistical pattern recognition. Proceedings. ISCC 2004. Ninth International Symposium on Computers And Communications,” IEEE Cat., no.04TH8769, 1, pp. 92-97, vol. 1, 2004. https://doi.org/10.1109/ISCC.2004.1358387 .

J. Kittler, “Statistical pattern recognition in image analysis,” Journal of Applied Statistics, no. 21, pp. 61-75, 1994. https://doi.org/10.1080/757582968 .

X.Shenshu, Z. Zhaoying, Z. Limin, and Z. Wendong, “Approximation to Boolean functions by neural networks with applications to thinning algorithms,” Proceedings of the 17th IEEE Instrumentation and Measurement Technology Conference [Cat. No. 00CH37066], 2000, vol. 2, no. 2, pp. 1004-1008. https://doi.org/10.1109/IMTC.2000.848892 .

Downloads

Abstract views: 5

Published

2024-10-31

How to Cite

[1]
S. V. Yakovyn and S. I. Melnychuk, “Boolean Functions Implementation in Perception Structures”, Вісник ВПІ, no. 5, pp. 48–55, Oct. 2024.

Issue

Section

Information technologies and computer sciences

Metrics

Downloads

Download data is not yet available.