Entropy and Quantity of Information in Technical Designations

Authors

  • V. G. Kryzhanovskyi Vasyl Stus Donetsk National University, Vinnytsia

DOI:

https://doi.org/10.31649/1997-9266-2023-167-2-58-65

Keywords:

entropy, information, technical notation, algebraic theory of entropy, entropy of classification, information algebra, theory of hints

Abstract

Conventional designations of integrated microcircuits are considered as an example of classification and abbreviated name (code) of technical products to answer the question: Why do they say, that some designation systems are "more informative?". Do such notations contain more information compared to other systems? Such tasks are closely related to the tasks of machine learning and the construction of the "semantic web". Based on the algebraic approach and set theory, the characteristics of the entropy of the classification of designations are considered and it is shown that the entropy of such a coded designation is less than that of an arbitrary system of recording technical characteristics, which is explained by the positional structure of the designation and, accordingly, the lower power of the sets that make up a specific designation. Based on the approach of informational algebra, it is confirmed that the establishment in the technical notation of the atomic structure of the sets to which the technical characteristics correspond, really corresponds to the mathematical definition of a more informative structure. Based on the mathematical theory of hints, the structure of the technical designation is analyzed and the possibility of obtaining additional information, for example, relationships between different groups of technical parameters, is indicated. It will be obtained as a result of questions clarifying the interpretation of existing answers. This is a consequence of the property of hint entropy, which has two components — the Shannon entropy and the generalized Hartley measure, which correspond to probabilistic information about the true interpretation of the answer in the set and relational information about the true answer about some type of integrated circuit parameters. Technical notation turns out to be an effective example on which the considered mathematical theories can be applied and accordingly can be an example of a code that, on the one hand, can be understood by a person, and on the other hand, can be used in machine information processing systems.

Author Biography

V. G. Kryzhanovskyi, Vasyl Stus Donetsk National University, Vinnytsia

Dr. Sc. (Eng), Professor, Professor of the Chair of Applied Mathematics and Cybersecurity

References

P. Hitzler, “A review of the semantic web field,” Communications of the ACM, no. 64(2), pp. 76-83, 2021. https://doi.org/10.1145/3397512 .

А. І. Катаєва, «Застосування баз знань до неструктурованої текстової інформації,» Матеріали наукової конференції професорсько-викладацького складу, наукових працівників і здобувачів наукового ступеня за підсумками науково-дослідної роботи за період 2019–2020 рр. Вінниця: ДонНУ, квітень–травень 2021 р., с. 324-326.

А. Ю. Берко, О. М. Верес, і В. В. Пасічник, Системи баз даних та знань. Книга 1. Організація баз даних та знань «Магнолія-2006» Рік: 2013, 680 с.

ОСТ 11 073.915-80, Мікросхеми інтегральні. Класифікація і система умовних позначень. Чинний від 1 січня 1980 р.

Eco U. The Open Work, translated by Anna Cancogni: with an introduction by David Robey. Harvard University Press Cambridge, Massachusetts, 1989, 290 p.

R. M. Gray, Entropy and Information Theory. Springer New York, NY, 2013, 355 p.

K. Baclawski, and D. A Simovici, “A characterization of the information content of a classification,” Information Processing Letters, vol. 57, issue 4, pp. 211-214, 26 February 1996.

P. Fejer, and D. Simovici, Mathematical Foundations of Computer Science, Springer, New York, 1990.

J. Kohlas, “Information Algebras: generic structures for inference,” Discrete Mathematics and Theoretical Computer Science, Series ISSN 1439-9911. ISBN 978-1-85233-689-9 .

A. Janssen, and K. Immink, “An Entropy Theorem for Computing the Capacity of Weakly – Constrained Sequences,” IEEE Tran. on Information Theory, vol. 46, no. 3, pp. 1034-1038, May 2000.

Р. Н. Квєтний, П. П. Повідайко, М. М. Компанець, В. В. Гармаш, і Я. А. Кулик, Арифметичні основи проектування мікропроцесорних систем, навч. посіб. Вінниця: ВНТУ, 2017, 111 с.

J. Kohlas, “The mathematical theory of evidence — A short introduction,” in: Doležal, J., Fidler, J. (eds) System Modelling and Optimization. IFIP, Springer, Boston, MA, 1996, pp. 37-53. https://doi.org/10.1007/978-0-387-34897-1 .

M. Pouly, J. Kohlas, and P. Y. A. Ryan, “Generalized Information Theory for Hints,” January 2013 International Journal of Approximate Reasoning, no. 54(1), pp. 228-251. https://doi.org/10.1016/j.ijar.2012.08.004 .

ЦЕОМ. Інтегральні мікросхеми серії КР1533. [Електронний ресурс]. Режим доступу:

https://ksm.nau.edu.ua/arhitectura/files/ims1533.pdf. 21.01.23 .

А. Д. Данілова, А. І. Радченко, і Т. М. Яцків, Методичні рекомендації щодо впровадження цифрових ідентифікаторів у видавничий процес для періодичних видань Національної академії наук України, ПА «Укрінформнаука», 3-е вид., перер. і доп. Київ: Академперіодика, 2019, 60 с.

A. Saha, and N. Manna, Digital Principles and Logic Design. Infinity Science Press LLC, 2007, 505 р. ISBN: 978-1-934015-03-2.

Downloads

Abstract views: 79

Published

2023-05-04

How to Cite

[1]
V. G. Kryzhanovskyi, “Entropy and Quantity of Information in Technical Designations”, Вісник ВПІ, no. 2, pp. 58–65, May 2023.

Issue

Section

Information technologies and computer sciences

Metrics

Downloads

Download data is not yet available.

Most read articles by the same author(s)