(image)

Norbert Jankowski

Chapter 1 List of Publications

  • [1] Norbert Jankowski. “A fast and efficient algorithm for filtering the training dataset”. In: Neural Information Processing. Vol. 13623. Cham: Springer International Publishing, 2022, pp. 504–512. doi: 10.1007/978-3-031-30105-6_42.

  • [2] Norbert Jankowski. “Revdbscan and Flexscan—O(n log n) clustering algorithms”. In: Neural Information Processing. Ed. by Teddy Mantoro et al. Cham: Springer International Publishing, 2021, pp. 642–650. doi: 10.1007/978-3-030-92307-5_75.

  • [3] M. Orliński and N. Jankowski. “Fast t-SNE algorithm with forest of balanced LSH trees and hybrid computation of repulsive forces”. In: Knowledge-Based Systems 206 (2020), pp. 1–16. doi: 10.1016/j.knosys.2020.106318.

  • [4] M. Orliński and N. Jankowski. “O(m log m) instance selection algorithms—RR-DROPs”. In: IEEE World Congress on Computational Intelligence. IEEE Press, 2020, pp. 1–8. doi: https://doi.org/10.1109/IJCNN48605.2020.9207158. url: http://www.is.umk.pl/~norbert/publications/20-FastDROP.pdf.

  • [5] N. Jankowski and M. Orliński. “Fast Encoding length-based prototype selection algorithms”. In: Australian Journal of Intelligent Information Processing Systems, Special Issue: Neural Information Processing 26th International Conference on Neural Information Processing 16.3 (2019), pp. 59–66. url: http://ajiips.com.au/iconip2019/docs/ajiips/v16n3.pdf.

  • [6] N. Jankowski and M. Orliński. “Fast algorithm for prototypes selection—Trust-Margin prototypes”. In: Artificial Intelligence and Soft Computing. Ed. by L. Rutkowski et al. Vol. 11508. Lecture Notes in Computer Science. Springer, 2019, pp. 583–594. doi: https://doi.org/10.1007/978-3-030-20912-4_53.

  • [7] N. Jankowski and R. Linowiecki. “Fast neural networks learning algorithm by approximate singular value decomposition”. In: International Journal of Applied Mathematics and Computer Science 29.3 (2019), 581–594. doi: https://doi.org/10.2478/amcs-2019-0043.

  • [8] N. Jankowski. “Comparison of prototype selection algorithms used in construction of neural networks learned by SVD”. In: International Journal of Applied Mathematics and Computer Science 28.4 (2018), pp. 719–733. doi: https://doi.org/10.2478/amcs-2018-0055.

  • [9] N. Jankowski. “Prototype-based kernels for extreme learning machines and radial basis function networks”. In: Artificial Intelligence and Soft Computing. Ed. by L. Rutkowski et al. Lecture Notes in Computer Science. Springer, 2018, pp. 70–75. url: http://www.is.umk.pl/~norbert/publications/18-nj-protoELM.pdf.

  • [10] N. Jankowski. “The multi-ranked classifiers comparison”. In: Computer Recognition Systems. Advances in Intelligent Systems and Computing. Springer-Verlag, 2016, pp. 111–123. url: http://www.is.umk.pl/~norbert/publications/15-nj-classifierComp.pdf.

  • [11] N. Jankowski. “Complexity-based test task ordering for meta-learning algorithms”. In: (2013). (submitted).

  • [12] Norbert Jankowski. “Meta-learning and new ways in model construction for classification problems”. In: Journal of Network & Information Security 4.4 (2013), pp. 275–284. url: http://www.is.umk.pl/~norbert/publications/13-mm.pdf.

  • [13] N. Jankowski. “Complexity measures for meta-learning and their optimality”. In: Algorithmic probability and friends. Ed. by D. L. Dowe. Vol. 7070. Lecture Notes in Computer Science. Springer-Verlag, 2013, pp. 198–210. url: http://www.is.umk.pl/~norbert/publications/11-cmplxOptimal.pdf.

  • [14] N. Jankowski. “Fast heterogeneous boosting”. In: IEEE Symposium Series on Computational Intelligence, Computational Intelligence and Ensemble Learning. IEEE Press, 2013, pp. 1–8. url: http://www.is.umk.pl/~norbert/publications/13-JankowskiHBoost.pdf.

  • [15] N. Jankowski. “Graph-based generation of meta-learning search space”. In: International Journal of Applied Mathematics and Computer Science 22.3 (2012), pp. 647–667. url: http://www.is.umk.pl/~norbert/publications/11-njMLSpaces.pdf.

  • [16] W. Duch, N. Jankowski, and T. Maszczyk. “Make it cheap: learning with O(nd) complexity”. In: IEEE World Congress on Computational Intelligence. IEEE Press, 2012, pp. 2520–2527. url: http://www.is.umk.pl/~norbert/publications/12-WCCI-WDNJTM-v2.pdf.

  • [17] N. Jankowski. Meta-uczenie w inteligencji obliczeniowej. 396 pages. Warszawa, Polska: Akademicka Oficyna Wydawnicza EXIT, 2011. url: http://www.exit.pl/meta.htm.

  • [18] N. Jankowski and K. Usowicz. “Analysis of feature weighting methods based on feature ranking methods for classification”. In: Neural information processing. Part II. Vol. 7063. Lecture Notes in Computer Science. Springer-Verlag, 2011, pp. 238–247. url: http://www.is.umk.pl/~norbert/publications/11-featweight.pdf.

  • [19] N. Jankowski, W. Duch, and K. Grąbczewski, eds. Meta-learning in computational intelligence. Studies in Computational Intelligence. Springer, 2011. url: http://www.springer.com/engineering/computational+intelligence+and+complexity/book/978-3-642-20979-6.

  • [20] N. Jankowski, W. Duch, and K. Grąbczewski. “Preface”. In: Meta-learning in computational intelligence. Ed. by N. Jankowski, W. Duch, and K. Grąbczewski. Studies in Computational Intelligence. Springer, 2011. url: http://www.springer.com/engineering/computational+intelligence+and+complexity/book/978-3-642-20979-6.

  • [21] N. Jankowski and K. Grąbczewski. “Universal Meta-learning Architecture and Algorithms”. In: Meta-learning in Computational Intelligence. Ed. by N. Jankowski, W. Duch, and K. Grąbczewski. Studies in Computational Intelligence. Springer, 2011, pp. 1–76. url: http://www.is.umk.pl/~norbert/publications/09-MLbook-njkg.pdf.

  • [22] K. Grąbczewski and N. Jankowski. “Saving time and memory in computational intelligence system with machine unification and task spooling”. In: Knowledge-Based Systems 24.5 (2011), pp. 570–588. url: http://www.is.umk.pl/~norbert/publications/09-GrabczewskiJankowskiIntemi.pdf.

  • [23] N. Jankowski and K. Grąbczewski. “Increasing efficiency of data mining systems by machine unification and double machine cache”. In: Artificial Intelligence and Soft Computing. Vol. 6113. Lecture Notes in Computer Science. Springer, 2010, pp. 380–387. url: http://www.is.umk.pl/~norbert/publications/10-JankowskiGrabczewskiICAISC2010.pdf.

  • [24] K. Grąbczewski and N. Jankowski. “Task Management in Advanced Computational Intelligence System”. In: Artificial Intelligence and Soft Computing. Vol. 6113. Lecture Notes in Computer Science. Springer, 2010, pp. 331–338. url: http://www.is.umk.pl/~norbert/publications/10-GrabczewskiJankowskiICAISC2010.pdf.

  • [25] N. Jankowski and K. Grąbczewski. “Building meta-learning algorithms basing on search controlled by machine’s complexity and machines generators”. In: IEEE World Congress on Computational Intelligence. IEEE Press, 2008, pp. 3600–3607. url: http://www.is.umk.pl/~norbert/publications/08-meta3.pdf.

  • [26] K. Grąbczewski and N. Jankowski. “Meta-learning with machine generators and complexity controlled exploration”. In: Artificial Intelligence and Soft Computing. Vol. 5097. Lecture Notes in Computer Science. Springer-Verlag, 2008, pp. 545–555. url: http://www.is.umk.pl/~norbert/publications/08-meta2.pdf.

  • [27] K. Grąbczewski and N. Jankowski. “Control of complex machines for meta-learning in computational intelligence”. In: Computational Intelligence, Man-Machine Systems and Cybernetics. WSEAS, 2007, pp. 287–293. url: http://www.is.umk.pl/~norbert/publications/07-meta-manip-wseas.pdf.

  • [28] K. Grąbczewski and N. Jankowski. “Meta-learning as scheme-based search with complexity control”. In: International Joint Conference on Neural Network. Workshop on Meta-Learning. USA: IEEE Press, 2007, pp. 3–8. url: http://www.is.umk.pl/~norbert/publications/07-meta-manip.pdf.

  • [29] N. Jankowski and K. Grąbczewski. “Gained knowledge exchange and analysis for meta-learning”. In: Proceedings of International Conference on Machine Learning and Cybernetics. Hong Kong, China: IEEE Press, 2007, pp. 795–802. url: http://www.is.umk.pl/~norbert/publications/07-IntResRepIEEE.pdf.

  • [30] N. Jankowski and K. Grąbczewski. “Learning machines information distribution system with example applications”. In: Computer Recognition Systems 2. Vol. 45. Lecture Notes in Computer Science. Springer, 2007, pp. 205–215. url: http://www.is.umk.pl/~norbert/publications/07-IntResRepCORES.pdf.

  • [31] K. Grąbczewski and N. Jankowski. “Meta-learning architecture for knowledge representation and management in computational intelligence”. In: International Journal of Information Technology and Intelligent Computing 2.2 (2007), p. 27. url: http://www.is.umk.pl/~norbert/publications/07-Intemi-ITIC.pdf.

  • [32] N. Jankowski and K. Grąbczewski. “Handwritten Digit Recognition — Road to Contest Victory”. In: IEEE Symposium Series on Computational Intelligence. USA: IEEE Press, 2007, pp. 491–498. url: http://www.is.umk.pl/~norbert/publications/07-ocr.pdf.

  • [33] K. Grąbczewski and N. Jankowski. “Toward Versatile and Efficient Meta-Learning: Knowledge Representation and Management in Computational Intelligence”. In: IEEE Symposium Series on Computational Intelligence. USA: IEEE Press, 2007, pp. 51–58. url: http://www.is.umk.pl/~norbert/publications/07-intemi.pdf.

  • [34] N. Jankowski and K. Grąbczewski. “Learning machines”. In: Feature extraction, foundations and Applications. Ed. by Isabelle Guyon et al. Studies in fuzziness and soft computing. Springer, 2006, pp. 29–64. url: http://www.is.umk.pl/~norbert/publications/05-LM-chapter.pdf.

  • [35] K. Grąbczewski and N. Jankowski. “Mining for complex models comprising feature selection and classification”. In: Feature extraction, foundations and Applications. Ed. by Isabelle Guyon et al. Studies in fuzziness and soft computing. Springer, 2006, pp. 473–489. url: http://www.is.umk.pl/~norbert/publications/05-nips-sel.pdf.

  • [36] N. Jankowski and K. Grąbczewski. “Heterogenous Committees with Competence Analysis”. In: Fifth International conference on Hybrid Intelligent Systems. Ed. by N. Nedjah et al. Brasil, Rio de Janeiro: IEEE, Computer Society, Nov. 2005, pp. 417–422. url: http://www.is.umk.pl/~norbert/publications/05-compComm.pdf.

  • [37] K. Grąbczewski and N. Jankowski. “Feature Selection with Decision Tree Criterion”. In: Fifth International conference on Hybrid Intelligent Systems. Ed. by N. Nedjah et al. Brasil, Rio de Janeiro: IEEE, Computer Society, Nov. 2005, pp. 212–217. url: http://www.is.umk.pl/~norbert/publications/05-fsel.pdf.

  • [38] W. Duch, N. Jankowski, and K. Grąbczewski. “Computational intelligence methods for information understanding and information management”. In: The 4th International Conference on Information and Management Sciences (IMS2005). Kunming, China: California Polytechnic State University, 2005, pp. 281–287. url: http://www.is.umk.pl/~norbert/publications/05-CI-info.pdf.

  • [39] N. Jankowski and M. Grochowski. “Instances selection algorithms in the conjunction with LVQ”. In: Artificial Intelligence and Applications. Ed. by M. H. Hamza. Innsbruck, Austria: ACTA Press, Feb. 2005, pp. 453–459. url: http://www.is.umk.pl/~norbert/publications/05-aia-NJMG.pdf.

  • [40] N. Jankowski and M. Grochowski. “Comparison of instances selection algorithms: II. Algorithms survey”. In: Artificial Intelligence and Soft Computing. Ed. by Leszek Rutkowski et al. Vol. 3070. Lecture Notes in Computer Science. Poland, Zakopane: Springer-Verlag, 2004, pp. 598–603. url: http://www.is.umk.pl/~norbert/publications/04-zakopane-NJMG.pdf.

  • [41] M. Grochowski and N. Jankowski. “Comparison of instances selection algorithms: I. Results and comments”. In: Artificial Intelligence and Soft Computing. Ed. by Leszek Rutkowski et al. Vol. 3070. Lecture Notes in Computer Science. Poland, Zakopane: Springer-Verlag, 2004, pp. 580–585. url: http://www.is.umk.pl/~norbert/publications/04-zakopane-NJMGb.pdf.

  • [42] N. Jankowski, K. Grąbczewski, and W. Duch. GhostMiner 3.0 User Guide. 262 pages. FQS Poland, Fujitsu. Kraków, Poland, 2004.

  • [43] N. Jankowski, K. Grąbczewski, and W. Duch. GhostMiner 3.0 Tutorials. 136 pages. FQS Poland, Fujitsu. Kraków, Poland, 2004.

  • [44] N. Jankowski, K. Grąbczewski, and W. Duch. GhostMiner 2.0. FQS Poland, Fujitsu. Kraków, Poland, 2003.

  • [45] N. Jankowski. Ontogeniczne sieci neuronowe. O sieciach zmieniających swoją strukturę. 311 pages. Warszawa: Akademicka Oficyna Wydawnicza EXIT, 2003. url: http://exit.pl/ontogen.htm.

  • [46] N. Jankowski and K. Grąbczewki. “Toward optimal SVM”. In: The Third IASTED International Conference on Artificial Intelligence and Applications. The International Association of Science and Technology for Development. Anaheim, Calgary, Zurich: ACTA Press, Sept. 2003, pp. 451–456. url: http://www.is.umk.pl/~norbert/publications/03-njkg-aia.pdf.

  • [47] N. Jankowski. “Discrete feature weighting & selection algorithm”. In: 2003 International Joint Conference on Neural Networks. Portland, USA: The IEEE Neural Networks Society, July 2003, pp. 636–641. url: http://www.is.umk.pl/~norbert/publications/03-nj-ijcnn.pdf.

  • [48] K. Grąbczewki and N. Jankowski. “Symbolic data transformations for continuous data oriented models”. In: International Conference on Artificial Neural Networks. Vol. 2714. Lecture Notes in Computer Science. Springer, 2003, pp. 359–366. url: http://www.is.umk.pl/~norbert/publications/03-kgnj-Cont.pdf.

  • [49] N. Jankowski. “Discrete quasi-gradient features weighting algorithm”. In: Neural Networks and Soft Computing. Ed. by L. Rutkowski and J. Kacprzyk. Vol. 19. Lecture Notes in Computer Science. Springer-Verlag, 2002, pp. 194–199. url: http://www.is.umk.pl/~norbert/publications/02-nj-zakopane.pdf.

  • [50] N. Jankowski and K. Grąbczewski. “From to HTML Help”. In: Proceedings of the XIII European Conference. 2002, pp. 102–105. url: http://www.is.umk.pl/~norbert/publications/02-latex2chm.pdf.

  • [51] N. Jankowski and W. Duch. “Optimal transfer function neural networks”. In: 9th European Symposium on Artificial Neural Networks. Bruges, Belgium, Apr. 2001, pp. 101–106. url: http://www.is.umk.pl/~norbert/publications/01-nj-esann.pdf.

  • [52] W. Duch and N. Jankowski. “Transfer functions: hidden possibilities for better neural networks”. In: 9th European Symposium on Artificial Neural Networks. Bruges, Belgium, Apr. 2001, pp. 81–94. url: http://www.is.umk.pl/~norbert/publications/01-nj-esannB.pdf.

  • [53] W. Duch et al. “Neural methods of knowledge extraction”. In: Control and Cybernetics 29.4 (2000), pp. 997–1018. url: http://www.is.umk.pl/~norbert/publications/00-nj-cc.pdf.

  • [54] W. Duch et al. “Understanding the data: extraction, optimization and interpretation of logical rules”. In: 7th International Conference on Neural Information. Dae-jong, Korea, Nov. 2000, p. 53.

  • [55] N. Jankowski and Jerzy Gomuła. “Simultaneous Differential Diagnoses Basing on MMPI Inventory Using Neural Networks and Decision Trees Methods”. In: Statistics and Clinical Practice. Ed. by L. Bobrowski et al. Warsaw, Poland, June 2000, pp. 89–95. url: http://www.fizyka.umk.pl/publications/kmk/00scp-nj.pdf.

  • [56] N. Jankowski. “Data regularization”. In: Neural Networks and Soft Computing. Ed. by L. Rutkowski and R. Tadeusiewicz. Zakopane, Poland, June 2000, pp. 209–214. url: http://www.fizyka.umk.pl/publications/kmk/00datareg-nj.pdf.

  • [57] N. Jankowski. “Probabilistic intervals of confidence”. In: Neural Networks and Soft Computing. Ed. by L. Rutkowski and R. Tadeusiewicz. Zakopane, Poland, June 2000, pp. 215–220. url: http://www.fizyka.umk.pl/publications/kmk/00pic-nj.pdf.

  • [58] W. Duch and N. Jankowski. “Taxonomy of neural transfer functions”. In: International Join Conference on Neural Networks. Ed. by Shun-Ichi Amari et al. Vol. III. Como, Italy & Los Alamitos, California: Computer Society and IEEE, July 2000, pp. 477–484. url: http://www.fizyka.umk.pl/publications/kmk/00ijcnn-duchnj.pdf.

  • [59] W. Duch et al. “Optimization and interpretation of rule-based classifiers”. In: Intelligent Information Systems. Advances in Soft Computing. Bystra, Poland: Springer-Verlag, June 2000, pp. 1–14. url: http://www.fizyka.umk.pl/publications/kmk/00iis-r.pdf.

  • [60] W. Duch et al. “Extraction of knowledge from data using Computational Intelligence methods”. In: 7th International Conference on Neural Information. Dae-jong, Korea, Nov. 2000, p. 53.

  • [61] N. Jankowski and W. Duch. “Ontogeniczne Sieci Neuronowe”. In: Sieci Neuronowe. Ed. by Włodzisław Duch et al. Biocybernetyka i inżynieria biomedyczna. Warszawa: Akademicka Oficyna Wydawnicza EXIT, 2000, pp. 257–294. url: http://www.fizyka.umk.pl/publications/kmk/99bc-nj.pdf.

  • [62] N. Jankowski. “Ontogenic neural networks and their applications to classification of medical data”. PhD thesis. Toruń, Poland: Department of Computer Methods, Nicholas Copernicus University, 1999, p. 200. url: http://www.fizyka.umk.pl/publications/kmk/99phd-nj.pdf.

  • [63] N. Jankowski. “Neural Turing Machine”. Summer School Conference on Connectionist Modelling Oxford (slides). July 1999. url: http://www.fizyka.umk.pl/publications/kmk/99turing-nj.pdf.

  • [64] W. Duch and N. Jankowski. “Survey of Neural Transfer Functions”. In: Neural Computing Surveys 2 (1999), pp. 163–212. url: http://www.fizyka.umk.pl/publications/kmk/99ncs.pdf.

  • [65] N. Jankowski. “Approximation and Classification in Medicine with IncNet Neural Networks”. In: Machine Learning and Applications. Workshop on Machine Learning in Medical Applications. Hellenic Artificial Intelligence Society. Chania, Greece, July 1999, pp. 53–58. url: http://www.fizyka.umk.pl/publications/kmk/99acai-nj.pdf.

  • [66] N. Jankowski. Flexible Transfer Functions with Ontogenic Neural. Tech. rep. Toruń, Poland: Computational Intelligence Lab, DCM NCU, 1999. url: http://www.fizyka.umk.pl/publications/kmk/99nj-a.pdf.

  • [67] N. Jankowski. “Approximation with RBF-type Neural Networks using flexible local and semi-local transfer functions”. In: 4th Conference on Neural Networks and Their Applications. Polish Neural Networks Society. Zakopane, Poland, May 1999, pp. 77–82. url: http://www.fizyka.umk.pl/publications/kmk/99zakop-nj.pdf.

  • [68] N. Jankowski. “Controlling the Structure of Neural Networks that Grow and Shrink”. In: Second International Conference on Cognitive and Neural Systems. Boston, USA, May 1998. url: http://www.fizyka.umk.pl/publications/kmk/njboston.pdf.

  • [69] N. Jankowski and V. Kadirkamanathan. “Statistical Control of RBF-like Networks for Classification”. In: 7th International Conference on Artificial Neural Networks. Vol. 1327. Lecture Notes in Computer Science. Lausanne, Switzerland: Springer-Verlag, Oct. 1997, pp. 385–390. url: http://www.fizyka.umk.pl/publications/kmk/icann97nj.pdf.

  • [70] N. Jankowski and V. Kadirkamanathan. “Statistical Control of Growing and Pruning in RBF-like Neural Networks”. In: Third Conference on Neural Networks and Their Applications. Kule, Poland: Polish Neural Networks Society, Oct. 1997, pp. 663–670. url: http://www.fizyka.umk.pl/publications/kmk/kule97.pdf.

  • [71] R. Adamczak, W. Duch, and N. Jankowski. “New developments in the Feature Space Mapping model”. In: Third Conference on Neural Networks and Their Applications. Kule, Poland: Polish Neural Networks Society, Oct. 1997, pp. 65–70. url: http://www.fizyka.umk.pl/publications/kmk/fsm-97.pdf.

  • [72] W. Duch, R. Adamczak, and N. Jankowski. “Initialization of adaptive parameters in density networks”. In: Third Conference on Neural Networks and Their Applications. Kule, Poland, Oct. 1997, pp. 99–104. url: http://www.fizyka.umk.pl/publications/kmk/initfsm.pdf.

  • [73] W. Duch, R. Adamczak, and N. Jankowski. “Initialization and optimization of multilayered perceptrons”. In: Third Conference on Neural Networks and Their Applications. Kule, Poland, Oct. 1997, pp. 105–110. url: http://www.fizyka.umk.pl/publications/kmk/initmlp.pdf.

  • [74] W. Duch, R. Adamczak, and N. Jankowski. New developments in the Feature Space Mapping model. Tech. rep. CIL-KMK-2/97. (long version). Toruń, Poland: Computational Intelligence Lab, DCM NCU, Oct. 1997.

  • [75] W. Duch and N. Jankowski. “New neural transfer functions”. In: Journal of Applied Mathematics and Computer Science 7.3 (1997), pp. 639–658. url: http://www.fizyka.umk.pl/publications/kmk/amcs.pdf.

  • [76] W. Duch et al. “Neural-based classification and visualization methods with applications to psychometry”. In: 34th International Seminar on Statistics and Clinical Practice. Warszawa, 1996.

  • [77] W. Duch, R. Adamczak, and N. Jankowski. “Improved Memory-Based Classification”. In: Proceedings of the International Conference EANN ’96. Ed. by A. B. Bulsari, S. Kallio, and D. Tsaptsinos. June 1996, pp. 447–450. url: http://www.fizyka.umk.pl/publications/kmk/eann96.pdf.

  • [78] W. Duch and N. Jankowski. “Bi-radial Transfer Functions”. In: Second Conference on Neural Networks and Their Applications. Szczyrk, Poland, May 1996, pp. 131–137. url: http://www.fizyka.umk.pl/publications/kmk/biradial.pdf.

  • [79] W. Duch and N. Jankowski. Bi-radial transfer functions. Tech. rep. UMK-KMK-TR 1/96. Toruń, Poland: Department of Computer Methods, Nicholas Copernicus University, 1995.

  • [80] N. Jankowski. MatLab — plusy kontra minusy. Tech. rep. Department of Computer Methods, Nicholas Copernicus University in Torun, Poland, 1995. url: http://www.fizyka.umk.pl/publications/kmk/krakow95.pdf.

  • [81] W. Duch et al. “Feature Space Mapping: a neurofuzzy network for system identification”. In: Proceedings of the European Symposium on Artificial Neural Networks. Helsinki, Aug. 1995, pp. 221–224. url: http://www.fizyka.umk.pl/publications/kmk/enna95.ps.gz.

  • [82] N. Jankowski. “Applications of Levin’s Universal Optimal Search Algorithm”. In: System Modeling Control’95. Ed. by E. Kącki. Vol. 3. Łódź, Poland: Polish Society of Medical Informatics, May 1995, pp. 34–40. url: http://www.fizyka.umk.pl/publications/kmk/LUS-zakopaneV95.pdf.

  • [83] W. Duch and N. Jankowski. “Complex systems, information theory and neural networks”. In: Proceedings of the first national conference: Neural Network And Their Applications. Vol. 1. Institute of electronics and control system. Kule, Poland: Technical University of Częstochowa, Apr. 1994, pp. 224–231. url: http://www.fizyka.umk.pl/publications/kmk/complex.pdf.