6533b855fe1ef96bd12b10bb
RESEARCH PRODUCT
Neural Networks with Multidimensional Cross-Entropy Loss Functions
Eduardo L. PasiliaoVladimir BoginskiAlexander Semenovsubject
Artificial neural networkMachine translationbusiness.industryComputer scienceBinary number02 engineering and technologyFunction (mathematics)Extension (predicate logic)010502 geochemistry & geophysicsMachine learningcomputer.software_genre01 natural sciencesComputingMethodologies_PATTERNRECOGNITIONCross entropy020401 chemical engineeringBenchmark (computing)Deep neural networksArtificial intelligence0204 chemical engineeringbusinesscomputer0105 earth and related environmental sciencesdescription
Deep neural networks have emerged as an effective machine learning tool successfully applied for many tasks, such as misinformation detection, natural language processing, image recognition, machine translation, etc. Neural networks are often applied to binary or multi-class classification problems. In these settings, cross-entropy is used as a loss function for neural network training. In this short note, we propose an extension of the concept of cross-entropy, referred to as multidimensional cross-entropy, and its application as a loss function for classification using neural networks. The presented computational experiments on a benchmark dataset suggest that the proposed approaches may have a potential for increasing the classification accuracy of neural network based algorithms.
year | journal | country | edition | language |
---|---|---|---|---|
2019-01-01 |