Please use this identifier to cite or link to this item:
http://hdl.handle.net/10609/119146
Title: | Universal approximation results on artificial neural networks |
Author: | Ocáriz Gallego, Jesús |
Tutor: | Fernández Barta, Montserrat |
Abstract: | In this project, we present an overview on results of universal approximation with neural networks in function spaces and provide some new results for variable Lebesgue spaces. Artificial neural networks are one of the most deeply studied topics in mathematics in particular, and in science in general, in the last years. Among many other features, they present the interesting property that, for certain activation functions, feedforward neural networks with a single hidden layer can be used to approximate functions in some function spaces, yielding thus the fact that their study is fundamental in this context. The main purpose of this project is to give a well-motivated and self-contained summary of all the previous results that exist in approximation with neural networks in certain function spaces, as well as provide some new ones for variable Lebesgue spaces, a class of spaces that generalize the classical ones. For that, we introduce all the basic concepts that are necessary to understand these results of approximation and show the history of evolvement of such results during the last years, leading to our current new results in that field. |
Keywords: | neural networks universal approximation variable lebesgue spaces |
Document type: | info:eu-repo/semantics/masterThesis |
Issue Date: | Jun-2020 |
Publication license: | http://creativecommons.org/licenses/by-nc-nd/3.0/es/ |
Appears in Collections: | Bachelor thesis, research projects, etc. |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
jocarizTFM0620memory.pdf | Memory of TFM | 1,54 MB | Adobe PDF | View/Open |
Share:
This item is licensed under a Creative Commons License