Ferrero Jaurrieta, MikelPereira Dimuro, GraçalizTakáč, ZdenkoSantiago, RegivanFernández Fernández, Francisco JavierBustince Sola, Humberto2022-03-102022-03-1020211613-0073https://academica-e.unavarra.es/handle/2454/42468Gated Recurrent Units (GRU) are neural network gated architectures that simplify other ones (suchas, LSTM) by joining gates mainly. For this, instead of using two gates, if𝑥is the first gate, standardoperation1−𝑥is used to generate the second one, optimizing the number of parameters. In this work, we interpret this information as a fuzzy set, and we generalize the standard operation using fuzzy negations, and improving the accuracy obtained with the standard one.9 p.application/pdfeng© 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).Fuzzy set complementFuzzy negationsRecurrent neural networksGated recurrent unitFuzzy sets complement-based gated recurrent unitinfo:eu-repo/semantics/conferenceObjectinfo:eu-repo/semantics/openAccessAcceso abierto / Sarbide irekia