Title
		
		
			Recurrent neural networks for solving matrix algebra problems
		
	
			Creator
		
		
			Živković, Ivan S. 1983- 
					
	
			Copyright date
		
		
			2018
		
	
			Object Links
		
		
	
			Select license
		
		
			Autorstvo-Nekomercijalno-Bez prerade 3.0 Srbija (CC BY-NC-ND 3.0)
		
	
			License description
		
		
			Dozvoljavate samo preuzimanje i distribuciju dela, ako/dok se pravilno naznačava ime autora, bez ikakvih promena dela i bez prava komercijalnog korišćenja dela. Ova licenca je najstroža CC licenca. Osnovni opis Licence: http://creativecommons.org/licenses/by-nc-nd/3.0/rs/deed.sr_LATN. Sadržaj ugovora u celini: http://creativecommons.org/licenses/by-nc-nd/3.0/rs/legalcode.sr-Latn
		
	
			Language
		
		
			English
		
	
			Cobiss-ID
		
		
	
			Theses Type
		
		
			Doktorska disertacija
		
	
			description
		
		
			 
Datum odbrane: 22.11.2018.
		
	
			Other responsibilities
		
		mentor
				Stanimirović, Predrag 1959- 
				član komisije
				Milovanović, Gradimir 1948- 
				član komisije
				Todorović, Branimir 1967- 
				član komisije
				Janković, Dragan 
				član komisije
				Petković, Marko 1984- 
				
			Academic Expertise 
		
		
			Prirodno-matematičke nauke
		
	
			Academic  Title
		
		
			-
		
	
			University
		
		
			Univerzitet u Nišu
		
	
			Faculty
		
		
			Prirodno-matematički fakultet
		
	
			Group
		
		
			Odsek za matematiku i informatiku
		
	
				Alternative  title
			
			
				Rekurentne neuronske mreže za rešavanje problema linearne algebre 
			
		
				Publisher
			
			
				 [I. S. Živković] 
			
		
				Format
			
			
				VII, 172 str.
			
		
				description
			
			
				Biography: str. 169-170;
Bibliografaphy: str. 157-167.
			
		
				description
			
			
				Artificial neural networks, dynamical systems, control systems.
			
		
				Abstract (en)
			
			
				The aim of this dissertation is the application of recurrent neural
networks (RNNs) to solving some problems from a matrix algebra
with particular reference to the computations of the generalized
inverses as well as solving the matrix equations of constant (timeinvariant)
matrices. We examine the ability to exploit the correlation
between the dynamic state equations of recurrent neural networks for
computing generalized inverses and integral representations of these
generalized inverses. Recurrent neural networks are composed of
independent parts (sub-networks). These sub-networks can work
simultaneously, so parallel and distributed processing can be
accomplished. In this way, the computational advantages over the
existing sequential algorithms can be attained in real-time
applications. We investigate and exploit an analogy between the
scaled hyperpower family (SHPI family) of iterative methods for
computing the matrix inverse and the discretization of Zhang Neural
Network (ZNN) models. A class of ZNN models corresponding to the
family of hyperpower iterative methods for computing the generalized
inverses on the basis of the discovered analogy is defined. The Matlab
Simulink implementation of the introduced ZNN models is described
in the case of scaled hyperpower methods of the order 2 and 3. We
present the Matlab Simulink model of a hybrid recursive neural
implicit dynamics and give a simulation and comparison to the
existing Zhang dynamics for real-time matrix inversion. Simulation
results confirm a superior convergence of the hybrid model compared
to Zhang model.
			
		
				Authors Key words
			
			
				Veštačke neuronske mreže, dinamički sistemi, uopšteni
inverzi
			
		
				Authors Key words
			
			
				Artificial neural networks, dynamical systems, control systems
			
		
				Classification
			
			
				004.832:[512.64+517.98+519.857(043.3)
			
		
				Subject
			
			
				P170
			
		
				Type
			
			
				Tekst
			
		
			Abstract (en)
		
		
			The aim of this dissertation is the application of recurrent neural
networks (RNNs) to solving some problems from a matrix algebra
with particular reference to the computations of the generalized
inverses as well as solving the matrix equations of constant (timeinvariant)
matrices. We examine the ability to exploit the correlation
between the dynamic state equations of recurrent neural networks for
computing generalized inverses and integral representations of these
generalized inverses. Recurrent neural networks are composed of
independent parts (sub-networks). These sub-networks can work
simultaneously, so parallel and distributed processing can be
accomplished. In this way, the computational advantages over the
existing sequential algorithms can be attained in real-time
applications. We investigate and exploit an analogy between the
scaled hyperpower family (SHPI family) of iterative methods for
computing the matrix inverse and the discretization of Zhang Neural
Network (ZNN) models. A class of ZNN models corresponding to the
family of hyperpower iterative methods for computing the generalized
inverses on the basis of the discovered analogy is defined. The Matlab
Simulink implementation of the introduced ZNN models is described
in the case of scaled hyperpower methods of the order 2 and 3. We
present the Matlab Simulink model of a hybrid recursive neural
implicit dynamics and give a simulation and comparison to the
existing Zhang dynamics for real-time matrix inversion. Simulation
results confirm a superior convergence of the hybrid model compared
to Zhang model.
		
	
			“Data exchange” service offers individual users metadata transfer in several different formats. Citation formats are offered for transfers in texts as for the transfer into  internet pages. Citation formats include permanent links that guarantee access to cited sources. For use are commonly structured metadata schemes : Dublin Core  xml  and ETUB-MS xml, local adaptation of international ETD-MS scheme intended for use in academic documents.
		
	
					