Modular neural network: Difference between revisions

Content deleted Content added
BG19bot (talk | contribs)
m References: Remove blank line(s) between list items per WP:LISTGAP to fix an accessibility issue for users of screen readers. Do WP:GENFIXES and cleanup if needed. Discuss this at Wikipedia talk:WikiProject Accessibility#LISTGAP
No edit summary
 
(25 intermediate revisions by 17 users not shown)
Line 1:
A '''modular neural network''' is an [[artificial neural network]] characterized by a series of independent neural networks moderated by some intermediary., such as a cohomological structure of Cohomology Theory. Each independent neural network serves as a module and operates on separate inputs to accomplish some subtask of the task the network hopes to perform.<ref name="Azom, 2000">({{sfn|Azam, |2000)</ref> }} The intermediary takes the outputs of each module and processes them to produce the output of the network as a whole. The intermediary only accepts the modules’modules' outputs—it does not respond to, nor otherwise signal, the modules. As well, the modules do not interact with each other.
{{Underlinked|date=December 2012}}
 
A '''modular neural network''' is an [[artificial neural network]] characterized by a series of independent neural networks moderated by some intermediary. Each independent neural network serves as a module and operates on separate inputs to accomplish some subtask of the task the network hopes to perform.<ref name="Azom, 2000">(Azam, 2000)</ref> The intermediary takes the outputs of each module and processes them to produce the output of the network as a whole. The intermediary only accepts the modules’ outputs—it does not respond to, nor otherwise signal, the modules. As well, the modules do not interact with each other.
 
==Biological basis==
As [[artificial neural network]] research progresses, it is appropriate that artificial neural networks continue to draw on their biological inspiration and emulate the segmentation and modularization found in the brain. The brain, for example, divides the complex task of visual perception into many subtasks.<ref>({{sfn|Happel, |Murre|1994)</ref> }} Within a part of the [[brain]], called the [[thalamus]], lies the [[lateral geniculate nucleus]] (LGN), which is divided into different layers that separately processprocesses color and contrast: both major components of [[Visual perception|vision]].<ref>({{sfn|Hubel, |Livingstone|1990)</ref> }} After the LGN processes each component in parallel, it passes the result to another region to compile the results.
 
Certainly someSome tasks that the brain handles, like vision, haveemploy a hierarchy of sub-networks. However, it is not clear whether there is some intermediary which ties these separate processes together on a grander scale. Rather, as the tasks grow more abstract, the isolation and compartmentalization breaks down between the modules and they begin to communicate backwith andeach forth.other, At this point,unlike the modular neural network analogy is either incomplete or inadequatemodel.
 
== Design ==
Unlike a single large network that can be assigned to arbitrary tasks, each module in a modular network must be assigned a specific task and connected to other modules in specific ways by a designer. In the vision example, the brain evolved (rather than learned) to create the LGN. In some cases, the designer may choose to follow biological models. In other cases, other models may be superior. The quality of the result will be a function of the quality of the design.
 
==Complexity==
One of the major benefits of a modularModular neural network is the ability tonetworks reduce a single large, unwieldy neural network to smaller, potentially more manageable components.<ref name="Azom, {{sfn|Azam|2000"/>}} There are someSome tasks it appears are forintractably practical purposes intractablelarge for a single neural network as its size increases. The following are benefits of using a modular neural networknetworks over a single all-encompassing neural network.include:
 
===Efficiency===
The possible connections[[neuron]] increases(node) atconnections a dauntingincrease ratequadratically as nodes are added to thea network. Since computationComputation time depends on the number of nodes and their connections, any increase here will havehas drastic consequences in thefor processing time. Assigning As the greater task is further compartmentalized, the possible connections each node can make are limited, and thespecific subtasks willto hopefullyindividual executemodules more efficiently than trying to tacklereduce the wholenumber taskof atnecessary onceconnections.
 
===Training===
A large [[neural network]] attempting to model multiple parameters can suffer from interference as new data can dramatically alter existing connections or just serve to confuse. Each Withmodule some foresight into the subtasks tocan be solved,trained eachindependently neuraland networkmore canprecisely bemaster tailoredits for itssimpler task. This means the training [[algorithm]] used, and the training data used for each sub-network can be unique and implemented much more quickly. In large part this is due to the possible combinations of interesting factors diminishing as the number of inputs decreases.
 
===Robustness===
Regardless of whether a large neural network is biological or artificial, it remains largely susceptible to interference at and failure in any one of its nodes. By compartmentalizing subtasks, failure and interference are much more readily diagnosed and their effects on other sub-networks are eliminated as each one is independent of the other.
 
== Notes ==
{{reflist|130em}}
 
== References ==
 
*{{cite web|last=Azam, |first=Farooq. |title=Biologically Inspired Modular Neural Networks. PhD Dissertation, |publisher=Virginia Tech. |year=2000 |hdl=10919/27998 |url=http://scholarhdl.libhandle.vt.edunet/theses10919/available/etd-06092000-12150028/unrestricted/etd.pdf27998}}
*{{cite journal | last1 = Happel, | first1 = Bart and| last2 = Murre, | first2 = Jacob. | year = 1994 | title = The Design and Evolution of Modular Neural Network Architectures. | Neuralurl Networks, 7: 985-1004; 1994.= http://citeseer.comp.nus.edu.sg/cache/papers/cs/3480/ftp:zSzzSzftp.mrc-apu.cam.ac.ukzSzpubzSznnzSzmurrezSznnga1.pdf/the-design-and-evolution.pdf | journal = Neural Networks | volume = 7 | issue = 6–7 | pages = 985–1004 | doi = 10.1016/s0893-6080(05)80155-8 }}{{Dead link|date=April 2020 |bot=InternetArchiveBot |fix-attempted=yes }}
*{{cite journal | last1 = Hubel, | first1 = DH and| last2 = Livingstone, | first2 = MS. | year = 1990 | title = Color and contrast sensitivity in the lateral geniculate body and primary visual cortex of the macaque monkey. | journal = Journal of Neuroscience. | volume = 10: 2223-2237;| 1990issue = http://www7| pages = 2223–2237| doi = 10.jneurosci.org/cgi/content/abstract1523/JNEUROSCI.10/7/2223-07-02223.1990 | pmid = 2198331 | pmc = 6570379 | doi-access = free }}
* {{cite journal | last1 = Tahmasebi, | first1 = P., | last2 = Hezarkhani, | first2 = A., | year = 2011. | title = Application of a Modular Feedforward for Grade Estimation”Estimation | journal = Natural Resources Research, | volume = 20( | issue = 1),| 25-32.pages = 25–32 | doi DOI:= 10.1007/s11053-011-9135-3. http://link.springer.com/article/10.1007%2Fs11053-011-9135-3| s2cid = 45997840 }}
* {{Cite journal|last1=Clune|first1=Jeff|last2=Mouret|first2=Jean-Baptiste|last3=Lipson|first3=Hod|date=2013-01-30|title=The evolutionary origins of modularity|journal=Proceedings of the Royal Society B: Biological Sciences|volume=280|issue=1755|pages=20122863|doi=10.1098/rspb.2012.2863|pmid=23363632|pmc=3574393|issn=0962-8452|arxiv=1207.2743}}
Engineers Solve a Biological Mystery and Boost Artificial Intelligence .http://arxiv.org/abs/1207.2743
* {{cite journal|last1 = Tahmasebi,|first1 = Pejman,|last2 and= Hezarkhani|first2 = Ardeshir|year Hezarkhani.= 2012|title = "A fast and independent architecture of artificial neural network for permeability prediction."|journal = Journal of Petroleum Science and Engineering|volume = 86|pages (= 118–126|doi=10.1016/j.petrol.2012):.03.019| 118-126bibcode=2012JPSE...86..118T }}
 
* Tahmasebi, Pejman, and Ardeshir Hezarkhani. "A fast and independent architecture of artificial neural network for permeability prediction." Journal of Petroleum Science and Engineering 86 (2012): 118-126.
 
[[Category:Computational neuroscience]]
[[Category:ArtificialNeural neuralnetwork networksarchitectures]]
[[Category:Modularity|Neural network]]