Functional decomposition: Difference between revisions

Content deleted Content added
No edit summary
Changed random statement about an article to a Further reading item. Reformatted another inline reference. I know AI could put together a more coherent article.
Line 19:
 
===Knowledge representation===
Processes related to functional decomposition are prevalent throughout the fields of [[knowledge representation]] and [[machine learning]]. Hierarchical model induction techniques such as [[Logic circuit minimization]], [[decision trees]], [[grammatical inference]], [[hierarchical clustering]], and [[quadtree decomposition]] are all examples of function decomposition. A review of other applications and function decomposition can be found in {{Harvtxt|Zupan|Bohanec|Bratko|Demšar|1997}}, which also presents methods based on [[information theory]] and [[graph theory]].
 
Many [[statistical inference]] methods can be thought of as implementing a function decomposition process in the presence of noise; that is, where functional dependencies are only expected to hold ''approximately''. Among such models are [[mixture models]] and the recently popular methods referred to as "causal decompositions" or [[Bayesian networks]].
Line 31:
However, while perfect functional decomposition is usually impossible, the spirit lives on in a large number of statistical methods that are equipped to deal with noisy systems. When a natural or artificial system is intrinsically hierarchical, the [[joint distribution]] on system variables should provide evidence of this hierarchical structure. The task of an observer who seeks to understand the system is then to infer the hierarchical structure from observations of these variables. This is the notion behind the hierarchical decomposition of a joint distribution, the attempt to recover something of the intrinsic hierarchical structure which generated that joint distribution.
 
As an example, [[Bayesian network]] methods attempt to decompose a joint distribution along its causal fault lines, thus "cutting nature at its seams". The essential motivation behind these methods is again that within most systems (natural or artificial), relatively few components/events interact with one another directly on equal footing .{{Harvsfnp|Simon|1963}}. Rather, one observes pockets of dense connections (direct interactions) among small subsets of components, but only loose connections between these densely connected subsets. There is thus a notion of "causal proximity" in physical systems under which variables naturally precipitate into small clusters. Identifying these clusters and using them to represent the joint provides the basis for great efficiency of storage (relative to the full joint distribution) as well as for potent inference algorithms.
 
===Software architecture===
Line 49:
===Systems engineering ===
{{main|Functional flow block diagram}}
Functional decomposition in [[systems engineering]] refers to the process of defining a system in functional terms, then defining lower-level functions and sequencing relationships from these higher level systems functions.<ref>[{{cite report |url= http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-885j-aircraft-systems-engineering-fall-2005/readings/sefguide_01_01.pdf ''|title=Systems Engineering Fundamentals.''], |publisher=Defense Acquisition University Press, |place=Fort Belvoir, VA, |date=January 2001, p45|page=45}}</ref> The basic idea is to try to divide a system in such a way that each block of a [[Functional flow block diagram|block diagram]] can be described without an "and" or "or" in the description.
 
This exercise forces each part of the system to have a pure [[role|function]]. When a system is designed as pure functions, they can be reused, or replaced. A usual side effect is that the interfaces between blocks become simple and generic. Since the interfaces usually become simple, it is easier to replace a pure function with a related, similar function.
Line 59:
*[[Currying]]
*[[Database normalization]]
*[[Function composition (computer science)]]
*[[Inductive inference]]
*[[Knowledge representation]]
 
==Further reading==
* {{cite conference
| last1=Zupan
| first1=Blaž
| last2=Bohanec
| first2=Marko
| last3=Bratko
| first3=Ivan
| last4=Demšar
| first4=Janez
| yeardate=July 1997
| contributiontitle=Machine learning by function decomposition
| url=https://dl.acm.org/doi/10.5555/645526.657131
| conference=ICML '97: July 8 - 12, 1997
| conference-url=https://dl.acm.org/doi/proceedings/10.5555/645526
| editor=Douglas H. Fisher
| book-title=Proc.Proceedings 14thof the Fourteenth International Conference on Machine Learning
| publisher=Morgan Kaufmann Publishers |place=San Francisco
| isbn=978-1-55860-486-5
| pages=421–429
}} A review of other applications and function decomposition. Also presents methods based on [[information theory]] and [[graph theory]].
 
==Notes==
Line 134 ⟶ 156:
*{{Citation | last1=Simon| first1=Herbert A. |year=1996 |chapter=The architecture of complexity: Hierarchic systems |title=The sciences of the artificial |publisher=MIT Press |place=[[Cambridge, Massachusetts|Cambridge]], Massachusetts |pages=183–216}}.
*{{Citation |last1=Tonge |first1=Fred M.|year=1969 |chapter=Hierarchical aspects of computer languages |editor=Whyte, Lancelot Law |editor2=Wilson, Albert G. |editor3=Wilson, Donna |title=Hierarchical Structures |publisher=American Elsevier |place=[[New York City|New York]] |pages=233–251}}.
 
* {{Citation
| last1=Zupan
| first1=Blaž
| last2=Bohanec
| first2=Marko
| last3=Bratko
| first3=Ivan
| last4=Demšar
| first4=Janez
| year=1997
| contribution=Machine learning by function decomposition
| contribution-url=http://citeseer.ist.psu.edu/zupan97machine.html
| title=Proc. 14th International Conference on Machine Learning
| publisher=Morgan Kaufmann
| pages=421–429
}}
{{refend}}