A Bayesian net is a way of modelling a system in which events or states of affairs have probabilistic implications for one another (see conditional probability). The model is a directed graph in which nodes represent possible states of a system, and directed arcs between nodes represent the way a change in the probability of one state impinges upon the probability of others. A hierarchical net has nodes at different levels, corresponding to a cascade of changing probabilities. As a change in the probability of one state comes about, others will then update in accordance with the underlying structure of the system. Such networks can model diagnostic procedures, enabling accurate and empirically well-founded probability judgements, reflecting the sum total of evidence to date, and easily updated as evidence accumulates. They therefore represent a powerful learning tool.
The extent to which their power helps to undermine the poverty of the stimulus argument for innate generative grammars is a matter of debate, but it is plausible that an infant equipped with the capacity to make probability assignments, update them, and modify them flexibly in the light of experience is far from the passive ‘stimulus–response’ creature that is supposedly unable to use experience of language effectively without a built-in system of rules as its birthright.