In a control system, the set of all possible values that the system state can have. The future behaviour of a system is not solely dependent on its inputs, but will also depend on its state. The resulting behaviour consists of both system outputs and changes to the state, usually referred to as a state transition. In effect, the state drives a feedback loop within the system. State spaces can be broadly divided into two categories. The first is that of discrete state systems, in which the state has a fixed finite number of states and the inputs and outputs also tend to be discrete. The second category is that of dynamical state systems, in which the state space is continuous and hence infinite, with both inputs and outputs generally being continuous as well. A growing area of systems design, particularly when computers are used to control physical systems, involves mixing discrete and dynamical states and transitions; this results in hybrid state systems.