(a) Fixed Ratio Schedule;

(b) Fixed Interval Schedule;

(c) Variable Ratio Schedule; and

(d) Variable Interval Schedule.

We would discuss each of these briefly—

#### (a) Fixed Ratio Schedule (FR):

It is one example of a schedule in which the number of responses determine when reinforcement occurs. A certain number of responses must be made before a reinforcer is produced i.e., there is a fixed ratio of non-reinforced responses to reinforced responses, e.g., every third (ratio of 3: 1). fourth (4:1) or hundredth (100:1) response might be reinforced. Under Fixed Ratio schedule, pause occurs after each reinforcement, but except for this the rate of responses tends to be quite high and relatively steady.

#### (b) Fixed Interval Schedule (FI):

It is one in which reinforcement is given after a fixed interval of time. No reinforcements are forthcoming, no matter how many responses are made until a certain interval of time has gone by.

#### (c) Variable Ratio Schedule (VR):

It is one in which subjects are paid off after a variable number of responses. For instance, reinforcement might come once after two responses, again after ten responses and so on after six responses and so on after different number of responses as decided by the experimenter. A variable ratio schedule can be specified in terms of the average number of responses needed for reinforcement.

#### (d) Variable Interval Schedule (VI):

It is one in which the individual is reinforced first after one interval of time, then after another interval and so on.

An important consequence of many schedules of positive reinforcement is that other things being equal, extinction tends to be slower for scheduled reinforced responses than for continuously reinforced ones.

In other words, if positive reinforcement is stopped, the individual continues to respond for a much longer time after scheduled reinforcement than after continuous reinforcement. In technical language, we say that scheduled reinforcement increases resistance to extinction.