Glossary App for iOS and Android  |   Sign In  |  Become A Member

Psychology Glossary

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Follow AlleyDog

Variable Interval Schedule



Variable Interval Schedule:
+ add to my flashcards
If you understand variable ratio schedules, this will be easy. If not, this might be a little confusing at first, but hang on and it will become clear. A variable interval schedule (VI) is a type of operant conditioning reinforcement schedule in which reinforcement is given to a response after specific amount of time has passed (an unpredictable amount of time), but this amount of time is on a changing/variable schedule. This is almost identical to a Fixed-Interval Schedule but the reinforcements are given on a variable or changing schedule. Although the schedule changes, there is a pattern -- the amount of time that must pass changes, but the reinforcement is given after "N"th amount of time passes, where N is the average amount of time that must pass. Let's give an example. You conduct a study in which a rat is put on a VI 10 second schedule (the operant response is pressing a lever). This means that the rat will get reinforced when it waits an average of 10 seconds and then presses the lever. However, because it is an average, the rat may have to wait 30 seconds one trial, then only 2 seconds the next, 30 the next 50 the next, 1 second the next, and so on....just as long as it all averages out to reinforcement being delivered after an average interval of 10 seconds. In addition, sometimes the researcher can make the time interval start all over again if the organism makes an operant response before the proper time has elapsed. So, if the organism makes a response before it is supposed to, the interval starts all over again (if it was supposed to wait 30 seconds on that trial, the 30 seconds starts all over again).

 

How helpful is this page:

Thank you for taking the time to help offer your feedback.

How could we make this page more useful?