The
Behavior
Analyst
1989,
12,
221-225
No.
2
(Fall)
Keller,
Schoenfeld,
Cumming,
and
Berryman
as
Instructional
Stimuli
John
A.
Nevin
University
of
New
Hampshire
When
the
Association
for
Behavior
Analysis
arranged
a
symposium
series
honoring
Fred
Keller
on
his
90th
birth-
day,
I
was
delighted
to
be
asked
to
par-
ticipate.
Thirty
years
earlier,
Keller
had
guided
my
entry
into
the
experimental
analysis
of
behavior
at
Columbia,
and
he
still
stands
as
a
model
for
much
of
my
professional
life.
Nevertheless,
when
I
was
asked
for
a
title,
I
was
at
a
loss.
I
had
no
notion
of
what
I
would
say
at
the
sym-
posium,
but
knew
that
the
title
would
determine
the
substance,
if
not
the
de-
tails,
of
the
talk.
As
I
considered
possible
titles,
I
found
myself
revisiting
Scher-
merhorn
Extension
in
reverie.
I
had
first
entered
its
dark
and
dingy
halls
in
search
of
Nat
Schoenfeld's
office
on
the
third
floor.
I
had
no
background
in
psychology,
but
was
interested
in
psychophysics
and
color
vision,
so
I
had
made
an
appoint-
ment
to
discuss
the
possibility
of
grad-
uate
study
at
Columbia.
Bill
Cumming
was
with
Schoenfeld
when
I
arrived
and
they
interviewed
me
together,
taking
turns
in
probing
my
background,
challenging
my
half-formed
ideas,
and
explaining
Columbia's
program.
Evidently
I
passed
muster,
and
Keller,
then
serving
as
de-
partment
chairman,
admitted
me
on
condition
that
sooner
or
later
I
take
the
Graduate
Record
Exams.
(He
never
spec-
ified
a
deadline,
and
I
never
took
the
ex-
ams.)
I
began
taking
courses,
and
grad-
This
paper
is
based
on
a
talk
presented
at
a
sym-
posium
on
"Columbia
University:
Discriminative
Stimuli
and
Establishing
Operations"
at
the
meet-
ings
of
the
Association
for
Behavior
Analysis,
May
1989.
All
of
us
who
participated
in
this
symposium
are
indebted
to
Celia
Wolk
Gershenson
for
bringing
us
together
to
celebrate
the
special
sense
of
excite-
ment
and
common
enterprise
that
Fred
Keller
gave
us
during
our
time
at
Columbia.
Correspondence
should
be
addressed
to
the
author
at
the
Depart-
ment
of
Psychology,
University
of
New
Hampshire,
Durham,
NH
03824.
ually
became
familiar
with
the
cast
of
characters
in
Schermerhorn
and
the
ideas
that
excited
them.
Within
my
first
semester,
I
started
working
for
Bob
Berryman,
building
ap-
paratus
for
his
and
Cumming's
studies
of
complex
discriminated
operants.
As
a
re-
sult
of
their
analyses,
they
argued
for
a
separate
"instructional"
function
of
stimuli
as
selectors
of
discriminations.
Could
this
be
a
metaphor
for
the
role
of
my
mentors
at
Columbia?
Why
not?
Hence
the
title.
Beginning
in
1959,
the
Cumming-Ber-
ryman
laboratory
on
the
second
floor
of
Schermerhorn
Extension
was
largely
de-
voted
to
exploring
variations
of
the
now-
familiar
matching-to-sample
procedure;
the
notion
that
the
sample
stimuli
in
that
paradigm
might
exercise
a
special
func-
tion
was
in
the
air.
As
Cumming
and
Berryman
(1965)
analyzed
it,
the
match-
ing-to-sample
paradigm
involved
two
si-
multaneous
discriminations
and
a
su-
perordinate
successive
discrimination.
Consider
a
pigeon
confronting
red
and
green
lights
defining
a
simultaneous
dis-
crimination
on
the
side
keys
of
a
three-
key
panel.
In
the
first
such
discrimina-
tion,
red
was
SD
and
green
was
SA;
in
the
second,
green
was
SD
and
red
was
SA.
The
superordinate
successive
discrimination
involved
the
color
of
the
sample
on
the
center
key.
In
a
matching
contingency,
a
red
sample
presented
on
some
trials
sig-
naled
that
the
first
discrimination
was
operative,
and
a
green
sample
on
other
trials
signaled
that
the
second
was
op-
erative.
In
an
oddity
contingency,
the
color
relations
were
reversed,
and
in
"symbolic"
matching,
the
color
relations
were
arbitrary
(for
example,
if
the
sample
was
blue,
the
first
discrimination
was
op-
erative,
and
if
yellow,
the
second
was
op-
erative).
Thus,
at
a
procedural
level,
the
221
222
JOHN
A.
NEVIN
sample
could
be
said
to
select
the
appro-
priate
discrimination
in
conjunction
with
the
experimenter's
specification
of
the
contingency.
A
series
of
transfer
experiments,
most
of
them
reviewed
by
Cumming
and
Ber-
ryman
(1965),
suggested
that
there
was
indeed
evidence
for
separate,
hierarchi-
cal
instructional
control
by
the
sample.
In
particular,
it
appeared
that
pigeons
generalized
broadly
to
novel
samples,
and
then
responded
to
the
comparison
stim-
uli
on
the
side
keys
on
the
basis
of
their
generalized
response
rather
than
the
sam-
ple
color
itself.
Inspired
by
Cumming
and
Berryman's
ideas,
Eckerman
(1970)
de-
vised
a
way
to
make
the
response
to
the
sample
explicit.
In
his
symbolic
match-
ing
procedure,
pigeons
were
required
to
peck
at
different
locations
along
a
strip
key,
depending
on
the
wavelength
of
the
sample,
in
order
to
produce
the
compar-
ison
stimuli
-vertical
and
horizontal
lines.
When
novel
wavelengths
were
pre-
sented
as
samples
in
a
generalization
test,
the
location
of
the
peck
on
the
strip
key
was
a
good
predictor
of
which
compari-
son
stimulus
would
be
pecked
on
that
trial.
In
effect,
the
birds
used
peck
loca-
tion
to
name
the
sample,
and
then
pecked
the
comparison
that
corresponded
to
that
name.
These
findings
and
their
interpretation
have
yet
to
be
integrated
with
the
work
of
Murray
Sidman
and
his
associates
(e.g.,
Sidman
&
Tailby,
1982)
on
the
use
of
symbolic
matching
to
establish
stimulus
equivalence
relations
in
which
the
sam-
ple
and
comparison
stimuli
are
inter-
changeable
members
of
an
emergent
stimulus
class;
but
speculation
along
these
lines
would
take
us
too
far
afield.
My
present
purpose
is
simply
to
suggest
that
Keller,
Schoenfeld,
Cumming,
and
Ber-
ryman
functioned
as
instructional
stim-
uli
for
their
students,
signaling
discrim-
inations
that
would
lead
to
reinforcing
consequences
when
we
emitted
the
ap-
propriate
behavior.
Keller
has
characterized
Columbia
in
his
and
Schoenfeld's
time
as
a
special
en-
vironment,
but
of
course
it
was
the
peo-
ple
and
the
ideas
that
made
it
special.
It
served
as
the
antecedent
for
behavior
ini-
tiated
there
and
for
consequences
that
have
ensued
over
the
years.
Using
these
terms
of
the
familiar
ABC
model,
to-
gether
with
the
notion
ofthe
instructional
stimulus,
I
will
try
to
bring
it
back
to
life.
Starting
with
the
behavior
term,
what
did
we
actually
do,
as
graduate
students,
day
after
day?
We
wired
relay
circuits;
deprived
and
trained
pigeons
and
rats;
set
timers
and
counters
to
arrange
rein-
forcement
contingencies;
varied
condi-
tions
systematically,
or
sometimes
on
a
hunch,
to
see
what
would
happen;
plot-
ted,
transformed,
and
replotted
our
data;
showed
them
to
one
another
and
debated
their
significance;
and
eventually
pre-
sented
them
in
evening
research
semi-
nars.
In
sum,
we
did
science.
The
consequences
were
tightly
tied
to
the
behavior:
the
fun
of
making
appa-
ratus
work,
the
thrill
of
controlling
be-
havior,
the
intellectual
and
aesthetic
sat-
isfaction
of
orderly
data,
the
reactions
of
fellow
graduate
students,
and
eventually
Keller's
comment,
"That's
very
interest-
ing,"
or
Schoenfeld's
challenge,
"Now
what
does
that
mean?"
The
presumably
ultimate
consequence-
successful
de-
fense
of
the
dissertation
-was
just
a
part
of
the
package.
It
was
the
ongoing
enter-
prise
that
mattered:
the
elaboration
of
what
Fred
called
"the
message,"
the
guiding
system
of
reinforcement
theory,
the
touchstone
from
which
one
could
try
to
grapple
with
a
wide
array
of
psycho-
logical
phenomena,
including
those
of
daily
life.
As
graduate
students,
our
behavior
had
to
be
shaped
and
then
brought
under
stimulus
control.
Relatively
little
of
this
took
place
in
formal
courses
and
semi-
nars.
For
example,
in
my
assistantship,
Berryman
introduced
me
to
the
delights
of
apparatus:
finding
some
potentially
useful
junk
in
the
surplus
stores
on
Canal
Street,
making
it
work,
then
(and
only
then)
figuring
out
something
interesting
to
do
with
it.
He
had
devised
a
clever
programmer
for
interlocking
schedules
of
reinforcement,
and
suggested
we
do
something
with
it.
I
had
yet
to
work
with
even
the
simplest
ratio
or
interval
sched-
ules,
but
his
enthusiasm
was
infectious,
so
we
started
an
experiment
even
though
INSTRUCTIONAL
STIMULI
223
I
had
no
real
notion
of
its
point.
Without
ever
telling
me
what
to
do,
Berryman
provided
all
the
cues
necessary
for
a
pro-
cess
in
which
each
step
was
controlled
by
the
data
as
they
accumulated
and
rein-
forced
by
the
orderliness
they
exhibited.
The
process
eventuated
in
an
article
de-
scribing
a
procedural
and
behavioral
continuum
relating
interval
and
ratio
schedules
(Berryman
&
Nevin,
1962).
A
different
piece
of
apparatus
might
well
have
led
elsewhere.
While
I
was
setting
up
the
interlocking
schedule
study
with
Berryman,
Keller
remedied
my
ignorance
of
operant
be-
havior
by
arranging
a
special
series
of
exercises
which
I
and
a
few
equally
ig-
norant
graduate
students
ran
through
af-
ter
hours
in
the
Psych
1
lab.
We
placed
our
rats
in
the
student
chambers,
and
there
it
was:
Raw
individual
behavior,
changing
in
an
orderly
way
within
the
space
of
an
hour
as
it
interacted
with
con-
ditions
that
we
could
arrange.
We
wrote
frequent
short
reports
for
Keller
to
com-
ment
on,
and
before
the
semester
ended,
he
had
taken
us
through
Skinner's
Be-
havior
of
Organisms
(1938),
with
Keller
and
Schoenfeld's
Principles
ofPsychology
(1950)
as
our
guide.
In
effect,
he
had
pro-
gressively
introduced
discriminative
stimuli
in
the
presence
of
which
appro-
priate
scientific
behavior-replication
of
the
work
of
Skinner
and
others-would
be
reinforced.
I
was
hooked
on
the
ex-
perimental
analysis
of
behavior.
For
the
next
few
years,
I
worked
con-
currently
on
reinforcement
schedules
and
complex
stimulus
control.
At
the
same
time,
seminars
and
projects
with
Clar-
ence
Graham
and
Bill
McGill
enhanced
my
initial
psychophysical
interests
and
provided
me
with
new
experimental
and
quantitative
skills.
Much
of
my
work
still
involves
a
psychophysical
approach
to
stimulus
and
schedule
control.
For
ex-
ample,
McGill's
instruction
in
the
theory
of
signal
detection
not
only
presented
a
new
approach
to
sensory
psychology,
but
also
offered
a
new
way
to
look
at
rein-
forcement
effects.
Graphs
of
response
probabilities
or
rates
maintained
by
one
schedule
in
the
presence
of
different
stim-
uli,
or
at
different
times,
against
the
cor-
responding
response
probabilities
or
rates
maintained
by
a
different
schedule
produced
the
equivalent
of
an
isosensi-
tivity
curve
(e.g.,
Nevin,
1965,
1974a),
which
I
later
called
an
isoreinforcement
curve
(Nevin,
1981).
This
led
me
to
view
both
antecedent
and
consequent
stimuli
as
functionally
equivalent
determiners
of
behavior
-a
view
that
I
am
still
working
to
elaborate
(Nevin,
1989).
The
behav-
ioral
basis
for
this
persistence
will
be-
come
clear
below.
Although
I
took
the
standard
round
of
courses
and
seminars,
most
of
my
edu-
cation
in
general
psychology
came
infor-
mally
from
Cumming.
He
taught
an
un-
dergraduate
experimental
psychology
course
in
the
mornings,
and
after
class
he
would
stand
in
the
lab,
Schaefer
beer
and
cigarette
in
hand,
discoursing
on
such
cognitive
topics
as
reaction
time
and
"mental
chronometry,"
concept
forma-
tion,
the
role
of
experience
in
scaling
ex-
periments,
or
the
significance
of
visual
"illusions"
as
opposed
to
"veridical"
perception.
Eventually,
I
recognized
that
all
these
topics
were
related
to
the
idea
of
the
discriminated
operant.
He
never
told
me
how
to
understand
behavior
in
these
situations,
but
he
gave
me
all
the
cues
that
permitted
me
to
identify
the
relevant
relations
for
myself.
The
effects
of
his
teaching
are
still
evident
in
a
recent
signal-detection
analysis
of
the
Muller-
Lyer
illusion
and
the
Kahnemann-Tver-
sky
"representativeness"
heuristic
(Nev-
in,
in
press).
The
effort
to
identify
continua
relating
phenomena
that
seemed
to
be
qualita-
tively
different
was
very
much
in
the
air,
and
remains
for
me
one
of
the
distinctive
features
of
the
Columbia
approach.
For
example,
in
a
graduate
seminar
on
aver-
sive
control,
Murray
Sidman
led
us
to
see
the
procedural
continuum
that
relat-
ed
escape
and
avoidance
behavior,
sug-
gesting
further
that
avoidance
behavior
might
be
reinforced
directly
by
shock
fre-
quency
reduction
(cf.
Herrnstein
&
Hine-
line,
1966).
He
assigned
me
to
report
on
his
own
doctoral
dissertation
(Sidman,
1
9
52),
and
I
was
entranced
by
the
striking
regularity
of
his
data
on
the
parametric
effects
of
response-shock
and
shock-shock
224
JOHN
A.
NEVIN
intervals
on
response
rate.
Inspired
by
McGill's
course
on
stochastic
timing
pro-
cesses,
I
struggled
for
weeks
to
elaborate
a
mathematical
model
based
on
shock
frequency
reduction
that
would
give
a
good
account
of
the
data,
and
came
tan-
talizingly
close;
John
Gibbon,
a
fellow
graduate
student,
eventually
succeeded
(Gibbon,
1972).
All
of
us
who
worked
with
schedules
of
reinforcement
spent
hours
studying
and
debating
the
work
of
Schoenfeld,
Cum-
ming,
and
their
former
and
current
stu-
dents
on
the
continuum
of
effects
that
could
be
produced
by
varying
only
tem-
poral
parameters
(Schoenfeld
&
Cum-
ming,
1960;
Schoenfeld,
Cumming,
&
Hearst,
1956).
One
of
their
goals
was
to
minimize
the
organism's
control
over
the
presentation
of
reinforcement,
so
that
the
schedule
would
be
a
true
independent
variable.
More
particularly,
they
showed
that
ratio-like,
interval-like,
and
various
intermediate
schedule
performances
could
be
generated
by
appropriate
selec-
tion
of
temporal
contingencies
alone.
In
retrospect,
their
approach
seems
quite
dif-
ferent
from
more
recent
analyses
that
stress
the
feedback
relations
between
re-
sponding
and
reinforcement
(e.g.,
Baum,
1973).
The
feedback
approach
takes
ob-
tained
reinforcer
rate
as
a
dependent
variable
determined
jointly
by
response
rate
and
the
scheduled
contingencies,
with
the
result
that
the
schedule
ceases
to
be
an
independent
variable.
However,
the
notion
that
feedback
relations
them-
selves
lie
along
a
continuum
(e.g.,
Rach-
lin,
1978)
is
consistent
with
the
spirit
of
Schoenfeld
and
Cumming's
approach-
as
was
Berryman's
and
my
1962
paper
on
interlocking
schedules.
The
importance
of
identifying
contin-
uous
relations
may
even
have
overshad-
owed
the
phenomena
under
consider-
ation.
At
one
point
in
my
graduate
career,
I
recognized
that
a
procedural
variable
could
bridge
the
gap
between
two
seem-
ingly
disparate
topics
in
stimulus
control
and
mentioned
this
to
Schoenfeld.
I
no
longer
recall
the
topics,
but
I
will
always
remember
Nat's
response:
"Once
you
have
learned
to
see
continuous
relations
rather
than
discrete
phenomena,
your
life
is
changed."
There
could
be
no
better
example
of
his
role
as
a
selector
of
dis-
criminations.
If
it
is
accepted
that
Keller,
Schoenfeld,
Cumming,
and
Berryman
functioned
as
instructional
stimuli,
we
are
left
with
the
problem
of
characterizing
the
special
en-
vironment
within
which
these
instruc-
tions
were
arranged.
I
would
like
to
sug-
gest
that
the
Columbia
environment
served
as
a
superordinate
instructional
stimulus,
setting
the
occasion
for
rein-
forcement
of
discriminative
operants
un-
der
subordinate
instructional
control.
To
illustrate
this
notion,
I
would
like
to
de-
scribe
part
of
a
study
conducted
in
my
lab
with
Jim
Grosch
(Nevin
&
Grosch,
in
press).
We
trained
pigeons
on
a
conventional
delayed
matching-to-sample
task
with
red
and
green
key
lights.
Delays
interposed
between
offset
of
the
sample
and
onset
of
the
comparison
key
lights
varied
un-
predictably
between
0
and
21
seconds
from
trial
to
trial.
Some
trials
were
ac-
companied
by
a
tone,
and
others
by
white
noise.
If
the
tone
was
present,
a
response
to
the
correct
comparison
color
produced
4.5
s
of
access
to
grain,
whereas
if
the
noise
was
present,
a
correct
response
pro-
duced
only
1.5
s
of
access
to
grain.
(Tone
and
noise
were
counterbalanced
across
birds.)
Thus,
the
auditory
cue
was
a
sort
of
higher-order
instructor,
signaling
not
which
response
was
correct,
but
whether
the
reinforcer
would
be
large
or
small.
Its
effects
were
clear:
All
our
pigeons
re-
sponded
more
accurately
at
all
delays
on
trials
when
the
auditory
cue
signaled
the
larger
reinforcer.
I
would
like
to
suggest
that
the
Colum-
bia
Psychology
Department
was
like
that
auditory
cue:
an
environment
in
which
instructional
control
was
especially
effec-
tive
because
the
reinforcers
were
so
am-
ple.
Much
of
my
present
research
deals
with
"behavioral
momentum"
-the
tendency
for
a
class
of
learned
behavior
to
persist
under
altered
conditions.
The
data
on
free-operant
performances
of
pigeons,
rats,
monkeys,
and
humans
in
a
variety
of
experimental
settings
are
quite
consis-
INSTRUCTIONAL
STIMULI
225
tent:
The
persistence
of
operant
behavior
increases
with
the
rate
or
amount
of
re-
inforcement
occurring
in
the
stimulus
sit-
uation
(e.g.,
Nevin,
1974b,
1979;
Mace
et
al.,
in
press).
Many
of
those
who
studied
at
Columbia
during
the
Keller
and
Schoenfeld
years
exemplify
this
prin-
ciple,
in
that
ample
reinforcement
for
the
scientific
behavior
acquired
there
has
en-
gendered
great
persistence.
My
own
con-
tinuing
preoccupation
with
stimulus
and
schedule
control
is
directly
traceable
to
that
environment
as
it
was
mediated
by
the
instructional
stimuli
provided
by
Keller,
Schoenfeld,
Cumming,
and
Ber-
ryman.
My
history
of
reinforcement
at
Colum-
bia
also
determined
the
title
ofthis
paper,
which
in
turn
determined
its
content.
The
effort
to
describe
that
history
in
the
lan-
guage
of
complex
stimulus
control
has
given
me
a
fresh
realization
that
what
we
study
and
what
we
do
are
of
the
same
stuff.
Our
analyses
and
our
actions-
professional
science
and
personal
expe-
rience-all
are
aspects
of
behavior,
sub-
ject
to
a
common
set
of
principles
that
we
grasp
more
and
more
surely
as
the
science
of
behavior
grows.
REFERENCES
Baum,
W.
M.
(1973).
The
correlation-based
law
of
effect.
Journal
of
the
Experimental
Analysis
of
Behavior,
20,
137-153.
Berryman,
R.,
&
Nevin,
J.
A.
(1962).
Interlocking
schedules
of
reinforcement.
Journal
of
the
Ex-
perimental
Analysis
of
Behavior,
5,
213-223.
Cumming,
W.
W.,
&
Berryman,
R.
(1965).
The
complex
discriminated
operant.
In
D.
Mostovsky
(Ed.),
Stimulus
generalization
(pp.
284-330).
Stanford:
Stanford
University
Press.
Eckerman,
D.
A.
(1970).
Generalization
and
re-
sponse
mediation
of
a
conditional
discrimina-
tion.
Journal
of
the
ExperimentalAnalysis
of
Be-
havior,
13,
301-316.
Gibbon,
J.
(1972).
Timing
and
discrimination
of
shock
density
in
avoidance.
Psychological
Re-
view,
79,
68-92.
Herrnstein,
R.
J.,
&
Hineline,
P.
N.
(1966).
Neg-
ative
reinforcement
as
shock
frequency
reduc-
tion.
Journal
of
the
Experimental
Analysis
of
Be-
havior,
9,
421-430.
Keller,
F.
S.,
&
Schoenfeld,
W.
N.
(1950).
Prin-
ciples
of
psychology.
New
York:
Appleton-Cen-
tury-Crofts.
Mace,
F.
C.,
Lalli,
J.
S.,
Shea,
M.
C.,
Lalli,
E.
P.,
West,
B.
J.,
&
Nevin,
J.
A.
(in
press).
The
mo-
mentum
of
everyday
human
behavior
in
a
group
home.
Journal
of
the
Experimental
Analysis
of
Behavior.
Nevin,
J.
A.
(1965).
Decision
theory
in
studies
of
discrimination
in
animals.
Science,
150,
1057.
Nevin,
J.
A.
(1
974a).
On
the
form
of
the
relation
between
response
rates
in
a
multiple
schedule.
Journal
of
the
ExperimentalAnalysis
ofBehavior,
21,
237-248.
Nevin,
J.
A.
(1974b).
Response
strength
in
mul-
tiple
schedules.
Journal
of
the
Experimental
Analysis
of
Behavior,
21,
389-408.
Nevin,
J.
A.
(1979).
Reinforcement
schedules
and
response
strength.
In
M.
D.
Zeiler
&
P.
Harzem
(Eds.),
Reinforcement
and
the
organization
of
be-
haviour
(pp.
117-157).
Chichester,
England:
Wi-
ley.
Nevin,
J.
A.
(1981).
Psychophysics
and
reinforce-
ment
schedules:
An
integration.
In
M.
L.
Com-
mons
&
J.
A.
Nevin
(Eds.),
Quantitative
analyses
of
behavior.
Vol.
1.
Discriminative
effects
of
re-
inforcement
schedules
(pp.
3-27).
Cambridge:
Ballinger.
Nevin,
J.
A.
(1989
May).
Toward
an
integrative
account
of
discriminated
operant
behavior.
In-
vited
address
at
the
meeting
of
the
Association
for
Behavior
Analysis,
Milwaukee,
WI.
Nevin,
J.
A.
(in
press).
Signal
detection
analysis
of
illusions
and
heuristics.
In
M.
L.
Commons,
M.
Davison,
&
J.
A.
Nevin
(Eds.),
Quantitative
analyses
of
behavior.
Vol.
11.
Signal
detection:
Mechanisms,
models,
and
applications.
Hillsdale,
NJ:
Erlbaum.
Nevin,
J.
A.,
&
Grosch,
J.
(in
press).
Effects
of
reinforcer
magnitude
on
delayed
matching
per-
formance.
Journal
of
Experimental
Psychology:
Animal
Behavior
Processes.
Rachlin,
H.
(1978).
A
molar
theory
of
reinforce-
ment
schedules.
Journal
of
the
Experimental
Analysis
of
Behavior,
30,
345-360.
Schoenfeld,
W.
N.,
and
Cumming,
W.
W.
(1960).
Studies
in
a
temporal
classification
of
reinforce-
ment
schedules:
Summary
and
projection.
Pro-
ceedings
of
the
National
Academy,
46,
753-758.
Schoenfeld,
W.
N.,
Cumming,
W.
W.,
&
Hearst,
E.
(1956).
On
the
classification
of
reinforcement
schedules.
Proceedings
of
the
National
Academy,
42,
563-570.
Sidman,M.
(1952).
Avoidanceconditioningwith
brief
shock
and
no
exteroceptive
"warning
sig-
nal."
Doctoral
dissertation,
Columbia
Univer-
sity,
New
York.
Sidman,
M.,
&
Tailby,
W.
(1982).
Conditional
discrimination
vs.
matching
to
sample:
An
ex-
pansion
of
the
testing
paradigm.
Journal
of
the
Experimental
Analysis
of
Behavior,
37,
5-22.
Skinner,
B.
F.
(1938).
Thebehavioroforganisms.
New
York:
Appleton-Century-Crofts.