1     Theoretical and historical issues



‘When one says “a man learns”, you agree that this is the least and first of sentences, do you not?’

Eleatic Stranger, Plato’s Sophist


Innate and acquired knowledge

The most famous and influential pages ever written about learning were designed to show that conventional and obvious theories about learning are wrong. Plato’s dialogue Meno, written more than 2,000 years ago, has the philosopher Socrates conducting an improvised experiment in order to convince Meno, a rich young aristocrat, that the most expensive and expert lessons can teach nothing that is not already innately known, though hidden, and therefore already present in the learner. The example Socrates chose was a simplified version of Pythagoras’ Theorem, which I trust readers will remember from their schooldays. If not, so much the better, since those who believe they do not already know the theorem will be in a good position to judge Plato’s theory, which is that they have learned it all before in a previous life. More knowledgeable readers must make an effort to put themselves in the shoes of Meno’s young slaveboy, the subject of the original experiment, who had been born and bred in his master’s household and, we are assured, had been taught no geometry whatever.


Figure 1.1 The square on the diagonal

The starting point of the experiment is the square ABCD, which Socrates drew in the sand, shown here in Figure 1.1, which has sides 2 feet long. By drawing in the cross lines the


slaveboy, who can count, and also knows that 2 x 2 is 4, is able to deduce from the diagram that a square with a 2-foot side has an area of 4 square feet. That is the easy part. The hard question, which Socrates now puts, is how can we draw a square with twice the area of the first one, that is, a square with an area of 8 square feet? That’s obvious, says the slave-boy, drawing a square with a side twice as long. But more drawing in the sand quickly demonstrates that a square with a 4-foot side is too big, since its area is 16 square feet. The slaveboy then makes a compromise guess of a square with a 3-foot side, but gives up on seeing that this gives 9 square feet. This is an important stage in the experiment, since Socrates starts the question over again from scratch, asking the slaveboy a series of easy questions, which ends with the correct answer to the difficult question. Shortening the series a little, one can say: look at the square ABCD, isn’t it cut in half by the line AC? In that case, if the area of ABCD is 4, and it is cut in half by AC, what is the area of the triangle ADC? And wouldn’t you say that the big square AEFC is made up of sections just like ADC? How many of these sections are there? So there are 4 of these sections in the big square AEFC, and just 2 of the same sections in ABCD? And what is the relation of 4 to 2? Double, now says the slaveboy. So how many square feet in the big square then? 8 feet, says


the slaveboy. But which line do we know is the side of the 8—feet square? The slaveboy points to AC. Ah, then says Socrates, the technical name for that is the diagonal, so it would be your personal opinion that the square on the diagonal of the original square is double the original area. That is so, Socrates.

Socrates takes this to mean that the slave boy has spontaneously blurted out Pythagoras’ theorem due to his innate knowledge of it. (It is possible to use similar techniques to prove that the square on the hypotenuse equals the sum of the squares on the sides of any triangle, but it would take a good deal more drawing in the sand: see Figure 1.2). Since no one had told the slaveboy about geometry in his present life, he must have learned about it in a previous life. But in any previous life we could have performed the same experiment. Therefore, the soul is infinitely immortal, at least in a backwards direction, and in fact Socrates generalizes from


this argument to conclude that the Soul is immortal in the future as well. It was thus a very serious point, and not just a demonstration of educational techniques.
Figure 1.2 Is Pythagoras' theorem innate?

There are various ways in which one can see a general proof of Pythagoras' theorem in this figure, which is more elaborate than that drawn by Socrates in Plato's 'Meno'. The outer square contains the square on the hypoteneuse of ABC with 4 triangles like ABC. The same outer square is made up of the square on AB (ABDE) plus a square on CB (DFGH) plus 4 triangles the size of ABC. Taking away the 4 extra triangles in each case leads one to discover Pythagoras' theorem. But does this mean that everyone is born already knowing the theorem?

There are many other philosophical implications of course, but for our purposes Plato’s theory of learning — that all learning consists of the recollection of already known innate ideas — provides an extreme example of a nativist (or rationalist) view, in which the role of information input from personal experience of the environment is minimized. The notion that the soul has learned everything that there is to know in previous lives, including those in a wide variety of animal species, did not remain academically respectable for very long, though it retains a certain popular appeal, but if previous lives are interpreted metaphorically as the influence of past evolution on genetically inherited characteristics then one could say that there are modern nativist theories, in the field of human sociobiology, which approach Plato’s in their emphasis is on predetermined factors in psychology.

There has of course never been any shortage of opposition to Socrates’ interpretation of the slaveboy experiment, and to the Platonic nativist position more generally. Both Socrates and Plato were in many ways reactionaries, whose views happened to stick — the belief that human life was ruled by customs and conventions, and that skills of all kinds could be taught on a commercial basis by professional teachers, was the received wisdom being attacked by Socrates. One of the participants in the discussion of learning with Meno, Anytus, shortly afterwards brought Socrates to trial and extracted the death penalty, though it is likely that this had more to do with politics than with the argument over learning, since Anytus was a politician who had just been involved in the restoration of democracy to Athens after a brief but extremely violent coup by the ‘Thirty Tyrants’, many of whom were Socrates’ relatives or former pupils. (The explicit conclusion at the end of the discussion with Meno is that political rulers obtain their ability to rule by divine right, no matter how annoying this might be is to democrats such as Anytus.)

The substantial theoretical opposition to Plato’s arguments for innate ideas was supplied soon afterwards by Aristotle, who can safely be counted as an ‘empiricist’, that is, as a theorist who believes that learning and the acquisition of


knowledge depend upon information flowing into the mind from the outside world, during one’s lifetime and definitely not before that. In a brief discussion of memory, Aristotle specifically says that when anyone learns something for the first time, or hears or sees something new, no recollection of innate knowledge is involved (De Memoria, 451a, 20). On the contrary, he argued that the mind is capable of thinking a range of thoughts, just as a blank writing tablet or piece of paper is capable of having all sorts of things potentially written on it (De Amina, 430a, 5). This has proved to be a very enduring image for empiricist theories, which are the standard form for learning theories. Obviously there is a world of difference between Plato’s theory, that everyone’s head comes complete with an encyclopedia of information which needs only to be accessed by an appropriate question, and Aristotle’s alternative that although we may eventually acquire an enormous amount of information, the pages of our internal encyclopedia are all blank to start out with. Exactly how the information from the outside world can come to be encoded and stored and understood internally is of course something for much further debate. It is very useful for me that in the places where Aristotle discusses these details, he usually refers to animal as well as human psychology. His basic assumptions are that information is gathered from the environment by sense perception, or an innate faculty of discrimination; in some animals sense perceptions may persist as a simple form of memory; in others a more complete memory for events is found; and in people deliberate searching and reconstruction of memories, and inferences from perception by the power of reasoning, allow for the acquisition of knowledge from the experience of life. It is hardly surprising that with this sort of naturalistic approach Aristotle specifically disagreed with Socrates’ emphasis on innate or universal ideas, even for mathematical concepts. (See Book A, chapter 1 of Aristotle’s Metaphysics for sensation and memory in animals; Books M and N for disagreement with Socrates — 1078b; and Posterior Analytics, Book II, chapter xix for perception and memory.)

This very ancient difference of opinion about the relative importance of internal and external sources of knowledge and influences on behaviour has been the source of many kinds


of argument, but the same issue, be it put as nature versus nurture, genetics versus culture, or learning versus instinct, is arguably still the single most contentious theoretical problem in the learning theories of today. Before looking at some slightly later developments in this innate encyclopedia versus blank pages contrast, I should perhaps add the cautionary note that, although it is easiest to understand the issue in terms of extremes, even extreme theorists usually modify their position to compromise on difficult cases.

We now move on rather quickly however, missing out a period of 2,000 years, from the empiricism of Aristotle in the fourth century BC to the development of it by Locke (1632—1704). To Aristotle’s image of the blank tablet, or tabula rasa, Locke added that of the mind as, at birth, an empty cabinet, waiting to be filled with ideas (Locke, 1689/1982, p. 55) and is primarily responsible for the most popular empiricist model of the mind of a child as plain white paper devoid of all characters (p. 104). This alone should make his disagreement with Plato’s assumptions clear enough, but Locke was extraordinarily thorough in dismissing all claims that the human mind is governed by any innate principles, or primary notions, brought into the world at birth. Children and idiots, he says, plainly have far fewer ideas than a rational adult, and if innate ideas can only be uncovered by mature reason, there does not seem much point in having them innate. Locke particularly disagrees with the idea that if someone accedes to a certain mathematical demonstration, this means that the mathematical principle involved was innate. Children have to learn, gradually, their numbers, before they can even agree that 3 plus 4 equals 7 (and also have to understand both the name and the concept of equality). More complicated mathematical demonstrations, such as, of course, Socrates’ square on the diagonal, simply show that good teaching from another person is far more useful than any innate impression on its own — it is the nature of the things contained in the demonstration which are persuasive. It is much harder to construct the demonstration than just to agree with it, and Locke does not think that many of the mathematicians that he knew would want to agree that all the diagrams that they had drawn were just copies of innate blueprints engraved on everyone’s mind


(p. 60). For social behaviour, and moral and religious principles, the great diversity of accepted practices during the history of mankind and in various cultures, persuaded Locke that there were no common factors. Perhaps he overestimated the prevalence of cannibalism, especially of children, and thus overstated the argument against any parental instinct for preserving and cherishing offspring, but the variousness of both social and intellectual ideas across human cultures and across historical periods remains as a challenge to present-day sociobiologists (e.g. Lumsden and Wilson, 1981).

Few learning theorists today would be as extreme as Locke, in limiting babies to only very faint ideas of hunger, thirst, warmth and pain, which they may have felt in the womb; but whether or not children, of say 5 or 6, have access to innate ideas of impossibility, identity, and a whole being necessarily bigger than one of its parts, is still a subject for investigation in developmental psychology. Locke said children had neither these nor any innate idea of the existence of God or the duty to worship him, keeping about as rigorously as anyone ever has to the blank paper model of human nature. It is difficult to say why Locke opposed innate mental principles so vigorously, but one can hardly ignore the fact that an English king was beheaded while Locke was growing up, and another was run out of the country the year before the publication in 1689 of Locke’s major work. Locke came from a Puritan and Parliamentarian family, and was firmly identified with the democratic and Whig side of the political issues of his time. Immediately after his psychological Essay, he published two political treatises; the first attacked the Divine right of Kings, and the second promoted the ideals of natural equality, the acquisition of property by individual effort, and rationally argued checks and balances in government which, it is sometimes held, - eventually became enshrined in the constitution of the United States of America (Russell, 1946). It is not too far-fetched to presume that Locke’s distrust of innate principles, and firm belief in the ability of all rational individuals to work things out for themselves, reflect political liberalism as well as psychological empiricism.

By the rigour of his opposition to innate ideas, Locke set himself the problems of a learning theorist, and it is worth looking briefly at his solutions. If the mind begins as a blank,


how is it ever filled? As Locke put it, ‘Whence comes it by that vast store, which the busy and boundless fancy of man has painted on it, with an almost endless variety? Whence has it all the materials of reason and knowledge?’ He gives immediately the learning theorist’s answer: ‘To this I answer, in one word, from Experience.’ That is all very well, but a number of details have to be specified, and there have always been difficulties in saying exactly what should go where, and in what order, on the blank paper. Like Aristotle, Locke said that the original source of all ideas was sensation, that is information received from seeing, hearing, touching, tasting and smelling the outside world. This is one fountain of experience, and the second is an internal sense, or reflection, since perceiving the operation of our own minds in doubting, believing, thinking, reasoning and willing is an equally important source of mental life. Human reflection is largely dependent on language, of course, but Locke followed in the tradition of allowing animals a share in the earlier stages of learning. Sensation, or the faculty of perception, he thought was common to all animals, and indeed was what separated animals from vegetables, but many members of the animal kingdom could be assumed to have very obscure and dull perceptions, since their sensory organs are wisely adapted to their ecological state and condition: we may reasonably conclude, says Locke, that the quickness and variety of the sensations of an oyster or a cockle do not match those of people, or of other higher animals. However, the prime importance given to perception in Locke’s theory is illustrated by his (surely extreme) suggestion that a human being whose sensory input was restricted to the small dull level of a cockle or an oyster would be prevented from exceeding these molluscs in any other form of intelligence — ‘notwithstanding all that is boasted of innate principles’ (Locke, 1689/1982, pp. 148—9).

Perception alone could lead no further, and could certainly not produce learning, unless it was combined with some kind of retention of perceived information. Habituation and conditioned reflexes can be the simplest forms of retention, but in Locke’s theory the main form of retention considered is a full-blown model of human memory. In this, a simple perceived idea can first of all be kept ‘in view’, by contemplation, for some period of time but, as in modern theories of


short-term memory, only a small number of items can be kept in view at once, and thus all ideas are at some point put out of view, in a long-term repository. Once they are there, they may be revived or retrieved for further contemplation in the future. All ideas in the long-term store are subject to ‘constant decay’: those which make the deepest impression, and last longest, are those given motivational significance by pleasure or pain, but repetition, and attention, also help to fix ideas in memory in Locke’s theory. Retrieval of an idea from memory is often an active and willed process, but at other times, Locke notes (pp. 150—3), items in memory return to mind ‘of their own accord’; or are ‘tumbled out of their dark cells’ and brought to light by tempestuous passions. Taken together, memory processes are essential to all other cognitive processes, and in fact Locke attributes stupidity in general to extreme slowness in the retrieval of ideas from the memory.

This is fairly typical as an empiricist model of memory; a peculiarity of Locke’s is that he assumes that it applies to many other animals, as well to the human species. This is peculiar partly because it minimizes the role of verbal coding in human memory, but much recent experimental research supports the view that birds and mammals have well-developed memory systems (see chapters 8 and 9). Locke himself was impressed with the memory of imitating birds for previously heard sounds, the more so because in the case of artificial sounds imitation does not seem to be of any use in ‘the bird’s preservation’ (Locke, 1689/1982, p. 155).

It is worth noting that a more sophisticated treatment of the problem of retrieval of information from memory was considered and rejected by Plato. One of the theories discussed by Socrates was the one of an imprint-receiving lump of wax, which could be bigger in some people than in others, of harder or softer consistency, and a source of fallibility in memory if the wax is too small or too soft, leading imprints to become crowded on top of one another or smudged (Theaetetus, 191—94). But this does not allow for the difference between stored and retrieved memories. We can in some sense have memories without using them in the same way that one can have clothes without wearing them; the metaphor which Socrates develops from this is that of an aviary, which starts off empty. Someone goes out and catches


large numbers of wild birds and puts them in his aviary — in this sense he has got them. But if he wants to hold and look at a particular one he has the new problem of retrieving it from within the aviary, and he may find that in some cases this is very difficult, or that if some of the birds fly in groups he catches the wrong one, and so on. In this context Socrates says that the aviary/memory store is empty at birth — during life one fills one’s memory store with pieces of knowledge as one might fill one’s aviary with birds, but there are analogous problems with retrieving the items from the store. The most interesting point made by Plato was that the identification of items used in retrieval may be different from the identification used when catching them in the first place — but he eventually rejects the metaphor altogether on the unrealistic grounds that if one knew enough about a memory to be able to retrieve it from the store there would be no need to carry the retrieval process any further.

Whatever the other details of a theory of perception and memory, it must include the assumption that representations perceived or remembered can be distinguished as same or different: Locke calls this discernment, and says it depends on, among other things, attention and the acuity of the sense organs — in modern theories the concept of discrimination learning in some ways corresponds to this (see chapter 8). A separate cognitive operation in Locke’s theory is comparison of ideas with respect to time, place, and other individual features — this is rather like relational as opposed to absolute discrimination, and is the point, curiously, where Locke begins to distinguish human intellectual processes from those of animals, since he believed that animals only compared very concrete features of objects. After that, according to Locke, even foxes and dogs begin to fall far short of human cognition, because they do not compose large complex ideas by combining together simple ones. Children first slowly get ideas fixed in their memories by repeated sensations, then gradually learn the verbal signs of speech, and in order not to have an endless list of names for each slightly different sensory experience they abstract general ideas from particular experiences and objects. Abstraction, according to Locke, is completely absent in all animals. He was not necessarily right


about this, but the important point here is that both people and animals are supposed to gradually build up experience, based on sensation, with human cognition achieved by the addition of extra stages of gradual experience over and above purely animal learning.

This would seem to be an empiricist learning theory of a particularly extreme type, but even Locke surreptitiously includes some nature along with nurture, as is made clear in his discussion of the association of ideas. For Locke most learning and thinking took place because certain ideas have ‘natural correspondence and connexion’ with others: physical similarity would be the simplest reason why two sensations could be connected together, but it was an essential function of human rationality to produce inherently reasonable relationships among ideas. However, says Locke, if you observe other people closely, you will almost always find some particular thing about them that is rather odd, and in fact many other people are completely unreasonable. This is because some ideas can become connected together by accident or custom, and once this process has started, it becomes very difficult to stop (1689/1982, p. 395). The examples include adults making themselves sick by eating too much honey, and afterwards feeling ill even when just hearing the name of honey; children learning to fear the dark by being told stories which connect darkness to goblins; and the rare case of a young man who practised complicated dance-steps in a room with a large trunk in it, and afterwards found that he could only dance properly either in that same room, or in a room that had a trunk in a similar position. But mental associations between ideas at random were thought by Locke to be very common, especially in perverse sects of religion or philosophy, and the foundation of almost all the errors in the world.

This is a very unusual treatment of the association of ideas, and not one much adhered to after Locke, but the general belief that random conjunctions of stimuli may have unexpected and unwanted after-effects is common in several recent theories of the role of conditioning experiences in the origins of neurosis. In Greek theories association of ideas was, on the contrary, a generally beneficial mental technique. In Aristotle’s treatment (e.g. Sorabji, 1972) the association of ideas


is discussed as a process of recollection, for use, for instance, in memorizing and remembering speeches. Recollection is a matter of a train of ideas in constant succession, and in getting to the next idea in the sequence we are likely to move to something that is similar, opposite or in some sense close to the current, one. But arbitrariness of associations can be made use of by the mnemonic method of loci: Greeks memorizing speeches often deliberately associated ideas in the speech with positions in a well-known room, so that when giving the speech they could imagine moving round the room, and be reminded of the point they wanted to make.

The theory of how one is reminded of something was not just a matter of practical mnemonics for Plato, but a crucial part of his theory of innate ideas. First, he said, one could be reminded of something despite dissimilarity — a lover who saw a lyre might visualize the boy he loved who normally played it. Second, one could be reminded by similarity — when seeing a picture of person X, one will recollect the person X themselves. But in this case one will be just as aware of the difference between the cue and the recollection — one may see a very bad caricature of Mrs Thatcher, and contemplate the real Mrs Thatcher, without being tempted to think that the cartoon is Mrs Thatcher. Socrates goes on from this to the innateness of abstract concepts. If we look at two almost equal sticks, we are reminded of the concept of equality, but we do not think the sticks themselves are identical to the concept — among other things, someone else might say the sticks were not equal (the Platonic version of the social psychological experiment of Asch, 1956). We must have had prior knowledge of equality in order to be reminded of it by the almost equal sticks, and since all sense perceptions of equality must be inferior to the ideal concept of equality, in order to ever be reminded of absolute equality we must have had knowledge of it before we were born, thus our soul had to exist before birth, and, Socrates concludes, after death as well. Deducing immortality from the association of ideas is of course far-fetched, and the argument appears rushed, and in a sense this is understandable, as the discussion is supposed to take place just prior to Socrates’ execution: but the importance


given to recollection in Platonic theory is probably one reason for Aristotle’s emphasis on the topic (Sorabji, 1972).

After Locke, the principle of the association of ideas was developed in a quite different direction by Hume (1711—76),. who was an extremist on all psychological and philosophical issues, but as a political historian an amiable Tory. Like Locke, Hume assumed that the origin of ideas was in sense impressions, and like Aristotle he assumed that progress from one idea to another in a train of thought would be based on connections due to similarity, contiguity and habitual sequence. But, taking a very radical step of his own, he denied that any particular kind of relation between ideas is more natural or inevitable than another, or that it is possible for human reason to provide logical relations between experienced ideas in any way independent of the quite arbitrary associations between ideas arising from individual circumstances. Thus, in Hume, learning from experience is second to nothing else in its influence on mental life, and, in this sense, he can be regarded as the progenitor of many other learning theories. Another very modern feature of Hume’s theorizing was his reduction of all psychological mechanisms to the sequence of one event being followed by another, which he usually referred to as ‘cause and effect’. He achieved considerable notoriety by arguing that we have no logical justification for reasoning that bread will be nourishing in the future, because it has been so in the past — what matters for present purposes is not whether he was right about this, but Hume’s alternative, psychological explanation, that we expect things to happen in the future as we have experienced them in the past, but because of an irrational process of habit: ‘All inferences from experience, therefore, are the effects of custom, not of reasoning’ (1777/1970, p. 43). Hume’s only concession to any innately accurate process of reasoning was to divide human thought into two categories: ‘relations of ideas’, which are completely independent of empirical evidence, and thus confined, in his view, to pure mathematics; and ‘matters of fact’, which include everything else. All matters of fact can be reduced to one sense impression having followed another, which is described as cause and effect. This allows Hume to give a one-word answer to virtually all


psychological questions: ‘What is the nature of all our reasonings concerning matter of fact? — the relation of cause and effect. What is the foundation of all our reasonings and conclusions concerning that relation? It may be replied in one word, Experience’ (1777/1970, p. 32, original italics).

Perhaps the most significant aspect of Hume’s writing for twentieth-century learning theories was that the key feature of learning from experience was applied universally, and no hard and fast line was drawn between its effects on the human species and its effects on other animals. Thus, ‘It is certain that the most ignorant and stupid peasants — nay infants, nay even brute beasts — improve by experience, and learn the qualities of natural objects, by observing the effects which result from them’ (1777/1970, p. 39). It seemed obvious to Hume that domestic animals such as horses or greyhounds learned from experience, and became more cunning and less impetuous as they got older; and that systematic training with reward and punishment could induce animals to behave in an arbitrary and unnatural manner. He did not pretend that these animals should therefore be assumed to be capable of argument or reasoning, but since he believed that neither children, nor mankind in general, nor even philosophers in their non-professional moments, were guided by pure reasoning, there was less cause for Hume to specify animal inadequacies in this respect than there was for Locke or Aristotle.

There were, however, some exceptions to the universal of unreasoning custom and habit, even for Hume — these were the abstract schemes of quantity and number, which did not require empirical evidence. It was worthwhile Hume believed, to reason out Pythagoras’ Theorem, but he did not therefore suppose that there was anything innate about it. ‘That the square of the hypoteneuse is equal to the squares of the other two sides, cannot be known, let the terms be ever so exactly defined, without a train of reasoning and enquiry’ (1777/1970, p. 163). Hume remained, however, extremely sceptical about the value of what he called ‘a . priori reasoning’, if it was not combined with matter-of-fact evidence. Nothing could be learned, he believed, from purely logical manipulations, such as saying ‘where there is no property, there is no injustice’,


with injustice defined as a violation of property, since this was nothing more than an imperfect kind of definition. And for matter- of-fact, realistic knowledge of the world, Hume suggested that thinking things out from first principles in the traditional philosophical way — his ‘a priori’ reasoning — was completely useless: one could never know whether a stone would fall up or down, or whether one billiard ball hitting another would stop, fly off at an angle, or return in a straight line, on purely a priori grounds, unless one had practical or experimental evidence to go on. Thus he felt that the only possible sources of knowledge were abstract reasoning with a numerical basis, or experimental evidence based on experience. This seems to be a reasonable approximation to modern scientific practice, and therefore should not have caused much of a stir on methodological grounds, but both its philosophical necessity and its psychological truth have been repeatedly questioned.

Both the philosophical and the psychological reactions against Hume were begun by Kant (1724-1804), who as a consequence is frequently referred to as one of the most influential of modern thinkers. Although he is as famous for the obscurity of his writing as he is for the weight of his conclusions, Kant’s elaboration of the whole nature/nurture question had a considerable influence in the early development of experimental psychology (Boring, 1950), and distinctions not unlike those he used have become increasingly important in modern learning theory. The starting point is the distinction between Hume’s ‘relations of ideas’ and ‘matters of fact’. Kant’s terms were ‘analytic’ and ‘synthetic’. Roughly speaking the former are logical necessities, such as ‘a rose is a rose’, ‘a tall man is a man’, or ‘something large takes up a lot of room’, and possibly also 2 + 2 = 4 and Pythagoras’ theorem; while the other category is of experienced facts such as ‘today is Thursday’, ‘the present prime minister’s first name is Margaret.’ The distinction is partly based on amount of new factual content: if a sports commentator says ‘Liverpool have just won the European cup on penalties’, a good deal of new information has been conveyed to the listener; on the other hand, if, as often happens, the commentator has nothing new to say, he may fill in time with analytical judgments,


such as ‘but this game isn’t over until the final whistle’s blown’ or ‘this race is going to be won by the runner who runs fastest from start to finish’.

To anyone with empiricist sympathies, it seems rather unlikely that even logical necessities should be in any way innate, and almost inconceivable that experienced or reported matters of fact, such as may be found in daily newspapers or The Guinness Book of Records, should be influenced at all by knowledge that is inherited. But, if anything, it seems more likely that logical necessities might contain some germ of an innate principle than that unlearned expectations are contained in matters of fact. It is the contribution of Kant that stresses the importance of innate first principles, even for the perception of matters of fact. Plato came close to claiming that everyone was influenced by innate concepts of such things as tables and beds, and even of individuals, the argument going that if you are able to perceive Margaret Thatcher, you have to know what it is that you are perceiving before you can perceive it, and therefore the concept of Margaret Thatcher is innate, but this sort of view is rarely defended (though see Fodor, 1980). The Kantian position seems to be much more of a compromise, since he has no doubt that ‘all knowledge begins from experience’; it is just that there are innate categories which are ‘keys to possible experiences’ and which might reasonably be regarded as part of the innate organization of the perceptual apparatus, such as the distinction between one and many objects, and the experience of succession in time (Kant, 1781/1979, pp. 25, 219). Even in Aristotle’s or Hume’s theory of learning from experience, there has to be some intuitive means of first understanding sense perceptions, or of recording that one sensation has followed another in time. Thus to rely only on experience, and nothing else at all by way of innate mental organization, is unrealistic, even for experienced perceptual facts. However, in terms of modern learning theories, which, unlike Kant’s, but like Hume’s, attempt to account for learning as it occurs in species other than our own, it is necessary to take into account even stronger kinds of innate matters of fact. Bertrand Russell’s instructive example was that a dog may become excited either because it observes its master putting on his


hat (in anticipation of going out for a walk) or (instinctively) because it smells a rabbit. Russell’s belief was that if the dog became capable of talking like a philosopher it would say as regards the smell of rabbit, ‘wherever there is this smell, there is something edible’, and as regards the hat, ‘my master’s putting on his hat is an invariable antecedent to his going out’. Moreover, if the dog was asked how it knew these things ‘he would say, in the latter case, that he had observed it, and in the former, that it was a synthetic a priori intuition’ (Russell, 1940, p. 248).

Clearly, the greater the evidence for instinctive organization of behaviour, the more plausible it is to assume that perceived events correspond to innate principles, and that animals recognize their natural foods, or fellow species-members, as conforming to some kind of ‘innate releasing mechanism’, as present-day ethologists have suggested (Tinbergen, 1951). Even David Hume recognized this, and, in order to defend the continuity between human and other species, said that learning from experience was itself an instinct, and that avoidance of being burned by fire in man was as much an instinct as that by which a bird incubates eggs (1777/1970, p. 108). This is probably not true, but illustrates the fact that even an arch- empiricist like Hume could not avoid getting instinctive processes mixed up in his learning theory. In subsequent chapters it will become apparent that the degree to which any species brings innate dispositions to bear on its reactions to experience is still a matter of intense experimental and theoretical investigation.

Darwinian evolution

Questions of both learning and instinct and, above all, of the relation between human and animal psychology, were given a new impetus by the rapid growth of biological knowledge in the nineteenth century, and in particular by the widely discussed Darwinian theory of evolution. As it is now understood, evolutionary theory tends to emphasize first and foremost genetically inherited characteristics, and therefore innate and instinctive determinants of both animal and human psychology, as is illustrated by the ‘selfish gene’


axioms of modern sociobiology (Hamilton, 1964; Dawkins, 1976; Wilson, 1975). But, increasingly, the study of behavioural ecology and particularly concern with the foraging techniques of individual species is leading to theories in which the biological utilities of the learning process are fully recognized (Krebs and Davies, 1983; Kamil and Roitblat, 1985; Lea, 1981, 1984a).

It is important to establish that one conceivable and initially popular relation between learning and evolution is now known to be biologically impossible. Darwin himself incorporated into his own speculations (especially in discussing emotional expression) Lamarck’s suggestion that useful habits acquired by parents could be passed on to their offspring by a biological form of inheritance. Thus, Darwin suggested that ‘some instincts have been developed simply through long-continued and inherited habit’, and that this explained why some instinctive behaviours appeared now to be superfluous or counter-productive. Dogs which now scratch the ground without burying anything might be repeating a movement ‘originally followed by some remote progenitor of the dog-genus for a definite purpose’ (Darwin, 1872/1965, pp. 42—4). Darwin was mistaken about this. Behavioural evolution must take place by the other mechanism he suggested — ‘through the preservation of variations of pre-existing instincts — that is, through natural selection’ (Darwin, 1872/1965, pp. 42 and 44). There is a certain appeal to the notion that the personal achievements of one generation can be passed on to the next, perhaps because a parent who had arduously acquired a knowledge of statistics, or a foreign language, might wish that this could be transferred to their offspring. But there is no biological mechanism by which information can be moved from the brain to the testicles or the ovaries of the parent. This was suspected by Weissman at the turn of the century, and is confirmed by all known biochemistry (Maynard-Smith, 1975; Dawkins, 1982). It applies not only to complex psychological information, but equally to habits discussed by Darwin such as the laying back of the ears by animals when fighting (which serves a protective function), and to purely physical results of the environment on growth and maturation.


Thus all instinctive programming of behaviour, and all innate influences on learning, must result from the selection of individuals most favoured by random variation in the instinctive and innate influences themselves. This allows, however, for a spectrum of behaviour-controlling mechanisms, with many unexpected combinations of inherited predisposition and learned individual initiative. At one extreme, perception can be regarded as the equivalent to the opening of a lock by a certain innate key — all that can be perceived in this case being whether or not the innate key is present. At the other extreme, quite artificial and novel stimulus information can be correctly categorized, though no doubt this process depends to some degree on built-in capacities. Most of human perception is surely of the latter kind — we have no difficulty in recognizing an electric typewriter or a television screen when we see one, though no genetic assistance for this particular task could possibly have been inherited from the effects of selection on previous generations. It is also the case that domestic and laboratory animals can learn to respond appropriately to stimuli from human artifacts, which they equally are in no sense genetically programmed to deal with. However, in many species, and not only in behaviourally simple animals, it is possible to find cases where sensory perception seems to correspond more closely to the lock- and-key model. In a very similar way, using the keyboard of a typewriter is a motor behaviour that cannot have been inherited or directly affected by evolution (and, although other animals do not type, they may be trained to ride bicycles and so on), but from the human knee jerk to the wriggling of worms it is possible to find patterns of movement that are undoubtedly inherited, and some of these, especially those required for the construction of nests or shelters (in either wasps or weaver-birds), may be both complex in themselves and complex in their relation to environmental stimulation.

Ticks and Toads

Ticks are arachnid arthropods, which have a very simple style of life, even by comparison with their close relatives the spiders, since they are specialized as parasitic bloodsuckers.


A tick may stay clinging to a twig, perhaps for months, until exposed to a low concentration of butyric acid, a component of the smell of a mammal, released from skin glands. It then drops, falling on to the mammal, attaches itself to the mammal’s skin, and gorges itself with blood. When fully fed, it may have a strong tendency to climb upwards when on vertical surfaces, but this would be largely due to the fact that its body is swollen with blood behind the region of the legs, so that the mechanical force of gravity will keep the tick’s body oriented in an upward direction (Fraenkel and Gunn, 1940). Thus the climbing up to a high position in a tree, and the falling off on to a mammal below, are greatly facilitated, if not wholly determined, by two rudimentary innate influences (namely, the reflexive response to butyric acid, and the physical tendency to crawl upwards when full).

Somewhat more complicated lock-and-key systems appear to control the feeding responses of frogs and toads. They do not have tracking eye movements, but respond to moving visual stimuli of a particular size and shape with a stereotyped sequence of responses. In the common toad Bufo bufo, studied by Ewert (1982), this consists of first orienting the whole body towards the stimulus, approaching it, and then snapping, swallowing and gulping after binocular fixation. By putting toads inside glass cylinders and presenting standard visual stimuli outside the cylinder, Ewert and others have isolated certain innate stimulus preferences. For instance, a stripe moving in the direction of its long axis is likely to elicit preycatching activities, in proportion to its length, but the same stripe, moving perpendicularly to its long axis, is not. Ewert (1976, 1982) refers to this as the ‘worm/antiworm’ phenomenon. Since it occurs immediately after metamorphosis, and can be traced to the selective response of cells in the ganglion layer of the retina and further on in the toad’s visual pathway, it seems undeniable that this form of perceptual categorization is inherited and is not learned from individual experience of worms. Many other forms of response to visual stimuli by frogs and toads, including the direction and distance of jumps away from threat, appear to be innately programmed (Ingle, 1982). A large South American toad, Ceratophrys omata, sits hidden in vegetation until a moving stimulus the size of a


small rodent appears, whereupon it leaps towards the stimulus with open mouth, ready to swallow a whole mouse or lizard. In captivity it has been known to eat whole rats, and since it operates according to the rule of the larger the approaching stimulus the better, and fails to distinguish between its normal prey and the human band, it has to be approached with caution by the human observer (Grzimek, 1974).

Learned modifications to innate releasing mechanisms

The chemical stimulus which results in the falling of the tick, and the visual stimulus which elicits worm-catching behaviour in the common toad, could well be used to support the concept of the innate releasing mechanism, or IRM, introduced by Tinbergen (1951) in his book The Study of Instinct. An example used by Tinbergen himself was the reaction of infant herring gulls to cardboard models which bore varying degrees of resemblance to a real herring gull profile (see Figure 1.3). Adult herring gulls have yellow beaks with a prominent red dot at the end, to which infant ‘begging’ responses are directed: grey bills with a prominent dot are preferred to grey bills with no dot, and there is a preference for the dot to be red in addition to this. Since other gull species with slightly different adult appearance have roughly corresponding infant preferences, it seems reasonable to assume that innate templates or filters are responsible for the species- specific differences (Hailman, 1962; Weidman, 1961). The very first responses of all newborns are not likely to be much influenced by learning, despite empiricist arguments like that of Kuo (1924, 1932) which claimed that the embryo has learned what is necessary: and in the case of nestling birds, the example of the cuckoo provides an undeniable example of innate behaviour patterns. The infant cuckoo, as soon as it is hatched, laboriously manoeuvres all other eggs and nestlings out over the brim of the nest. It must then make begging responses, consisting partly of gaping of the mouth, which impel parental feeding by the unwitting host species. But the later and more impressive example of its unlearned perceptual abilities is that it must, when adult, be able to


recognize a fellow member of its own species, of the opposite sex. It would therefore be foolish to deny the possibility of a very large measure of inherited control over the reactions of individual birds, in certain species. But this does not mean that all birds species employ the same measure of innate control, or that individual learning cannot be added on to initial innate preferences.


Figure 1.3 Innate visual preferences in herring gulls.


Tinbergen’s own experimental work provides ample evidence of learned modifications to such initial preferences. For instance Tinbergen and Kuenen (1957), in their experiments on the reactions to artificial stimuli of nestling thrushes, observed that if a very natural, adult thrush-like stimulus was presented with no following feeding experience, the infants very soon refused to respond to it, whereas if an artificial non-preferred stimulus was followed by food, the young birds began to respond to it as if it was the natural


releaser. It is undoubtedly the case that a great deal of perceptual learning takes place during the early life of many nestling (‘altricial’) bird species. Both herring gulls (Tinbergen, 1953) and terns (Stevenson et al., 1970) learn to recognize their own parents’ vocal calls after only a few days. Other social learning is more gradual (Klinghammer, 1967), and several passerine species, for instance bullfinches (Nicolai, 1959) and canaries (Marler and Waser, 1977) have their song production strongly influenced by their own auditory experience.

Imprinting and social learning

It is well known that non-nestling, ‘precocial’ birds such as ducks, geese, or domestic chicks, undergo a very rapid form of social learning, and become ‘imprinted’ on the visual impression of moving objects present in the first few days of their experience, whether or not these correspond closely to the appearance of their natural parents. This is apparently a specialized learning process, since is it time-constrained, not easily reversed, and not influenced in the expected direction by external rewards and punishments, but clearly, if young birds become ‘imprinted’ on the human form (Lorenz, 1967) or on an experimental football (Hess, 1959; Bateson, 1966, 1979), then perceptual information from the environment has become inserted into the animal’s cognitive system, in a thoroughly empiricist way.

Although this kind of early social learning is most obvious in birds, and possibly more specialized, a combination of early innate preferences and more protracted individual learning of social recognition is typical of mammals. Experiments on the taming of wild animals and observations of domesticated species make it clear that, while there may be innate components of social recognition, visual, auditory and olfactory, the social experiences of individuals strongly determine their social behaviour. This is best illustrated by the experiments of Harlow (1958; Harlow and Suomi, 1970) who demonstrated that deprivation of normal interaction with a natural mother led to thoroughly abnormal behaviour in rhesus monkeys, only partly alleviated by the substitution of a cloth ‘surrogate mother’. Similarly, for human infants, though


human ethologists would expect that response to eye contact and response to the sounds of the human voice are to some extent innate in human babies, no one would suppose that the experiences of a human infant are not a profound influence on all aspects of its future behaviour.

Individual learning and foraging

For the naturalistic study of animal behaviour, an increasingly significant theoretical base is provided by mathematical analysis of the efficiency of various patterns of food-seeking, or foraging, and the attempt to relate these to ecological factors such as the amount of variability in food distribution over space and time, the number of different types of food available, and so on (Krebs and Davies, 1983; Kamil and Sargent, 1981; Charnov, 1976). It goes almost without saying that there is massive innate influence (and perhaps here it is appropriate to call it innate determination) on what foods a particular species will seek, and how it will go about seeking them. Swallows will scythe through the air for insects, sparrow-hawks will catch sparrows, antelope will graze, and lions will hunt antelopes. The swallow does not have to learn to be a swallow, and it is quite possible that this particular species does not have to learn a great deal about its foraging strategies. However, there are some bird species which apparently have to learn even the basics of their trade, and a great many other species in which minute-to-minute and day-today knowledge about food locations and types is essential to the success of their foraging forays.

A species which seems to be born ill-prepared for its main task in life is the oyster-catcher. This may be connected with the fact that this bird, in Britain, does not eat oysters, which are in short supply and expensive, but mussels, which are relatively abundant. Although young oyster-catchers are mobile soon after birth, like ducklings and chicks, they are not able to feed themselves for at least six weeks, and sometimes not for six months, after hatching (Norton-Griffiths, 1967, 1969). By the time the young are fledged, at six weeks, they are fairly good at probing for worms in sand or mud with their long beaks, but they are still ‘hopelessly inefficient’


at opening up mussels or other molluscs, or at dealing with crabs. If the main available food is mussels or crabs, they thus have to follow their parents to the feeding ground for the food to be presented to them ready-to-eat.

During the months that this continues, the young are not merely growing stronger but are also gradually learning the techniques necessary for feeding themselves. The evidence for this is partly that some never learn, and die, and partly that, with mussels, two quite distinct techniques are used in foraging by adult birds. For mussels covered by shallow water, the bird searches for one which is open, and stabs into it with its beak, cutting through the muscle which would otherwise close the shell. But for mussels fully uncovered by the retreating tide, and therefore tightly closed, the adult pulls one loose, carries it to a hard patch of sand, and hammers it open at the weakest point, which is the flat undersurface. The point is that individual adults specialize in one or other of these methods, and that their offspring appear to learn from example, and adopt the particular technique most favoured by their parents (Norton-Griffiths, 1969).

Clearly skills in bill use might be refined by practice, even if the rudiments are innately given, as is true for the motor performance of flying in birds. Other kinds of learning are involved in perceptual search and identification. An animal has to be able to recognize when it has found what it is looking for, and it may search more effectively if it knows what it is looking for: the term ‘search image’ is sometimes used’ to explain selective response to certain types of prey (Tinbergen, 1981; Croze, 1970; Drent, 1982). The observable facts are ‘that animals may show selective responsiveness to particular stimulus patterns and that the effective patterns may be determined by previous experience with them or related patterns’ (Hinde, 1970, p. 125). In part, the concept of a search image overlaps with more general hypotheses about pattern recognition (see chapter 9), but the natural utilities of searching behaviour are most directly reflected in the way that expectancies of food-finding may vary with induction h from recent experience. Croze (1970) simply showed that crows, which normally show little interest in half-shells of mussels lying on sand, immediately began


turning them over after he had laid out a row of shells with pieces of meat alongside them, and then had hidden pieces of meat under the shells. Krebs et al. (1972) gave great tits the semi-natural task of foraging for meal worms hidden in an aviary which contained 18 each of the following types of hiding place: large cups filled with torn-up paper; half pingpong balls filled with sawdust; wooden poles with a small hole drilled in the top, covered with masking tape. Only one meal worm was hidden for a particular test, and whichever bird in a group found this had a strong tendency to try other hiding places of the same type.

Apart from search images of a particular kind of prey, or a particular kind of prey location, a number of other decisions made during foraging involve short- or long-term learning about features of the environment. Drent (1982) discusses the different time scales involved in decisions made by parent starlings feeding their young, which were studied by J. M. Tinbergen (1981). On setting out from the nest these birds had to aim either close to the colony, where they could easily find leatherjacket grubs, or much further away, where they could less easily find much more palatable caterpillars. On the whole they preferred to go for the higher-quality but difficult-to- get caterpillars, unless their large brood was very hungry, when they more often brought back the easy-to-get but less desirable leatherjackets. For either type of prey individual birds had to select a ‘macro-patch’, this is, a general region to fly to to begin searching, and also a ‘micro-patch’ — not looking for too long on one particular leaf, for instance, before moving on to the next. A final decision, which also might be determined to some extent by trial and error, is how much food to collect before flying back to the nest. This sort of psychological problem as set to individuals of a particular species by its behavioural ecology, is a very active area of current research, and even the briefest examination of it emphasizes how much room is left for learning from experience in behaviours that at first sight look largely instinctive (Boice, 1984; Fragaszy and Mason 1983; Leger et al. 1983). Lea (1981, l984a) has gone so far as to suggest that learning as a process is a device evolved by animals as an aid to their foraging activities.


Ideas versus reflexes and explanations versus descriptions

The current emphasis in naturalistic and evolutionary analyses of animal learning may be in terms of decision-making, information- processing, and cognition, but this has not always been so. Darwin and his collaborator Romanes (1883, 1886) tended to get carried away by anecdotal evidence for very highly intelligent behaviour in parrots, sheepdogs and farm animals, and the desire for a sounder scientific base for theories of learning in animals prompted Morgan (1894) to propose what is now known as ‘Lloyd Morgan’s canon’: ‘In no case may we interpret an action as the outcome of the exercise of a higher psychical faculty, if it can be interpreted as the exercise of one which stands lower on the psychological scale’. This implies a rule of always choosing the simplest possible explanation — the scientific law of parsimony — which has much to recommend it. There is a problem, however, in defining simplicity, and in deciding exactly what range of evidence must be included under any particular explanation — the universal application of Lloyd Morgan’s canon does not solve all theoretical issues. Interest in the phenomena of learning, both human and animal, has been closely connected with a running argument about the level of explanation to be applied to the phenomena, which cuts across the nature/nurture issue, and which is not over yet.

The course of the argument has gone roughly like this: at the turn of the century Pavlov in Russia and Thorndike in America were independently trying to establish a rigorous and scientific approach to learning, which would make use of laboratory experiments with animals, but which could also be widely applied to human behaviour, particularly in the psychology of education and in the treatment of mental illness. Shortly afterwards J. B. Watson formally proposed that the study of learning and conditioning and indeed the rest of psychology as well (which he took to be largely determined by learning and conditioning), should embrace the programme of behaviourism, and thus become ‘a purely objective experimental branch of natural science’ (Watson, 19 14/1967, p. 1). In all these developments, the emphasis was


on what could be directly and quantitatively measured, with explanations for the measurements kept as parsimonious as possible, and therefore differing as little as possible from the measurements themselves. Thus theories were couched in terms of conditioned reflexes, physiological processes closely tied to reflexes, and simple habits, or stimulus-response connections (stimuli and responses both being directly observable: see Boakes 1984: From Darwin to Behaviourism). A considerable amount of progress was made in finding experimental techniques to test such theories, and the mathematical principles of behaviour put forward by Hull (1943, 1952) were extremely influential in their day. However, the behaviourist and stimulus-response approach has fairly obvious limitations, especially, but not only, as applied to human learning. In particular, the theoretical issues associated with attention, perception and memory were largely ignored by Hull. In terms of the Darwinian and evolutionary explanation of animal psychology, the extreme empiricism of behaviourism meant that all species of animal were taken to be equal as collections of spinal reflexes. Differences between species were therefore ignored, or denied as a matter of principle, and thus the role of learning processes in the natural life of a species was not often considered. Even for the experimental evidence from a single domesticated species — the laboratory rat — the strictly stimulus- response theorists were vulnerable to criticism, and the work of Tolman (1932, 1959), probed at these weak points.

Tolman himself advocated a ‘purposive behaviourism’, which sounds like a contradiction in terms. What he did in practice was to seek soundly based experimental evidence for explanations of animal learning which invoke inner and unobservable psychological processes, such as the mental inspection of ‘cognitive maps’ and the formation of ‘expectancies’ as to future events (Tolman, 1948). By and large, the study of animal learning after about 1970 follows either Tolman or Darwin, or both. Laboratory research on domesticated species seeks to support theoretical models of animal learning which presume that internal representation of outer events are encoded, sorted, retrieved and rehearsed in the animal’s memory (Hulse et al., 1978; Spear and Miller, 1981;


Roitblat et al., 1984), while more naturalistic studies attempt to relate the cognitive capacities and proclivities of a given species to the behavioural ecology of its conditions of life — a thoroughly Darwinian enterprise.

Human and animal learning

Both the emphasis on cognition and the emphasis on the specializations of particular species bring up the question of the relation between human and animal learning. Few apart from Hume (1777/1970), Hull (1943, 1952) and Skinner (1953, 1972) have ever proposed that the relation should be one of identity. If nothing else, the ordinary human use of spoken language means that the content of what a person is potentially able to learn has little in common with the content of what is learned in any other species’ natural life, or in the most carefully contrived animal experiment. Most modern learning theorists would therefore accept that the acquisition of knowledge by human individuals must differ in some very striking ways from the learning processes available to any sub-human species. There are, however, a number of reasons why questions about animal learning should continue to be asked.

1 The evolution of learning

Many of us will remain interested in the abilities of other species, whether or not these abilities are of importance in the context of a specifically human psychology. But, on the basis of Darwinian evolution, it seems reasonable to assume that specifically human abilities are not entirely unrelated to those of our animal relatives, and of course there are many aspects of human behaviour that are by no means specifically human. When we are pricked we bleed, when we are hungry we want food, and when we are angry we are aggressive — in talking or thinking about these events we may enter into the realms of exclusively human cognition, but the processes which bring them about are not always so rarefied. Arguably, to fully understand human learning, if that is our goal, we must know exactly how and to what extent it differs from


animal learning, and we must therefore examine animal learning as well.

2 Animal cognition

If it were true that, on the one hand, all animal behaviour was controlled like simple clockwork, and, on the other, that all human behaviour was controlled by biologically unique reasoning processes, then, although we would still want to know how the reasoning evolved from the clockwork, in practice there would be no overlap between human and animal psychology. However, from Freud to modern cognitive science (e.g. Johnson-Laird, 1983; Sulloway, 1979) there is agreement that the rationality of men and women is frail and suspect, and that various quite unconscious motivations and methods of inference underlie some of our simplest subjective impressions of reason. More recently, theories of animal learning have stressed that processes more complex than clockwork-like stimulus-response connections are implied by the results of laboratory experiments, and by natural patterns of foraging (Mackintosh, 1974, 1983; Dickinson, 1980; Hulse et al., 1978; Spear and Miller, 1981; Griffin, 1984; Walker, 1983a; Kamil and Sargent, 1981; Roitblat et al. 1984). In particular, basic aspects of the perception, identification and recognition of objects, and coding, storage and retrieval of remembered events, are studied in animal experiments (see chapters 8 and 9). This strengthens considerably the argument above, that some of the elements of human cognitive abilities must be understood with reference to the much more general evolution of methods of information processing in animal species, some of which may be very widespread. The complete panoply of human psychology did not arise in splendid isolation from the biological world. Some parts of it undoubtedly did, and therefore the gains from the study of the basics may be rather fewer in psychology than they are in biochemistry and genetics, but there are many points of contact between theories of human and animal learning.


3 The importance of learning in human psychology

One of the main ways in which human learning differs from animal learning is that there is simply more of it. Even without taking the extreme empiricist view that everyone’s mind can be considered to be completely blank at birth, waiting to be filled in with all the details of individual personality and temperament, as well as those of vocabulary, social knowledge and higher education, it has always been obvious that upbringing, training and apprenticeship and the informal explorations of everyday life are aspects of personal experience that mould and shape character and skill. Plato, who we began with as the best example of someone who does not believe in the theoretical basis of learning from experience, was second to none in emphasizing the practical realities of learning in upbringing and training. His recommendations for a properly organized form of human society, in the Republic, are usually criticized precisely because of the extremes which Plato appeared to be willing to go to in order to regulate individual experience. The recommendations begin, for instance, with the necessity of abolishing all then current Greek myths and nursery rhymes, as told to children by their nurses and mothers, and the issuing of specially fabricated approved versions. Since the Greek myths, of which the Oedipus legend is a fairly run-of-the-mill example, seem frequently to involve violence and/or sex between parent and child, of an explicitness well up to modern video-nasty standards, one can perhaps be sympathetic to this particular kind of censorship, but Plato intended to be far more thoroughgoing, since all literature and poetry, for all ages, was to be inspected and bowdlerized, the originals being retained for inspection by the rich on payment of a stiff fee. The same was to apply to children’s games, and all drama, painting and music, since exposure to any of these could affect character development. For instance, novel fashions in music would be banned by Plato as dangerous to the whole fabric of society. On a more positive note, high-minded civil servants would have their noblest sentiments schooled by habit, since their environment was to be carefully controlled from birth. They were to be given rigorous physical training in late adolescence,


and a through academic education between the ages of 20 and 35, practical experience of life being gained between 35 and 50, at which age the best of them might be fit to divide their time between further study and high-level administration. As this suggests, the ancient Athenians were fanatical about education, physical training, and the practising of music and crafts of all kinds. The rare scepticism of the Meno (suggesting that there are some things which are inborn and which cannot be taught), which this chapter began with, was argued partly on the basis of Socrates’ surprise and disappointment in the face of the fact that several famous and successful Greek leaders had had sons who had turned into bad characters, against the best intentions of their fathers. It is of course not very difficult to think of empiricist reasons as to why this might happen: in Plato’s Republic the problem is solved at a stroke by the abolition of normal family life for the ruling classes.

Thus, although it is possible to find theorists who claim that, in principle, important aspects of human psychology are inborn and innate (with the linguist Chomsky (1957, 1980) being the modern equivalent of Plato in this respect), absolutely no one can deny that, in practice, culture, schooling, relations with parents and peers and, indeed, exposure to nursery rhymes and legends, or, in modern times, the impact of videos, advertising, and the myths of the media, are all the stuff and substance of real-life human psychology. And all involve the notion of learning, and the acquisition by the individual of knowledge from the environment. Therefore an adequate theory of learning should provide a scientific basis for empiricism, that is, it should help explain how human culture is biologically possible. The biological side of this will involve animal learning for the reasons given in the preceding two sections. In addition to these, a reason for animal experimentation is that it allows for the investigation of the effects of motivationally significant events on learning processes. Subjects in experiments on human cognition may vary in their interest, arousal and commitment, but rarely does their performance in the experiment represent a goal of an emotional intensity which is equivalent to that of even everyday interactions involving personal social interactions


or personal life ambitions. By contrast, in experiments on animal learning such as those which will occupy most of the pages of the remaining chapters in this book, the experimenter is free to manipulate an animal’s hunger or level of physical discomfort in ways which would be completely unacceptable if the subjects were people. 1


1     Voices are now often raised with the claim that anything unacceptable in the treatment of a person should be equally unacceptable in the treatment of any other species — striking examples, in their way, of belief in the relation of identity between human and animal psychology. A more considered view is usually taken by psychologists with specialized knowledge of the subject, and an example can be found in a report by a working party set up by the British Psychological Society (1979). Anyone conducting research with animals in English-speaking countries has both a moral and a legal responsibility to keep the animals involved in appropriate conditions and to minimize suffering and discomfort.


End of Chaper 1 | Start of Chapter 2 | Contents