• Morgan, A. (2014), Representations Gone Mental, Synthese, 191(2): 213-244 [ pdf ] [ doi ] Abstract
    Many philosophers and psychologists have attempted to elucidate the nature of mental representation by appealing to notions like isomorphism or abstract structural resemblance. The ‘structural representations’ that these theorists champion are said to count as representations by virtue of functioning as internal models of distal systems. In his 2007 book, Representation Reconsidered, William Ramsey endorses the structural conception of mental representation, but uses it to develop a novel argument against representationalism, the widespread view that cognition essentially involves the manipulation of mental representations. Ramsey argues that although theories within the ‘classical’ tradition of cognitive science once posited structural representations, these theories are being superseded by newer theories, within the tradition of connectionism and cognitive neuroscience, which rarely if ever appeal to structural representations. Instead, these theories seem to be explaining cognition by invoking so-called ‘receptor representations’, which, Ramsey claims, aren’t genuine representations at all — despite being called representations, these mechanisms function more as triggers or causal relays than as genuine stand-ins for distal systems. I argue that when the notions of structural and receptor representation are properly explicated, there turns out to be no distinction between them. There only appears to be a distinction between receptor and structural representations because the latter are tacitly conflated with the ‘mental models’ ostensibly involved in offline cognitive processes such as episodic memory and mental imagery. While structural representations might count as genuine representations, they aren’t distinctively mental representations, for they can be found in all sorts of non-intentional systems such as plants. Thus to explain the kinds of offline cognitive capacities that have motivated talk of mental models, we must develop richer conceptions of mental representation than those provided by the notions of structural and receptor representation.
  • Morgan, A. & Piccinini, G. (2017), Towards a Cognitive Neuroscience of Intentionality, Minds & Machines [ pdf ] [ doi ] Abstract
    We situate the debate on intentionality within the rise of cognitive neuroscience and argue that cognitive neuroscience can explain intentionality. We discuss the explanatory significance of ascribing intentionality to representations. At first, we focus on views that attempt to render such ascriptions naturalistic by construing them in a deflationary or merely pragmatic way. We then contrast these views with staunchly realist views that attempt to naturalize intentionality by developing theories of content for representations in terms of information and biological function. We echo several other philosophers by arguing that these theories over-generalize unless they are constrained by a theory of the functional role of representational vehicles. This leads to a discussion of the functional roles of representations, and how representations might be realized in the brain. We argue that there’s work to be done to identify a distinctively mental kind of representation. We close by sketching a way forward for the project of naturalizing intentionality. This will not be achieved simply by ascribing the content of mental states to generic neural representations, but by identifying specific neural representations that explain the puzzling intentional properties of mental states.

In Progress

  • Morgan, A. (submitted), Mindless Accuracy Abstract
    Following Brentano (1874), many philosophers have held that intentionality is the mark of the mental. Most philosophers of mind these days think of intentionality as representational content, where content is understood in terms of the conditions under which a state is accurate. Thus a contemporary reformulation of Brentano’s view holds that ‘underived’ accuracy conditions are psychological in nature. Call this the Content View. The Content View is tacitly held by many, and is presupposed by a variety of projects in the philosophy of mind. In this paper I argue that the Content View is false; nature abounds with mindless representations. I begin by pointing out that the most mature and widely-endorsed theories of content in contemporary philosophy of mind, namely the ‘tracking’ theories developed within the naturalizing semantics tradition, are widely acknowledged to encompass states of all sorts of mindless systems such as plants or thermostats. Tracking theories thus result in a ‘dementalized’ conception of representational content, and entail that the Content View is false. Philosophers of mind must therefore look elsewhere for a theory of content that will ground the Content View. An apparently promising place to look is Tyler Burge’s (2010) theory of perceptual content, which is explicitly intended to avoid the dementalizing consequences of tracking theories. Burge holds that accuracy conditions play a crucial explanatory role in perceptual psychology, and therefore demarcate a distinctively psychological kind of representation. However, I argue that Burge’s theory dementalizes representation for precisely the same reason tracking theories do. Ascriptions of accuracy conditions play crucial explanatory roles in various sciences other than psychology, as in explanations of how plants are able to synchronize their activities with the day-night cycle. Indeed, the notion of explanatory accuracy plays the same essential role in tracking theories as it does in Burge’s theory, so there’s a sense in which Burge’s theory just is a tracking theory. Although tracking theories pick out a legitimate notion of representation, they don’t pick out a notion of distinctively mental representation. I close by suggesting that if we are to illuminate the nature of mental representation, we must move beyond the bare notion of explanatory accuracy and take seriously the mentalistic nature of genuine intentionality. We must provide an account of how an informational state is integrated within the capacities of a psychological subject such that it is intelligible as a state of a subject’s inner perspective on an outer, objective world.


  • Morgan, A. (2014), On the Matter of Memory: Neural Computation and the Mechanisms of Intentional Agency, Rutgers University [ pdf ] [ abstract ]

Dissertation Committee:

Frances Egan (co-chair)
Robert Matthews (co-chair)
Charles Randy Gallistel
Gualtiero Piccinini (external)