SoxProspects News
|
|
|
|
Legal
Forum Ground Rules
The views expressed by the members of this Forum do not necessarily reflect the views of SoxProspects, LLC.
© 2003-2024 SoxProspects, LLC
|
|
|
|
|
Forum Home | Search | My Profile | Messages | Members | Help |
Welcome Guest. Please Login or Register.
The nature of consciousness
|
Post by incandenza on Apr 15, 2021 10:15:13 GMT -5
So here's my basic approach to these issues (and basically what I meant by my comment about Nagel): modern thinking about mind is basically stuck in the dualistic ontology emblematized by Descartes. Of course, in our day and age we are much too sophisticated to think anything like a "spirit" or "soul" exists, and the naive position is scientific materialism. But we tend to essentially equate the mind and the brain. The problem with doing this is that the mind and the brain are quite clearly not the same thing. No matter how many facts you know about the brain, you will not get to the mental experience which depends upon (but is not reducible) to it.
This is what people usually think Nagel is saying, but like I said, Nagel actually leaves open the possibility that we can get to subjective experience scientifically. I think this is wrong. And it's not because the mind is some mysterious substance beyond understanding or anything like that; it's just that science works (to be horribly reductive) by treating the world as a world of objects, and experience cannot be made into an object.
In other words, there is a whole ontology that undergirds our naive thinking about the mind, as pervasive in our age as Christian theological conceptions of the soul in the middle ages. Most cognitive scientists accept this ontology without even realizing it.
And unfortunately, so do a lot of analytic philosophers (like Schwitzgebel). This is silly of them, because phenomenology in the Continental tradition (Husserl, Heidegger, Merleau-Ponty, etc.) is much more attuned to these ontological assumptions; the method is to bracket all such assumptions, and simply start from experience itself, on the premise that we have to have a clear understanding of what mental experience is, and eliminate our unwarranted ontological assumptions as much as possible, before we can start speculating on stuff like the hard problem of consciousness.
Analytic philosophy of mind has sort of been re-inventing the wheel in recent years by recognizing the importance of phenomenological reflection, something that Schwitzgebel seems to be into. But a quick glance at his references makes it clear he's not engaged with the century-old tradition of this in Continental philosophy. I have no doubt that he has some interesting observations, and maybe some enlightening things to say about the easy problem, but I think there are more promising directions to go in exploring the nature of consciousness.
If you're interested in people who are trying to meld traditional phenomenological insights with modern cognitive science, check out the journal Phenomenology and the Cognitive Sciences. Or you might already be familiar with this term, but if not, google search 'enactivism.' Or for an overview of phenoomenology in general, just start here.
|
|
ericmvan
Veteran
Supposed to be working on something more important
Posts: 9,027
|
Post by ericmvan on Apr 16, 2021 3:46:58 GMT -5
Is it really true that Nagel "leaves open the possibility that we can get to subjective experience scientifically?" His entire argument is based on the inherently first person nature of phenomenality. I've read his controversial Mind and Cosmos (and agree with a decent amount of it) and there's nothing to suggest he's softened his position on that. Otherwise, I agree with your first two paragraphs (and probably the third). One of my happiest moments reading the literature was discovering that Wilfrid Sellars, one of the top guys in the field, not only labeled mind-brain identity theory "absurd" as early as 1961, but he did so in one of his two most famous papers ("Philosophy and the Scientific Image of Man").
I hadn't heard of "enactivism," but I see that it's an extension of embodied cognition. I'm thinking of the old putdown formula "this is original and [other positive attribute], but the part that that's original isn't [attribute] and the part that's [attribute] isn't original." I think the fact that sufficiently advanced organisms select their environments rather than occupy them by chance or passively is a nice insight, but I'm very dubious that it explains much of anything; it would rather be something to keep in mind when developing a complete picture.
Quoting Wikipedia ... "Organisms do not passively receive information from their environments, which they then translate into internal representations." Umm, straw man? Of course the information from the environment is received via action; I would regard the insight that it isn't passive as trivial at best.
As far as representations are concerned, let me begin by pointing out that something that resembles connectivism may be going on at low levels, so the opposition with representationalism is somewhat of a false dichotomy. But the notion that the brain doesn't have representations is a non-starter. I got into neuroscience out of self defense c. 1992, after developing a baroque sleep disorder full of previously unknown symptoms. When I started thinking about how the brain worked, I eventually decided that representations are one of the two fundamental cognitive components (the second being associations between representations). Well, you can explain huge swaths of stuff with that (which I did starting at Harvard as a non-degree grad student in 1998-2002).
So I'm a nuts and bolts guy when it comes to the Easy Problems. I own a great book, Doing Without Concepts, by Edouard Machery, which is all about the different proposed solutions to the puzzle of how we instantly tell that A is a desk but B is a table. The notion that switching to an enactivist viewpoint would accomplish more than squat in solving that problem is, to me, laughable. (Machery, BTW, essentially argues that "concept" as used is an incoherent concept (in one sense of the word!).) You explain that by outlining the flow of information among representational neural groups.
And lest it seem that I'm rejecting the notion that we need some kind of Big Idea to make progress in cognitive science, here's one that is actually both much simpler and explains stuff: the brain treats thought as a variety of action, or, more succinctly, cognition is (a form of) action. That's not some vague philosophical principle whose implications are subject to lengthy debate and which spawns collections of broadly ruminative papers; it says that specific neural mechanisms that evolved to control action have been recruited to control thought.
In terms of phenomenology, I've got some good books that try to examine it rigorously, e.g. Barry Dainton's Stream of Consciousness. I've got my own taxonomy of the contents of experience which I think is pretty good, and includes a whole class that I don't think has been previously identified, cognitive state feelings, such as the feeling of being puzzled, confused ... there are tons of them, and they all have very clear functions in terms of meta-cognition. All of that is the product of simple introspection; I think that a lot can be said on this topic that's simple and universal.
The notion that enactivism has anything to do with the Hard Problem is even less tenable. I can argue that it's essentially a physics problem, i.e., that reductive materialism is false, at least four different ways. (So is dualism, of course, and. less obviously, pansychism; as is often the case with seemingly impossible problems, people are assuming that all the extent brands of solution exhaust the possibilities.)
If you know Chalmers' taxonomy, I'm a Type C Materialist. Searle seems to have become one, and I think Joe Levine is, too. Chalmer's argument that all the sort-of-Type C positions we know of reduce to others of his types is ironclad, but ... duh? If there's an unknown Type C solution, it wouldn't reduce, by definition. And this position is argued for in The Only Book Yet Written About Subjective Experience That Argues a True Thesis.
|
|
|
Post by incandenza on Apr 16, 2021 13:56:59 GMT -5
I had misremembered this slightly (it's been a long time since I read the bat essay!). What I had in mind was his discussion of "objective phenomenology" in the last few paragraphs, but in looking at it again I don't think it's right to assume he was invoking a scientific method here; it actually might be something more like a Husserlian approach, but he's so vague about the idea that it's hard to say. Really, he admits as much, saying we need to think about what we mean by 'subjective' and 'objective,' which, yes, we very much do! And there's this whole phenomenological tradition that does just this...!
In its best versions enactivism (indeed, a component of the embodied cognition pardaigm) emphasizes the recursive nature of organism-environment interactions: the organism acts within and toward an environment, but also the environment solicits these actions. I think you're totally right when you say, further down, that thought is a kind of action. But, like all actions and perceptions, it has to be thought of as an interaction. So: consciousness is an interactive process between an organism and its environment.
Haha, well, take it up with the empiricists! Granted, there are probably not a lot of contemporary philosophers who would subscribe to a totally naive empiricism. But the point that is not trivial, I think, is that actions (including acts of perception) construe the environment in terms of interests, affordances, gestalts, whatever - i.e., the "world" is a product of a dynamic interactive process, not something just waiting to be perceived or acted on.
'Representation' is such a fraught term. The criticism of it (coming from a phenomenological perspective) is its heavy association with subject/object dualism. I'm sure there are ways of construing 'representation' that aren't necessarily dependent on this ontology, but let me just nip the semantics in the bud and say that as it has traditionally been used, 'representation' tends to imply this dualism, and that is the real concern. (For instance, it is self-evidently the case that there are correlations between brain activity and mental states, as well as the external world, and if that's all that's meant, then fine. But are these "representations"? Hard to call them that if we are construing consciousness in terms of a model of dynamic interaction as described above.)
Can you be quite certain that a view is "laughable" when you've only just learned that it exists? Again, there's plenty of literature on phenomenology and the cognitive sciences (starting with the eponymous journal; see also the work of Hubert Dreyfus and Sean Dorrance Kelly, for starters). Lots of recent stuff on predictive coding, the hot new thing in theories of mind, too.
Totally agree with you here! Are you familiar with conceptual metaphor theory? IMO it is, in broad strokes, the best paradigm for understanding the nature of abstract thought. It also happens to fit pretty much hand-in-glove with embodied cognition.
Don't know this one, but again I'd really suggest diving into the phenomenological tradition in continental philosophy. Looks like Dainton at least has some passing engagement with Husserl, but that's about it. I think Merleau-Ponty would be especially relevant to the questions you're interested in, but as a major M-P stan I am very biased!
Well, again, it's not a question of "solving" the hard problem, but of getting to a clear understanding of what consciousness is in the first place. That's the value of the concept, and of the broader embodied cognition paradigm.
I would argue it's a philosophical problem, and that physics is based on an ontology that we have to recognize and be clear about if we're even going to understand what the philosophical question is. To wit:
I was not familiar with Chalmers' typology and had to look it up, but the idea appears to be that if we knew "all physical truths about the universe" then it would not be possible that we wouldn't know all facts about a given subject's conscious experience? (Forgive me if this is wrong, it's just based on some quick googling.) But this looks like exactly what I mean by the sort of ideas that are undergirded by an unrecognized ontology, namely, the basic subject/object dichotomy that underlies modern science. According to this ontology, it's possible to stand outside of the world of "facts" as a mere observer - if not for this hidden premise it would be totally incoherent to say that we could, even in theory, know "all physical truths about the universe."
The issue is that all "truths" are situated; there is no truth that is not dependent on some perspective. To even imagine having access to "all physical truths" means forgetting this perspectival nature of truth.
Furthermore, since we are not outside of the universe but participants in it, our very determination of truth itself alters the hypothetical set of all physical truths. (Even modern physics itself entails this - Heisenberg uncertainty principle, the observer effect, and all that - despite the ontological underpinnings of scientific materialism.)
Again, just to be clear, I'm not saying consciousness is spooky spiritual stuff or anything, or even that it's necessarily non-material. (One definition of 'material' could just be 'exists in nature,' and I certainly think consciousness exists in nature.) I'm just saying that the ontology that undergirds scientific materialism cannot suffice to explain it. And ultimately, if you follow the phenomenological rabbit down the hole, I think you end up having to undo that ontology.
|
|
|
Post by voiceofreason on Apr 16, 2021 14:22:17 GMT -5
Not to sound rude or anything as I have great respect for both of you guys intelligence but.
Trying to read this and comprehend or keep up I laughed at a thought that popped into my head.
Donald Sutherland in Animal House... stoned and talking about a solar system in an atom in a fingernail.
|
|
|
Post by incandenza on Apr 16, 2021 14:53:14 GMT -5
Not to sound rude or anything as I have great respect for both of you guys intelligence but. Trying to read this and comprehend or keep up I laughed at a thought that popped into my head. Donald Sutherland in Animal House... stoned and talking about a solar system in an atom in a fingernail. Vibin' like
|
|
ericmvan
Veteran
Supposed to be working on something more important
Posts: 9,027
|
Post by ericmvan on Sept 20, 2021 13:45:28 GMT -5
I'm not saying consciousness is spooky spiritual stuff or anything, or even that it's necessarily non-material. (One definition of 'material' could just be 'exists in nature,' and I certainly think consciousness exists in nature.) I'm just saying that the ontology that undergirds scientific materialism cannot suffice to explain it. And ultimately, if you follow the phenomenological rabbit down the hole, I think you end up having to undo that ontology.
I forgot we were having this conversation until I came here to talk about citrus fruit! I'd say more today if it didn't hurt to type. In random order, and very quickly:
A full understanding of phenomenology is not necessary to solve the Hard Problem. Your solution must include an ontology capable of generating a taxonomy of experiences (of the sort Kriegel gives in The Varieties of Consciousness), but the actual taxonomy is a bonus feature. More in this review.
I'm using "representations" loosely. I used to call them "information constructs" and probably need to invent a term, for the very reasons you cite.
I may be lukewarm to the enactivism thing because my paradigm for the principal neuromodulators (serotonin, etc.) has their purpose as tuning the brain to the environment (both external and cognitive / emotional). E.g., serotonin tunes the brain to the level of novelty and norepinephrine to salience. That we then sometimes act to alter the external environment is interesting (and part of psychology), but ceases to be neuroscience. That the state of our cognitive / emotional environment is continually driving further alterations to it is neuroscience, of course, but it's not enactivism, is it?
I'm really looking forward to Anil Seth's forthcoming book with its prediction-error theory of consciousness, but just because I want to get fully up to speed with the whole predictive-brain idea. I already know that he doesn't have a "science of consciousness" in the Hard Problem sense and I'm a big enough believer in global workspace theory to be skeptical that he has one in the Easy Problem sense, too. I'm really curious as to see how he explains the production of intense emotion from prediction- fulfillment, which of course does happen, e.g., at the end of every good romantic comedy. In general the prediction thing seems to me like another oversold and somewhat "duh!" idea, but I'm open-minded and actually would be excited if there's any unexpected explanatory power there.
I define a materialist solution to the Hard Problem as one where experience is shown to be metaphysically necessary, where we see that p-zombies are actually impossible. This is not incompatible with new physics. The trick is that the new physics of C. has to help complete the standard model of the rest of physics. To use a non-serious example, souls survive death ... and are dark matter! No consciousness, no galaxies as we know them, either. *
No idea for C. that makes it metaphysically necessary has ever been published.
* This is separate from a mathematical physics argument that free will is metaphysically necessary, and hence so is conscious experience (the connection of the latter two being a John Searle insight). That argument is not purely logical; it relies on our sense of what might be credible about reality.
|
|
|