FreshRSS

🔒
❌ About FreshRSS
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

Winning Bet: Consciousness Still a Mystery

In 1998, after a day lecturing at a conference on consciousness, neuroscientist Christof Koch (Allen Institute) and philosopher David Chalmers made a bet.

They were in “a smoky bar in Bremen,” reported Per Snaprud, “and they still had more to say. After a few drinks, Koch suggested a wager. He bet a case of fine wine that within the next 25 years someone would discover a specific signature of consciousness in the brain. Chalmers said it wouldn’t happen, and bet against.”

It has now been 25 years, and Mariana Lenharo, writing in Nature, reports that both of the researchers “agreed publicly on 23 June, at the annual meeting of the Association for the Scientific Study of Consciousness (ASSC) in New York City, that it is still an ongoing quest—and declared Chalmers the winner.”

One thing that helped settle the bet, Lenharo writes, was the recent testing of two different theories about “the neural basis of consciousness”:

Integrated information theory (IIT) and global network workspace theory (GNWT). IIT proposes that consciousness is a ‘structure’ in the brain formed by a specific type of neuronal connectivity that is active for as long as a certain experience, such as looking at an image, is occurring. This structure is thought to be found in the posterior cortex, at the back of the brain. On the other hand, GNWT suggests that consciousness arises when information is broadcast to areas of the brain through an interconnected network. The transmission, according to the theory, happens at the beginning and end of an experience and involves the prefrontal cortex, at the front of the brain.

Six labs tested both of the theories, but the results did not “perfectly match” either of them.

Koch reportedly purchased a “a case of fine Portuguese wine” for Chalmers.

The post Winning Bet: Consciousness Still a Mystery first appeared on Daily Nous.

The Rigor of Philosophy & the Complexity of the World (guest post)

“Analytic philosophy gradually substitutes an ersatz conception of formalized ‘rigor’ in the stead of the close examination of applicational complexity.”

In the following guest post, Mark Wilson, Distinguished Professor of Philosophy and the History and Philosophy of Science at the University of Pittsburgh, argues that a kind of rigor that helped philosophy serve a valuable role in scientific inquiry has, in a sense, gone wild, tempting philosophers to the fruitless task of trying to understand the world from the armchair.

This is the third in a series of weekly guest posts by different authors at Daily Nous this summer.


[Roy Lichtenstein, “Bull Profile Series”]

The Rigor of Philosophy & the Complexity of the World
by Mark Wilson

In the course of attempting to correlate some recent advances in effective modeling with venerable issues in the philosophy of science in a new book (Imitation of Rigor), I realized that under the banner of “formal metaphysics,” recent analytic philosophy has forgotten many of the motivational considerations that had originally propelled the movement forward. I have also found that like-minded colleagues have been similarly puzzled by this paradoxical developmental arc. The editor of Daily Nous has kindly invited me to sketch my own diagnosis of the factors responsible for this thematic amnesia, in the hopes that these musings might inspire alternative forms of reflective appraisal.

The Promise of Rigor

Let us return to beginnings. Although the late nineteenth century is often characterized as a staid period intellectually, it actually served as a cauldron of radical reconceptualization within science and mathematics, in which familiar subjects became strongly invigorated through the application of unexpected conceptual adjustments. These transformative innovations were often resisted by the dogmatic metaphysicians of the time on the grounds that the innovations allegedly violated sundry a priori strictures with respect to causation, “substance” and mathematical certainty. In defensive response, physicists and mathematicians eventually determined that they could placate the “howls of the Boeotians” (Gauss) if their novel proposals were accommodated within axiomatic frameworks able to fix precisely how their novel notions should be utilized. The unproblematic “implicit definability” provided within these axiomatic containers should then alleviate any a priori doubts with respect to the coherence of the novel conceptualizations. At the same time, these same scientists realized that explicit formulation within an axiomatic framework can also serve as an effective tool for ferreting out the subtle doctrinal transitions that were tacitly responsible for the substantive crises in rigor that had bedeviled the period.

Pursuant to both objectives, in 1894 the physicist Heinrich Hertz attempted to frame a sophisticated axiomatics to mend the disconnected applicational threads that he correctly identified as compromising the effectiveness of classical mechanics in his time. Unlike his logical positivist successors, Hertz did not dismiss terminologies like “force” and “cause” out of hand as corruptly “metaphysical,” but merely suggested that they represent otherwise useful vocabularies that “have accumulated around themselves more relations than can be completely reconciled with one another” (through these penetrating diagnostic insights, Hertz emerges as the central figure within my book). As long as “force” and “cause” remain encrusted with divergent proclivities of this unacknowledged character, methodological strictures naively founded upon the armchair “intuitions” that we immediately associate with these words are likely to discourage the application of more helpful forms of conceptual innovation through their comparative unfamiliarity.

There is no doubt that parallel developments within symbolic logic sharpened these initial axiomatic inclinations in vital ways that have significantly clarified a wide range of murky conceptual issues within both mathematics and physics. However, as frequently happens with an admired tool, the value of a proposed axiomatization depends entirely upon the skills and insights of the workers who employ it. A superficially formalized housing in itself guarantees nothing. Indeed, the annals of pseudo-science are profusely populated with self-proclaimed geniuses who fancy that they can easily “out-Newton Newton” simply by costuming their ill-considered proposals within the haberdashery of axiomatic presentation (cf., Martin Gardner’s delightful Fads and Fallacies in the Name of Science).

Inspired by Hertz and Hilbert, the logical empiricists subsequently decided that the inherent confusions of metaphysical thought could be eliminated once and for all by demanding that any acceptable parcel of scientific theorizing must eventually submit to “regimentation” (Quine’s term) within a first order logical framework, possibly supplemented with a few additional varieties of causal or modal appeal. As just noted, Hertz himself did not regard “force” as inherently “metaphysical” in this same manner, but simply that it comprised a potentially misleading source of intuitions to rely upon in attempting to augur the methodological requirements of an advancing science.

Theory T Syndrome

Over analytic philosophy’s subsequent career, these logical empiricist expectations with respect to axiomatic regimentation gradually solidified into an agglomeration of strictures upon acceptable conceptualization that have allowed philosophers to criticize rival points of view as “unscientific” through their failure to conform to favored patterns of explanatory regimentation. I have labelled these logistical predilections as the “Theory T syndrome” in other writings.

A canonical illustration is provided by the methodological gauntlet that Donald Davidson thrusts before his opponents in “Actions, Reasons and Causes”:

One way we can explain an event is by placing it in the context of its cause; cause and effect form the sort of pattern that explains the effect, in a sense of “explain” that we understand as well as any. If reason and action illustrate a different pattern of explanation, that pattern must be identified.

In my estimation, this passage supplies a classic illustration of Theory T-inspired certitude. In fact, a Hertz-like survey of mechanical practice reveals many natural applications of the term “cause” that fail to conform to Davidson’s methodological reprimands.

As a result, “regimented theory” presumptions of a confident “Theory T” character equip such critics with a formalist reentry ticket that allows armchair speculation to creep back into the philosophical arena with sparse attention to the real life complexities of effective concept employment. Once again we witness the same dependencies upon a limited range of potentially misleading examples (“Johnny’s baseball caused the window to break”), rather than vigorous attempts to unravel the entangled puzzlements that naturally attach to a confusing word like “cause,” occasioned by the same developmental processes that make “force” gather a good deal of moss as it rolls forward through its various modes of practical application. Imitation of Rigor attempts to identify some of the attendant vegetation that likewise attaches to “cause” in a bit more detail.

As a result, a methodological tactic (axiomatic encapsulation) that was originally championed in the spirit of encouraging conceptual diversity eventually develops into a schema that favors methodological complacency with respect to the real life issues of productive concept formation. In doing so, analytic philosophy gradually substitutes an ersatz conception of formalized “rigor” in the stead of the close examination of applicational complexity that distinguishes Hertz’ original investigation of “force”’s puzzling behaviors (an enterprise that I regard as a paragon of philosophical “rigor” operating at its diagnostic best). Such is the lesson from developmental history that I attempted to distill within Imitation of Rigor (whose contents have been ably summarized within a recent review by Katherine Brading in Notre Dame Philosophical Reviews).

But Davidson and Quine scarcely qualified as warm friends of metaphysical endeavor. The modern adherents of “formal metaphysics” have continued to embrace most of their “Theory T” structural expectations while simultaneously rejecting positivist doubts with respect to the conceptual unacceptability of the vocabularies that we naturally employ when we wonder about how the actual composition of the external world relates to the claims that we make about it. I agree that such questions represent legitimate forms of intellectual concern, but their investigation demands a close study of the variegated conceptual instruments that we actually employ within productive science.  But “formal metaphysics” typically eschews the spadework required and rests its conclusions upon Theory T -inspired portraits of scientific method.

Indeed, writers such as David Lewis and Ted Sider commonly defend their formal proposals as simply “theories within metaphysics” that organize their favored armchair intuitions in a manner in which temporary infelicities can always be pardoned as useful “idealizations” in the same provisional manner in which classical physics allegedly justifies its temporary appeals to “point masses” (another faulty dictum with respect to actual practice in my opinion).

Philosophy’s Prophetic Telescope

These “Theory T” considerations alone can’t fully explicate the unabashed return to armchair speculation that is characteristic of contemporary effort within “formal metaphysics.” I have subsequently wondered whether an additional factor doesn’t trace to the particular constellation of doctrines that emerged within Hilary Putnam’s writings on “scientific realism” in the 1965-1975 period. Several supplementary themes there coalesce in an unfortunate manner.

(1) If a scientific practice has managed to obtain a non-trivial measure of practical capacity, there must be underlying externalist reasons that support these practices, in the same way that external considerations of environment and canvassing strategy help explicate why honey bees collect pollen in the patterns that they do. (This observation is sometimes called Putnam’s “no miracles argument”).

(2) Richard Boyd subsequently supplemented (1) (and Putnam accepted) with the restrictive dictum that “the terms in a mature scientific theory typically refer,” a developmental claim that strikes me as factually incorrect and supportive of the “natural kinds” doctrines that we should likewise eschew as descriptively inaccurate.

(3) Putnam further aligned his semantic themes with Saul Kripke’s contemporaneous doctrines with respect to modal logic which eventually led to the strong presumption that the “natural kinds” that science will eventually reveal will also carry with them enough “hyperintensional” ingredients to ensure that these future terminologies will find themselves able to reach coherently into whatever “possible worlds” become codified within any ultimate enclosing Theory T (whatever it may prove to be like otherwise). This predictive postulate allows present-day metaphysicians to confidently formulate their structural conclusions with little anxiety that their armchair-inspired proposals run substantive risk of becoming overturned in the scientific future.

Now I regard myself as a “scientific realist” in the vein of (1), but firmly believe that the complexities of real life scientific development should dissuade us from embracing Boyd’s simplistic prophecies with respect to the syntactic arrangements to be anticipated within any future science. Direct inspection shows that worthy forms of descriptive endeavor often derive their utilities from more sophisticated forms of data registration than thesis (2) presumes. I have recently investigated the environmental and strategic considerations that provide classical optics with its astonishing range of predictive and instrumental successes, but the true story of why the word “frequency” functions as such a useful term within these applications demands a far more complicated and nuanced “referential” story than any simple “‘frequency’ refers to X” slogan adequately captures (the same criticism applies to “structural realism” and allied doctrines).

Recent developments within so-called “multiscalar modeling” have likewise demonstrated how the bundle of seemingly “divergent relations” connected with the notion of classical “force” can be more effectively managed by embedding these localized techniques within a more capacious conceptual architecture than Theory T axiomatics anticipates. These modern tactics provide fresh exemplars of novel reconceptualizations in the spirit of the innovations that had originally impressed our philosopher/scientist forebears (Imitation of Rigor examines some of these new techniques in greater detail). I conclude that “maturity” in a science needn’t eventuate in simplistic word-to-world ties but often arrives at more complex varieties of semantic arrangement whose strategic underpinnings can usually be decoded after a considerable expenditure of straightforward scientific examination.

In any case, Putnam’s three supplementary theses, taken in conjunction with the expectations of standard “Theory T thinking” outfits armchair philosophy with a prophetic telescope that allows it to peer into an hypothesized future in which all of the irritating complexities of renormalization, asymptotics and cross-scalar homogenization will have happily vanished from view, having appeared along the way only as evanescent “Galilean idealizations” of little metaphysical import. These futuristic presumptions have convinced contemporary metaphysicians that detailed diagnoses of the sort that Hertz provided can be dismissed with an airy wave of the hand, “The complications to which you point properly belong to epistemology or the philosophy of language, whereas we are only interested in the account of worldly structure that science will eventually reach in the fullness of time.”

Science from the Armchair

Through such tropisms of lofty dismissal, the accumulations of doctrine outlined in this note have facilitated a surprising reversion to armchair demands that closely resemble the constrictive requirements on viable conceptualization against which our historical forebears had originally rebelled. As a result, contemporary discussion within “metaphysics” once again finds itself flooded with a host of extraneous demands upon science with respect to “grounding,” “the best systems account of laws” and much else that doesn’t arise from the direct inspection of practice in Hertz’ admirable manner. As we noted, the scientific community of his time was greatly impressed by the realization that “fresh eyes” can be opened upon a familiar subject (such as Euclidean geometry) through the exploration of alternative sets of conceptual primitives and the manner in which unusual “extension element” supplements can forge unanticipated bridges between topics that had previously seemed disconnected. But I find little acknowledgement of these important tactical considerations within the current literature on “grounding.”

From my own perspective, I have been particularly troubled by the fact that the writers responsible for these revitalized metaphysical endeavors frequently appeal offhandedly to “the models of classical physics” without providing any cogent identification of the axiomatic body that allegedly carves out these “models.” I believe that they have unwisely presumed that “Newtonian physics” must surely exemplify some unspecified but exemplary “Theory T” that can generically illuminate, in spite of its de facto descriptive inadequacies, all of the central metaphysical morals that any future “fundamental physics” will surely instantiate. Through this unfounded confidence in their “classical intuitions,” they ignore Hertz’ warnings with respect to tricky words that “have accumulated around [themselves], more relations than can be completely reconciled amongst themselves.” But if we lose sight of Hertz’s diagnostic cautions, we are likely to return to the venerable realm of armchair expectations that might have likewise appealed to a Robert Boyle or St. Thomas Aquinas.


Discussion welcome.

 

COMMENTS POLICY

The post The Rigor of Philosophy & the Complexity of the World (guest post) first appeared on Daily Nous.

❌