Hamza Tzrotzis section:
Very nice presentation, and I can find agreement with much of it. I do find a problem with the idea that theism explains ‘where consciousness comes from’, since God cannot be said to exist prior to God’s consciousness. Can God be unconscious? If not, can we say for sure that consciousness does not create God instead of the other way around?
This goes along with the sentiment expressed earlier in examining the shortcomings of panpsychism, when he asks ‘what is thought without a thinker?’ While this is certainly difficult to imagine in a normal state of mind, I have experienced dream states in which there was a dream, but it was more like a film, and I woke up specifically noticing that I did not find myself in the movie or in the audience. The experience of the movie simply was. In light of this, and the fact that human beings occupy such a minute and idiosyncratic position within the biosphere of this planet, I would not rule out the possibility that our waking experience of human ego might be limited to animals or certain animals, owing to their autonomy of movement. The vast majority of phenomena in the universe may indeed be felt experiences without an experiencer per se.
Even if this is true, the idea that the universe is composed of consciousness (awareness, or to be more technical, aesthetic participation/sensory-motive re-acquaintance), even proprietary qualities of awareness, but without an overriding executive, there could still be God-like influences within or through the larger potential of our human consciousness. Retrocausality, synchronicity, intuition, and other exotic metaphenomenal conditions may indeed flirt with the boundary between reality and surreality to produce veridical insights and delusional obsessions alike. God may be no more than a figment of consciousness, just as we are, but that doesn’t mean that human consciousness does not include some kind of meta-human guidance for some people at some time, although such guidance may be indistinguishable from mental illness.
Professor Simons section:
Opens well, setting the stage for a criterial of success for explanation in science. I hadn’t heard before the origin of the word melancholia before (black-bile). He goes on to look at intentionality and awareness and asserts that there is nothing that it is like to be a stone. I agree superficially, that what we see of a stone does not express any intention, but given the vastly different scale of time between ourselves and geological time, it is conceivable that what we encounter as static minerals are, within another frame of reference, some unfamiliar kind of experience. Rocks don’t have feelings, but they may *be* feelings – slow, or intermittent feelings which perhaps only awaken when there is a significant change in physical state. When rocks collide, something may feel something, even if it is not the rock itself as we experience it.
I like that he embraces being honest about the shortcomings of science in explaining phenomenality (the Hard Problem). His examples of evolved body parts which have been repurposed could apply to consciousness in theory, but I submit that would have to be a very superficial theory that overlooks the completely anti-physical nature of consciousness. Unlike an ear, awareness is not a plausible feature of some unrelated physical system.
I appreciate Professor Simons call for modesty in consideration of supernatural explanation, but I’m not sure that alone constitutes a refutation. It may be more reckless to insist upon a naturalistic explanation to the exclusion of all explanations which happen to transcend the spatiotemporal aspects of nature.
It seems to me that rather than presuming God as a literal being, we should try thinking of the various theological concepts as a metaphor for being itself – for consciousness. We can learn about consciousness by our own intuitive accounts, written in the native mytho-poetic language of what I call the ‘metaphenomenal’ layer of awareness. God is something like the local, human-shaped shadow of the infinite potential of the future of consciousness. By itself, the universe is only theological and meaningful on the inside. It is up to consciousness, to conscious beings, to imbue the exterior world with divine shades of care and attention. God is not for being, but for becoming.
I liked Simons parting comments about Spinoza and dual-aspect theory. He says, (as most would agree) that consciousness is a process, not a thing. I suggest that consciousness is neither, rather it is the eternal firmament from which processes and things are ‘carved out’ (by time and space…types of entropy or reduced sense). Locally, our human experience is very elaborately enfolded multiple times into spacetime, so that it does actually take on nested process-like characteristics as well. Just as the entire life of our body is the story of a single cell in self-replication/modification, the story of our lives is a single moment stretched out into innumerable sub-moments.
Both critiques of panpsychism I think are ultimately rejections of a straw man. Nobody, Leibniz included I’m sure, thinks that electrons ‘have’ human-like consciousness, but that is completely beside the point, IMO, in considering whether the fabric of nature is more likely sensitive vs physical, information-theoretic, or theological. In those terms, I find it easy to account for physical, informational, and spiritual phenomena as elaborations of sensory experience, but I find no real way to justify any of the others existence in the absence of sense.
I have called science a ‘performance enhancing philosophy’, and like philosophy, it is biased against subjectivity from the start in that all formal writing is publicly directed. When we are doing science or philosophy, we are automatically put into the perspective of being a generic ‘one’ who says this or demonstrates that. The vocabulary and cadence implicitly evoke a style which would be equally well suited for a Classical Greek oration as a 17th century treatise. In all cases, the author is conscious of themselves as a person according to society’s most official protocols. They write as a potentially esteemed public person appealing to other esteemed public persons, and invite them to consider propositions and conclusions which can be esteemed publicly.
This may not be doing us any favors when it comes to considering consciousness itself. With our intimate personal contents neatly tucked away behind dramatic flourishes of prose and persuasion, our impressionable minds soon forget that we could be anything else but fine upstanding members of the human zoo. We speak as if we were one-of-many rather than a unique and unrepeatable image of eternity.
The interior view, which was so much more prominent when we were just waking up this morning, and which we can barely remember surrounded us as young children, now becomes completely transparent to the protocols and politics of the public view. To do philosophy or science, we need not even objectify subjectivity any further than it already has been, because we are already standing three yards behind the toy model of ourselves which has been dressed up for the occasion in a robe, toga, or lab-coat. Suddenly the product of insight and reason alone is not enough to survive the marketplace of ideas. It must be groomed and packaged and toilet trained out of its native poetry in order to fit in with the customers expectations. In a way, philosophy seems to compensate for its inability to get out of its own way in considering subjectivity fairly by taking itself too seriously. We wear the disguise of formalism that keeps us pointing to a picture of a mirror rather than taking a look at our own reflection.
While religion differs from both philosophy and science in that it projects a subjective significance onto the public world, both philosophy and science owe their serious demeanor to religion. Ritual and ceremony are public interactions in which the event is made to signify itself – to represent self-consciousness socially as a performance of particulars.
The aesthetics of temples and cathedrals are monumentally pompous, as are those of elite universities, and for good reason:
pomp (n.) c.1300, from Old French pompe “pomp, magnificence” (13c.) and directly from Latin pompa “procession, pomp,” from Greek pompe “solemn procession, display,” literally “a sending,” from pempein “to send.” In Church Latin, used in deprecatory sense for “worldly display, vain show.”
The paradox of religion is that in order to send the message of spiritual other-worldliness into the world, it succeeds in direct proportion to its hypocrisy. The more magnificent its public image, the more popular the religion tends to be, and the more quiet contemplation of private depths becomes a choreographed sporting event tied to military conquest and political control.
Can this state of affairs be improved, even on this internet where time and worldliness are switched on or off at will, or is the public perspective perpetually predisposed to pomposity?
Once upon a time, the belief in witchcraft was about as common as the belief in using soap.
…how pervasive was belief in witchcraft in early modern England? Did most people find it necessary to use talismans to ward off the evil attacks of witches? Was witchcraft seen as a serious problem that needed to be addressed? The short answer is that belief in witchcraft survived well into the modern era, and both ecclesiastical and secular authorities saw it as an issue that needed to be addressed.
In 1486, the first significant treatise on witchcraft, evil, and bewitchment, Malleus Malificarum , appeared in continental Europe. A long 98 years later, the text was translated into English and quickly ran through numerous editions. It was the first time that religious and secular authorities admitted that magic, witchcraft, and superstition were, indeed, real; it was also a simple means of defining and identifying people who performed actions seen as anti-social or deviant.
Few doubted that witches existed, and none doubted that being a witch was a punishable offense. But through the early modern era, witchcraft was considered a normal, natural aspect of daily life, an easy way for people, especially the less educated, to events in the confusing world around them. – (source)
In many parts of the world, the belief in witchcraft is still very common.
“As might be expected, the older and less educated respondents reported higher belief in witchcraft, but interestingly such belief was inversely linked to happiness. Those who believe in witchcraft rated their lives significantly less satisfying than those who did not.
One likely explanation is that those who believe in witchcraft feel they have less control over their own lives. People who believe in witchcraft often feel victimized by supernatural forces, for example, attributing accidents or disease to evil sorcery instead of randomness or naturalistic causes.” (source)
another poll on beliefs in the U.S:
What People Do and Do Not Believe in
Many more people believe in miracles, angels, hell and the devil than in Darwin’s theory of evolution; almost a quarter of adults believe in witches
New York, N Y . — December 15 , 2009 —
A new Harris Poll finds that the great majority (82%) of American adults believe in God, exactly the same number as in two earlier Harris Polls in 2005 and 2007. Large majorit ies also believe in miracles (76 %), heaven (75%), that Jesus is God or the Son of God (73%) , in angels (72%), th e survival of the soul after death (71%), and in the resurrection of Jesus (70%). Less than half (45%) of adults believe in Darwin’s theory of evolution but this is more than the 40% who believe in creationism. These are some of the results of The Harris Poll of 2,303 adults surveyed online between November 2 and 11, 2009 by Harris Interactive . The survey also finds that: 61 % of adults believe in hell; 61% believe in the virgin birth (Jesus born of Mary); 60% believe in the devil; 42% believe in ghosts; 32% believe in UFOs; 26% believe in astrology; 23% believe in witches 20% believe in reincarnation (source)
Because there are so many benefits associated with freedom from superstition, it is not much of a tradeoff emotionally to go from a world of mystical phantoms to one of scientific clarity. It may be too intellectually challenging or difficult for a lot of people to get the opportunity to be exposed to scientific knowledge in the right way, at the right time in their life, but it seems that if they do seize that opportunity, they are happy with their decision. Of course, not everyone makes a decision to block out all religious or spiritual beliefs when they accept scientific truths, and even though there are probably more people alive today who believe in sorcery than there were in 1600, there are more people now who also believe in germs, powered flight, and heliocentric astronomy.
There is little argument that scientific knowledge and its use as a prophylactic against the rampant spread of superstition is a ‘good thing’. Is it possible though, to have too much of a good thing? Is there a limit to how much we should insist upon determinism and probability to explain everything?
The City Dark is a recent documentary about the subtle and not-so-subtle effects of light pollution. Besides new and unacknowledged health dangers from changed sleeping habits in people and ecological upheaval in other species, the show makes the case that the inescapable blur of light which obscures our view of the night sky is quietly changing our view of our own lives. The quiet importance of the vast heavens in setting our expectations and limiting the scale of our ego has been increasingly dissolved into a haze of metal halide. In just over a century, night-time illumination has gone from a simple extension of visibility into the evening, into a 24 hour saturation coverage of uninhabited parking lots, residential neighborhoods, and office buildings. The connection between the power to see, do, and know, is embodied literally in our history as the Enlightenment, Industrial Age, and Information Age.
The 20th century was cusp of the Industrial and Information ages, beginning with Edison and Einstein redefining electricity, light, and energy, peaking with the midcentury Atomic age when radiation became a household word and microwave ovens began to cook with invisible light rather than heat. Television became an artificial light source which we used not only as silent companions with which to see the world, but as a kind of hypnotic signal emitter which we stare directly into – the home version of that earlier invention which came into its own in the 20th century, the motion picture. The century which tracked the spread of electricity and light from urban centers to the suburbs ended with the internet and mobile phones bringing CRT, LED, and LCD light into our personal space. Where once electronic devices were confined to living rooms and cars, we are now surrounded by tiny illuminated dots and numbers, and a satellite connection is hardly ever out of arms reach.
A Life Sentence
In Foucault’s Discipline and Punish, he details the history of prison and the rise of disciplinary culture in Europe as it spread from monasteries through the hospitals, military, police, schools, and industry. He discusses how the concept of justice evolved from the whim of the king to torture and publicly execute whoever he pleased, to kangaroo courts of simulated justice, to the modern expectation of impartiality and evidence in determining guilt.
The shift of punishment style from dismemberment to imprisonment reflected the change in focus from the body to the mind. The Reformation gave Western Europe a taste of irreverence and self-determination, at the same time, the monastic lifestyle was adopted throughout pre-Modernity. To be a hospital patient, student, soldier, prisoner, or factory laborer was to enter a world of strict regulation, immaculate uniforms, and constant inspection. Inspection is a central theme which Foucalt examines. He describes how an obsessive regimen of meticulous inspection and monitoring, and standardized testing reached an ultimate expression in the panopticon architecture. Through this central-eye floor plan, the population is exposed and personally vulnerable while the administration retains the option to remain concealed and anonymous.
Tying these themes of inspection, enlightenment, and illumination together with witchcraft is the concept of evidence. What could be more scientific than evidence.
late 14c., from Old French evident and directly from Latin evidentem (nominative evidens) “perceptible, clear, obvious, apparent” from ex- “fully, out of” (see ex-) + videntem (nominative videns), present participle of videre “to see” (see vision)
The Salem Witch Trials famously victimized those who were targeted as witches by subjecting them to what seem to us now as ludicrous tests. This gives us a good picture of the transition from pre-scientific to scientific practices in society. This adolescent point between the two reveals a budding need to rationalize harsh punishments intellectually, but not enough to prevent childish impatience and blame from running the show.
The idea of circumstantial evidence – evidence which is only coincidentally related to a crime, marks a shift in thinking which is echoed in the rise of the scientific method. As the mindset of those in power became more modern, the validity of all forms of intuition and supernatural sources came into question. Where once witchcraft and spirits were taken seriously, now there was a radical correction. It was belief in the supernatural which was revealed to be obsolete and suspicious. The default position had changed from one which assumed spirits and omens to one which assumed coincidence, exaggeration, and mistaken impressions. Beyond even the notion of innocent until proven guilty, it was the notion that proof mattered in the first place which was the Enlightenment’s gift to the cause of human liberation.
Few would argue that this new dis-belief system which brought us out of savagery is a good thing, but also, as Foucault intimates, we cannot assume that it is all good. Is incarceration really the human and effective way of discouraging crime that we would like to think, or is it a largely hypocritical enactment of a fetish for control? Does the desire to predict and control lead to an insatiable desire to dictate and invade others?
There have been many exposes on psychics and mediums over the years where stage magicians and others have run down the kinds of tricks that can be used to gather unexpected intelligence from an audience and use it to fool them. The cold reading is a way of cheating a mark into thinking that the psychic has supernatural powers, when in fact they have had an assistant look through their purse earlier.
Ironically, these techniques are the same techniques used in science, except that they are intended to reveal the truth rather than instigate a fraud. Statistical analyses and reductive elimination are key aspects of the scientific method, giving illumination to hidden processes. In neuroscience, for instance, an fMRI is not really telling us about how a person thinks or feels, rather they physiological changes that we can measure are used to produce a kind of cold reading of the subject’s experience, based on our own familiarity with our personal experience.
This is all fantastic stuff, of course, but there seems to be a point where the methods of logical inference from evidence crosses over into its own kind of pathology. The etymology of superstition talks about “prophecy, soothsaying, excessive fear of the gods”. The suffix ‘-stition’ is from the same root as ‘-standing’ in understanding. There is a sense of the mind compulsively over-reaching for explanations, jumping to conclusions, and rendered stupid by naivete.
The converse pathology does not have a popular name like that, although people use the word pseudoskeptical to emphasize a passionately prejudiced attitude toward the unproved rather than a scientifically impartial stance. The neologism I am using here, hypostition, puts the emphasis on the technical malfunction of the scientific impulse run amok. Where superstition is naive, hypostition is cynical. Where superstition jumps to conclusions, hypostition resists any conclusion, no matter how clear and compelling, in which the expectations of the status quo are called into question.
Tests in Life
Much of what is meant by witchcraft can be boiled down to an effort to access secret knowledge and power. The witch uses divination to receive guidance and prophecy intuitively, often by studying patterns of coincidence and invoking a private intention to find its way to a public expression. Superstition swims in the same waters, reading into coincidence and projecting their own furtive impulses outwardly. Beyond that, we talking about herbal medicine, folk psychology, and rituals mythologizing nature.
The goal of the science and technology is similarly an effort to extract knowledge and power from nature, but to do so without falling into the trap of magical thinking. Instead of making a pact with occult forces, the scientist openly experiments to expose nature. Along the way, there are often lucky coincidences which lead to breakthroughs, and challenges which seem tailor made to derail the work. These trials and tribulations, however, are not supported by science. If we adopt the hypostitious frame of mind, there can be no narrative to our experience, no fortunate people, places, or times, beyond the allowable margins of chance.
We have come full circle on coincidence, where we obliged to doubt even the most life-altering synchronicity as mere statistical inevitability.
In place of superstition we have neuroses. Our triumph over the fear of the unknown has become an insidious phobia of the known. Even to recognize this would be to admit some kind of narrative pattern in human history. Recognition of such a pattern is discouraged. The tests which we face are not allowed to make that kind of sense, unless it can be justified by the presence of a chemical in the body, or a behavior in another species.
A Way Out
For me, the recognition of the two poles of superstition and hypostition are enough to realize that the way forward is to avoid the extremes most of the time. Intuition and engineering both have their place, and the key is not to always try to squeeze one into the other. At this point, the world seems to be nightmarishly extreme in both directions at the same time, but maybe it has always seemed that way?
The challenge I suppose is to try to find a way to escape each other’s insanity, or to contribute in some way toward improving what we can’t escape from. With some effort and luck, our fear of the dark and insensitivity to the light might be transformed into a full range of perception. Nah, probably not.
It occurs to me that it might be easier to explain my view of consciousness and its relation to physics if I begin at the beginning. In this case, I think that the beginning was in asking ‘What if the fundamental principle in the universe were a simple form of awareness rather than something else?’
Our choices in tracing the lineage of consciousness back seem to be limited. Either it ’emerged’ from complexity, at some arbitrary stage of biological evolution, or its complexity evolved without emergence, as elaboration of a simple foundational panpsychic property.
In considering which of these two is more likely, I suggest that we first consider the odd, unfamiliar option. The phenomenon of contrast as a good place to start to characterize the theme of awareness. Absolute contrasts are especially compelling. Full and empty, black and white, hot and cold, etc. Our language is replete with evidence of this binary hyperbole. Not only does it seem necessary for communication, but there seems also to be an artistic satisfaction in making opposites as robust as possible. Famously this tendency for exaggeration clouds our thinking with prejudice, but it also clarifies and makes distinction more understandable. In politics, mathematics, science, philosophy, and theology, concepts of a balance of opposites can be found as the embodiment of its essential concepts.
For this reason alone, I think that we can say with certainty that consciousness has to do with a discernment of contrasts. Beneath the linguistic and conceptual embodiments of absolute contrasts are the more zoological contrasting pairs – hungry and full, alive and dead, tired and alert, sick and healthy, etc. At this point we should ask, is consciousness complex or is it simple? Is the difference between pain and pleasure something that should require billions of cellular interactions over billions of years of evolution to arrive at accidentally, or does that seem like something which is so simple and primordial that nothing could ever ‘arrive’ at it?
Repetition is a special form of contrast, because whether it is an event which repeats cyclically through a sequence or a form which repeats spatially across a pattern, the underlying nature of what repeats is that it is in some sense identical or similar, and in another sense not precisely identical as it can be located in memory or position as a separate instance.
I use the phrase “repeats cyclically through a sequence” instead of “repeats sequentially through time” because if we take our beginning premise of simple qualities and capacities of awareness as preceding even physics, then the idea of time should be grounded in experience rather than an abstract metric. Instead of conceiving of time as a dimension in which events are contained, we must begin with the capacity of events to ‘know’ each other or in some way retain their continuity while allowing discontinuity. An event which repeats, such as a heartbeat or the circadian rhythms of sunlight, is fundamentally a rhythm or cycle. That is the actual sense experience. Regular, frequent, variation. Modulation of regularity.
Likewise, I use the phrase “repeats spatially across a pattern” instead of “repeats as a pattern across space” because again, we must flip the expectation of physics if we are to remain consistent to the premise of sense-first. What we see is not objects in space, it is shapes separated by contrasting negative shapes. What we can touch are solids, liquids, and gases separated from each other by contrasting sense of their densities. Here too, the sense of opposites dominates, separating the substantial from the insubstantial, heavy from light, hard from soft.
An important point to make here is that we are adapted, as human beings with bodies of a particular density and size, to feel the world that relates appropriately to our body. It is only through the hard lessons like plague and radiation that we have learned that indeed things which are too small for us to see or feel can destroy our bodies and kill us. The terror of this fact has inspired science to pursue knowledge with an aggressive urgency, and justifiably so. Scientists are heroes, informing medicine, transportation, public safety, etc as never before in the history of the world and inspiring a fantastic curiosity for knowledge about reality rather than ideas about God or songs about love. The trauma of that shattering of naive realism haunts our culture as whole, and has echoes in the lives of each generation, family, and individual. Innocence lost. The response to this trauma varies, but it is hard to remain neutral about. People either adapt to the cold hard world beyond themselves with fear or with anger. It’s an extension of self-consciousness which seems uniquely human and often associated with mortality. I think that it’s more than confronting their own death that freaks out the humans, it’s the chasm of unknowable impotence which frames our entire experience on all sides. We know that we don’t really know.
The human agenda becomes not merely survival and reproduction, but also to fill the existential chasm with answers, or failing answers, to at least feel fulfilled with dramatic feelings – with entertainments, achievements, and discoveries. We want something thrilling and significant to compensate for our now unforgettable discovery of our own insignificance. With modernism came a kind of Stockholm syndrome turn. We learned how to embrace the chasm, or at least to behave that way.
At the same time that Einstein began to call the entire foundation of our assumptions about physics into question, the philosophy of Neitzsche, along with the science of Darwin and Freud had begun to sink in politically. Revolutions from both the Left and Right rocked the world, followed in some nations by totalitarianism and total war. The arts were transformed by an unprecedented radicalism as well, from Duchamp, Picasso, and Malevich to Stravinsky and Le Corbusier. After all of the pageantry and tradition, all of the stifling politeness and patriarchy, suddenly Westerners stopped giving a shit about the past. All at once, the azimuth of the collective psyche pitched Westward all the way, toward annihilation in a glorious future. If humans could not live forever, then we will become part of whatever does live forever. The human agenda went transhuman, and everyone became their own philosophical free agent. God was indeed dead. For a while. But the body lives on.
The point of this detour was to underscore the importance of what we are in the world – the size and density of our body, to what we think that the world is. Not only do we only perceive a narrow range of frequencies of light and sound, but also of events. Events which are too slow or too fast for us to perceive as events are perceived as permanent conditions. What we experience exists as a perceptual relativity between these two absolutes. Like the speed of light, c, perception has aesthetic boundaries. Realism is personal, but it is more than personal also. We find agreement in other people and in other creatures which we can relate to. Anything which has a face earns a certain empathy and esteem. Anything that we can eat has a significance to us. Sometimes the two overlap, which gives us something to think about. Consciousness, at least the consciousness which is directed outwardly from our body, is all about these kinds of judgment calls or bets. We are betting that animals that we eat are not as significant as we are, so we enjoy eating them, or we are betting that such a thought is immoral so we abstain. Society reflects back these judgments and amplifies them through language, customs, belief systems, and laws. Since the modernist revolution, the media has blanketed the social landscape with mass production of cliches and dramatizations, which seems to have wound up leaking a mixture of vanity and schadenfreude, with endless reenactments, sequels, and series.
It is out of this bubble of reflected self-deflection that the current philosophies rooted in both reductionism and emergentism find their appeal. Beginning with the assumption of mechanism or functionalism as the universal principle, the task of understanding our own consciousness becomes a strictly empirical occupation. Though the daunting complexity of neuroscience cannot be overstated, the idea is that it is inevitable that we eventually uncover the methods and means by which data takes on its fancy experiential forms. The psyche can only be a kind of evolutionary bag of tricks which has developed to serve the agenda of biological repetition. Color, flavor, sound, as well as philosophy and science are all social peacock displays and data-compressing virtual appendages. The show of significance is an illusion, an Eloi veneer of aesthetics over the Morlock machinations of pure function.
To see oneself as a community of insignificance in which an illusion of significance is invested is a win-win for the postmodern ego. We get to claim arbitrary superiority over all previous incarnations, while at the same time claiming absolute humility. It’s a calculated position, and like a game theory simulation, it aims to minimize vulnerability. Facts are immutable and real, experiences are irrelevant. From this voyeuristic vantage point, the holder of mechanist views about free will is free to deny that he has it without noticing the contradiction. The emergent consciousness can speak glowingly out of both sides of its mouth of its great knowledge and understanding in which all knowledge and understanding is rendered void by statistical mechanics. Indeed the position offers no choice, having backed itself into a corner, but to saw off its own limbs with one hand and reattach them with another when it is not looking.
What is gained from this exercise in futility beyond the comfort that comes with conformity to academic consensus is the sense that whatever happens, it can be justified with randomness or determinism. The chasm has been tamed, not by filling it in or denying it, but by deciding that we are simply not present in the way that we think. DNA acts, neurons fire, therefore we are not thinking. Death is no different than life which has paused indefinitely. An interesting side effect is that as people are reduced to emergent machines, machines are elevated to sentient beings, and the circle is complete. We are not, but our products are. It seems to me the very embodiment of suburban neuroses. The vicarious society of invisible drones.
Just as 20th century physics exploded the atom, I would like to see 21st century physics explode the machine. Instead of releasing raw energy and fragmentation, I see that the blasting open of mathematical assumptions will yield an implosion into meaning. Pattern recognition, not information, is the true source of authenticity and significance. They are the same thing ultimately. The authenticity of significance and the significance of authenticity speak to origination and individuation over repetition. Not contrast and dialectic, not forces and fields, but the sense in which all of these facets are yoked together. Sense is the meta-syzygy. It is the capacity to focus multiplicity into unity (as in perception or afference) and the capacity for unity to project into multiplicity (participation or efference).
These are only metaphorical descriptions of function however. What sense really is and what it does can only be experienced directly. You make sense because everything makes sense…in some sense. That doesn’t happen by accident. It doesn’t mean there has to be a human-like deity presiding over all of it, to the contrary, only half of what we can experience makes sense intentionally, the other half (or slightly less) makes sense unintentionally, as a consequence of larger and smaller sequences which have been set in motion intentionally. We are the evidence. Sense is evident to us and there is nothing which can be evident except through sense and sense making.
The difference between intelligence and wisdom, (aside from rolling up an D&D character), parallels the distinctions which have been dividing philosophy of mind from the beginning. Intelligence implies a cognitive ability in a technical and literal sense – a talent for understanding factual relations which apply to the public world. The products of intelligence are transformative, but famously amoral. Frankenstein and 2001’s HAL both embody our fear of the monstrous side of technology, of intelligence ‘run amok’ with hubris. This is a rich vein for science fiction. Beings built from the outside in – an inhuman mind from inanimate substance. Zombies, killer robots, aliens. Giant insects or weaponized planetoids. In all cases the impersonal, mechanistic side of consciousness is out of proportion and humanity is dwarfed or under-signified.
Intelligence is supposed to be impersonal and mechanistic though. Its facts and figures are not supposed to be local to human experience. The sophisticated view which developed through Western intelligence not only does not require us to value human subjectivity. It insists, to the contrary, that all human awareness is a contamination to the pristine reality of factual evidence – objects which simply are ‘as they are’ rather than merely ‘seem to be’. All human awareness, that is, except for the reasoning which progresses science itself.
Wisdom, while overlapping with intelligence as set of cognitive talents and skills, is not as clear cut as intelligence. Wisdom does not yield the kind of public results which intelligence is intended to produce, because wisdom is not focused on public objects but on private experiences. Both intelligence and wisdom attempt to step back from the local phenomenal world to seek deeper patterns, but intelligence seeks them from indirect experiences outside of the body, while wisdom seeks within the psyche, within the library of possible personal experiences. The library of wisdom, unlike that of intelligence, is in the language of the personal. Characters and stories which work on multiple levels of figurative association. The privacy of wisdom extends to its own forms, leading to a lot of mystery for the sake of mystery which those minds on the other side of the aisle find deeply offensive. Intelligence only uses symbols as generic pointers – to literally refer to a specific quantifiable variable. Intelligence stacks symbols in sequence as language and formulas. Wisdom uses symbols as poetry and art, evocative images which work on multiple levels of awareness and understanding but not the kind of fixed understanding of intelligence. The understanding of wisdom can be open ended and elliptical, absurd, poignant, etc.
The deepest kinds of wisdom are said to be ‘timeless’. Unlike the high value that intelligence places on up to the minute information, wisdom seems to appreciate with age. Rather than being seen as increasingly irrelevant, ancient stories and turns of phrase are revered and celebrated for their pedigree. There is an almost palpable weight to the anachronistic language and images. The metaphors are somehow more potent when delivered by a long dead prophet. This favoring the dead happens with more modern quotations too. Maybe it’s because they are no longer around to put their quotation in context, or maybe it just takes a while for greatness to make itself known against the background of more ephemeral noise.
In any case, the realm of wisdom is a decidedly human realm of human experience. It is a talent for recognizing and encapsulating common sense and long life, with subtlety and significance for all people and all lives. Wisdom is about being able to appreciate our fortunes as individuals and members of society. Wisdom helps us find an objective vantage point within the history of our personal experience from which to see and evaluate the ups and downs of life passing. Through wisdom, we can see the bigger picture in our own trials and tribulations and the rise and fall of civilizations. We can see how every moment can change seeming fiction into seeming fact and back. Wisdom is subjective and mystical, but so too were many phenomena in the natural world before science. The promise of Western intelligence is to de-mystify they world, to remove subjectivity, but in addressing subjectivity itself the intellect meets its match.
Frustrated with the prospect of decomposing its source into objects, intelligence turns to a kind of inside-out subjectivity in some variety of functionalism. Subjectivity, from the Western perspective of public space as reality, is nullified. It can only be disqualified as an ‘emergent property’ or ‘illusion’ of some other deterministic process of matter or ‘information’ – a side effect somehow, of what already seems to know itself perfectly well and function competently without resorting to any fanciful aesthetic ‘feelings’ or ‘flavors’. If arithmetic or the laws of physics work automatically, then they don’t need a special show of aesthetic phenomena to lubricate their own wheels, yet, thinks the left-brained Western mind, there is simply no other possibility. Consciousness must arise as some sort of accident among colliding collections of complex computations.
This is a problem, since it only pushes dualism from the Cartesian center of tolerance into the ghettos of the untouchable. Subjective experience is now confined in science to untouchable, uncountable metaphysical aethers of simulation; epiphenomenal dead ends which had no meaningful beginning.
The Western mind cannot tolerate being put into a box by any phenomena which it cannot put into a box itself. The irony of modern physics of course, is that all of the boxes which have been piled up so far seem to indicate their origin in circularity. A microcosm of disembodied Cheshire Cat smiles…determinable indeterminisms. On the astronomical scale, suddenly the bulk of the matter and energy in the universe has been re-categorized into darkness. The alchemist caps seem to have reappeared in science, but turned inside out. Western intelligence is no longer explaining the universe which we experience directly as participants, but is devoted to pursuing an alternate universe backstage which just so happens to identify human subjectivity as the only thing in the cosmos which is not actually real.
The current battle over TED, Sheldrake and Hancock is on the front line of the war between public-facing thinkers and private-facing thinkers, between body-space visionaries and life-time visionaries. Both sides play out a reflexive antagonism – a shadow projection which extends beyond the personal. Each side hears the other in their own limited terms, and neither one is able to communicate the missing perspective of the other. The argument continues because both fail to understand the missing piece of their own perspective and their mode of thinking has devolved into an aggressive-defensive vicious circle of un-wisdom and un-intelligence.
At this point in our political and in our intellectual life, the midpoint has been skewed so far from the center (to the West in science, and to the Right in politics), that any proposal which engages other perspectives is seen as extremism. Any new information is mistaken for treasonous compromise. Whether that extremism is in the mechanistic or the animistic direction, the result is very similar.
I would love to see a study done comparing the brain activity of so called ‘militant atheists’ and ‘religious fundamentalists’, to see how really far apart they are. My scientific hypothesis is that no neuroscientist will be able to look at the fMRIs of the two camps in a double blind test and reliably tell the difference. If that were true, what would it mean about protecting science from unscientific ideas if you cannot prove the scientific validity of thoughts from a brain scan?
One of the most significant intellectual errors educated persons make is in underestimating the fallibility of science. The very best scientific theories containing our soundest, most reliable knowledge are certain to be superseded, recategorized from “right” to “wrong”; they are, as physicist David Deutsch says, misconceptions:I have often thought that the nature of science would be better understood if we called theories “misconceptions” from the outset, instead of only after we have discovered their successors. Thus we could say that Einstein’s Misconception of Gravity was an improvement on Newton’s Misconception, which was an improvement on Kepler’s. The neo-Darwinian Misconception of Evolution is an improvement on Darwin’s Misconception, and his on Lamarck’s… Science claims neither infallibility nor finality.
This fact comes as a surprise to many; we tend to think of science —at the point of conclusion, when it becomes knowledge— as being more or less infallible and certainly final. Science, indeed, is the sole area of human investigation whose reports we take seriously to the point of crypto-objectivism. Even people who very much deny the possibility of objective knowledge step onto airplanes and ingest medicines. And most importantly: where science contradicts what we believe or know through cultural or even personal means, we accept science and discard those truths, often enough wisely.
An obvious example: the philosophical problem of free will. When Newton’s misconceptions were still considered the exemplar of truth par excellence, the very model of knowledge, many philosophers felt obliged to accept a kind of determinism with radical implications. Give the initial-state of the universe, it appeared, we should be able to follow all particle trajectories through the present, account for all phenomena through purely physical means. In other words: the chain of causation from the Big Bang on left no room for your volition:
[[MORE]]Determinism in the West is often associated with Newtonian physics, which depicts the physical matter of the universe as operating according to a set of fixed, knowable laws. The “billiard ball” hypothesis, a product of Newtonian physics, argues that once the initial conditions of the universe have been established, the rest of the history of the universe follows inevitably. If it were actually possible to have complete knowledge of physical matter and all of the laws governing that matter at any one time, then it would be theoretically possible to compute the time and place of every event that will ever occur (). In this sense, the basic particles of the universe operate in the same fashion as the rolling balls on a billiard table, moving and striking each other in predictable ways to produce predictable results.
Thus: the movement of the atoms of your body, and the emergent phenomena that such movement entails, can all be physically accounted for as part of a chain of merely physical, causal steps. You do not “decide” things; your “feelings” aren’t governing anything; there is no meaning to your sense of agency or rationality. From this essentially unavoidable philosophical position, we are logically-compelled to derive many political, moral, and cultural conclusions. For example: if free will is a phenomenological illusion, we must deprecate phenomenology in our philosophies; it is the closely-clutched delusion of a faulty animal; people, as predictable and materially reducible as commodities, can be reckoned by governments and institutions as though they are numbers. Freedom is a myth; you are the result of a process you didn’t control, and your choices aren’t choices at all but the results of laws we can discover, understand, and base our morality upon.
I should note now that (1) many people, even people far from epistemology, accept this idea, conveyed via the diffusion of science and philosophy through politics, art, and culture, that most of who you are is determined apart from your will; and (2) the development of quantum physics has not in itself upended the theory that free will is an illusion, as the sorts of indeterminacy we see among particles does not provide sufficient room, as it were, for free will.
Of course, few of us can behave for even a moment as though free will is a myth; there should be no reason for personal engagement with ourselves, no justification for “trying” or “striving”; one would be, at best, a robot-like automaton incapable of self-control but capable of self-observation. One would account for one’s behaviors not with reasons but with causes; one would be profoundly divested from outcomes which one cannot affect anyway. And one would come to hold that, in its basic conception of time and will, the human consciousness was totally deluded.
As it happens, determinism is a false conception of reality. Physicists like David Deutsch and Ilya Prigogine have, in my opinion, defended free will amply on scientific grounds; and the philosopher Karl Popper described how free will is compatible in principle with a physicalist conception of the universe; he is quoted by both scientists, and Prigogine begins his book The End of Certainty, which proposes that determinism is no longer compatible with science, by alluding to Popper:Earlier this century in The Open Universe: An Argument for Indeterminism, Karl Popper wrote,” Common sense inclines, on the one hand, to assert that every event is caused by some preceding events, so that every event can be explained or predicted… On the other hand, … common sense attributes to mature and sane human persons… the ability to choose freely between alternative possibilities of acting.” This “dilemma of determinism,” as William James called it, is closely related to the meaning of time. Is the future given, or is it under perpetual construction?
Prigogine goes on to demonstrate that there is, in fact, an “arrow of time,” that time is not symmetrical, and that the future is very much open, very much compatible with the idea of free will. Thus: in our lifetimes we have seen science —or parts of the scientific community, with the rest to follow in tow— reclassify free will from “illusion” to “likely reality”; the question of your own role in your future, of humanity’s role in the future of civilization, has been answered differently just within the past few decades.
No more profound question can be imagined for human endeavor, yet we have an inescapable conclusion: our phenomenologically obvious sense that we choose, decide, change, perpetually construct the future was for centuries contradicted falsely by “true” science. Prigogine’s work and that of his peers —which he calls a “probabilizing revolution” because of its emphasis on understanding unstable systems and the potentialities they entail— introduces concepts that restore the commonsensical conceptions of possibility, futurity, and free will to defensibility.
If one has read the tortured thinking of twentieth-century intellectuals attempting to unify determinism and the plain facts of human experience, one knows how submissive we now are to the claims of science. As Prigogine notes, we were prepared to believe that we, “as imperfect human observers, [were] responsible for the difference between past and future through the approximations we introduce into our description of nature.” Indeed, one has the sense that the more counterintuitive the scientific claim, the eagerer we are to deny our own experience in order to demonstrate our rationality.
This is only degrees removed from ordinary orthodoxies. The point is merely that the very best scientific theories remain misconceptions, and that where science contradicts human truths of whatever form, it is rational to at least contemplate the possibility that science has not advanced enough yet to account for them; we must be pragmatic in managing our knowledge, aware of the possibility that some truths we intuit we cannot yet explain, while other intuitions we can now abandon.
It is vital to consider how something can be both true and not in order to understand science and its limitations, and even more the limitations of second-order sciences (like social sciences). Newton’s laws were incredible achievements of rationality, verified by all technologies and analyses for hundreds of years, before their unpredicted exposure as deeply flawed ideas applied to a limited domain which in total provide incorrect predictions and erroneous metaphorical structures for understanding the universe.
I never tire of quoting Karl Popper’s dictum:Whenever a theory appears to you as the only possible one, take this as a sign that you have neither understood the theory nor the problem which it was intended to solve.
It is hard but necessary to have this relationship with science, whose theories seem like the only possible answers and whose obsolescence we cannot imagine. A rational person in the nineteenth century would have laughed at the suggestion that Newton was in error; he could not have known about the sub-atomic world or the forces and entities at play in the world of general relativity; and he especially could not have imagined how a theory that seemed utterly, universally true and whose predictive and explanatory powers were immense could still be an incomplete understanding, revealed by later progress to be completely mistaken about nearly all of its claims.
Can you imagine such a thing? It will happen to nearly everything you know. Consider what “ignorance” and “knowledge” really are for a human, what you can truly know, how you should judge others given this overwhelming epistemological instability!
That’s a great article, IMO. He hits a lot of points dead on that I have tried many times to make in many different debates.
The only point of contention I might have is the idea of scientific paradigms being misconceptions rather than conceptions. It makes sense as far as it provides a good provocation for an audience, but if we were really being absolutely scientific about it, I would say that misconception has too much of a dismissive connotation. If it turns out that the Earth is actually a four dimensional shadow of a 19 dimensional interplanetary being, that doesn’t make our perceptions of the Sol-centric orbiting orb or the Jerusalem-centric flat garden misconceptions…but it does make the latter model a misconception *in comparison to the former*.
But yes, for the purposes of the waking up the average humdrum mind, the point is well made that we would all be well advised to keep in mind that odds are that everything we know is wrong, on some level or in some sense. That seems to be more important now than it usually does. Some moments in history appear to be more polarizing than others.
“the chain of causation from the Big Bang on left no room for your volition”
This is still the overwhelmingly popular assumption, in my experience.
“Of course, few of us can behave for even a moment as though free will is a myth; there should be no reason for personal engagement with ourselves, no justification for “trying” or “striving”; one would be, at best, a robot-like automaton incapable of self-control but capable of self-observation. One would account for one’s behaviors not with reasons but with causes; one would be profoundly divested from outcomes which one cannot affect anyway. And one would come to hold that, in its basic conception of time and will, the human consciousness was totally deluded.”
I have tried many times to communicate this exactly. Great way of pulling it together. I think that the fact that what he is saying is true gives us the vantage point from which to see that “trying” or “striving” is an aesthetic quality of intention which is actually perpendicular orphysical principle to the axis of the unintentional. As the privacy of physics is a spectrum of effort, courage, tenacity, boldness, surrender, release, etc, the public side of physics is monotonous: anesthetic, automatic qualities which describe a continuum between determinism and ‘accident’ (/error/random/mutation). Of course, it is only because we can alter our own degree of intentionality that we can discern between determinism and accident; were the universe plotted only on that single unintentional axis, that chain of causation, then there could be no conceivable difference between accident and non-accident. It is ironically our own distance from automatism which gives us the impression that automatism is rich enough to exist as a monopole.
“It is vital to consider how something can be both true and not in order to understand science and its limitations,”
I’m generally in full agreement with everything here. He might be a bit more optimistic than I am about the current state of what is accepted by scientists and science buffs at this point. I feel that the mechanistic worldview is not going to die that easily. If this revolution does get off the ground, it could very well be another brief renaissance before being subsumed into the next revival of anesthetic totalism.