The End Of Philosophy - Tales Of Reality by Jan Strepanov - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.
image
image
image

4 – See What Thought Did

image

Whether or not the peculiarly human form of knowledge is somehow causally linked to identifiable problems in human affairs, it is certainly a well-established evolutionary development.   And even if abstract knowledge is considered to have some negative impacts, it is not readily obvious how the mind might evolve to transcend these; we have no developed principles of distilling that which is considered known.   In any case, with knowledge being culturally regarded as beneficial or of no harm at worst, there is little desire to question its value or consider it occasionally troublesome.   Effectively, there is no cultural perspective that imagines factual knowledge could have tangible downsides, and no cultural impetus to even think about such a possibility.   We simply assume our general form of knowledge to be as inherently useful as the air we breathe.  Specific ideas may be judged right or wrong, but the wider process of constructing knowledge-based understanding through apparently proven ideas is generally seen as inherently beneficial and to be pursued without reservation.

Any examination of potential problems rooted in our abstract form of knowledge is therefore exploratory and wrestles against much that is taken for granted and generally considered beyond doubt.   It may involve posing questions about positions otherwise presumed axiomatic.   Stepping outside habitual modes of thinking may prove difficult, with unreasoned defenses of conventionality and reflexive responses based on accepted ideas blocking unfettered critical thinking.   Tackling this largely ignored area that nonetheless concerns the very heart of human culture is tricky; an uncompromising interrogation of the everyday model of objectivity and knowledge only threatens to make certain people look stupid for never having done it.

The reluctance to probe these matters can be traced throughout the very area where they are supposed to be the main subject matter: philosophy.   The history of philosophical ideas appears wide and riddled with competing ideas and arguments that initially look to be asking each and every question the mind could dream up.   But in terms of postulated answers, do philosophers – or intellectuals of any ilk for that matter – balance their works with an acknowledgment of their own motives and how such motives might distort their ideas?   Whether those motives be simply to generate income from academic conformity, book sales and occasional speaking events, or whether they are also to promote some political cause or ground-breaking theory, all such intellectuals are flesh and blood subject to the same worldly conditions and concerns as the rest of us.   In terms of pursuing whatever social goals they seek, tuning the content of their works to meet those goals is an ever-present potential source of bias.   To imagine that anyone pursues truth in a manner completely indifferent to how their ideas are received by the wider world is illogical; statements made can have powerful consequences – both positive and negative.   It is not only in times of social upheaval that mere ideas can be demonized, books burned, or intellectuals persecuted and murdered; even in times of social calm a book or even a single statement can make or break an entire career.   But how many of our great philosophers have ever acknowledged this?   And if few have, why would that be, and what overall cultural distortions have resulted?

From early childhood onward, we are all influenced if not indoctrinated by our education: a mountain of ideas fired at the mind long before anyone might consider doing philosophy or simply questioning the information and worldviews thereby disseminated.   And of course, on the basis that no one does anything for no reason at all, the extent to which anyone pursues degrees and diplomas through conventional education can be regarded as roughly proportional to their reluctance to criticize whatever is taught.   Hence, academia represents a subtle but authoritarian policing of knowledge that neither rewards intellectual rebels nor welcomes their transgressions.   Meanwhile, other minds pick up on the benefits offered by educational conformity and recognition and simply play the game without ever really questioning its rules.   Original thinking is an obvious casualty.

Quite regardless of how anyone views any content within the many fields of knowledge educational institutions dispense, the awarding of stripes by these institutions is based on a conservative model of knowledge, as opposed to an open and self-critical one.   Marks are not awarded for challenging the teachers, schoolmasters and lecturers who are all employed to know better, at the same time as failure is arguably just failure to comply with rules enforcing intellectual conformity.   Consequently, the forces opposing any form of thinking differently are formidable.   Therefore, it is no surprise if philosophers and many others appear either already too indoctrinated or too comfortably indifferent to acknowledge such forces even exist – never mind the insidious manner in which they pervade and color all areas of human thought.

Just as we are naturally inclined to put our best foot forward, the educated professional philosopher is no more likely to critically undermine thought itself than a doctor is to criticize the medical profession.   And given that people in general, if they bother to think about these issues at all, see philosophy as just so much convoluted cognition of no fruitful outcome, any genuinely critical and incisive thought about thought involves entering mostly uncharted waters.   But at least there are no established rules to follow: a freedom notably missing within conventional views of thought and knowledge.

Simplification as a necessary compromise

The defining characteristic of human thought appears to be its abstract nature – something which can be seen as an imagined symbolic representation of the world, no doubt related to graphic symbology and, less directly, the encoding of thought within written language.

All such formats are by nature highly simplified in their depiction of reality – not only because real-world complexity has no identifiable limits, but also because whatever underlying motivations bring about the mind’s processing of reality, they are presumably primarily focused on achieving goals, and thereby indifferent to extraneous detail of no relevance.

This overall process of simplifying complexity is particularly obvious within the field of graphic representation.   Modern pictographs are often deliberately simplified in the interests of getting a message across without extraneous and potentially distracting information.   At the other extreme, even the highest quality digital photography is but a collection of pixels: flatly-colored squares that drop a degree of real-world detail whilst nonetheless retaining enough to meet the photographer’s goals.

Both these instances involve simplified representations of the world such that a tremendous amount of detail is lost – modern digitization processes being no exception.   Importantly, there is also a measure of unreality introduced – again, even with the most exacting photography.   As anyone who has ever zoomed-in close enough on a digital photograph knows, the lattice of colored square pixels composing the image does not marry up well with anything seen in reality.   Similarly, any magnified examination of pre-digital-era photographic prints eventually shows too much grain to represent anything meaningful – much as similar inspection of portrait painting can eventually reveal little other than brush strokes.

Any model of anything at all inevitably involves compromises: losses in detail plus processing changes which, taken together, should really be seen as distortions, in that the depiction of reality is not just simplified but somewhat altered.   Therefore, whether recognized or not, such a depiction constitutes a more or less flawed model of what it seeks to represent.   Moreover, as reality remains unfathomably complex and inherently other than any model, all models can only be refined by adding detail – but never completed.   And even this idea assumes that detail being added is an authentic representation of the relevant aspect or feature of reality it signifies.   But as the detail itself will inevitably be somewhat flawed, supposed refinement also means more flaws, plus more overall convolution.   Hence, no model of anything can progressively approach the reality it mimics – albeit it may become progressively practical and more convincing.   Therefore, to the extent that a model is refined to appear more authentic, it arguably becomes correspondingly more illusory and deceptive by remaining other than that which it increasingly appears to be.

If this modeling issue proves relatively easy to discuss by examining graphic reproduction – as opposed to human cognition’s modeling of reality – it seems reasonable to assume the ability of cognition to accurately represent the world must be at least as prone to flaws, given cognition is generally more remote from direct sensory input.   And this is underscored by considering that, whatever its faults, photography can quickly snap a scene complete with many details our mind would struggle to recall, even if it had studied the scene at length.

However, and quite curiously, we cannot easily envisage how to graphically illustrate joy or horror, or any of the rest of our many emotions.   Artists can obviously create something suggesting blissful-and-inviting, or ugly-and-menacing, but whatever emotion they try to depict obviously has no known physical form of itself – a curious observation that highlights how thoroughly the mind can believe in things never seen in the physical world.

Thoughts can obviously draw on everything and anything experienced in any way – thereby mingling received ideas with immediate perceptions, emotions and memories.   Furthermore, everything falling under the heading of imagination also has to be factored in to the mind’s machinations.   Additionally, in terms of influencing thinking, it seems naïve to discount whatever the organism might be doing or experiencing below the radar of any conscious recognition at all.   In short, the mind hardly appears independent.   Even if it be equated with the brain, the brain is hardly an isolated organ.

Without getting mired in the convoluted and highly speculative ideas of cognitive psychology that try to piece together what in effect is the gazillion-part-jigsaw of subjective experience, how are the full workings of human thought to be addressed – if indeed this is at all possible?   To further complicate matters, rational ideas in relation to thought sit alongside countless metaphysical ideas postulated in manners that even their most ardent disbelievers cannot logically discount or fully disprove.   Consequently, any apparent progress in researching this area obviously ought to be treated with extreme circumspection, being inescapably susceptible to distortions and delusions.   Try as we might in the face of whatever appears obviously nonsensical, none of us can wholly escape what conventional sociology labels our conditioning.   Meanwhile, the notion of theory has a feel arguably way too simple and dogmatic to address the impenetrable question of exactly why we think as we do.   The temptation to speculate wildly about all this is unlimited, but only because the possibility of reliably confirming or debunking anything at all proves negligible.

But whatever its hidden wellsprings may be, the human evolution of thought together with its use of abstract modeling has allowed our species to envisage and exploit the potential of circumstances such that, as a foundational technology, abstract thought enables other technologies to be built upon it.   Hence, when looking at the surrounding world, we can see possibilities based on its current configuration and our amassed knowledge of how it can be reconfigured to meet our ends.

When for example we see a stone, we are able to think of it as a weight, a step, a weapon, a hammer, a grinding tool and so forth – in general, an object with the potential to help us achieve numerous different ends.   But inasmuch as our different conceptual ideas of the stone do not directly alter the reality that is just the stone, the different uses we envisage for it constitute different ways of seeing the same world: different models or tales of what we otherwise like to consider as a singular reality.   And as proven by the possibility to have this discussion without any particular stone being seen by either writer or reader, we are cognitively able to model and remodel the world in complete abstraction.   This is a key attribute of human technology: the mind’s ability to generally play speculatively with our models of reality in ways that somehow use past experience to plan courses of actions that, once implemented, generally produce intended results.

Compared to the uses of a simple stone, those results may be hugely more sophisticated when seen in for example, the stunning innovations of computer modeling and virtual reality, but the same principles of us using abstract ideas and models to think through what is and what might be possible, are at the core of even these technologies.   In fact, the extensive reach of technology within our modern age can be seen as simply the long-term outcome of ongoing remodeling of the world for either direct results, or to make further remodeling faster and easier.   In general, the changes and increasing sophistication of our abstract models roughly parallel similar changes and refinements in our external environment.

Of course, the key role of abstract modeling is somewhat buried beneath the many technologies it has now spawned, and which have continuously developed within a snowball-effect spanning thousands of years.   Hence it is too easy to overlook that basic technologies such as graphic representation, writing, simple tool and weapon-making, and even agriculture, were all things that likely evolved from scratch.   Such things presumably only became shared knowledge through grappling with early forms of the same interpersonal communication techniques we use today – most notably language.   And it is sobering to consider that without the developed social techniques by which we now encode and impart our abstract knowledge to our offspring, civilization would likely resort to some drastically primitive form in just a generation or two.

Although the human brain seems to have evolved specific attributes for managing abstract thought alongside its unique forms of cognition, the actual substance or content of cultural ideas is not generally considered latent within the newborn and therefore needs revealing afresh to each generation.   Notably, if this were not the case, social conditioning would not be possible – an observation that asks a related question about whether or not anyone can ever be considered a truly original thinker.   Although we are arguably still very animal-like in many respects, this is heavily disguised by the visible civilizing results of acting upon our technologically communicated knowledge and structuring our societies around whatever was achieved by those who went before us.   True originality of thought can therefore appear an impossibility.

This overall version of human development reaffirms the importance of motive-based behavior, as opposed to some purely intellectual development of our species.   It appears the technology of thought developed, not primarily to explain the world as conventional philosophers and other academics might imagine, but because it helped survival as a minimum, and allowed us to thrive as an advantage.   Significantly, the considerable benefits of technology have always stood out in times of war where one technology was often developed to outsmart competing technologies of perceived enemies.   In this respect, the everyday simplistic position that technology is intrinsically beneficial – without any qualification – somewhat whitewashes our bloody history in which the reality was more about an absence of suitably developed technology proving fatal.

In many respects, and for many minds, the results of technology are nonetheless seen as an unqualified success even if from at least one perspective this is illogical.   When the core technology that is the mind’s modeling of the world is demonstrably flawed, such success should logically be tainted by some degree of failure or problems – and this of course is exactly what history reveals within various forms of what could be considered human madness.

Far from technology working exclusively to humanity’s benefit, it can be argued that we are unwittingly allowing it to frighten us – at least in the sense that its formidable powers are threatening and overpowering the mind’s ability to impartially assess technology’s true nature with a view to taming its worst excesses.   This is arguably the real existential dilemma of our current evolutionary state.   One fear of technology has always been that it will be used against us – and of course, history relates how that potential has often been unleashed.   Hence, a secondary concern emerges – the fear of not developing technology, as arguably demonstrated within the balance of terror approach to avoiding a nuclear holocaust.

However, addressing the real dangers of technology within the modern age is not directly about technologies themselves.   It is in fact questionable if much can be achieved by for example, reining in nuclear power, attacking internet surveillance, or refusing to tax-fund hi-tech wars – even if individuals might very understandably make efforts in those directions.   Philosophically, these and other current forms of technology gone crazy should not detract from just how long-standing and elementary this whole issue really is.   The technology at the very heart of matters – abstract thought – remains the most valid focal point on the basis that many uniquely human problems are founded on this root technology.

An inherent trait of abstract thought is its inescapably divisive way of seeing the world.   Whereas we habitually think of the world as full of things, the reality is that such things, as seen by the mind, are only place-marker inventions that allow the processes of abstract thought to proceed.   As things, they are only abstract things.   But given without them there simply cannot be any thoughts related to such things, every last idea or thought must employ them.   Moreover, if the mind wants to sell apples according to size, it will likely divide the world of apples into big, medium and small apples – illustrating how the preferred divisions of the mind are unsurprisingly based on its intentions.

Abstract division is everywhere.   For example, the political mind might divide voters into left, right, center, undecided, and so forth.   Alternatively, the same population may be divided into various classes, such as working, middle, upper and ruling.   Yet other labels exist such as the political class, the leisure class, the intellectual class, and even the chattering class; where specific individuals might fit into such classes is less than obvious.   Similarly divisive ideas include white-collar and blue-collar workers, or management, employees and the unemployed – the possible groups available to the mind being effectively unlimited.   But is such rather haphazard choosing of labels to pigeon-hole huge swathes of the population really justified or beneficial to properly understanding anything, considering each person on the planet is actually unique in countless ways?   What motivates the mind to engage in the dubious use of such blanket terms?

Note that these examples do not concern some esoteric philosophical technicality; they actually form the basis of much real-world political thinking, debate and campaigning.   Many popular ideas and debates reflexively divide populations into such groups in manners rarely questioned as regards supposed justifications or possible weaknesses.   And this remains the case even if it is only logical that a corollary of such divisive thinking is the deeply divided societies that humankind has largely come to regard as both endemic and inevitable.

Regardless of the wisdom or foolishness of all this divisive thinking, such reflexive classification of populations is now so integral a part of mainstream culture that challenging it typically meets with a similarly reflexive opposition.   The approach has long been standard practice within academia, having been birthed by minds that sought to theorize about societies without ever thinking to factor in their own conditioned thoughts and personal motivations when creating those theories – never mind the corrosive social consequences of promoting such inherently divisive thinking.

In terms of furthering academic careers, simple and dogmatic theories – other things being equal – have always trumped the intellectual humility of circumspection.   Hence, the class-based approach to supposedly understanding society has become utterly accepted and integral to expected intellectual thinking within many academic circles.   Even some so-called laymen can be seen eagerly embracing such an approach within efforts to partake in supposedly informed debate.

All such accepted scholarly ideas obviously bestow social and academic prestige on both their originators and their adherents.   And they presumably become accepted because they in some way develop human thought in a manner judged useful.   This of course is the history of class succeeding as a political and sociological concept; it proved useful in various ways to different people – from creating academically prestigious theories about how society supposedly works, to writing political manifestos targeting social change.   Hence, anyone who embraced the concept tended to be lauded by certain academics and political fighters.

But however much the adoption of such theories may have benefited certain individuals and academic institutions, and even provided an intellectual backdrop for the understandable political struggles of many, the dubious consequence of all such theory is that we live increasingly within an abstract vision of reality that diminishes our common humanity and replaces it with divisive conceptualizations.   In this instance, supposed intelligence frames us as class members, and thereby as different from those of other classes – as opposed to being their fellow human beings.   But how convenient such thinking proves to those who at some deep and possibly subliminal level seek to have themselves intellectually confirmed as superior, and therefore meriting of their privilege – all whilst academia quietly reaps the benefits of its general subservience to those of power who also enjoy privilege.

Despite the truly catastrophic real-world results of trying to construct societies that addressed certain excesses portrayed within class-based theories – particularly under the moniker of communismclass as a concept remains omnipresent and of unquestioned importance within many intellectual circles.   Even so-called right-wing arguments often access the concept.   But notably, there are effectively no challenging theories postulating that such widespread reliance on the concept of class might twist our understanding of social reality and seriously distort our cultural views of the human condition.

Given manifest atrocious results from social engineering based on accepted theories rooted in class thinking, logic dictates such theories to be at least somewhat flawed – if not open to nefarious manipulation.   If this was about rocket science for devices that exploded on the launch pad and caused mass casualties, would we still be taking such science seriously?   But little or nothing has been done to ask any real questions about the whole paradigm of thought from which concepts such as class emerge.   Such complacency is of course quite general and reflects an academic reluctance towards self-criticism concerning any possible wrong-thinking.   Hence, it most certainly is not any proven relevance or generally beneficial value of the class concept that has imbued it with lasting popularity; it appears as just one more artifact of an intellectual conservatism so typical of academia.

This example of how and why the mind labels up the world touches on why key weaknesses within abstract thought are extensively ignored, along with their consequences – the overall situation being a feedback loop in which ideas gain academic and cultural inertia in manners having little to do with any demonstrable understanding of reality.   As shown, such weaknesses are actually quite easy to spot in light of the unfathomable complexity of reality – a truth which simplistic thinking boldly and foolishly ignores.   Meanwhile, the clamor for academic recognition and related benefits has people all over the world trying to concoct the next supposedly great theory-based breakthroughs in various fields; it would appear that academic acclaim and insightful thinking often exist in mutual exclusion.

As regards building an understanding of human behavior, we are not dealing with elementary aspects of the material world such as a chemical element or a well-defined phenomenon in physics; the subject matter is so utterly complex and open-ended that any simplistic theory should be seen as inherently suspect before it is even formulated.   Moreover, given the preferred design of scientific theories is to remain parsimonious, it becomes questionable if the use of conventional theoretical thinking is at all appropriate when dealing with human behavior.   Can the simple ever comprehend the complex other than as something that it in fact cannot comprehend?

But there is no academic prestige in highlighting the pitfalls of reducing real-world convolution to theoretical simplicity.   In fact, the opposite is true: uncomplicated one-size-fits-all thinking is often preferred – albeit often obfuscated behind concepts and jargon to make it appear disarmingly complicated to any neophyte who might otherwise spot its weakn