Sunday, March 25, 2012

Privatisation

The twentieth century experienced a period of extraordinary growth in the size and power of capitalism with nation states weakened, economic policy shifting from Keynesianism to Monetarism, and neoliberal governments abdicating their power to control interest rates, exchange rates and the money supply. It saw nation states restricted under compulsory trade arrangements in which corporation-friendly treaties drove out often hard-won, socially-friendly national laws. It saw unprecedented growth in the three pillars of modern capitalism, the neoliberal Holy Trinity: privatisation, deregulation and globalisation. For thirty years the Washington Consensus ruled.
The twentieth century also witnessed social democratic governments defecting to the neoliberal camp – especially in Britain, Australia and the US, and it saw trade unions weakened by the creation of a new global reserve army of labour in China, the Indian sub-continent and the former Soviet Bloc.  
It saw the Soviet Union collapse and its rich public resources brutally cannibalised by the Russian capitalist oligarchs. And, last but not least, it saw the Chinese Communist Party became the Chinese Chamber of Commerce.
All of this, of course, was but a prelude to the Global Financial Crisis:
In 2008, corporate psychopaths flew financial weapons of mass destruction (financial derivatives) into the twin towers of our economy, the housing market and the stock market. Ten trillion dollars of wealth imploded in a cloud of dust.

And whilst global capitalism is on the ropes, there are as yet few signs of concerted, viable socialist opposition. Meanwhile, the forces of capitalism press on – especially with privatisation.


Margaret Thatcher, Conservative British prime minister, introduced privatisation in the early 1980s to raise money for the government, to cut back the size of the public sector and to break the power of the public sector unions. Money making, however, was never the real objective because the sale of public assets entails the loss of on-going revenues to government and this inevitably means lower government income in the long run. Privatisation involves the sale of public assets to private buyers at fire-sale prices - including things like public housing, public utilities and public industries - at bargain basement prices. Privatisation quickly made many private fortunes at public expense.



The types of assets privatised were often natural monopolies like rail or telecommunications. Once privatised and free of regulation, these former public monopolies quickly become unfettered private monopolies, free to increase prices and reduce supply.  Privatised assets are often very profitable and supporters put this down to increased efficiency, but their profitability stems instead from fire-sale acquisition prices, monopoly market power, cut-backs in repairs and maintenance, and reduced service standards.    



Privatisation diminishes democracy: governments are answerable to the electorate, capitalist corporations aren’t. ‘Commercial-in-Confidence’ contracts limit public and parliamentary scrutiny and thereby reduce public control. Privatisation, therefore, changes the operating principles and the operating environment. Corporations care only for the bottom line; ignoring any wider social responsibilities they might have once had while in government hands – like cross subsidies, welfare concerns, social inclusion or the environment.



Privatisation is based on several false ideas: that governments are intrinsically inefficient; that there is no such thing as the public interest; that government debt causes inflation and higher interest rates; and that government investment crowds-out the more productive private investment. These ideas are as false as those on which the neoliberal house of cards was built in the first place. Now that that house has blown away maybe it’s time to rethink privatisation as well.



The Top Ten problems with Privatisation


1.         Privatisation, transferring ownership and/or control of public assets to private buyers, together with deregulation and globalisation are the three main pillars of of modern corporate capitalism. Corporations have the status and rights of ‘legal persons’ (and in the United States, unbelievably, they also have human rights). Unfortunately, however, such ‘corporate persons’ are psychopaths. They exhibit:

·         Callous unconcern for the feelings of others.

·         An incapacity for maintaining enduring relationships.

·         Reckless disregard for the safety of others.

·         Deceitfulness, repeated lying and conning others for profit.

·         An incapacity to experience guilt.

(‘The Corporation’, Bakan, J. 2004)


Not surprisingly therefore, when giant corporations collapse (Enron, Lehman, HIH etc), subsequent investigations show their callous indifference to shareholder assets and worker’s pension funds.

2.      Global financial markets, monopolized by some of the world’s largest financial corporations, out of control due to greed, deregulation and market failure, are driving the global privatisation agenda; witness the Sub-Prime Global Crisis, massive government bailouts and yet multimillion dollar privtisation bonuses continue to be paid despite continuing losses running into the billions (eg Royal Bank of Scotland) 

3.      Public goals are broader than the corporate bottom line which ‘privatises’ profits but externalizes (socialises) costs – the costs the corporation creates but doesn’t have to pay are other people’s problems

4.      Corporations prefer to privatise natural monopolies; allowing them to reduce supply and increase prices, maximizing their profits. Everyone, except the capitalist, knows that natural monopolies must be publicly owned or, at the very least, tightly regulated to safeguard the public interest. In the last three decades, corporations have successfully opposed regulation everywhere

5.      Privatised assets are poorly maintained and poorly developed – UK railways   

6.      Privatisation distorts the allocation of resources; for example, a toll put on some road in the middle of an otherwise free but complex road network completely rearranges all subsequent traffic flows

7.      Privatisation diminishes the sphere of democratic control and accountability – governments are answerable to the electorate, corporations aren’t, ‘commercial in confidence’ contracts prohibit public audit review and parliamentary scrutiny

8.      Privatisation severely constrains government fiscal and financial flexibility; major public revenue earning assets have been handed over to the corporations and the revenue streams are subsequently lost to government

9.      Based on the first false ‘efficiency’ premise that corporations and markets are efficient, but governments aren’t; that corporate debt is better somehow than government debt and that governments need to shift the risk to corporations 

10.  Based on the second false ‘inflation’ premise that government debt increases the money supply which in turn creates inflation, adverse terms of trade and higher interest rates.

The above only skims the surface of the sea of problems with privatisation. Over time this post will grow to expand on each and every one of the ten dot points. It might even add a few more.

Friday, March 9, 2012

Anti-Realism versus Realism

The threat to reason
 

Dan Hind, author of ‘The Threat to Reason’ (Verso 2007), sees two primary threats to Reason – on the one hand governments and corporations seeking to monopolise Science for their own selfish, covert purposes; and, on the other, religious faith, postmodernism and New Age quackery seeking to undermine Enlightenment Reason either from disillusionment or revelation, this second threat overt, self-advertising like a side-show.

Reason posits that the world is orderly – effect always following cause, knowledge accumulating from theories, experience, and logic. Religious ideas undermine this philosophy because they posit other forms of knowledge – divine revelation and oracles for example. Postmodernism undermines by rejecting the idea of Enlightenment Reason altogether – reflecting disillusionment, claiming instead that all knowledge is socially constructed.

Even so, Hind has some sympathy for postmodernism – a justified, but ultimately mistaken, reaction to the horrors of the 20th century; with Enlightenment Reason spruiking Freedom Truth and Justice but, all the while, letting loose global warfare, totalitarianism, imperialism, concentration camps and atomic bombs.

Postmodernists argue that Enlightenment Reason is simply a self-serving rationale for the white, western, male power structure; thereby denying the universal validity of Enlightenment Reason, claiming instead that Reason is relative to specific societies at specific times and places, that knowledge is always and everywhere socially constructed. Such claims are widely supported in some circles; but, according to Hind, not really much of a threat.
 

Anti-Realism

True, the postmoderns themselves are not much of a threat – in any event their writings are mostly unintelligible. But the philosophy underpinning postmodernism – anti-realism – is a considerable threat. Anti-realism, the idea that facts, beliefs and explanations are mind-dependent, is extensive in many fields and disciplines. Such philosophies are ancient; not postmodern, not even modern, originating as responses to Ancient Greek scepticism. And many anti-realist philosophies are also philosophies of science. The following schematic illustrates this nicely.


Realists believe that the world exists independently of their conceptions of it. Realism, therefore, has two primary dimensions: existence and independence. Not surprisingly anti-realists attack along both dimensions. The schematic shows the main forms of anti-realism and their locations vis-à-vis the existence and independence axes.
These two axes divide the central part of the schematic into four quadrants:
  • vertical axis: mind-independent/dependent, and
  • Horizontal axis: existence/non-existence.
So, for example, instrumentalism as a scientific method holds that its subject matter is mind-independent but not real; instrumental facts and entities are just convenient elements of some theory or other incorporated only because they work. Conversely, social-constructivism holds that its subject matter is real, but not mind-independent. So, whilst Hind might dismiss postmodernism as largely irrelevant, he cannot so easily dismiss social constructivism and relativism. These ideas present a serious challenge to realist philosophies of knowledge: truth, justification, and explanation.
Nowhere is this better illustrated than in Raymond Bradley's paper "How to Lose Your Grip on Reality" which forcefully shows the anti-realist underpinnings of many of the founders of quantum mechanics. Bradley describes anti-realism as a virus of the mind, a condition which led some great physicists into extremely poor metaphysics. He says:

It is no secret that anti-realism – the denial that there is a way the world really is as distinct from our perceptions or conceptions of it, or even the denial that a real world exists at all – is rampant in the social sciences and among those thinkers who call themselves "Postmodernists". What is less well-known is that anti-realism is also endemic among quantum physicists. Needless to say, the literati seize upon quantum antirealism as experimental confirmation of the views. I propose to stop this myth-making in its tracks.

In this paper I don't hide the fact that I regard anti-realism as a virulent form of philosophical error – a veritable "virus of the mind" as Richard Dawkins would put it. My aim is to explain how and why it is that so many quantum theorists have become infected by it.

Boghossian characterises the classical view of knowledge as follows:
  • Objectivism about Facts: the world which we seek to understand and know about is largely independent of us and our beliefs about it. Even if thinking beings had never existed, the world would still have had many of the properties that it currently has.
  • Objectivism about Justification: facts of the form - information E justifies belief B - are society independent facts. In particular, whether or not some item of information justifies a given belief doesn't depend on the contingent needs and interests of any community.
  • Objectivism about Rational Explanation: under the appropriate circumstances, our exposure to the evidence alone is capable of explaining why we believe what we believe.
('Fear of Knowledge', Boghossian, P. OUP 2006, page 22)
Constructivism and relativism challenge each of these objectivisms claiming instead that society socially-constructs facts, justifications and explanations in ways that reflect its contingent needs and interests.
Social Construction of facts
Boghossian claims that fact constructionism is the most common form of social construction and yet in some ways the most strange. The world was around for billions of years before humans. Clearly, there were facts about the world long before humans, but only humans can 'socially construct' facts. How can this be? First, realist philosophers readily acknowledge that society does create many facts; facts which depend on and would not exist without society; facts such as money, religion, and job titles. But there seem to be many other facts: mountains, rivers, and birds, for example, which exist independently of humans.
Not so, say the fact-constructionists. Human consciousness and language imposes structure on the world. Before humans the world existed as a formless, primeval substance. When people started to describe the world they created facts about the world just as the cookie cutter cuts shapes from the dough. So, ancient Egyptians did not die of tuberculosis because doctors did not discover that disease until much later. And, according to Foucault, there were no homosexuals until society invented the term to describe certain sorts of activities.   
But, according to Boghossian, the description dependence of facts gets its force from another far less radical view: the social relativity of facts. Every society is free to describe the world in any way it sees fit; often these descriptions will be different from one another, but this fact in no way entails that the way society describes the world thereby causes the world to depend on these descriptions.
But, say fact-constructionists, society can describe the world one way and get certain outcomes, then describe it another way and get different outcomes. The facts in the world therefore do depend on society’s descriptions. For example, consider a simple scene with three basic objects in it: A, B, C. that is one way of describing it - so it has three objects. But if I now allow that any combination of two objects is also an object, the new conceptual scheme produces a world with six objects in it: A, B, C, AB, AC, BC. And if I allow combinations of three objects to be objects, then I can have up to seven objects in the world: A, B, C, AB, AC, BC, and ABC. So how many objects are there in the world? Well that clearly depends on how society chooses to describe them.
But this involves a slight of hand. Each description involving different numbers of objects is perfectly consistent with the way things are in the world; the above account equivocates on the word 'object', when we refer to three objects and then to six and seven objects, we have at the same time craftily shifted the sense of word 'object' we are using from basic objects to non-basic objects. Boghossian argues that this description is simply the same as saying that eight persons are at a party or, equivalently, four couples are there.
More damaging, however, both these fact constructionist approaches - cookie cutter and conceptual scheme - require some basic independently existing facts to even get started: primordial dough on the one hand and basic objects on the other. This clearly undermines the idea that society constructs all facts.
But a third approach exists which avoids these problems: the language game theory of fact constructionism. This simply states that no mind-independent way that things are exists; things are the way they are relative to the language game used to talk about these things. Language games simply reflect our contingent needs and interests. We are like characters in a work of fiction. What we say in the novel about the past, the future, the present or about one another or anything may or may not be true, but truth in the novel is relative only to the novel. No independent way that things are exists.
The above approach - language game relativism - asserts that no absolute truths exist. But this renders the assertion incoherent. If it asserts an absolute truth, then it contradicts itself, and if it asserts a relative truth then it has the same status as a claim in a work of fiction - we can safely choose to ignore it.
Social Construction of Justifications

Constructivism depends on relativism and relativism depends, in turn, on scepticism. Constructivism is ultimately a sceptical doctrine. Scepticism asserts that all attempts to justify knowledge must end in an infinite regress or circular reasoning. Either way, we can never justify knowledge. Societies may however choose to halt the regress at some convenient stopping point. They will define convenience in terms of their contingent needs and interests. But ultimately all such stopping points are arbitrary, so halting the regress still does not provide a justification. It does however, set up the conditions whereby different societies can set their cut-offs at different points according to their contingent needs and interests. And we have, therefore, no objective means of saying one society's cut-off is better than another. The sceptics demand for justification, therefore, creates the basis for relativism and social constructivism.
According to the sceptic, when attempting to justify our beliefs, we have three equally bad options: infinite regress, arbitrary assumptions or circular reasoning. Philosophers call this unhappy set of options Agrippa’s Trilemma or the problem of the criterion. On each outcome, the sceptic can claim that we lack justification for our alleged knowledge.
What has this got to do with relativism? In an illuminating series of papers, Howard Sankey (Witchcraft, Relativism and the Problem of the Criterion, Erkenn 2010) shows how this Trilemma underlies ‘epistemic relativism’
The regress may only be avoided by reasoning in a circle or by unjustified adoption of a norm. Neither option yields justification. Hence, the decision to adopt a given epistemic norm is not one that may be made on a rational basis. Nor is it possible for any particular epistemic norm to receive greater justification than any other. For all norms are equally lacking in justification. Instead of being a rationally based decision, the adoption of a norm is rationally unjustified. It may rest upon an irrational leap of faith, a subjective personal commitment or an arbitrary convention. But it cannot be supported by appeal to rational grounds which show one set of epistemic norms to be better justified than an alternative set of such norms.

If no norm is better justified than any other, all norms have equal standing. Since it is not possible to provide an ultimate grounding for any set of norms, the only possible form of justification is justification on the basis of a set of operative norms. Thus, the norms operative within a belief system provide justification within that belief system. Those who adopt a different belief system are justified by the norms operative within their belief system. There is no sense in which the norms operative in one belief system possess a higher degree of justification than the norms employed in another such system. Justification is an entirely internal matter of compliance with norms that are operative within a belief system.

The relativist is now in a position to claim that rational justification is relative to operative norms within a belief system. It is possible for there to be alternative belief systems with alternative sets of epistemic norms. As a result, what one is rationally justified in believing depends upon the belief system that one accepts and the epistemic norms which are operative within that belief system. There is no sense in which it may be said that any belief system possesses a greater degree of rationality than any other (Sankey, pg 5).


Sankey draws on Roderick Chisholm’s response to the sceptic.
Chisholm’s response to the sceptic
Chisholm refers to the problem of the criterion as the diallelus ‘the wheel’ or the vicious circle. He expresses it this way:
To know whether things really are as they seem to be, we must have a procedure for distinguishing appearances that are true from appearances that are false. But to know whether our procedure is a good procedure, we have to know whether it really succeeds in distinguishing appearances that are true from appearances that are false. And we cannot know whether it does really succeed unless we already know which appearances are true and which ones are false. And so we are caught in a circle.
(Chisholm, R. Ch. 5 ‘The Problem of the Criterion’, ‘Ways of Knowing’ pg 62)
He illustrates the problem by reference to sorting a pile of apples into the good and bad. This sorting is simple to do for we have standard criteria to help us make the decision. But what about beliefs; How do we sort the good from the bad? We are now on the wheel; we have shifted the problem from sorting beliefs to sorting methods for sorting beliefs. We could, of course, follow the scientific method, and this might give us good beliefs, but how do we know? And if we know, why did we need a method in the first place?
Chisholm breaks the issues down into two questions:
A) What do we know? What is the extent of our knowledge?
B) How are we to decide whether we know? What are the criteria?
Everything hinges on the starting point A or B. Assume a hypothetical person H. If H starts from A, from what H knows; or how far H’s knowledge extends, then H has some hope of answering B: ‘how does H know?’. If however H starts from B instead, by having say a good set of directions then H might hope to answer A. But the sceptic argues that H cannot answer B until H has answered A, but H cannot answer A, unless H has answered B. No possible way to decide in any particular case exists.
Suppose H needs to sort apples into good and bad heaps. Can H build the heaps without a decision rule? Or can H formulate a decision rule without the heaps? According to the sceptic, the answer is no in each case; the problem of justified belief is ultimately circular.   
But two other possibilities exist: start from A or start from B. Chisholm refers to those who start from A as particularists and those who start from B as methodists. According to Chisholm, empiricists are methodists. They begin with broad and completely arbitrary generalisations about experiential criteria. As a consistent empiricist, what can you know?
All you can know is that there are and have been certain sensations. You cannot know whether there is any you who experiences those sensations--much less whether any other people exist who experience sensations. And I think, if [Hume] had been consistent in his empiricism, he would also have said you cannot really be sure whether there have been any sensations in the past; you can know only that certain sensations exist here and now (ibid pg 68).
Chisholm prefers particularism.
There are many things that quite obviously, we do know to be true. If I report to you the things I now see and hear and feel - or, if you prefer, the things I now think I see and hear and feel - the chances are that my report will be correct; I will be telling you something I know. And so, too, if you report the things that you think you now see and hear and feel. To be sure, there are hallucinations and illusions. People often think they see or hear or feel things that in fact they do not see or hear or feel. But from this fact - that our senses do sometimes deceive us - it hardly follows that your senses and mine are deceiving you and me right now. One may say similar things about what we remember (ibid pg 69).
Having the apples before us, we can look them over and formulate criteria of goodness. This procedure is the answer to the puzzle of the diallelus:
We have then a kind of answer to the puzzle about the diallelus. We start with particular cases of knowledge and then from those we generalize and formulate criteria of goodness - criteria telling us what it is for a belief to be epistemologically respectable.
Chisholm, a foudationalist, defeats the sceptic by accepting the immediate evidence of the senses as knowledge not requiring any further justification despite the possibility of dreams, hallucinations or evil Cartesian demons. Thus accepted, we can weigh the evidence in particular cases to formulate and justify rational decision criteria.
The particularist response to relativism
But how does this resolve the issues with relativism. Relativists argue that societies justify their beliefs by reference to prevailing ‘epistemic norms’, and they justify epistemic norms in terms of their contingent needs and interests. And since according to the relativist, not even the possibility of cross-cultural metanorms exists, we cannot judge cultures by any absolute, non-question begging standards.
However, irrespective of whether the non-comparable epistemic norm is science, revelation, or the oracle, each norm still has to perform against its expected outcomes in particular cases. And how society's epistemic norms perform against expectations in particular cases will therefore be of great interest to it. Consequently, it might be reasonable to subject each norm to a test of its performance against expected outcomes in particular cases. If any norm were to emerge superior in any such contest, then clearly some epistemic norms would be superior to others and we would therefore have an absolute yardstick - a meta norm - which spans the cultural divide. The relativist case evaporates.
Sankey writes:
It is entirely possible for the members of a community to justify their beliefs in terms of a set of norms that they possess. But for such norms to provide the beliefs with genuine epistemic support, the norms must themselves convey epistemic warrant. Where an epistemic norm fails to be a reliable indicator of truth, compliance with the norm fails to provide rational support for beliefs which comply with the norm.
(Sankey, ‘Witchcraft …” pg 13)
Sankey argues that ‘robust common sense’ underpins the above approach.
Boghossian's View
Boghossian takes issue with epistemic pluralism:
There are many different, genuinely alternative epistemic systems, but no facts by virtue of which one of these systems is more correct than any of the others.
(‘Fear of Knowledge’, page 90)
Boghossian goes on to say
Every epistemic system will have a possible alternative that contradicts it. Take any such contradictory pair. If one of them is deemed to say something false, the other will have to be deemed as saying something true. Under these circumstances, it’s hard to see how it could be right to say that there are no facts by virtue of which one epistemic system could be more correct than another.
(ibid. page 91)
This seems to support the view expressed by Sankey that relative performance against stated predictions in particular cases is the best way to evaluate epistemic systems. Clearly such a procedure judges an epistemic system by its own lights, since its users often base crucial decisions on such performance and outcomes, therefore, will be of great interest to the users. Here, then, is an absolute measure of epistemic systems where the relativist claims that none can exist.
Social Construction of Explanations
Further, social constructivism attacks rational explanation - the third of Boghossian's objectivisms. The attack takes the following form:
It is never possible to explain why we believe what we believe solely on the basis of our exposure to the relevant evidence: our contingent needs and interests must also be involved.
(‘Fear of Knowledge’, page 118)
This argument is usually described as an underdetermination thesis wherein the evidence is never enough to determine the explanation. For example, any number of polynomials will fit the same set of data points; the data points themselves cannot determine which of the polynomials to use.

Boghossian posits two forms of ‘explanation-constructivism’: strong constructivism where our interests always decide the issue and evidence never enters the picture, and weak constructivism where evidence enters the picture at least some of the time.

Strong explanation-constructivism contradicts itself: anyone asserting that prejudice underpins all beliefs must accept that prejudice underpins their own beliefs as well. Hence, we have no reason to accept such an idea.

But, according to Boghossian, weak explanation-constructivism seems a little more plausible. For example, Kuhn argued that ‘paradigms’ are the source of all explanations: paradigms are frameworks of accepted propositions about some part of knowledge. Scientific revolutions sweep away old paradigms and replace them with new ones. Kuhn believes that all knowledge is relative to some paradigm or other and that paradigms are incommensurable. This incommensurability stems from the fact that adherents of different paradigms ask different questions, speak different languages and live in different worlds.

Boghossian refers to these descriptions as ‘indefensible rhetorical excess’ (page 123) that conflates a difference in representation with a difference in the thing represented. If paradigms were truly incommensurable then, Boghossian argues, even partial translation between paradigms would be impossible, but even Kuhn admits that translation is possible; for example, Kuhn supplies many ‘examples of shared predictions that provided a basis on which to prefer rationally one theory to another’ (ibid. page 125). Translatability therefore undermines weak explanation-constructivism on grounds similar to the refutation of justification-constructivism in the last section.
Pierre Duhem identified a second form of underdetermination. He claimed that the outcome of an experiment depends not only on the explanatory theory but also on all the auxiliary equipment, methods and procedures used. If the outcome is not as expected, then the problem could lie with any of these factors. If we are stargazing, for example, and spot an anomaly; the problem might lie with the heavens or with the telescope. The theory cannot tell us which.

But in general we do have a way of dealing with such problems. We will rationally begin by identifying the weakest link. For example, unless we specifically think the telescope is at fault, we are far more likely to revise our astronomical theory.

Lessons for Progressives

Why do some progressive movements adopt anti-realist, constructivist views? Boghossian thinks that it might be because constructivism supplies the philosophical resources to protect oppressed cultures from the charge of holding false or unjustified views (page 130). Well, it might, but this seems shortsighted. If the powerful cannot criticise the views of the oppressed because those views are reflections of cultural norms; then, by the same token, the oppressed cannot challenge the views of the powerful for the same reasons. This leads only to the maintenance of the status quo - but in most cases the status quo is precisely what we must change.

Boghossian is correct when he states that things are independent of human opinion and we are capable of arriving at belief about how things are that is objectively reasonable and therefore binding on anyone capable of appreciating the evidence.



Tuesday, March 6, 2012

Cosmology

Did the universe have a beginning?

The big bang theory postulates that the entire universe originated in a cosmic explosion about 15 billion years ago. Such an idea had no serious constituency until Edwin Hubble discovered the redshift of galaxy light in the 1920s, which seemed to imply an expanding universe. However, our ability to test cosmological theories has vastly improved with modern telescopes covering all wavelengths, some of them in orbit. Despite the widespread acceptance of the big bang theory as a working model for interpreting new findings, not a single important prediction of the theory has yet been confirmed, and substantial evidence has accumulated against it.

Did the Universe Have a Beginning? Dr Tom Van Flandern

And Wikipedia tells us that

Cosmology is the discipline that deals with the nature of the Universe as a whole. Cosmologists seek to understand the origin, evolution, structure, and ultimate fate of the Universe at large, as well as the natural laws that keep it in order. Modern cosmology is dominated by the Big Bang theory, which brings together observational astronomy and particle physics.

In recent times, physics and astrophysics have played a central role in shaping the understanding of the universe through scientific observation and experiment. What is known as physical cosmology shaped through both mathematics and observation the analysis of the whole universe. It is generally understood to begin with the Big Bang, followed almost instantaneously by cosmic inflation - an expansion of space from which the universe is thought to have emerged ~13.7±0.2×109 (roughly 13.5–13.9 billion) years ago.

Modern scientific cosmology is usually considered to have begun in 1917 with Albert Einstein's publication of his final modification of general relativity in the paper "Cosmological Considerations of the General Theory of Relativity," (although this paper was not widely available outside of Germany until the end of World War I). General relativity prompted cosmogonists such as Willem de Sitter, Karl Schwarzschild and Arthur Eddington to explore the astronomical consequences of the theory, which enhanced the growing ability of astronomers to study very distant objects. Prior to this (and for some time afterwards), physicists assumed that the Universe was static and unchanging.

Subsequent modelling of the universe explored the possibility that the cosmological constant introduced by Einstein in his 1917 paper may result in an expanding universe, depending on its value. Thus the big bang model was proposed by the Belgian priest Georges Lemaître in 1927 which was subsequently corroborated by Edwin Hubble's discovery of the red shift in 1929 and later by the discovery of the cosmic microwave background radiation by Arno Penzias and Robert Woodrow Wilson in 1964. These findings were a first step to rule out some of many alternative physical cosmologies.

What is wrong with This Picture?

Wikipedia also has an entry on Religious Cosmology, and if Eric Lerner is correct then the entry should also include a reference to Big Bang theory. According to Lerner, a ‘cosmological pendulum’ exists that swings through the course of history between a ‘scientific’ and a ‘mythological’ view of the cosmos (All references are taken from ‘The Big Bang Never Happened’, Lerner, E. 1991). According to Lerner, the evolution of cosmology depends on the evolution of society. As society swings between periods of progress and periods of crisis - sometimes lasting centuries, approaches to the cosmos alternate between progress based on empirical observation on the one hand, and regress based on a priori deductive inferences starting from mathematical axioms on the other.

Today Big Bang theorists see a universe much like that envisioned by the medieval scholars - a finite cosmos created ex nihlo, from nothing, whose perfection is in the past, and degenerating to a final close. The perfect principles used to form this universe can be known only by pure reason, guided by authority, independent of observation (Lerner, pg 6).

The first ‘deductive’ swing of the pendulum gave us the static and finite universe of Ptolemy, and the early Christian Church introduced the idea of creation from nothing, ‘decaying from a perfect beginning to an ignominious end’ (pg 7). But the pendulum swung back during the scientific revolution. In this, its empirical phase, cosmologists saw the universe as infinite in space and time, without origin or end; and, by the middle of the 19th century, the concept of universe was that of an unending process of evolution. But, in the early 20th century, the pendulum swung back in a startling return to ‘discredited medieval concepts’. The crises of the 20th century gave credibility to the ‘old philosophical view of a decaying universe, degenerating from its perfect origins, and to the deductive method.’ The new theory grew not from observation but from these pessimistic philosophical underpinnings. Hence, unsurprisingly, observation presents the greatest challenges to Big Bang Theory.

Empirical Challenges to the Big Bang Theory

Dr Tom Van Flandern has conveniently assembled the top thirty problems with the Big Bang Theory on his website. The following text shows the top ten:

1. Static universe models fit observational data better than expanding universe models. Static universe models match most observations with no adjustable parameters. The Big Bang can match each of the critical observations, but only with adjustable parameters, one of which (the cosmic deceleration parameter) requires mutually exclusive values to match different tests. Without ad hoc theorizing, this point alone falsifies the Big Bang. Even if the discrepancy could be explained, Occam’s razor favors the model with fewer adjustable parameters – the static universe model.

2. The microwave “background” makes more sense as the limiting temperature of space heated by starlight than as the remnant of a fireball.

3. Element abundance predictions using the Big Bang require too many adjustable parameters to make them work.

4. The universe has too much large scale structure (interspersed “walls” and voids) to form in a time as short as 10-20 billion years. The average speed of galaxies through space is a well-measured quantity. At those speeds, galaxies would require roughly the age of the universe to assemble into the largest structures (superclusters and walls) we see in space, and to clear all the voids between galaxy walls. But this assumes that the initial directions of motion are special, e.g., directed away from the centres of voids. To get around this problem, one must propose that galaxy speeds were initially much higher and have slowed due to some sort of “viscosity” of space. To form these structures by building up the needed motions through gravitational acceleration alone would take in excess of 100 billion years.

5. The average luminosity of quasars must decrease with time in just the right way so that their average apparent brightness is the same at all redshifts, which is exceedingly unlikely.

6. The ages of globular clusters appear older than the universe.

7. The local streaming motions of galaxies are too high for a finite universe that is supposed to be everywhere uniform.

8. Invisible dark matter of an unknown but non-baryonic nature must be the dominant ingredient of the entire universe. The Big Bang requires sprinkling galaxies, clusters, superclusters, and the universe with ever-increasing amounts of this invisible, not-yet-detected “dark matter” to keep the theory viable. Overall, over 90% of the universe must be made of something we have never detected.

9. The most distant galaxies in the Hubble Deep Field show insufficient evidence of evolution, with some of them having higher redshifts (z = 6-7) than the highest-redshift quasars.

10. If the open universe we see today is extrapolated back near the beginning, the ratio of the actual density of matter in the universe to the critical density must differ from unity by just a part in 1059. Any larger deviation would result in a universe already collapsed on itself or already dissipated.




Einstein’s Role

One always has to be careful criticising Einstein since the line between criticism and anti-Semitism (which I abhor) is sometimes a fine one. However, apart from the Wiki entry, it is not obvious from the above what role Einstein played in the development of modern cosmology. First, Einstein’s equations of General Relativity underpin Big Bang Theory. Second, he did not derive his equations from observation. Third, it is not obvious what problem he was trying to solve.

Hilton Ratcliff argues that ‘[it is] not that easy to find out precisely why – in a practical sense – Einstein objected to Newtonian physics, or whether in the cold light of day he made any real improvement to the situation’ (‘The virtue of Heresy’, Ratcliff, H. pg 260). He goes on to say that Einstein:

·        Was concerned with the propagation of light in the absence of a physical medium, but that notion had already been answered in Maxwell’s equations.

·        Had a big problem with ‘action-at-a-distance’ (especially gravitation), but his proposed field solution still acts on objects at a distance from each other.

·        Was understandably puzzled by wave-particle duality in light, but could not suggest a workable solution; it is still today one of the most vexing questions in physics.

·        The crucial objection he raised was to the notion of absolute, universal time and the simultaneity of events. We shall shortly see that in this regard Albert Einstein was being idealistically fanciful.

We will deal with Ratcliff’s illustration of Einstein’s idealism (as opposed to realism) in the Anti-Realism post.

Was Einstein was a victim of circumstances? – Albeit, in some respects, a fortunate one. His General Relativity model had non-empirical, deductive origins. The equations of the model are non-linear and massively ‘underdetermined’ (more variables than equations; more degrees of freedom than constraints). And whilst there may be an infinity of mathematical solutions to such models, few solutions make sense physically. For example, a simple schoolroom quadratic might give two perfectly good mathematical solutions for a person’s age; one positive and one negative. But common sense tells us to dispense with the negative solution because people do not have negative ages. Unfortunately, we have no ‘common sense’ about cosmology so we are without a guide to the correct solutions. Lerner says:

In general when equations describing physical reality produce singularities – solutions involving either zero or infinity – it is a sign that something is wrong, since scientists assume that only measurable, finite quantities should be predicted (Lerner, pg134, my emphasis). 

For the universe to expand it must be finite. But most scientists from the scientific revolution onwards believed the universe to be infinite both in time and space. Einstein laid the foundation for eliminating this idea. He assumed, against all the evidence then and now, that unstated forces distribute matter homogeneously throughout the universe. Given this and his idea that matter ‘curves’ the space around it, Einstein believed that the uniform distribution of matter would curve space right around itself. Hence, the universe, like the surface of a sphere, was finite but unbounded. So, based on the idea of curved space, Einstein’s equations laid the foundation for a finite, but expanding universe, and thereby also laid the foundation for the Big Bang.       

However, Einstein's equations are unstable; so, if left to itself, Einstein's universe would quickly collapse under the pressure of gravity. To offset this, Einstein introduced an ad hoc fudge factor, a repulsive variable to neutralise the contractionary effects of gravity called the cosmological constant. This stabilised Einstein's universe and made it static. Einstein later admitted that this was the biggest mistake of his professional career.

But a Belgium catholic priest, Georges Lemaître - a student of Eddington, the Scottish physicist who led the 1919 solar eclipse expeditions that claimed success for Einstein’s general theory of relativity - realised that the instability of Einstein’s universe was a gift from God.  Lemaitre showed that

Einstein’s universe is only one special solution [of the equations of general relativity] among infinite possible cosmologies – some expanding, some contracting, depending on the value of the cosmological constant and the ‘initial conditions’ of the universe (Lerner pg 133).

Adjust the value of the cosmological constant and as if by magic one has an expanding universe. And if the universe is expanding, then it must have started somewhere and somewhen.  The somewhen was the Big Bang. And the somewhere must be the centre of the universe. And since Hubble’s red shifts indicated that everything seemed to be moving away from Earth, the Earth must clearly lie at the centre of the universe. Of course neither scientific concerns, nor the ad hocery of Einstein’s cosmological constant motivated Lemaitre. Theosophical concerns were is primary motivation. Lemaitre believed that he had ‘scientifically’ confirmed nothing less than the Biblical creation itself.

But the evidence contradicted Lemaitre’s Big Bang Mark I, and it subsequently collapsed, as did Gamow’s Big Bang Mark II that followed at the end of the Second World War. Big Bang Mark III was borne of theories about the microwave cosmic background and the relative presence of three light elements – helium, deuterium, and lithium. Big Bang Mark IV (today’s Standard Model) incorporates an ad hoc theory of inflation which overcomes some of the inconsistencies of the earlier theories (Lerner, pg 156).

For the universe to be finite, which it must be in order to expand, there must be enough matter, sufficiently homogeneously distributed, to cause space to curve back on itself. The problem is that all the matter in the universe amounts to around 0.02% of that needed to cause such curvature, and what there is, is certainly not homogeneously distributed. Standard Model cosmologists therefore, again ad hoc, posit the existence of dark matter and dark energy to account for the missing 99.98%. What could such matter be? Cosmologists have sought help from particle physicists to see if they can identify the 99% of the universe that appears to be missing. Particle physicists are busily searching for this strange dark substance which no one has ever observed and which has none of the properties of ordinary matter, but which Standard Model theorists desperately need to save Big Bang Mark IV from itself.

Scientists can transform matter and energy into one another, but they cannot create either from nothing. Consequently, particle physicists are searching for the Higgs Boson, a magical sub-atomic particle, existing in a vacuum, which ‘generates all the needed energy from nothing’ (Lerner, pg 158). Wikipedia has this to say:

The Higgs boson is a small theoretical particle, which (if it exists) is created by a Higgs field. It is necessary for a set of rules in physics that we call the Standard Model, but it has yet to be found in an experiment. If the results of the work at CERN cannot show that the Higgs boson exists, much of our entire understanding of physics will need to be re-written. It is important in the scientific world because many scientists believe that it is responsible for giving mass to all known particles that have mass.

(My emphasis)

Cosmology and particle physics seem to have got themselves in a bit of a tangle. Lerner shows brilliantly, how cosmological speculation reflects financial speculation in the periods in which it occurs:

Throughout the decade [1980s], the rise of financial speculation in Wall Street was shadowed by the rise of cosmologists’ speculation in Princeton, Cambridge, and elsewhere. As Witten and his colleagues were acclaimed by the press as geniuses for theories that produced not a single valid prediction, so men like Michael Milken and Donald trump earned not only far greater fame but also incomes that peaked, in Milken’s case, at half a billion dollars per year for paper manipulations that added not a single penny to the nation’s production (Lerner, pg 166).                       

Percipient in the light of the CERN Large Hadron Collider’s failure to find the Higgs Boson, Lerner goes on to say:

If a tower of financial speculation could be built on debt – the promise of future payment – then, similarly, a tower of cosmological speculation could be built on promises of future experimental confirmation (Lerner, ibid.).

When cosmological speculation sits happily in the shadow of contemporary ideology; when it proceeds from the premise that the theory is so mathematically perfect that theorists must make the facts conform to it or ignore them altogether; when cosmologists continually make ad hoc adjustments analogous to Ptolemy’s epicycles; it seems clear that cosmological ‘science’ has indeed retreated from Copernicus back to Ptolemy or even earlier.