Koyaanisqatsi: Is Technology Really So Separate From Nature?

My first exposure to Godfrey Reggio’s 1982 time-lapse masterpiece was at an exhibition at the Victoria and Albert Museum on the ‘Post Moderns’. It featured the now universally recognisable accelerated footage of taillights pumping through the city to the rhythm of alternating traffic flows, creating an eerily arterial display. What was interesting about the use of this footage in this particular exhibition was that it was shown under the pretext of the death of futurism and the birth of dystopia, sandwiched as it was between clips of the bleak futuristic skyline of Blade Runner (which I must admit has a beguiling beauty all of its own) and chaotic images of the Tokyo Stock Exchange. While footage from Koyaanisqatsi, complete with the stark minimalist composition of Phillip Glass, did not feel out of place in this exhibition, I couldn’t shake the notion that there was more to it than merely a bleak vision of man’s conquest over the Earth. This became more apparent when I watched the film in its entirety. Read More…

Human Nature, Markets and the Misrepresentation of Ayn Rand

ayn rand, free markets, cultural marxism

The late Ayn Rand is one of western society’s most polarizing thinkers. The mere mention of her name is likely to elicit a strong response from either side of the left/right political divide; such were the force of her ideas.

As an outspoken opponent of welfarism and charity, and a staunch defender of free markets, Rand is often characterised (unfairly, in my opinion) as a cruel proponent of ruthless free market capitalism where the strong thrive and the weak wither.

This demonization of Rand means that the value of her work and the lucidity and prescience of her arguments are largely unrecognised. This is a real pity, because they have enormous practical value today, especially Rand’s treatment of human nature, where she was influenced by Aristotle (a favourite of mine), and which I have always found to be incredibly perspicacious; particularly as someone living in a western culture suffused with “progressive” political ideologies, which often deny the existence of human nature and, as a result, misconstrue the mechanics of free markets and the morals of those who operate within them.

Central to Rand’s concept of human nature is that mankind possesses reason: the capacity to rationally evaluate the world around him. The ability to do so is absolutely essential, because otherwise, without this critical faculty, man would not be able to act in accordance with reality, and as such, would fail continuously in any endeavour he embarked upon. Those who are able to grasp reality most accurately are able to perform wondrous technological feats, produce penetrating works of art and philosophy, and create viable, prosperous businesses.

This was why she called her philosophy “Objectivism”, as she believed the truth was non-relativistic – and most importantly, objectively comprehensible to man. With our advanced reasoning capabilities, we are able to discern the fundamentals of existence.

Free markets are an essential component of this worldview, because in this vision of the untrammelled marketplace, an individual’s ability to reason is the primary determinant of his success, he is guided by his own powers of observation and not by the prescriptions of any external governmental authority.

The problem with the latter is that its prescriptions may not conform to what is, but rather what it believes should be, regardless of what is reasonable or realistic.

Which is precisely why communism has failed in every country that has tried to completely eliminate the free market. As someone who left Soviet Russia at a young age for the United States, Ayn Rand experienced this folly firsthand.

In our contemporary western societies which usually contain a mixture of free market capitalism and welfarist socialism, government funded entities operate as a countervailing force to the private sector. The public sector compensates for inequalities in family life, education, personal wealth and employment opportunities.

Because the pursuit of equality is regarded as a moral imperative, the activities of the private sector, particularly corporations, are often perceived as lacking in morality and are criticised for not being “inclusive” enough. In some countries this has resulted in “diversity” legislation and/or punitive taxation.

More recently, Silicon Valley has come under attack for its supposed “elitism” and there are calls to reform its “corrupt culture”, which apparently deliberately excludes people on the basis of their gender.

What this criticism fails to recognise, however, and which Ayn Rand’s philosophy explains so beautifully, is that the success of Silicon Valley is dependent on its ability to reason and not by bigotry. Its very survival depends on it: ergo, it hires the best technology professionals available and does not consider those who do not have the requisite skills. Because we live in cultures where ideologues are obsessed with “equality” this simple logic is often misunderstood or mistakenly attributed to malice.

I’m certainly not a Randian disciple, though I agree with aspects of her thought, I do not think that free markets always result in perfect competition, as she did, and my conception of power and the state is different (more on that in a future article), but I do rue the fact that she has become a vilified figure, that her philosophy, which contains so much wisdom and insight into the human psyche, is perceived as evil. It indicates that the truth has been distorted by political ideology, which proves her argument and leaves our culture poorer for the lack of it.

Featured Image Credit: Portrait of Ayn Rand (Date Unknown) by Phyllis Cerf. Source: Wikicommons. Design by Imagine Athena

Da Vinci, Polymaths and the Art-Science of Innovating the Future

Da Vinci Sketch

Redesigning art, science and mankind.

Leonardo Da Vinci had an impressive CV: painter, sculptor, architect, musician, mathematician, engineer, inventor, anatomist, geologist, cartographer, botanist, and writer. The period in which he flourished, now known as “The Renaissance” (“The Rebirth”), was a time of extraordinary experimentation. Inspired by the intellectual curiosity of their ancient Greek and Roman forebears, Renaissance thinkers combined art and science in novel ways; the boundaries between the disciplines more fluid than they are today. It was from such voluminous expertise that the ideal of The Renaissance Man arose: an individual with many creative gifts who cultivated a wide range of scholarly interests.

This period of history is analogous to our own, as we enter an increasingly technology era,  the border between man and machine gradually disappearing.

The Divine Proportion

Da Vinci’s iconic Vitruvian Man (1490)  has become an everlasting symbol of The Renaissance, of its dual commitment to artful wonder and scientific rigour. In Vitruvian Man’s creation, Da Vinci synthesized information from anatomy, architecture and physics into, what he believed, was an overarching theory of the universe.

 The Encyclopaedia Britannica described it as,

“Leonardo envisaged the great picture chart of the human body he had produced through his anatomical drawings and Vitruvian Man as a ‘cosmografia del minor mondo’ (cosmography of the microcosm). He believed the workings of the human body to be an analogy for the workings of the universe.”

 

Vitruvian Man (1490)

Da Vinci was exacting in his approach to aesthetics. He illustrated a book on mathematical proportion in art, De divina proportione (1509).

He  also kept detailed notebooks in which he methodically recorded his observations of the natural world. From his notebook:

“The lights which may illuminate opaque bodies are of 4 kinds. These are: diffused light as that of the atmosphere… And Direct, as that of the sun… The third is Reflected light; and there is a 4th which is that which passes through [translucent] bodies, as linen or paper or the like.”

Da Vinci applied this knowledge to art-making. His groundbreaking painting, The Lady with an Ermine (1483) contrasted varying degrees of light and shade to create depth of perspective in a way rarely achieved before.

His use of light in later works such as The Mona Lisa (1503-17) forever changed how artists used light in their paintings.

Left: Mona Lisa  (1503-17) Right: Lady with an Ermine (1483)

 

He also turned his meticulous hand to cartography, creating maps that were visually detailed and precise, which was unusual for the time. In 1502, his plan of the Italian town of Imola in Bologna was unparalleled in this regard.

Town Plan of Imola                                       Town Plan of Imola (1502)

Varying Octaves

It was this ability to successfully blend art and science that made Da Vinci the innovator he was.

A true man of The Renaissance, his tune was not monotonous. He played a range of chords during his lifetime, producing novel harmonies and unique melodies.

In the early 21st century, such types were harder to find. In 2009, Edward Carr writing for the magazine Intelligent Life argued that polymaths are “an endangered species”.  This was due in large part to universities worldwide favouring specialisation in one particular area, with generalists regarded as lacking commitment or insufficient depth of knowledge to really be considered an authority in their field.

But what about the blow this deals to innovation, Carr wonders?

“The question is whether their loss has affected the course of human thought. Polymaths possess something that monomaths do not. Time and again, innovations come from a fresh eye or from another discipline. Most scientists devote their careers to solving the everyday problems in their specialism. Everyone knows what they are and it takes ingenuity and perseverance to crack them. But breakthroughs—the sort of idea that opens up whole sets of new problems—often come from other fields. The work in the early 20th century that showed how nerves work and, later, how DNA is structured originally came from a marriage of physics and biology.”

Indeed, Francis Crick, one of the two men credited with uncovering the double helix structure of DNA, had begun his career in science as a physicist. By applying methods he had learned from physics, he was able to approach what was considered the “holy grail of biology” in a new and effective way. He and his research partner, James Watson, focused all their attention on working out the physical configuration – “the physics” – of DNA  before ascertaining its purpose. Later scientists were able to do exactly that, by building on the valuable work of Crick and Watson.

DNA Structure Quote

It is at the intersection of disciplines where creative catalysis happens.

 

Renaissance Woman

However, as the 21st century progresses and with the recent advent of transformative technologies like 3D printing, polymaths, like Neri Oxman, may have the opportunity to thrive once more.

IMG_1209

 

With a background in both medicine and architecture, MIT-based technologist Neri Oxman has pioneered new methods of designing and manufacturing construction materials.

     In a talk she gave at the annual PopTech conference in 2012, she described the philosophical approach to her work as:

“Ask not what science can do for design but what design can do for science.”

                                                                               ……..

 

 

Neri Oxman (2012)  (2012) Source: Wikimedia Commons

Informed by the way nature “designs”, Oxman uses 3D printing to create one of a kind artifacts, the material and anatomical structure of which mimics the biological entities they are modelled upon.

One of her works, Minotaur Head with Lamella,  exhibited at The Centre Pompidou in Paris in 2012, as part of the “Design and Mythology” collection is “a shock absorbing flexible helmet” that is designed to:

 

“…flex and deform in order to provide comfort and high levels of mechanical compliance. The head shield introduces variable thickness of the shell, informed by anatomical and physiological data derived from real human skull data. Medical scan data of a human head is selected from an open repository. Two sets of data are created and trimmed from the scan using medical imagining software simulating the hard tissue (skull) and the soft tissue (skin and muscle). Combined, these two data sets make up the bone-to-skin threshold informing helmet thickness and material composition according to its biological counterpart such that bony perturbations in the skull are shielded with soft lamellas designed as spatial sutures.” (MIT Media Lab)

 

Minoutaur Head with Lamella (2012) Source: Media Lab MIT/ Neri Oxman Projects

3D printing presents many collaborative opportunities for art and science.

From now onwards, we will experience far greater integration between technology and art than ever before: the emergence of art-science. This is already happening with enormous success in digital publishing and game design, where many modern day Da Vincis are to be found.

…………………………………………………………………………………

See our list: Game Developers at the Philosophical Frontier
…………………………………………………………………………………

Is Mankind the Next Great Design Project?

It is a significant historical period we are living through. New cultural paradigms are being forged as human lives become ever more entwined with technological processes.

The futurist and trend forecaster Ray Kurzweil has long predicted that in the 21st century humanity will “transcend” biology, by merging with technology. In his best-selling 2005 book The Singularity is Near: When Human Beings Transcend Biology he predicts that humans will be routinely augmenting their bodies and intelligence with technology by the year 2045.

Thought we have yet to develop technology that actually merges with our bodies, we are certainly becoming more reliant on it to mediate and organise our daily lives, with mobile devices serving as intellectual prosthetics. Complete physical integration does seem like the logical next step.

But is this more science fiction than fact? Writer and technology entrepreneur Jaron Lanier, has referred to the cultish nature of “The Singularity” concept, which is often treated with religious reverence by its adherents.

It is a vision of the future worth paying attention to though, as Lanier warns, “these ideas (have) tremendous currency in Silicon Valley; these are guiding principles, not just amusements, for many of the most influential technologists.”

Indeed Ray Kurzweil has now been appointed as Director of Engineering at Google, giving him the opportunity to realise many of his prophecies.

Rebirth

As it was during The Renaissance, ours is a highly innovative age, a golden era for those interested in collapsing the historical barriers between art and science.

3D printing, Web 2.0 and mobile technology supply us with a glimpse of our digitally consolidated future.

We are experiencing reinvention as technology fuses with many aspects of everyday life.

The coupling of art and science in The Renaissance gave birth to Da Vinci, its archetypal man.

Our time may even see the human race reborn.

Featured Image Credit: Female Head by Leonardo Da Vinci Source: WikiPaintings

On Netflix Now: Henry Ford: American Experience

On Netflix Now is a new series, which reviews dramatic feature films and documentaries currently on Netflix.

Scientific progress has wrenched us from our shackles to nature. Though human beings are of the earth, we are not entirely subject to it. In fact, we make every effort possible to transcend it.

Over the course of the last one hundred years we have advanced ever further from the pastoral lives of our agrarian forebears, whose simplicity appears quaint and remote to the urbanized mind of today’s homo urbanus. Mechanised systems and structures govern his thought processes and moral landscape. Thomas Hobbes’s brutal state of nature, where “all battle all”, has been carefully concreted over with rational systems of law, commerce, politics, transport and medicine.

However, as stubborn green weeds occasionally force their way through the cracks in the pavement, so the antediluvian aspects of the human psyche resist all attempts to subdue such.

Mankind may have replaced galloping hooves with faster forms of transport powered by combustion engines, but he can never fully outpace himself. He is forever pursued by his own animal lust, the desire to compete, to kill and to exercise dominion over the weak.

Two films on Netflix now explore this frustrated tussle with nature.

Both set in the early twentieth century, Days of Heaven (1978) and Henry Ford: American Experience (2012) examine the origins of our displacement from a pastoral existence to a metropolitan one.

It was men like the entrepreneur Henry Ford who directed humanity along this inexorable course of action.

PBS’s documentary of his illustrious life is both extensive and informative. An honest analysis of a brilliant but complicated man whose creative abilities were matched only by his tyrannical tendencies.

The two hour film follows a traditional biographical format, comprising archival footage and voice over narration; beginning with his early life and career to his later successes and failures.

Ford’s death at eighty-four makes this a long, complex story, which requires a significant time commitment from the viewer, but it is deftly told and doesn’t drag at all, with surprising bits of information that keep your attention.

Though forward thinking in many ways, Ford was primitive in his dealings with others. He thought little of draconian displays of pure bestial force.

Implacably convinced of his moral superiority, he attempted to impose his puritanical ideas on society. He published and distributed anti-Semitic material, devised real life sociological experiments with poor Brazilian villagers in the jungle and cruelly berated his son Edsel Ford, President of the Ford Motor Company, whose drinking and smoking habits Ford abhorred.

As a business owner, he created a steep pyramid structure at the Ford Motor Company with himself in sole position at the top. He answered to nobody and everybody was answerable to him. During times of labour unrest at the factory he condoned violent putdowns of strikes and refused to negotiate with unions or employees.

He also believed his iconic Ford Model T to be the pinnacle of automotive innovation and viciously blocked attempts by his son to modernize the company.

This tension between the two Fords: futurist inventor and primal subjugator is the most fascinating theme explored in the documentary.

It ends with a thoughtful pause in the Brazilian jungle, the camera poring over dense thickets of wild vegetation, the inscrutable natural world that Ford did so much to tame, but was unable to conquer in himself.

Terence Malick’s critically acclaimed drama Days of Heaven (1978) covers roughly the same period of history, but is located at the other end of the economic spectrum: the uncertain world of American labour in 1916… (read more)

Sign up here to receive On Netflix Now straight to your inbox.

By The Rivers of Babylon We Sat Down and Wept

9/11: An event that ripped a hole in the fabric of our history. The time before it, seems idyllic.

Whilst pondering the historical significance of 9/11, we came across this rather beautiful a capella version of Babylon by the American singing trio, Mountain Man.

While some may see the choice of song here as a political point (‘Zion’ may seem a loaded term after all), it seems appropriate, now, more than ever, owing to its evocation of past and present, as we look back on a decade of war and take a moment to consider our place in history. So we ask you not to consider Zion in such narrow terms, but to apply it more broadly. As representative of a bright, shining past that will forever elude us. Zion is a lost innocence. This generation’s at least.

The twenty-first century has, for the most part, been defined by contradictions between what we thought about our human future and what we have found awaiting us. After a century of seemingly relentless war and the threat of nuclear annihilation, we seemed poised for a greater future, in the 1990s. The Soviet Union fell and thus appeared to bring an end to the age of great power competition; whereupon Francis Fukuyama declared the ’End of History’. New technology was causing the world to shrink and the march of globalisation swept across the world.

Then, at the dawn of the new human century, history came crashing down on us again, out of a clear September sky. It seems both tragic and fitting that we faced an enemy that wished to achieve a twisted parody of what some had thought we had achieved already: an end to history, a return to an imagined perfect epoch.

What has been achieved since then? Can the human race claim to be any more unified? The old Western order strains to hold itself together, fractured by economic chaos and endless war. The cradle of civilisation is racked by revolution and counter-revolution and our hatreds and misunderstandings seem more fervent than ever. We gaze, fearful but resigned, into the abyss.

The events of the last decade embody the state of timeless historical perpetuity to which Robin Jones refers in his piece about that blackest of Septembers. What is to become of us? The answers may surprise us. Our idea of history is more realistic now and more conducive to building a future from the fragments of our present. Our civilisation has not fallen; we have limped on in spite of our frailties. New discoveries have in many ways defined the 2010s, our artists and dreamers continue to cultivate great beauty and the human capacity for great love and empathy remains undiminished.

Francis Fukuyama was wrong, but so were the men who sought a return to the past. History cannot be brought to an end nor can it be revived. The goals of those men were impossible and thus their failure inevitable. There is no deterministic force called ‘history’ and the future is an untold story of which we are the narrators. Most of all there is no past, only the memory of what is lost.

Is University a Racket?

How valuable is a University education these days?

In the United States and England, over a 60 year period, the cost of university has risen sharply, leading to increased indebtedness among graduates; a depressed job market that has seen many young graduates forced into low-paid, non-graduate work; and the proliferation of alternative educational opportunities online, which are generally cheaper and require a lower time investment than a traditional education. [1]

The American venture capitalist Peter Thiel has been a particularly vocal critic  of the modern higher education system. His criticisms largely align with all of the aforementioned; his biggest concern being that higher education is a bubble.

His reasons:

1. Rising costs of education (and the attendant debt) with no discernable improvement in quality.

2. Acquiring a university degree is more about social signaling than actual learning.

What this means is that a large number of graduates are pursuing degrees for fear of being socially stigmatized as unintelligent and/or lazy. This type of herd mentality means that people do not think through the value of a university education for themselves and are instead compelled, through social pressure, to pay premium rates for an over priced product; leaving them, Thiel argues, at the end of their lengthy university education with considerable, paralysing debt – costs which they might never totally recoup.

This runs contrary to the conventional wisdom, which says that graduates earn higher incomes than non-graduates and can expect, over time, for their investment in education to pay off.

The historical data certainly bears this out. The Fusion Network recently published a fantastic interactive chart (factoring in the opportunity costs and debt of a four year degree), which demonstrates the differences in lifetime earnings, in the US, between graduates and non-graduates: on average, graduates earn more over time.

Similarly, in the UK, the average graduate can expect to outperform non-graduates in lifetime earnings. In fact, the economic outlook for the average non-graduate with only a GCSE level qualification (21% of the population) is pretty bleak. By the age of 32 they can expect their annual salary to peak at £19 000. For the average A-level graduate (21% of the population), their prospects are only marginally better: a maximum of £ 22 000 a year by age 34.

This, compared to the average graduate (38% of the population) whose annual income peaks at age 38, at £ 35 000.[2]

Of course, the biggest problem with this data (and which the statisticians openly acknowledge) is that it is backwards looking. It examines the lives of people who made career decisions over twenty years ago, under a very different set of historical circumstances.

Can we always use the past to accurately predict the future?

To emphasize this point, Peter Thiel draws a comparison, in this Intelligence Squared debate, with sub-prime mortgages: In 2005, the historical housing data would tell you that house prices always rise.

That is a sobering thought indeed, and it does seem to be the case that many of today’s indebted graduates are struggling to establish themselves, delaying marriage, family and mortgages for a longer period than previous generations.

The other important thing to consider, Thiel argues, is that the increasing amount of onerous debt that graduates are forced to take on, prevents them from participating in risky entrepreneurial ventures, limiting their range of potential career choices.

It also means society, overall, is deprived of high-value innovation.

Of course, many technological innovators don’t have degrees. No doubt this is what Thiel is partially informed by. You don’t need to do a degree in computer science to learn how to code.

But what about the unquantifiable benefits of university?

Is it not valuable simply on its own terms? A university education provides access to history’s great minds and ideas, an intellectual experience that you won’t get anywhere else.

Although this is certainly not a misleading argument, it may been stronger in the pre-internet era. Nowadays, however, young people have an unprecedented level of access to information. There is a plethora of educational resources online, most of which are significantly cheaper than the average 3-4 year bachelors degree. Although, the actual market value of these qualifications has yet to be conclusively proven. They are too new.

Of course, there are some degrees which will probably always require long periods of skill acquisition and study: medicine or architecture, for instance.

But do you need to make a similar time and financial investment if you’re pursuing a degree in business or graphic design? That’s less certain.

In fact, Charles Murray, co-author of the controversial book, The Bell Curve  (1994) argues that the bachelor of arts degree is a meaningless qualification that signifies nothing concrete to a potential employer. You’re better off, he says in the same Intelligence Squared debate as Peter Thiel, getting a professional diploma of some kind that can take as little as six months to complete, but which equips you with practical, recognizable skills for the job market.

……………………………………………………………………………………………………..

Purely Academic?

Robin Gilbert-Jones

It is a common cliché that after each educational milestone, at the start of the next phase, be it middle-school, high-school, university or postgrad, you are told to forget everything you have hitherto learned about your discipline and start fresh. The previous phase was just an academic, hurdle-jumping exercise to get you to the “real” stuff. I always felt like that to an extent when moving onto a new stage in my education, mainly to the difference in structure and teaching style, but never so much as when I first attempted to enter the working world.

I spent a year trying to break into the job market after my undergraduate degree, before eventually realising that if I was going to get anywhere I needed to achieve a higher level of education. During this time I spent 6 weeks in an unpaid internship at a think tank in London where I was asked by the Director about my career aspirations, when I mentioned possibly going into academia, I was told in no uncertain terms “don’t become an academic, the whole bloody thing is so corrupt”.

So has it become little more than a racket?

Some degrees lead you to a job (the real world), others to just to more academia (the opposite); in which case, if you want to enter the real world you have struggle to get onto the ladder and leave your degree behind. The trope of referring to the career ladder as the “real world” is not entirely unjustified when you consider what the academic world is. When you describe a position of hypothesis as purely “academic” it is conjecture, of no consequence. Academia teaches you, in some sense, how to learn but not how to live and survive. After the first few months of applying for jobs and attending interviews, I began to get the impression (unfairly, I think) that employers saw undergraduate and post graduate degree as four years spent out of the real world. That is not to say that the qualification itself is without merit or useful as a prerequisite to something else.

I can certainly rise to the defence of my masters as far as its content was concerned, it was one of the most challenging and enjoyable experiences of my life, but the sense in which it was a box-ticking exercise was still ever-present. What I do know is that my undergraduate degree was not sufficient to get my foot in the door for the kind of work I was applying for, so ultimately I had to invest in a bigger foot. I went to University at a time when ex-technical colleges were being converted to universities and Blair’s New Labour in their infinite wisdom (“education, education, education”) decided that a university degree was something everyone should have. By the time I finished undergrad in 2006, a humanities degree didn’t mean what it used to and there were all kinds of anecdotes flying around about PHD graduates applying for plumbing internships and the like.

So then the next dilemma presents itself: Do I struggle into the working world from the bottom or do I put it off further to get more qualified? The risk of that being another year spent out of the job market. While I don’t regret my decision my timing was appalling – I finished my masters in September 2009, in the bleak nadir of the economic crisis, and spent 8 months temping in admin jobs before a research consultancy took a chance on me and, while I am sure my MA added a great deal to my CV, an interview situation is a great playing-field leveller. Having now recruited employees myself, I know that the degree is largely only a preliminary consideration.

And then you have arrived and that is that. Nobody wants to know about your degree when you have been working in your field for four years – there are much more important considerations. Why is this? Well this problem is more acute with humanities and other content-based disciplines – if you study nursing you will most likely become a nurse, engineering an engineer etc. But the only way to truly follow a humanities degree through to career level is to go into academia. I know many English and History graduates and none of them are English Professors or Historians.

……………………………………………………………………………………………………..

Harvard Uber Alles

There is still enormous value, though, in acquiring a degree from a prestigious institution.

Graduates of Harvard University are some of the wealthiest and influential in the world. It counts more billionaires amongst its alumni than any other Ivy League institution and has educated the majority of elected US presidents.

A powerful selling point, a degree from Harvard confers a great deal of social capital on its graduates who are perceived to be intelligent, hard-working and competent. It sends an extremely strong signal, as do qualifications from other elite institutions.

Caveat Emptor

Conversely, a degree from a poorly perceived university can send the opposite signal.

This is where the opportunity and financial costs of university are almost impossible to justify. In fact, they may even be harmful.

There is no refund for a degree from a university that has the potential to actually damage your prospects in the job market. Undiscerning students hopeful that just possessing a degree will be enough, irrespective of the quality, need to be made aware of this.

Universities are somewhat culpable in this regard. It’s not only the exploitation of the hopes and dreams of inexperienced eighteen year olds, but that the enormous rise in fees is largely to do with the fact that universities are using them to cross subsidise research activities.  In effect, people are not paying for the product they’re being sold.

The universities are behaving rationally, because they are ranked according to their research output, but it does present a huge conflict of interest when selling a very expensive degree to a prospective student. The phrase “ripping off” comes to mind when the actual market value of that qualification is low.

So, is it worth it?

It’s usually pretty lame to conclude an essay with a wishy washy “time will tell” or “it’s all relative anyway.”

Except in this case.

We don’t know yet whether higher education is a bubble, as Peter Thiel argues it is. That will only become apparent later. Right now, prospective students have to trust historic data. That is their decision. A risk they must be prepared to take.

The value of nascent online opportunities is also not yet clear,

The only reliable advice that can be given to academic hopefuls is to resist social pressure and really examine the product they’re purchasing. Make sure it’s right for them.

Because, unlike any other large purchase, this one cannot be refunded, sold or traded. You are wedded together, for better or worse.

[1]

(Important to mention: This is mainly relevant to people who have studied in the US and the UK. This is because more people go to university in these countries, so understanding the positional advantage of university is more interesting in that regard, than in developing countries where the general level of educational attainment is low and university graduates form a small proportion of the population.)

 

[2]

We won’t get into the cost of living debate. Too complex and wide ranging to go into here. But might be a good future topic.

Lego, Kurt Vonnegut and the Lost Art of Touching Things

The other morning I awoke off a callous couch with a jolt, a half-empty jar of Nutella in one hand and in the other, an almost indecipherable communiqué magic-markered across page 56 of my copy of Slaughterhouse Five: “We will never ever, EVER! touch art again”, it slurred.

Beguiled but puzzled, I churned another glob of Nutella off a dirty salad ladle into my mouth, and attempted to decode this anomalous phrase. At first I wasn’t too sure Kurt Vonnegut had much to do with it, although for half a moment, I could imagine his spirit exiting the dusty library of the netherworld, and possessing my stupor, taking my sugar-driven hand to heed some enigmatic warning to the world.

Listen: my son and I recently enjoyed that new movie inspired by a famous brand of toys – Lego. People with children will recognise this particular phase of the parent/child dynamic, it comes somewhere in between goo-goo, ga-ga, dinosaurs and “I hate you, give me the car keys.”

My son and I are well into the cult of the block: we’ve got the games, the actual blocks themselves and of course the movie. We’ve had a fair run with the blocks so far; building passable wheeled contraptions and top-heavy aircraft that would make even Howard Hughes balk. We dismembered their famous men, replaced heads with till registers and attached doors to disembodied legs to create beautiful absurdities and other crimes against nature worthy of Dali.

But every cooling off period with Lego begins more often than not, with someone – inevitably me – taking a midnight barefoot trip to the kitchen across the sitting room floor, mined with the most murderous mini-monoliths known to the human race.

Soon enough these Legos are packed away on the highest shelf in the house until the child is a little older and learns to pick up after himself, or at least, until some marketing genius decides to make a movie based around the damn things, and suddenly Legos are cool again, taken down off the highest shelf in the house and once again ending up in the strangest places; like in your shoes, down the couch, down your bum, up your nose: everywhere and anywhere, Legos, by design, fit any space.

But first we have to watch the movie, dad.

Not too sure if I was watching the world’s longest toy commercial, some bizarre existential post-modern art installation or some highly sophisticated political propaganda film.

The plot itself is remarkably cohesive for children’s movie, if a bit erratic, and fairly easy to grasp for 4-year-olds and 40-somethings alike. The film is clever, irreverent and works really well as either some kind of Randian parable or be-yourself-and-be-cool-to-each-other analogy – depending on which side of the schoolyard Rubicon you stand.

It has a catchy song; Batman shows up in the middle of the movie for no reason at all and it has a grand poignant finale with a wholesome message, before everyone breaks into another chorus of that diabolically catchy song. It stands up remarkably well compared to, say, the average Lady Gaga song or episode of Glee or whatever other current pop culture touchstone is injecting our kids full of self-esteem these days.

Above all, it is a great family event, enjoyed by all and bringing families closer together. It lit up my boy’s eyes with wonder and fascination, inspiring him to search for a greater knowledge of how the basic principles of symmetrical construction work in real life.

“Sure, son” I smile proudly as I head to the highest shelf in the house, “I think you’re now old enough to handle these again.”

“No, dad, I want to play the game.”

The family electronic tablet – it’s an iPad-adjacent really, kind of like Nickelback is Led Zeppelin-adjacent – is everyone’s favourite thing at the moment, not only because it makes us all look like we’re on Star Trek, but because it can do so much stuff on it.

Mom pins things of interest in a giant digital scrapbook, dad looks at the latest trends in bikini fashion and the boy likes to play games on it. Apparently you can use it for actual useful things, too, but we’re still stuck on knitted Rasta-coloured tea cosies, Kate Upton and Need For Speed.

“I want to build blocks on the ‘pooter,” he tells me. And I let him, for no other reason than I’m too speechless with confusion to argue otherwise. And there he sits, like some deft urchin at the controls of the Death Star, playing a game that involves connecting blocks of varying sizes and colours together to make some kind of stylistically pleasing structure. All by moving fingers across a screen.

The only difference between this and the actual blocks I still rattle in the box with absentminded incredulity is that at the completion of the computerised construction, it squeaks or honks or unveils some sort of universal truth, along with various remarkable rewards.

Retiring to my solace later that evening, with a spoonful of the brown stuff, I still reel with the astonishment of it all: of how far we’ve come in the world, in our evolution, where we can do everything we’ve ever wanted with a computer, but also: of how far away we’ve drifted from the conventional idea of reality.

We don’t touch things anymore. Things like music, film, books, art and almost everything in our every day have become the great intangibles. We listen to music, yet we never feel it. We see moving images on screen, yet we never enjoy the process we used to get to that point. We read, yet we never count the pages nor remember the words.

Music used to be an event, an unwrapping of a vinyl record, the opening of a CD case; the placing of it on and in a player; the unfolding of the words and images of its cover – an art in itself – the reading of it like some undiscovered scroll of knowledge, filled with poetry and identity. Those days are gone. Now all you have to do is punch it into YouTube or iTunes and you have it instantly. No unearthing, no excitement of holding something that is yours and yours alone. Now you share it with millions, it drops out of a chute like a convenient capsule of immediate gratification.

Film was an occasion, too – in the real sense of the word. You had to go out if you wanted to see the latest blockbuster, you had to dress up and drink shitty fake-Coke and buy overpriced chocolate for a girl who might end up letting you put your arm around her, but probably didn’t.

Now, again, all you have to do is point and click and you have it. There is no romance left in a Netflick nor does popcorn taste quite the same if you have to clean it out your own couch.

Books, the final bastion of great tangible art – clucked labouriously and industrially by writers of yore onto magnificent lever-driven typewriters or smudged in ink, sweat and tears onto every conceivable surface, and delivered to your fingers in great wedges of enlightenment and dog-eared, spine-cracked knowledge – they, too, now have slimmed down to a single slab that you page by swiping to the sound of a manufactured soundbyte of “a turning page”.

Don’t ask me how Kurt Vonnegut ended up in all of this. I think while rolling all of this around my head, I had to pick up something real just to make myself sure that I was still here and I didn’t turn into an app or something.

And inevitably, in my house anyway, a book is always close by, and you don’t get more real than Slaughterhouse Five – one of my favourite books, not because it has aliens and time travel (those great intangible traditions of modern storytelling) – but because – like its hero Billy Pilgrim – a man tossing and turning in between the bed sheets of time and place – experiencing the book, you slowly begin to realise that you’re hurtling so fast through these rapidly changing times, you try so desperately to attach yourself to something real and tangible, just so that world won’t let go of you.

Reading through random lines of the book, I realised what Uncle Kurt was trying to explain to us all (and he does it not so much in his narrative, than with his wordplay): we lose a little of what we are, the faster we evolve. Our senses start to fade the faster we travel, and these senses are, no matter how the world changes, still the only connection we have with the world we live in.

They say the mind is the grand central station of the senses, and for the most part it really is that final destination where all our other senses revert back to and bounce back from again.

But touch is the soul of sense, of being. Corporeal interaction is what amplifies the sight, sound, smell of who we are and what we do. It’s no wonder the blind read with their fingers, the deaf feel vibration; touch enhances everything.

Building blocks with your hands is a lot different to sliding a finger over a virtual element, sliding it into place to a rhythm of an electronic click.

Listening to music without touching its closest point of creation – the groove of a record, the flap of a liner note – is not the same as simply plugging into instant access.

A movie isn’t a movie until you take your seat, and even more vital, a book is not a book and words are not words if you can’t feel yourself turning the page.

Living in a world without touch is like watching alien beings in a glass zoo. Billy Pilgrim taught me that.

Eventually, my son tired from sliding the blocks across a screen and wondered to himself if the box on the highest shelf in the house might promise more satisfaction…and it did.

The jagged monstrosities of real Lego could never compare to the perfect, pre-destined, game-theorised world behind the glass screen, but he felt with his fingers, and jimmy-rigged any challenge that got in his way with his hands. He improvised and experimented, improved and experienced, because he could touch.

Some great jazz musician once said that improvisation was the greatest freedom anyone could experience in anything from changing your underwear to composing a symphony. And it all starts with a touch.

(fin)

Featured Image Credit: “Kurt Vonnegut” on Flickr

……………………………………………………………………………………………………..

Science and Art eBookRead more about technology and innovation and humanity and the future in our Science and Art ebook. Available to purchase from Athena or from Amazon.