A few weeks ago, after I wrote a piece criticising the Uber ban in London, I received a considerable amount of pushback on Twitter, from people I can only assume are affiliated with the black cab industry. Read More…
A few weeks ago, after I wrote a piece criticising the Uber ban in London, I received a considerable amount of pushback on Twitter, from people I can only assume are affiliated with the black cab industry. Read More…
My first exposure to Godfrey Reggio’s 1982 time-lapse masterpiece was at an exhibition at the Victoria and Albert Museum on the ‘Post Moderns’. It featured the now universally recognisable accelerated footage of taillights pumping through the city to the rhythm of alternating traffic flows, creating an eerily arterial display. What was interesting about the use of this footage in this particular exhibition was that it was shown under the pretext of the death of futurism and the birth of dystopia, sandwiched as it was between clips of the bleak futuristic skyline of Blade Runner (which I must admit has a beguiling beauty all of its own) and chaotic images of the Tokyo Stock Exchange. While footage from Koyaanisqatsi, complete with the stark minimalist composition of Phillip Glass, did not feel out of place in this exhibition, I couldn’t shake the notion that there was more to it than merely a bleak vision of man’s conquest over the Earth. This became more apparent when I watched the film in its entirety. Read More…
Redesigning art, science and mankind.
Leonardo Da Vinci had an impressive CV: painter, sculptor, architect, musician, mathematician, engineer, inventor, anatomist, geologist, cartographer, botanist, and writer. The period in which he flourished, now known as “The Renaissance” (“The Rebirth”), was a time of extraordinary experimentation. Inspired by the intellectual curiosity of their ancient Greek and Roman forebears, Renaissance thinkers combined art and science in novel ways; the boundaries between the disciplines more fluid than they are today. It was from such voluminous expertise that the ideal of The Renaissance Man arose: an individual with many creative gifts who cultivated a wide range of scholarly interests.
This period of history is analogous to our own, as we enter an increasingly technology era, the border between man and machine gradually disappearing.
The Divine Proportion
Da Vinci’s iconic Vitruvian Man (1490) has become an everlasting symbol of The Renaissance, of its dual commitment to artful wonder and scientific rigour. In Vitruvian Man’s creation, Da Vinci synthesized information from anatomy, architecture and physics into, what he believed, was an overarching theory of the universe.
The Encyclopaedia Britannica described it as,
“Leonardo envisaged the great picture chart of the human body he had produced through his anatomical drawings and Vitruvian Man as a ‘cosmografia del minor mondo’ (cosmography of the microcosm). He believed the workings of the human body to be an analogy for the workings of the universe.”
Vitruvian Man (1490)
Da Vinci was exacting in his approach to aesthetics. He illustrated a book on mathematical proportion in art, De divina proportione (1509).
He also kept detailed notebooks in which he methodically recorded his observations of the natural world. From his notebook:
“The lights which may illuminate opaque bodies are of 4 kinds. These are: diffused light as that of the atmosphere… And Direct, as that of the sun… The third is Reflected light; and there is a 4th which is that which passes through [translucent] bodies, as linen or paper or the like.”
Da Vinci applied this knowledge to art-making. His groundbreaking painting, The Lady with an Ermine (1483) contrasted varying degrees of light and shade to create depth of perspective in a way rarely achieved before.
His use of light in later works such as The Mona Lisa (1503-17) forever changed how artists used light in their paintings.
Left: Mona Lisa (1503-17) Right: Lady with an Ermine (1483)
He also turned his meticulous hand to cartography, creating maps that were visually detailed and precise, which was unusual for the time. In 1502, his plan of the Italian town of Imola in Bologna was unparalleled in this regard.
It was this ability to successfully blend art and science that made Da Vinci the innovator he was.
A true man of The Renaissance, his tune was not monotonous. He played a range of chords during his lifetime, producing novel harmonies and unique melodies.
In the early 21st century, such types were harder to find. In 2009, Edward Carr writing for the magazine Intelligent Life argued that polymaths are “an endangered species”. This was due in large part to universities worldwide favouring specialisation in one particular area, with generalists regarded as lacking commitment or insufficient depth of knowledge to really be considered an authority in their field.
But what about the blow this deals to innovation, Carr wonders?
“The question is whether their loss has affected the course of human thought. Polymaths possess something that monomaths do not. Time and again, innovations come from a fresh eye or from another discipline. Most scientists devote their careers to solving the everyday problems in their specialism. Everyone knows what they are and it takes ingenuity and perseverance to crack them. But breakthroughs—the sort of idea that opens up whole sets of new problems—often come from other fields. The work in the early 20th century that showed how nerves work and, later, how DNA is structured originally came from a marriage of physics and biology.”
Indeed, Francis Crick, one of the two men credited with uncovering the double helix structure of DNA, had begun his career in science as a physicist. By applying methods he had learned from physics, he was able to approach what was considered the “holy grail of biology” in a new and effective way. He and his research partner, James Watson, focused all their attention on working out the physical configuration – “the physics” – of DNA before ascertaining its purpose. Later scientists were able to do exactly that, by building on the valuable work of Crick and Watson.
It is at the intersection of disciplines where creative catalysis happens.
However, as the 21st century progresses and with the recent advent of transformative technologies like 3D printing, polymaths, like Neri Oxman, may have the opportunity to thrive once more.
With a background in both medicine and architecture, MIT-based technologist Neri Oxman has pioneered new methods of designing and manufacturing construction materials.
In a talk she gave at the annual PopTech conference in 2012, she described the philosophical approach to her work as:
“Ask not what science can do for design but what design can do for science.”
Neri Oxman (2012) (2012) Source: Wikimedia Commons
Informed by the way nature “designs”, Oxman uses 3D printing to create one of a kind artifacts, the material and anatomical structure of which mimics the biological entities they are modelled upon.
One of her works, Minotaur Head with Lamella, exhibited at The Centre Pompidou in Paris in 2012, as part of the “Design and Mythology” collection is “a shock absorbing flexible helmet” that is designed to:
“…flex and deform in order to provide comfort and high levels of mechanical compliance. The head shield introduces variable thickness of the shell, informed by anatomical and physiological data derived from real human skull data. Medical scan data of a human head is selected from an open repository. Two sets of data are created and trimmed from the scan using medical imagining software simulating the hard tissue (skull) and the soft tissue (skin and muscle). Combined, these two data sets make up the bone-to-skin threshold informing helmet thickness and material composition according to its biological counterpart such that bony perturbations in the skull are shielded with soft lamellas designed as spatial sutures.” (MIT Media Lab)
Minoutaur Head with Lamella (2012) Source: Media Lab MIT/ Neri Oxman Projects
3D printing presents many collaborative opportunities for art and science.
From now onwards, we will experience far greater integration between technology and art than ever before: the emergence of art-science. This is already happening with enormous success in digital publishing and game design, where many modern day Da Vincis are to be found.
See our list: Game Developers at the Philosophical Frontier
Is Mankind the Next Great Design Project?
It is a significant historical period we are living through. New cultural paradigms are being forged as human lives become ever more entwined with technological processes.
The futurist and trend forecaster Ray Kurzweil has long predicted that in the 21st century humanity will “transcend” biology, by merging with technology. In his best-selling 2005 book The Singularity is Near: When Human Beings Transcend Biology he predicts that humans will be routinely augmenting their bodies and intelligence with technology by the year 2045.
Thought we have yet to develop technology that actually merges with our bodies, we are certainly becoming more reliant on it to mediate and organise our daily lives, with mobile devices serving as intellectual prosthetics. Complete physical integration does seem like the logical next step.
But is this more science fiction than fact? Writer and technology entrepreneur Jaron Lanier, has referred to the cultish nature of “The Singularity” concept, which is often treated with religious reverence by its adherents.
It is a vision of the future worth paying attention to though, as Lanier warns, “these ideas (have) tremendous currency in Silicon Valley; these are guiding principles, not just amusements, for many of the most influential technologists.”
Indeed Ray Kurzweil has now been appointed as Director of Engineering at Google, giving him the opportunity to realise many of his prophecies.
As it was during The Renaissance, ours is a highly innovative age, a golden era for those interested in collapsing the historical barriers between art and science.
3D printing, Web 2.0 and mobile technology supply us with a glimpse of our digitally consolidated future.
We are experiencing reinvention as technology fuses with many aspects of everyday life.
The coupling of art and science in The Renaissance gave birth to Da Vinci, its archetypal man.
Our time may even see the human race reborn.
Featured Image Credit: Female Head by Leonardo Da Vinci Source: WikiPaintings
9/11: An event that ripped a hole in the fabric of our history. The time before it, seems idyllic.
Whilst pondering the historical significance of 9/11, we came across this rather beautiful a capella version of Babylon by the American singing trio, Mountain Man.
While some may see the choice of song here as a political point (‘Zion’ may seem a loaded term after all), it seems appropriate, now, more than ever, owing to its evocation of past and present, as we look back on a decade of war and take a moment to consider our place in history. So we ask you not to consider Zion in such narrow terms, but to apply it more broadly. As representative of a bright, shining past that will forever elude us. Zion is a lost innocence. This generation’s at least.
The twenty-first century has, for the most part, been defined by contradictions between what we thought about our human future and what we have found awaiting us. After a century of seemingly relentless war and the threat of nuclear annihilation, we seemed poised for a greater future, in the 1990s. The Soviet Union fell and thus appeared to bring an end to the age of great power competition; whereupon Francis Fukuyama declared the ’End of History’. New technology was causing the world to shrink and the march of globalisation swept across the world.
Then, at the dawn of the new human century, history came crashing down on us again, out of a clear September sky. It seems both tragic and fitting that we faced an enemy that wished to achieve a twisted parody of what some had thought we had achieved already: an end to history, a return to an imagined perfect epoch.
What has been achieved since then? Can the human race claim to be any more unified? The old Western order strains to hold itself together, fractured by economic chaos and endless war. The cradle of civilisation is racked by revolution and counter-revolution and our hatreds and misunderstandings seem more fervent than ever. We gaze, fearful but resigned, into the abyss.
The events of the last decade embody the state of timeless historical perpetuity to which Robin Jones refers in his piece about that blackest of Septembers. What is to become of us? The answers may surprise us. Our idea of history is more realistic now and more conducive to building a future from the fragments of our present. Our civilisation has not fallen; we have limped on in spite of our frailties. New discoveries have in many ways defined the 2010s, our artists and dreamers continue to cultivate great beauty and the human capacity for great love and empathy remains undiminished.
Francis Fukuyama was wrong, but so were the men who sought a return to the past. History cannot be brought to an end nor can it be revived. The goals of those men were impossible and thus their failure inevitable. There is no deterministic force called ‘history’ and the future is an untold story of which we are the narrators. Most of all there is no past, only the memory of what is lost.
The other morning I awoke off a callous couch with a jolt, a half-empty jar of Nutella in one hand and in the other, an almost indecipherable communiqué magic-markered across page 56 of my copy of Slaughterhouse Five: “We will never ever, EVER! touch art again”, it slurred.
Beguiled but puzzled, I churned another glob of Nutella off a dirty salad ladle into my mouth, and attempted to decode this anomalous phrase. At first I wasn’t too sure Kurt Vonnegut had much to do with it, although for half a moment, I could imagine his spirit exiting the dusty library of the netherworld, and possessing my stupor, taking my sugar-driven hand to heed some enigmatic warning to the world.
Listen: my son and I recently enjoyed that new movie inspired by a famous brand of toys – Lego. People with children will recognise this particular phase of the parent/child dynamic, it comes somewhere in between goo-goo, ga-ga, dinosaurs and “I hate you, give me the car keys.”
My son and I are well into the cult of the block: we’ve got the games, the actual blocks themselves and of course the movie. We’ve had a fair run with the blocks so far; building passable wheeled contraptions and top-heavy aircraft that would make even Howard Hughes balk. We dismembered their famous men, replaced heads with till registers and attached doors to disembodied legs to create beautiful absurdities and other crimes against nature worthy of Dali.
But every cooling off period with Lego begins more often than not, with someone – inevitably me – taking a midnight barefoot trip to the kitchen across the sitting room floor, mined with the most murderous mini-monoliths known to the human race.
Soon enough these Legos are packed away on the highest shelf in the house until the child is a little older and learns to pick up after himself, or at least, until some marketing genius decides to make a movie based around the damn things, and suddenly Legos are cool again, taken down off the highest shelf in the house and once again ending up in the strangest places; like in your shoes, down the couch, down your bum, up your nose: everywhere and anywhere, Legos, by design, fit any space.
But first we have to watch the movie, dad.
Not too sure if I was watching the world’s longest toy commercial, some bizarre existential post-modern art installation or some highly sophisticated political propaganda film.
The plot itself is remarkably cohesive for children’s movie, if a bit erratic, and fairly easy to grasp for 4-year-olds and 40-somethings alike. The film is clever, irreverent and works really well as either some kind of Randian parable or be-yourself-and-be-cool-to-each-other analogy – depending on which side of the schoolyard Rubicon you stand.
It has a catchy song; Batman shows up in the middle of the movie for no reason at all and it has a grand poignant finale with a wholesome message, before everyone breaks into another chorus of that diabolically catchy song. It stands up remarkably well compared to, say, the average Lady Gaga song or episode of Glee or whatever other current pop culture touchstone is injecting our kids full of self-esteem these days.
Above all, it is a great family event, enjoyed by all and bringing families closer together. It lit up my boy’s eyes with wonder and fascination, inspiring him to search for a greater knowledge of how the basic principles of symmetrical construction work in real life.
“Sure, son” I smile proudly as I head to the highest shelf in the house, “I think you’re now old enough to handle these again.”
“No, dad, I want to play the game.”
The family electronic tablet – it’s an iPad-adjacent really, kind of like Nickelback is Led Zeppelin-adjacent – is everyone’s favourite thing at the moment, not only because it makes us all look like we’re on Star Trek, but because it can do so much stuff on it.
Mom pins things of interest in a giant digital scrapbook, dad looks at the latest trends in bikini fashion and the boy likes to play games on it. Apparently you can use it for actual useful things, too, but we’re still stuck on knitted Rasta-coloured tea cosies, Kate Upton and Need For Speed.
“I want to build blocks on the ‘pooter,” he tells me. And I let him, for no other reason than I’m too speechless with confusion to argue otherwise. And there he sits, like some deft urchin at the controls of the Death Star, playing a game that involves connecting blocks of varying sizes and colours together to make some kind of stylistically pleasing structure. All by moving fingers across a screen.
The only difference between this and the actual blocks I still rattle in the box with absentminded incredulity is that at the completion of the computerised construction, it squeaks or honks or unveils some sort of universal truth, along with various remarkable rewards.
Retiring to my solace later that evening, with a spoonful of the brown stuff, I still reel with the astonishment of it all: of how far we’ve come in the world, in our evolution, where we can do everything we’ve ever wanted with a computer, but also: of how far away we’ve drifted from the conventional idea of reality.
We don’t touch things anymore. Things like music, film, books, art and almost everything in our every day have become the great intangibles. We listen to music, yet we never feel it. We see moving images on screen, yet we never enjoy the process we used to get to that point. We read, yet we never count the pages nor remember the words.
Music used to be an event, an unwrapping of a vinyl record, the opening of a CD case; the placing of it on and in a player; the unfolding of the words and images of its cover – an art in itself – the reading of it like some undiscovered scroll of knowledge, filled with poetry and identity. Those days are gone. Now all you have to do is punch it into YouTube or iTunes and you have it instantly. No unearthing, no excitement of holding something that is yours and yours alone. Now you share it with millions, it drops out of a chute like a convenient capsule of immediate gratification.
Film was an occasion, too – in the real sense of the word. You had to go out if you wanted to see the latest blockbuster, you had to dress up and drink shitty fake-Coke and buy overpriced chocolate for a girl who might end up letting you put your arm around her, but probably didn’t.
Now, again, all you have to do is point and click and you have it. There is no romance left in a Netflick nor does popcorn taste quite the same if you have to clean it out your own couch.
Books, the final bastion of great tangible art – clucked labouriously and industrially by writers of yore onto magnificent lever-driven typewriters or smudged in ink, sweat and tears onto every conceivable surface, and delivered to your fingers in great wedges of enlightenment and dog-eared, spine-cracked knowledge – they, too, now have slimmed down to a single slab that you page by swiping to the sound of a manufactured soundbyte of “a turning page”.
Don’t ask me how Kurt Vonnegut ended up in all of this. I think while rolling all of this around my head, I had to pick up something real just to make myself sure that I was still here and I didn’t turn into an app or something.
And inevitably, in my house anyway, a book is always close by, and you don’t get more real than Slaughterhouse Five – one of my favourite books, not because it has aliens and time travel (those great intangible traditions of modern storytelling) – but because – like its hero Billy Pilgrim – a man tossing and turning in between the bed sheets of time and place – experiencing the book, you slowly begin to realise that you’re hurtling so fast through these rapidly changing times, you try so desperately to attach yourself to something real and tangible, just so that world won’t let go of you.
Reading through random lines of the book, I realised what Uncle Kurt was trying to explain to us all (and he does it not so much in his narrative, than with his wordplay): we lose a little of what we are, the faster we evolve. Our senses start to fade the faster we travel, and these senses are, no matter how the world changes, still the only connection we have with the world we live in.
They say the mind is the grand central station of the senses, and for the most part it really is that final destination where all our other senses revert back to and bounce back from again.
But touch is the soul of sense, of being. Corporeal interaction is what amplifies the sight, sound, smell of who we are and what we do. It’s no wonder the blind read with their fingers, the deaf feel vibration; touch enhances everything.
Building blocks with your hands is a lot different to sliding a finger over a virtual element, sliding it into place to a rhythm of an electronic click.
Listening to music without touching its closest point of creation – the groove of a record, the flap of a liner note – is not the same as simply plugging into instant access.
A movie isn’t a movie until you take your seat, and even more vital, a book is not a book and words are not words if you can’t feel yourself turning the page.
Living in a world without touch is like watching alien beings in a glass zoo. Billy Pilgrim taught me that.
Eventually, my son tired from sliding the blocks across a screen and wondered to himself if the box on the highest shelf in the house might promise more satisfaction…and it did.
The jagged monstrosities of real Lego could never compare to the perfect, pre-destined, game-theorised world behind the glass screen, but he felt with his fingers, and jimmy-rigged any challenge that got in his way with his hands. He improvised and experimented, improved and experienced, because he could touch.
Some great jazz musician once said that improvisation was the greatest freedom anyone could experience in anything from changing your underwear to composing a symphony. And it all starts with a touch.
Featured Image Credit: “Kurt Vonnegut” on Flickr
“People want to be loved, not for what they are, but what they appear to be.”
The recent furore over Facebook’s manipulation of its users’ news feed proves that reason should always be your master and technology your slave. Read More…