merit of innovation


just finished 'examining' some 3rd semester students. Talking with the censor about replacing the missing "13" (a relatively recently obsolete 'top grade' for an extraordinary performance) with some other kind of 'badge of innovation' or merit or something like that, to indicate: "This is something extraordinary and innovative which we quite naturally *weren't* testing for, and which deserves recognition".

So a 4 (or D) with a merit for innovation might be more prestigious and interesting - more telling? - than a plain 7 (or C).

I was a Director developer, who tried FutureSplash Animator in 1996 and couldn't take it very seriously. I regarded it as a toy. We had afterburner and then shockwave, which could do so much more. Director's scripting language was never highly regarded, with the latinate genitives of children, "silly" metaphors like puppets and casts, but it had powerful, dynamic LISP-like features, a command-line, acceptable OOP, and it compiled to bytecode. You could do great things with it. (Most of the really great Director stuff - such as the Voyager CDROMs - was overlooked, or made for very small audiences such as museum visitors). The early 'action' editors in Flash were a bad joke.


Programming with Oven Gloves on - The Flash 4 Actions Editor
I was amazed to see the younger upstart 'toy' technology bought by Macromedia, and then steadily capture the attention of the producers of cool but gimmicky content, ubiquitous banner-ads, and ultimately most of the development budget of Macromedia. It was as late as Flash 5 that a proper scripting language was introduced - a javaScript dialect now known as ActionScript 1. Verity Stob wrote some hilarious comments about this period on the Register. I advise the keen reader to check out the relevant article.


I was even more amazed that, as ActionScript 2 appeared, offering a classical OOP model, a handful of people started using it to make some quite decent casual games, and, with ActionScript 3 even emulators, synthesisers and some fabulous data visualisation tools (e.g. gapminder). Slowly, steadily, it became more technically powerful, and certainly more up-to-date than its older stepbrother.


I switched to Flash, even began teaching it, and developed a couple of solid medium-size applications with it, plus many small things. I adopted AS3 and grew to like it. I marvelled at the brio with which sensitive creative types took to "serious" computer science topics like strict typing, protected members and event bubbling.


But there has never been any doubt that Flash - authoring and playback - has been rotten on Apple's systems for almost 10 years. Crashes, hangs, lousy resource management (memory/CPU) and poor OS-integration have been the norm. Multimedia designers - many of whom are Mac users - have always had a love-hate relationship with Flash, and I believe that now they are ready to move on. Steve Jobs' famous letter from 2010 is a fascinating example of rhetoric and technological leadership, to the extent that after barely more than a year, even Adobe has been forced to admit that their flagship technology is sinking.


I confess to a certain amount of schadenfreude. I remember the snooty Flash kids, their tool of choice in the ascendant, looking down their noses at Director devs, just as we looked down our noses at the Hypercard and Authorware community.


And so it becomes increasingly obvious: Closed multimedia authoring systems are always a dead-end, no matter how de facto 'standard' they may temporarily be. Microsoft's never-popular Silverlight - a potential competitor crippled by neurotic strategy - is another relevant example. In each previous case, there was always an obvious proprietary ship to jump to, but that is not so as Flash declines, and that is why Flash and SWF will linger on, way past their sell-by date.


So instead multimedia designers are expected to bank on HTML5 and javaScript. In theory, a great idea. In particular we are offered Canvas, and a fairly ropey, underdocumented audio/video playback API, with various ideological and technical encumberances. (Did you know that iOS can only play one piece of HTML5 audio at a time? Or that Firefox doesn't - and will never - support mp3?)


Now I see an interesting dilemma for multimedia designers and content / front-end developers, who cut their OOP teeth on Java, ActionScript or C#: A mindset migration from strict-typed classical languages, to javaScript - a language which superficially resembles Java / ActionScript / C#, but in truth is more like an exotic variant of LISP. How will the multimedia designers cope with this? How will they adopt javaScript when 99% of javaScript books and web-based tutorials promote dilettante or sub-optimal practice? How will the multimedia design courses teach it? The paradigm shift is going to be interesting to watch, and can only be ugly.


Then there is code editing. Flash's ActionScript editor is not the best editor in the world, but it has some very friendly features which really help you learn. It is, quite frankly, ideal for first-time coders with artistic leanings. Is there a similarly accessible javaScript code editor that offers syntax-checking, automatic code-indenting, friendly error feedback etc.?


Is there a browser which doesn't just give up silently when your javaScript has the tiniest error? Does the browser error console offer much help? The error feedback in Flash authoring has never been amazing, but at least it gives you a clue where to look for problems in your code.


I know that this is total non-problem for comp.sci folks, but the audience for the Flash authoring tool is quite different. They have different needs and different expectations when making interactive stuff. And yes, many of them are crap, and clueless but many really want to make good stuff and adopt best practices.


The landscape for content developers, self-taught game devs, arty-nerds, interaction designers, ux/ui designers etc. for creating javaScript-based content is fragmented and unfocused. Hardcore coders will always be happy with Ultra-edit, or VisualStudio, or Eclipse or notepad++, but those tools will never catch the hearts of the folks that come from an arts or design background.


I also see no other software which has vector drawing tools as friendly and intuitive as those found in Flash since its very first versions, and I see no animation tools that can export lightweight vector-based animations (e.g. svg) for the web. (BTW The animated gif exporter of Flash *really* sucks, and animated gifs are surely not where we want to go in the 21st century).


Also, canvas offers no sprite model, no 'movieclips', no collision detection, ultimately you just have a bitmap which does not distinguish one mouse or touch event from any other, except in pixel coordinates. If you want to keep track of individual visual objects, you have to build your own engine in javaScript, which means performance overheads. There are a few of these engines out there, (see here, here, here and here) some are very good, but it's a terrible shame that some of the features that multimedia designers regard as basic or fundamental ("I want something to happen in the corner of the screen when I click on this moving monster") are simply not available, straight off the HTML-5 shelf.


SWF is undoubtedly on its deathbed, but the Flash authoring tool has no obvious heir. The market is wide open for a "Flash-killer" which offers some, or all of the features mentioned above, which generates HTML5+javaScript, preferably in some editable form, so that it can be hacked about with PHP or whatever afterwards.


Adobe Edge looks promising, but I am wary of Adobe's ability to manage multimedia authoring tool development. Their track record is abysmal. Can they just not screw up, bloat, and hobble their tools with the limitations of their broader strategy for reaching 'internet marketers'? I am sceptical.


And I am training my JS/HTML5 muscles for multimedia teaching and multimedia content production, because as Flash declines, the quality difference between 'the men and the boys' is going to be pretty stark.


I have had the pleasure, recently, of introducing Object-Oriented programming ("OOP") to a bunch of students on a brand new course. My introduction was deliberately incomplete, with most of the terminology deliberately cut away, because I know my colleagues will be bringing out the big UML guns next week. It has been a pleasure because every time anything is taught, the teacher has an opportunity to learn the stuff in a new way. (Whether he will take the opportunity is another matter).

It is fair to say that OOP is where computer science crosses over into biology and philosophy and - if you dare - finds a solid footing in the spiritual, even 'the sacred', where concepts like 'God' begin to mean something quite specific. What is this OOP? How does a programming paradigm have the potential to reveal the mysteries of the universe?

Well, sorry, this is not really very much about OOP. If you want to learn OOP there are thousands of excellent resources available on the net, and I can confidently say that reading about it is not the same as doing it, or as Frank Zappa put it, "writing about music is like dancing about architecture."

Every user of any modern computer system already understands the central concept of OOP - that the software should somehow resemble the 'real' world, or at least reproduce a recognisable fiction. The pictures on your social network are arranged in 'albums' (does anyone under 25 know what a 'photo album' used to be?); The labelled numerical transactions on the netbank screen should convince you that you are looking at 'your bank account'; Your files are organised in 'folders' and so on.

These 'objects' in the digital landscape are simulations of real things, or at least 'common fictions'. It is preferred if their designs make small concessions - illusory panderings - to your expectations. A 'document' icon should preferably have a dog-eared corner. A 'trash can' or 'recycle bin' should advertise the fact that it is empty or full by displaying a suitable icon. Software objects are metaphors for real and archetypal conceits, and we must recognise the metaphor in order to make use of them.

This is all well and good, and if you have no ambitions to discover or explore the mystical/occult dimensions of the digital landscape, you may stop reading here, secure with the blessed truth that these simulations are no more or less 'real' than the printed paper bank statement that arrives in a paper envelope, or the greetings on a birthday card.

When I learned to make my own software objects, I was very much distracted by the idea that I was doing something 'strange' or even 'difficult'. It was certainly a different way of thinking about writing code. I have been writing code since I was 14, so I have almost forgotten what it means not to know what programming might be. I get a bit of clue when I have to 'program' my washing machine or my video recorder, where the 'metaphors' fall far short of what happens on my computer screen. (Washing machines are so poorly imagined that they offer no metaphors. You simply choose between 'programs' using a clunky control panel. Haven't the washing machine interface designers heard of brainstorming?)

The fact is that 'programming' is what you want to avoid doing. Even programmers prefer to avoid actual programming. Programmers don't even like to think of themselves as programmers. They prefer other names for their work: Coding, developing, modelling, hacking, refactoring, and various other euphemisms. Programming is not just for dummies, it's 'for idiots'. It's a purely mechanical manifestation of something much more refined and interesting and beautiful: The algorithm.. the datatype... the class... the model. We make concessions to the rest of humanity, who have no clue, poor dears, and we sigh and say "yes, I am a programmer" as if we are embarrassed about it, but secretly we want to admit that we are really misunderstood philosophers, artists, magicians, messiahs... alas modesty forbids.

Are we just arrogant, deluded fools? There is a trend which would like to marginalise us that way. It can be quite threatening for those with an aversion to maths or technology when they encounter a programmer in person. Prejudices and clichés abound: Programmers are into science fiction and pizza, they rarely exercise, and don't wash often enough, they have no small talk, they're 'strange' and inscrutable, with poor social skills, they are allowed to come to work without wearing a suit and tie, they work at night and hold petty grudges and they probably know how to hack into your private stuff on facebook. Brrrrr. Creepy.

I am afraid that these clichés are partly, true. No, let's be honest, they are often true. So what is it about OOP that breaks the pattern? What happens to the object-oriented programmer which is different from the guy who is setting up his VHS machine to record "Eastenders" while he is on holiday?

The Object-Oriented programmer gets inside. Above all, he understands that there is a boundary between inside and outside, and you can be sure that he wants to spend some time inside and some time outside, and he is exquisitely interested in the boundary, the 'interface' between the two contexts. Even more bizarre: When the Object-Oriented programmer gets inside, his identity begins to shift. He become the object that he is modelling. Instead of "what code should I write here?" he asks "what do I know here and now? What do I need to know?" - he becomes the object, with its knowledge and (especially) its limitations and asks himself what he would do if he fit perfectly into the broader ecology of the system which his human programmer-self was just busy with a few moments ago.

Programmers enter trance states, where inputs and outputs are limited and simplified. This is not 'hippie' stuff, it's the noble science of excluding irrelevant variables from consciousness to create a pure, clean. testable hypothesis. "IN THIS SPECIFIC CONTEXT: WHAT DO I KNOW? WHAT CAN I DO?". He excludes, he narrows, he restrains, he focuses, he shifts, he transforms into the very thing he is making - he becomes the 'sub-machine' which is part of the 'super-machine'. He is humble, delirious, anxious and gloriously happy. This is truly part of the shamanistic tradition which is surely older than humanity - the art of relocating the self into some metaphorical 'other', in order to create tools for transformation and growth.

The broader community - humanity, or 'the user' - is only interested in the tools, the stories, the metaphors, and gives little thought to the fact that these are not random choices. The programmer has arrived at these specific choices through an essentially shamanistic process.

And if you doubt any of this, just consider what might be 'really' happening when you casually 'empty the recycle bin' on your PC.

Creatura and Pleroma

This was originally intended to be a response to this post on the excellent 'Only a Game' blog, but after trying to send my reply several times, with only refusals from the server, I decided that it was strong enough on its own.

I encountered the distinction between Creatura and Pleroma first in Gregory Bateson's writing, although it is Carl Jung that should take the credit for fishing out this elegant abstraction from the mystical soup of early Christian Gnosticism.

Pleroma is the 'physical' world of sand and gold and collisions between billiard balls. Creatura is the 'mental' world of information, knowledge, fantasy, form, tautology, hysteresis, observation and fiction.

As Gregory Bateson put it: Pleroma is evident when you kick a ball, creatura is evident when you kick a dog.

In one case, we have Newton's laws of motion, in the other, we have a living organism with memory and an energy supply which is independent of the energy of the kick. Newton's laws will not tell us much about which way the dog will move, and at what velocity. The dog may even bite us if we repeat the experiment too often.

Creatura is immanent in Pleroma. Immanence is the quality of being inextricably embedded in something else. The 'groovy atmosphere' is immanent in the party. Love is immanent in a loving couple etc.

If creatura is immanent in pleroma, then creatura can not exist without pleroma. This means that a transcendent God (sitting outside his creation, observing and judging) must be a fiction: God (or 'spirits' or whatever) can only 'really' exist as integrated into the physical universe, not separate from it. In this way, 'primitive' religions which employ totemism can be seen as less superstitious than the more dominant monotheistic faiths of the 'modern' world. In totemism, the spirit lives in the thing(s) - Neptune is part of any water, Thor is part of any lightning.

Another example: A software text file requires silicon or magnetised oxides or fibre optic cables or some such physical medium in order to exist. but the data exists independently of the medium. The data is 'immanent' in the disk. (This is Heraclitus' 'never the same river' - indeed a 'river', like most geographical features - dunes, volcanoes, beaches etc. - may be regarded as a 'mental' object, because it 'remembers' its form, and 'learns' to bend by eroding and depositing sediment etc. The water is pure pleroma, however. Chemists may study its chemical composition, but a biologist would be obliged to study the way that certain plants 'prefer' certain parts of the river bed). What does it mean to 'prefer'?

Science is mostly preoccupied with pleroma, and is suspicious of creatura. Creatura also includes lies, fictions, fantasies, failed hypotheses, contradictions etc. Science wants only truth, but a complete (i.e. 'real') description of [some part of] the universe requires creatura because description itself is creatura. Only creatura can specify the relationships between parts - i.e. the relationship between Newton's model of gravity and that of Einstein can only be described as creatura. Einstein doesn't make Newton obsolete, because Newton's laws are still simpler to use and easier to understand. But we need to know how his theory relates to Einstein's if we wish to choose the more suitable model for the job in hand: Are we attempting to score a goal in a football match, or travel to the nearest galaxy? Only creatura can answer these kinds of questions because pleroma has no goals - no mind.

We must be as rigorous in our understanding of creatura as is possible - which requires humility and parsimony. Some branches of science, e.g. information theory, biology and psychology, make [some aspects of] creatura their central focus. Those sciences also take great pains not to get seduced by false epistemologies. Still it took centuries to arrive at any kind of 'model' of communication, or of evolution, or personality, and we are still unravelling what these phenomena 'really' are. The unravelling itself is a mental process - a virtual one and also a fictional one, because all the evidence indicates that solving the mysteries of science only reveals new mysteries. Our 'truth' is always incomplete and therefore always somehow fictional. (Not to mention that the act of observing is fundamentally a creative act - a 'mapping' of the universe onto a sensory cortex, which then emerges into consciousness with pre-learned names ready to be attached).

Is the virgin different from the bride, or from the wife? What counts? The legal document, the religious ritual, or the biological sacrament?

Does a flock of birds 'really' exist? The materialist sees only the birds, the transcendentalist goes looking for a 'boss' who co-ordinates the flock - perhaps there is a 'boss bird', or perhaps it's God... Clearly both lines of thought are missing the point. We can model flocking behavior with an extremely simple and elegant algorithm which may be expressed in a single sentence: Move in the mean direction of your fellows.

What makes the flocking algorithm simple and elegant is that it applies to structure - the relationship between parts. You have to have fellows to have a flock, otherwise the algorithm is both meaningless and useless. Mies Van Der Rohe said "God is in the details" - but he wasn't quite right. God is in between the details, or rather in the relationship between details.

"The devotee believes that Krishna appears when his name is called. The guru understands that Krishna appears in the spaces between the name calling."
-Swami Wassermann

In Praise of Slow Learning

In a reply to my last post, Malene raised the example of bilingual children (or children growing up in bilingual environments).

This is very interesting. The child is put under extra stress, but this is not a bad thing. He only 'suffers' when the adults frown at his apparent lack of progress.


Parents are encouraged to look for evidence of learning, so if a child appears to be a little 'slow', they might imagine there is something wrong, when in fact it may just be because the child has more 'work' to do - filtering out the distortions, whether they be sarcastic language, violent family members, or something else. In such a case, the child should be applauded and encouraged just for hanging in there, and integrating what can be had.

In the case of the bilingual family, each language acts as a kind of 'noise', distorting the signal of the other, but not entirely, because both languages will ultimately map the same phenomena, and as Malene points out, this early 'work' carried out by the child has great benefits later in life. He knows that signifier and signified are distinct. A bilingual child is an intuitive semiotician almost from the moment he can name any object in two languages.

Contrary to adult intuition, the 'slow' child may actually end up better equipped for life. There are various examples of 'geniuses' who were late-developers. Maybe these individuals were slow because they were observing and understanding things more completely than we did. Maybe the 'normal' children learned one lesson too early: Swallow your curiosity and keep up with the curriculum. These 'normal' children are in danger of developing a dread of all that is different and unknown.

Perhaps the worst thing is to reach adulthood and to live in fear of the margins - what is often called 'narrow mindedness'. Most of us have epistemological taboos of one sort or another, but there are apparently some people who are so anxious about straying into the margins that 'normalisation' is seen as good medicine, not just for themselves, but also for their fellow humans. This is the root of totalitarianism and indoctrination.

Here in central Copehangen we are witnessing the gradual erosion of the 'free town' Christiania, a sprawling, beautiful, imperfect jewel, squatted since the early 1970s on unused military land, where misfits of all kinds can meet an important fundamental need: A sense of belonging.

The government plan to close it down, a process they call 'normalisering' (always said with a straight face). The hidden agenda is to erect luxury flats for the bourgeoisie on what must be a very lucrative piece of land, but it is being sold as a way of clamping down on the drug trade, and various other disreputable activities.

Little connection is made between the gang wars which have flared up all over town since the police started raiding the Christiania hash stalls, and there are no suggestions offered for what may be done afterwards to cater (for example) for the Greenlanders, the childrens' theatre, the stables, the shrines, the organic stores, the 'gay house' (where young homosexuals can escape persecution from family members), the womens' metal workshop... each of which would have difficulty surviving in 'normal' society, but have their own special place to unfold in this protective margin called Christiania. To unfold, and to give something back to society as a whole, instead of being a forgotten and neglected underclass.


If it were more widely known (widely taught!) that margins like these are the source of all innovation, all creativity and all progress, not just in human culture, but in evolution, design and any other viable system, then we may be rescued from that brand of idiocy that seeks normality at all costs. There is nothing new without randomness (basic information theory), and randomness does not thrive under 'normal' conditions.

So we have a good argument for tolerating 'slow learners' and deviants. Not just tolerating them, but perhaps even encouraging others - and ourselves - to learn more slowly, and to deviate from the syllabus. So that we can explore the richness of the margins along the way.

Hacking Epistemologies

- if you know that what you know and how you know is constructed from nothing but distorted inputs, and that the truth can only be glimpsed by crossing and comparing these with the reflections of someone else who also recognises the distorted nature of their own inputs, then you are doing pretty well.

I am a bit hooked on this word "epistemology" - knowledge about knowledge. We would want those to close to us to know things in a similar way to ourselves. Not necessarily to know the same things, but to have a similar (recognisable) way of knowing whatever we do know.


Absolute truth may never be reached, but we can move towards it. Knowledge has a form, a shape, and it has gaps, which have shapes too. An important step is to recognise that you - like everyone else - have an epistemology, i.e. that your knowledge has been formed and shaped by more-or-less distorted sensory inputs from before birth, (does the fetus hear mother's voice more or less truly than the newborn infant?) and the distortions continue to shape and refine our knowledge gathered in the present in a similar way.


The distortions of earlier inputs will necessarily affect the way the distortions of more recent inputs are perceived and recognised. In many cases, when the distortions do not contradict each other, those distortions will become invisible, which can lead almost 'everybody' to believe that the world is flat, or that the devil is out to get us, or whatever. Anyone who had an alternative epistemology would be marginalised and/or lonely. (The sufi story 'When the Waters Were Changed' is relevant here).

It is precisely in the margins where we find the best clues about the distortions we accept and reject, and therefore get a better idea about whatever truth may be approached. (Conspiracy theories and heresies will always hold some interest, even if we must guard against being trapped by the restrictive 'armour' which protects them from corruption by the mainstream). For some people the margins are provoking, for others they are a ridiculous waste of time, for still others, it is soothing for the soul to discover: I am not alone in being alone with my epistemology.

Perhaps the only true knowledge we have is the body knowledge which has guided the formation of our body parts in the womb, a program of unfolding ("bootstrapping", as the hackers would call it) which is written into our DNA from the start. Much of this programming is shared with our fellow creatures, and with plants.

The unfolded program is no longer just a program. It is also evident in the forms (body parts) that we end up with. The flower with five petals shares some knowledge with the five-armed starfish, and the five-toed foot. Each has had to 'split in five', which is a technique as well as a description. I believe that shamanic experiences are a way of getting into contact with these shared programs. Trance (achieved through drumming, dancing, drugs, sex, sleep deprivation, fasting and other rituals) allows us to hack our epistemology, so that the 'animal' knowledge can dominate. (Or rather, if we speak cybernetically, we can exclude or restrain those parts of our human knowledge which contradict the animal/vegetable core, and thereby become a wolf or a hawk, or a mushroom).

The 'automatic' processes of puberty can be another site where we can expect humans to be more animal/vegetable, but our awareness of this is partly distorted by consciousness and culture - not least the whole teenage racket - but in spite of this, secondary sexual characteristics do manage to unfold successfully. They must, or the game is over within a single generation.

Too many befuddled and superstitious individuals are constantly on the lookout for a suggestion of a 'sixth sense' - and this 'holy grail' is apparently some kind of explanatory principle for 'wild talents', anomalous occurences, miraculous apparitions of the virgin on pancakes etc. which would - if ever identified as a fact, re-establish the authority of the spirit (holy or otherwise) over scientific knowledge. That'll teach those arrogant fools in white coats. Bring back the guys in the luxurious robes!

These folks are so eager to believe that this apparatus exists, that they are quite uninterested in where it might be located in the organism, or how it might work.

Also, I detect a tacit lingering dread that such a discovery would, if made by bona fide scientists - checked and with all its phenomena reproduced in laboratories using formal experiment - somehow cheapen the whole business of 'religion' or 'spirituality', and we would all have to go hunting for some other bit of hocus pocus instead, just to retain our divine fuzzy feelings. Still they seek to 'prove' that there are such 'things' as auras, while fleeing from any attempt to understand what kind of knowledge a 'proof' really is.

Typically, these people are not doing their research very well. Years ago, Rudolf Steiner identified not one extra sense, but seven - making a grand total of twelve. No hocus pocus here:


  • Sense of Touch. (The skin is the largest organ in the body, but you also feel your food moving in your gut. See next item.)

  • Sense of Life ...and therefore also the sense of death - the threshold between the two is Ernst Jentsch's "uncanny" or Masahiro Mori's "uncanny valley". Having mentioned food in the gut, many of us can sense the release of nutritious matter into the bloodstream, a 'sugar rush', the hunger/satisfaction cycle etc. which is all experienced by sensory apparatus too. 'Metabolism' by the way is one of the requirements for defining living things, according to many biologists. (Viruses are usually not considered truly 'alive' because they do not metabolise anything).

  • Sense of Self-Movement (Also motivation in general: Drive, goal orientation etc. Those dishes aren't going to wash themselves.)

  • Sense of Balance - Located in the ears, not just because of their convenient position, but also related to orientation, rhythm, dance, gravity, even somehow to acoustics and spatial awareness etc.

  • Sense of smell (Many have remarked on the peculiar relationship of smell to memory. How do dogs remember people, do you think?)

  • Sense of Taste (similar to smell, it has connections with memory, taste is strongly affected by smell and by texture. Cinnamon, for example, has no taste at all, but is considered to improve flavour).

  • Sense of Sight ...gets most of the attention in our culture. I am rather tired of its predominance and its cheap, easy glamour, but there are a few points of interest. The assymmetric relationship between rod and cone cells remains curious: Van Gogh was fascinated by the way colour perception is affected in low-light conditions, and attempted to capture this on canvas. So how well-lit should his paintings actually be? (Epistemological puzzle).

  • Sense of Temperature/Warmth - Those who live in temperate zones, are most likely attuned to this differently from those from the tropics. The latter most likely have an equally well developed 'sense of humidity'. This is actually the proverbial 'multiple Eskimo names for snow' thing. English itself has about a dozen words for frozen water in different states. "The most significant aspect of any sculpture is its temperature." as Joseph Beuys put it.

  • Sense of Hearing. The only one that gets much attention after sight. You can't turn it off, you must filter it instead, using entirely unconscious processes! Again, rhythm and acoustics have an important role to play. Play on!

  • Sense of Language (understanding, communication, formal pattern - and also general semantics, but as semantics becomes less general we have...)

  • Sense of Concept (boundaries and classes, one of my favorite topics. I am 'class conscious'!)

  • Sense of Ego (i.e. 'self' or identity - aummmm - I am one with the blog editor. See 'unity' below)

OK, you might think Rudolf was cheating. Some of these 'senses' seem to overlap, some of them appear to draw only upon the neural input of the classic 'five senses' we were taught at school, but that is not the point. The point is that we develop a 'sensibility' for all of these things through many and various sense organs, some of which, like the ability to sense blood sugar levels, are located in the brain, and for most people, entirely unconscious. Still I believe a child who has had too much cola is experiencing something 'different', which the soft drinks companies like to celebrate and promote. Another epistemological puzzle then: Is a wholly unconscious sense a 'true' sense?

With this in mind, I thought I might add a few more to Steiner's wonderful list:


  • Sense of order (and by extension, clean/dirty, sequence/disorder, justice/revenge)

  • Sense of belonging (home, family, nationalism, territory, sometimes combined with the sense of order, and a strong sense of motivation, then elevated above all else, with unfortunate consequences that we might call 'fascism')

  • Sense of occasion (social intelligence?)

  • Sense of completeness/unity/wholeness. (We have to collect the WHOLE SET of these cheap plastic trinkets to make our lives more meaningful. Confluence. Togetherness (see 'Sense of Belonging', above). The ultimate gestalt: God?)

  • Sense of humour (nothing funny to add here, I am afraid - ah isn't this something to do with fear, actually? The smile is a modified simian fear signal. Say cheese or I'll eat YOU).

any more?


 

Free Blogger Templates | Created by Adam Every