art is

2019.12.04

the synthesis of two or more things, so as to reveal a previously unknown or unnoticed relationship between them

like a painter’s synthesis of a brush technique and a reference. or a director’s synthesis of a camera-angle and scene

or an author’s synthesis of words (e.g. “the happiest sadist”)

provoke a new or changed understanding

that is to say, art is metaphor

it is the stuff of thought

there is a model of thought (a metaphor) which takes the human experience of the world to be the mapping of external stimuli to previously-defined patterns (e.g. “red”, “plant”, “compound interest”, or “throw the baby out with the bathwater”). in this model, it is metaphor, or art, by which a new pattern is defined, using positioning against existing patterns to give it a locus and size in this pattern space (or, alternatively, by which existing patterns are shown to have previously unrecognised overlap with another existing pattern)

take, for example, the synthesis of patterns like “reinforced” and “bullet-proof” and “car” to produce a new pattern “armoured car”. or, for a more abstract example, synthesise a few mathematical patterns and some intuitions about physical phenomena and the result might be a theoretical model of physical behaviours. that is to say, theoretical physics is an art

the use of the word “pattern” helps us to draw parallels with what is currently called “ai”, and to recognise the remaining distinction between it and the lived experience of a sentient entity. modern “ai” has gotten very good at the pattern-matching side of this equation. it can take existing patterns, some sensory input, and determine what patterns best match that sensory input (e.g. looking at a picture and tagging all “dogs”). what modern “ai” systems do not have, though, is a good system for dynamically defining new patterns or redefining existing patterns, as doing so requires both an extremely broad range of defined patterns and a means by which to continuously, asynchronously, and efficiently reference those patterns against one another to determine when patterns ought to be defined and deformed. what they tend to do instead is to reference a massive database of inputs that have been already “tagged” with a given pattern, synthesising those to create an internal representation of the pattern. this has the obvious downside of non-dynamism, which can produce unexpected edgecases (e.g. “this ai ought to recognise that this input matches a sufficiently different set of sub patterns from what is normally matched as ‘dog’, and so it should define a new pattern ‘cat’ (or reshape the already-existing ‘dog’ and ‘cat’ patterns accordingly). because it can’t do that, however, what is obviously a ‘cat’ to us will be instead tagged as ‘dog’”). there’s of course been effort in this regard, but without a broad pattern-base it can’t really work effectively

the human brain seems very well adapted to metaphorical processing, with specialised regions for similar patterns being formed of and linked together by tiny, asynchronously-operating processing units, allowing humans to pattern-match stimuli in parallel, and to redefine the locus/size of patterns or create new patterns by re-wiring these processing units to create different connections

modern computer architectures, by contrast, are still pretty bad at this sort of task. they are obscenely quick at linear/monolithic/data-dependency-having pattern matching (RE: church-turing model, “recognising a language” == matching an unambiguous pattern), and they are obscenely quick at purely parallel processing (RE: GPUs, matrix ops, fuzzy pattern matching). the “brain-style” model is a sort of middleground between these, though, for which we have got some possibly decent programming models (erlang/elixir maybe?), but still don’t have good physical architectures. there has been push to make the linear systems better adapt to emulating such systems (RE: ILP, SMT, multiple-cores), but a much more fundamental change is, i think, still necessary, and we’ll not see proper artificial intelligences until that point

what impact exactly quantum computing will have in this area i don’t want to get into without being better informed on the different classes of algorithms it can speed up exponentially/marginally/not-at-all. what’s fairly certain, though, is that the cooling and isolation it requires is not compatible with portability or low-power environments, and so this sort of paradigm shift seems necessity regardless

this time’s post was mostly for familiar-with-computers-people, so i apologise if it seems a bit abstruse. computers are a big part of how i understand the world (read synthesise metaphors/create art), and so it’s hard to have a more in-depth discussion that doesn’t use them as reference material. if you got to this point and do want one last computer metaphor to chew on, consider the relationship between metaphor and programming. it’s easiest to see in functional programming, where one takes a set of existing functions (i.e. pattern-set) and synthesises them to create a new function/pattern-set. one can then pass an input (some stimulus) to that pattern-set and it maps to a match against that pattern-set. it’s a bit obfuscated, since the complicated data types returned by a function are… complicated. it becomes a bit clearer, though, on considering that any given resultant data type can be represented by a combination of yes/no boolean values (e.g. "does it match pattern x? 1/0 pattern y? 1/0, 100101101001010….)

and so programming is also an art

or something

i’m also pretty dumb, so let me know why the above is wrong

thanks ^_^

song of the day:

Man Man - Dark Arts (誰か’s auto-gen bullet hell version)