Diigo Links

Tuesday, March 03, 2009


Currently reading "Gödel, Escher, Bach" because several people on
slashdot.org suggested it as a good entry-level mathematics book. So,
another non-fiction train-pillow for me. I hope this post to be a
place where I put notes as I don't have much of any other place to put
notes and seem to having nothing much else to write about. Disclaimer,
as they're notes on the text, I doubt they'll be readable out of
context. And if you are familiar with the text, these notes will
probably tell you what fraction of the text I'm actually understanding
(in that formal mathy way).

Preface: Hofstadter's "strange loops" seem well tied to W.B.Yeats' gyres,
Kauffman's "edge of chaos", even Stogratz's "Sync", to name a few.

Ch1

So, Gödel/Turing says there's stuff that can't be computed. But
nevertheless, that stuff is still there. If it can't be computed, and
it's not wrong (error), and it exists, how does it get there? If we
can equate computation with a cause/effect Newtonian understanding of
causality/locality - what's the stuff that transcends that? Quantum
effects seem to only occur in isolation, but it seems like a lot of
these incomputable things happen in our heads. With a nod to Penrose's
ideas linking consciousness to strange-action-at-a-distance, and a nod
to all those who pooh pooh him because it's too hot inside our brains
for any of that, what's left over? (think Loop Quantum Gravity, which
as I understand it, says that gravity lines flow through a dimension
we don't perceive, but that is still causally connected to ours) So
we're flatlanders that have to jump off the page to nut out gravity.
Do we need to do the same to nut out the set of NP-complete problems?
We've already made one jump off the number line with sq root of -1. Do
we wade knee-deep into error-space and further irrational functions to
somehow find the same sort of value that's been recently found in
chaos, randoms, noise, and complexity? If so, does that form start to
move away from things that looks like strings of symbols, strict and
rigorous, and smell more like a flower or sound like a song? (I think
that's where GEB is leading)


Hofstadter likes the Escher images, but with regards to Babbage's Analytical Engine I like Chris Ware's self portrait (at right). It seems to beautifully state the problem with or the limitations of axiomatic reasoning.
db Drag Racing
Gödel's statement that any sufficiently powerful formal system will not be complete and Hofstadter's image of any sufficiently powerful phonograph will have a record that will break it reminds me of db drag racing . Yes, it exists.

Ch7

Hitting a wall with the formal system proposed in Ch7. I need some quiet time to write things down while I'm reading. But, and this is a first, afebrile math dreams. Usually if I have math dreams (a.k.a. nightmares) I'm fevering. But last night I woke up mid-checksum or "removing 9s" or something dreamish.

ChSSSSSSSSS0
I think this is where we're going: any formal system that is sufficiently powerful is incomplete/inconsistant - so where do we come from? i.e. DNA is a formal system.
Just getting to the idea of looking for a TNT (typographical number theory ) system that can show "errors" in well-formed strings. Not through it yet, so I don't know the conclusion, but I suspect I'll find that the closer we get to the windmills, the less they'll look like dragons.


ChX


Hofstadter talks about needing to get computers to chunk in the right way - to get them to see the forest for the trees. He poses the question: we look at a TV and see a woman on the screen, but the screen only contains a projection of dots. We freely stitch them together (chunk) into an image and meaning beyond their individual parts. But which one is more real?
This reminded me of recent research that showed squid don't react much to standard TV images of squid or food on the screen, but if you show them HDTV they do react. The idea is that their vision is so accute (or their persistence of vision is so short) that they see standard TV as a series of still images or even worse of the screen NTSC drawing still images (no word if they let the squid compare PAL and NTSC) - but the HDTV made the difference (funny, I usually can't see much of a difference, but if the squid can see it, I'm sold...)

BlooP and FlooP and GlooP


As a post-script to a later poem I wrote : ( If Α = μ AND μ =Ω THEN Α=Ω )
turns out that μ is not only the sound of Joshu's koan "nonanswer" , but also symbol for a "search without bounds" or mu-operator.

So it reads: If Alpha equals Mu(a Zen sound and infinite search) and Mu equals Omega, then Alpha equals Omega. Clearly unintended layers, but a happy confluence of meaning none the less.

And I did the "wonderousness" test in Excel: Using Cell B1 as the seed:
In Cell A1 "=IF((MOD(B1,2)>0),(3*B1)+1,(B1/2))" then copy this down several hundred rows. Then B2 = "=A1" and fill down. You can see an interesting example by putting the number 4321 in B1 as the seed. It takes a 171 lines, but it has the property of "wonderousness". (hint: insert a simple line graph to see how the nubmers hop around - which is where the wonderousness shows itself ).
=================================
Hundreds of pages later to come to the understanding; "mathematicians study isomorphisms, but there's got to be more to life than isomorphisms". Don't get me wrong, I appreciate very much, being hand-held through the subtle points of Quine's paradox and Godel-isation.

And mind-moving? The idea of isomorphisms between the image on the back of the retina, and "meaning" or "recognition"/"cognition" has been lolling around in my head. Then I saw the Wade Davis TED talk. He talks about the greater meaning of losing languages (we've lost about 3000 languages in our lifetime) - they're not just ways of talking, but ways of thinking (i.e. the isomorphisms) This brings me back to the question of computer AI - computers are Turing machines, which are by their definition "incomplete". Of this system are we going to expect recognition that, for example, a mountain is a personification of God? Or that a mountain is a pile of rock? Or anything in between?

But, if you compare human intelligence to a computed isomorphism, couldn't you fault human intelligence for that "anything in between", whereas a computed isomorphism is "exact".

There seems to be a plasticity to human intelligence that blobs over the holes in number theory. So a paradox or infinitely recursive function that might confound or disable a computer, is dealt with "out of hand" by people. "yields a falsehood when preceded by itself" yields a falsehood when preceded by itself. "The next statement is false. The previous statement is true"- we run into these sorts of things every day - we just don't know it because we're so good at handling it. Examples of human behavior being illogical are not difficult to find. But even when we think we're being "reasonable"..., see 2004 TED talk by Dan Gilbert "Why are we happy? Why aren't be happy?"

So are we, putting "accurate" up against "plastic"? The "funny" thing is, in Darwinian terms, math wins. People-thinking is new. Math is old. What persists is math over biology. Is biology "inside" math?

No comments: