Robust11.org symposium on inconsistency robustness 2011
actors let us deal with consistency via local arbitration, but we cannot model global consensus.
so how does consciousness work? It's a representation of global consensus. which is pretty much what attention is. a representation of things related to self. it's local manifestation or representation or "assumption" of global consensus.
with many cores, the programmer does not know where the code actually executes. even what kinds of processor runs it. so there needs to be a higher level abstraction (that goes from high level to low level).
in a many core model, goto is fine, it's assignment that is the problem (which comports with representation theory. to keep the assignment in place, it needs to be "held" across a chain of representation. goto is instead, the chain to the assignment. easier. because we can just initiate a cascading message to stable actors to elicit the assignment.
A COMPUTABLE UNIVERSE
Understanding Computation & Exploring Nature As Computation Hector Zenil (editor)
(member of the Turing Centenary Advisory Committee)
petri nets fail as model of physics (well, not really, not for chemistry) because 2 inputs become one output. (20140311: but this is what happens representationally. the failure in chemistry is that we treat small changes to a molecule such as a protein acquiring some small molecule and changing shape as a kind of state change. that the resulting molecule is still the same molecule with some small change. This is a category error. The molecules are different. it is just as correct to say the small molecule is changed but still the same when the big molecule bonds to it. this is representational bias. there are three molecules. a big one, a protein. a small one, like glucose, and when they combine there is a third molecule. in actor terms. the two actors combine into a third actor. the actors take each other as messages. this is the basic conundrum resolved with the basic model and computational chemistry)
messages get passed in-deterministically. if you want to synchronize messages, you need another actor
An actor gets a start message and it sends a go and stop message. the go message comes back and it increments a counter and sends a go message. this recurs until stop is reached, at which time it sends a counter message. the stop can take in-determinately long. the counter may be at 1 or 99 when the stop is received. THERE IS NO REASON THE STOP MESSAGE MUST EXIST EVEN ON THE SAME ARCHITECTURE! the stop message can be a stigmergic message sent out to an external environment and is not received back until it is appropriately returned. note that in this model, the messages are sent to other actors addresses. or in a biochem setting, the message is sent down the axon, far away from the dendrites. the neuron must receive a stop, the actual atoms may be different, it is the same message. we can do the same thing with mark making in the environment, that 'produce' stop signals in our own brains. it's message passing, and levels of message passing.
Address != identity.
0) It Processes
1) it stores
2) it communicates
actors exist in systems.
0) An actor can create more actors
1) An actor can send messages (to addresses)
2) An actor can designate next message action
I would add another axiom.
3) An actor can seek an address. It does this by sending messages to actors to get addresses they know, and then sending messages to those new addresses until it finds an address it wants. it's how we look things up on the web too. seeking/making connections
conceptually one message at a time. But there is no reason to treat all messages in toto (an automata model). so that all messages are treated a one message by the actor.
messages are sent and received randomly. the address space of an actor or actors receive random messages.
note, many different actors could have the same "address" (receptors) and receive messages non-deterministically (arbitrary) and then address is not identity.
Hewitt, Meijer and Szyperski: The Actor Model (everything you wanted to know, but were afraid to ask)