014-Make a mind

The mind is a representation.  It is not a precursor to representation. We do not observe a mind and then wait a bit, and see the mind making representations. 

We don't see our mind preceding our representation experiences in ourselves, and we do not see the minds of others preceding their representation making.  Before we even guess if a person or a creature or a computer program has a mind, we wait for them to make  representations and demonstrate awareness.  Only then do we begin to suspect if there is a mind.  We encounter representations, in ourselves and others, before we conceive of having a mind as a way to explain the representation making and awareness. 

This doesn't mean that there is no such a thing as a mind.  But the concept of a mind which generates awareness and representations is flawed.  It requires us to assume something that only exists post-representationally.  It does not descend from our first principles but requires other assumptions about the nature of the world as a kind of backward way of describing representation and awareness.   The proposition of a mind requires us to make certain kinds of representations that we cannot show as products of first principles. 

Maybe the body is the concept we need to consider as the source of awareness and representations.  A body that produces awareness and makes representation.  But this concept is just a substitute for mind, with different entailing assumptions.  

A mind, is a good general word to describe what it is we are attempting to make.  An object which thinks, reasons, imagines, communicates, creates, expresses, and explores.   A man made object that is aware and creates representations.   Where I have written artificial mind, I ALWAYS mean it in this vernacular way.  

I never mean that there is really such a thing as a mind, any more than I have meant that there is such a thing as a unicorn.   It could be that a mind will be a computer program or an operating system.  But using that kind of computer lingo to describe the variety of organisms or artificial creations that would or could have minds is misleading, just as using anthropomorphic words to describe computer behavior is misleading.

I propose that a vernacular mind is referred to as a thing which is aware and makes representations.  Such a thing creates and works with representations.  It may interface to the physical world or a representational world via cameras, touch sensors, nerve endings, auditory cilia, or any of a variety of physical to representational transition points.  A mind may do all kinds of processing either with nerve cells, or programs and computer chips, or cellular automata, or some molecular processes.  Or it may be some combination of any or all of these varieties of configurations. 

However a vernacular mind is a reference description only.  It is not specifically the thing that is aware and makes representations.  The name I propose for that kind of thing is a "repper".   A repper is some thing that is aware and creates representations.  Or it is an instantiation of awareness and representation making.   It is representation making performed by an object.

A repper does not have to have a mind in an ordinary sense.  It could be that initial reppers are too primitive to have what we would call a mind.  But what should be common among all reppers, regardless of their complexity, is that they make representations and thus show awareness.   Humans are reppers.  A hypothetical AI like HAL9000 would be a repper.  An octopus is a repper.  An ant is a repper.  

We don't have to propose that an ant has a mind, but we do know ants are aware.  Ants demonstrate awareness in the same functional way I have described here (AW:X = X). 

We know ants produce and deposit chemicals for trail marking and communication.  Chemical markings from ants do not produce the same behavior in all ants or even the same behavior at all times.  ergo, there is a meaning attached to chemical signals and certain ants do different things with those signals.  That is, the chemical markings by ants are representations.  Ants are aware of the chemical signals, not just as chemicals but as representations.  Thus ants make representations by marking with chemicals signals.  (see below)

We see other kinds of representational behavior from homing bugs. (see below).  

We do not have to look very far in the biological literature to find that at least animals with nervous systems demonstrate awareness and representation activity.   That is, they are reppers. 



notes:  Does representation and awareness extend to creatures that do not have nervous systems?  there is some evidence for it.  
http://www.sciencedaily.com/releases/2009/06/090617131400.htm
ScienceDaily (June 18, 2009) — Bacteria can anticipate a future event and prepare for it, according to new research at the Weizmann Institute of Science. In a paper that appeared June 17 in Nature, Prof. Yitzhak Pilpel, doctoral student Amir Mitchell and research associate Dr. Orna Dahan of the Institute's Molecular Genetics Department, together with Prof. Martin Kupiec and Gal Romano of Tel Aviv University, examined microorganisms living in environments that change in predictable ways.

http://discovermagazine.com/2009/jan/071

Top 100 Stories of 2008 #71: Slime Molds Show Surprising Degree of Intelligence

A creature with no brain can learn from and even anticipate events.

by Jennifer Barone

From the January 2009 issue; published online December 9, 2008 

 



Single-celled slime molds demonstrate the ability to memorize and anticipate repeated events, a team of Japanese researchers reported in January. The study [pdf] clearly shows “a primitive version of brain function” in an organism with no brain at all.

In their experiment, biophysicist Toshiyuki Nakagaki of Hokkaido University and colleagues manipulated the environment of Physarum slime-mold amoebas (near right). As the cells crawled across an agar plate, the researchers subjected them to cold, dry conditions for the first 10 minutes of every hour. During these cool spells, the cells slowed down their motion. After three cold snaps the scientists stopped changing the temperature and humidity and watched to see whether the amoebas had learned the pattern. Sure enough, many of the cells throttled back right on the hour in anticipation of another bout of cold weather. When conditions stayed stable for a while, the slime-mold amoebas gave up on their hourly braking, but when another single jolt of cold was applied, they resumed the behavior and correctly recalled the 60-minute interval. The amoebas were also able to respond to other intervals, ranging from 30 to 90 minutes.

The scientists point out that catching on to temporal patterns is no mean feat, even for humans. For a single cell to show such a learning ability is impressive, though Nakagaki admits he was not entirely surprised by the results. After working with the slime mold for years, he had a hunch that “Physarum could be cleverer than expected.” The findings of what lone cells are capable of “might be a chance to reconsider what intelligence is,” he says.



recruitment trails, colonies, territory are all examples of representations.  it is impossible to read this article in a way that does not suggest ants make representations.   The variety of responses by colonies indicates the ants are attending to different objects of awareness and representation that lead to a behavior variety among colonies. 
http://atta.labb.usb.ve/Klaus/art15.pdf




a hierarchy is a representation.  

http://www.sciencedirect.com/science/article/pii/S0003347207000140
Hierarchical use of chemical marking and path integration in the homing trip of a subsocial shield bug
Purchase
$ 31.50

Mantaro Hironaka*Lisa Filippi1Shintaro Nomakuchi2Hiroko Horiguchi* and Takahiko Hariyama*

 Department of Biology, Hofstra University, U.S.A.

 Department of Applied Biological Sciences, Faculty of Agriculture, Saga University, Japan

* Department of Biology, Faculty of Medicine, Hamamatsu University School of Medicine, Japan

Received 15 February 2006;  
 
revised 24 March 2006;  
 
accepted 8 June 2006.  
 
MS. number 8848R.  
 
Available online 6 April 2007. 
 

The female shield bug Parastrachia japonensis provisions its young by dragging fruit to its burrow. Field observations showed that the bug took a winding path when searching for a suitable fruit, but took the shortest route when homing to the burrow. Displaced homing bugs always walked straight towards the fictive burrow, suggesting that they use path integration to orient. After a homing bug neared the entrance of its burrow, it stopped and started beating the surface of fallen leaves with its antennae. To determine whether these bugs use navigational cues other than those used for path integration, when in the vicinity of the burrow, we blocked their sensory organs and presented them with their own burrows in a laboratory experiment. Although nearly all bugs whose eyes had been blocked found their burrows, the antennae-blocked bugs did not. Homing bugs encountering various experimentally manipulated burrows, such as those containing their own nymphs with an alien burrow's substrate or their own burrow's substrate with alien nymphs, entered burrows only if they were made of the original substrate. When we presented bugs with their own burrows along the homing route, they entered their burrows at every homing point, even at the earliest stage of the homing process. These results suggest that the chemical cues marked around a female's burrow can suppress the use of path integration, and that P. japonensis uses cues hierarchically to accomplish precise homing.

Keywords: chemical cue; homing; navigation; orientation; Parastrachia japonensis; path integration; shield bug




previous next