Page 40
http://threesology.org
Note: the contents of this page as well as those which precede and follow, must be read as a continuation and/or overlap in order that the continuity about a relationship to/with the typical dichotomous assignment of Artificial Intelligence (such as the usage of zeros and ones used in computer programming) as well as the dichotomous arrangement of the idea that one could possibly talk seriously about peace from a different perspective... will not be lost (such as war being frequently used to describe an absence of peace and vice-versa). However, if your mind is prone to being distracted by timed or untimed commercialization (such as that seen in various types of American-based television, radio, news media and magazine publishing... not to mention the average classroom which carries over into the everyday workplace), you may be unable to sustain prolonged exposures to divergent ideas about a singular topic without becoming confused, unless the information is provided in a very simplistic manner.
Despite cellular dysfunction and death due to aging in an environment which is decaying, a look at memorization is not without merit from a human perspective. As such, we must wonder what type of memorization process(es) takes place in terms of compression and retention of sensory data. Yes, lots of things can effect memory to make it worse or better; yet it does not become exceptionally better, despite present attempts to make it so. Not only this, but increased memorization typically becomes serialized into a given area of interest. There is no large encyclopedic data base with which one can overlap and alter labeling according to context. In other words, analogical comparison is not typically acknowledged as a controllable function. Memories and how they are used become extremely limited.
While psychology has presented many theories on memorization such as long, medium and short term, episodic, etc., and have devised the idea that memory, with respect to time becomes related to one's age such that a young person of 5 sees (remembers) events as being a scale of 1/5th of life and a person of 80 "sees" events on a scale of 1/80th of life... no one appears to be able to recall every single memory... unless they are being compressed into some part of an experience. This would mean that what we experience is being carved up to relate to the reality we are presented in the age we live; thus requiring all sensory data to reflect a truncated memory. Events that we think are bad can become suppressed and overlayed by rationalizations, but this does not necessarily mean they are gone. We humans may not lose any memories, it is just that they are compressed and filed into folders that are later lost (misplaced) to our typically used retrieval mechanism.
How we perceive time is related to memory... since if there is no mechanism of memory, be it of an animate or animate scale of differentiated change (from a binary one to another that need not subsequently recur in the same manner). Time is also related to speed and direction. Distortions, such as impedance in a circuit, is an important control mechanism. So is its alternative labeling called resistance, reluctance (magnetism), opposition, barrier, deflection, etc... If time is viewed as an electrical charge in a circuit, it can be slowed, increased, re-channeled, reversed, and subjected to prismatic alterations... presenting us with the idea of multiple events which have an interconnection... even if we of the present are incapable of discerning an actual point of origin.
With respect to the presumed complexity of integrated circuitry to be needed for an advance AI system, humans do not yet have the capacity to empathize with an active circuit. Humans do not know, can not fully appreciate— how components perceive their environment. We look upon individual components as we might an insect, plant or non-sentient being that exists in the wilderness of a circuit board. We do not think that individual components are affected by sound, gravity, or changes which may occur when an AI system is enclosed in some covering... be it plastic, metal, or some future bio-synthetically grown hide or skin. Just because a tree does not communicate in a human language doesn't mean it isn't alive nor respond to environmental cues that we humans are not particularly perceptive of. So may be the case for electrical devises... even though one might seem a little strange if they treated their cell phone as they might a person or pet... making sure they got fed and were potty trained. An appreciation of perception, such as time and how it is measured, is of importance when working on the development of an AI system, since it may well require us to initially provide such systems with their own version of circadian rhythms... or an electronic version of biological rhythms.
Perception of time as related to a change in events, no doubt occurs differently amongst different life forms, even if we think some life forms are incapable of "consciously" perceiving time. "Consciousness", like intelligence, has different formulas of activity. Unfortunately, human ego all too often only approves of or finds value in that which it compares to its own faculties, abilities and desires. We humans practice a system of up-grading and down-grading the status of a belief like social fashions. For example, if at one time we regard chess players as having some extra-ordinary (and thus highly desirable) mental ability; the advent of computers being able to beat humans at chess may well cause humans to view chess playing ability as but yet a variation of a larger set of abilities exhibited by a few under given circumstances, because a person's brain is "wired" accordingly. They are thus "special" because the circumstances under which they apply their ability is defined as being special. If it were otherwise defined, so would their ability. The game of chess simply gives a particular person who is wired as such, the ability to express (and given notice of) the existence of this type of wiring diagram. It should in no way describe a necessary and needed replication in an AI system... unless all of existence is thought to be set up as a big chess game.
In 1912 one of Pavlov's students (I.P. Feokritova) demonstrated that a dog accustomed to being fed every 30 minutes would begin to drool toward the end of each half-hour period. It was clear evidence of conditioning to time; the between-feedings interval itself served as a conditioned stimulus. That discovery underscores the ever-present periodicity of daily living, especially on the biological level: rhythms of activity and sleep, rhythms of eating and lovemaking. As conditioning intervenes, anticipatory experiences of hunger, fatigue, or arousal serve our adaptation to ecological demands. Allowance should also be made for the daily, or circadian, rhythms in metabolic activity (e.g., daily cycles of temperature change). There is evidence that these fundamental biological functions can synchronize with the rhythmic phases of environmental (exogenous) change. Thus within a few days after a factory worker has been assigned to the night shift, highs and lows of his daily fluctuations of temperature will be inversed. The rhythmic changes in body temperature persists, nevertheless, suggesting an innate (endogenous) basis for circadian phenomena. Such a hypothesis would mean that the gradual establishment of human circadian rhythms of sleep or temperature results from maturation of the nervous system rather than from conditioning in the strict sense. Experiments begun in 1962, in which men lived in caves or other enclosures for months deprived of temporal cues from the environment, also demonstrated the enduring nature of rhythms in body temperature and in sleep–wakefulness. The rhythmic periods, however, sometimes expanded, the subject beginning to live on an approximately two-day cycle without being aware of it. Through conditioning to time and by way of circadian rhythms, human physiology provides a kind of biological clock that offers points of reference for temporal orientation. Perception of sequence and duration The psychological present To perceive is to become aware of stimulation. Awareness of sequence or duration may, at first glance, seem inconsistent with the definition of perceiving. In a mathematical sense, certainly, the present is only a point along the continuum of becoming, an instant when future is transformed into past. Nevertheless, there is indeed a more prolonged psychological present, a brief period during which successive events seem to form a perceptual unity and can be apprehended without calling on memory. There is a perceptual field for time just as there is a visual field. The rate or speed of a sequence determines the limits of the time field. When a metronome tics two or three times a second, one perceives an integral sequence, becoming aware of a rhythmic auditory series characterized by a perceptually distinct frequency. When the ticks come less often, however—at intervals of three seconds, say—the frequency or sequence no longer is perceived. Each physically discrete sound impulse remains an isolated perceptual event; each tick is no longer perceived as belonging to the same temporal field as the one that follows. Similar effects can be achieved by playing a recording of music or speech at a very slow rate. Music or spoken sentences are recognizable only when their elements (melody, rhythmic patterns, phrase) are presented at an optimal speed that permits significant perceptual unity; that is, only when they belong to the relative simultaneity of the psychological present. The perceived field of time also depends on the number of stimulus elements presented. When a clock strikes three or four times, one knows without counting that it is three or four o'clock. At noon one must count; the first chimes no longer belong to the psychological present that includes the last. Most people also can repeat a series of letters or numbers they hear, so long as there are no more than seven or eight elements. This ability varies with the degree of perceptual (e.g., semantic) organization among the elements. While most adults can apprehend only about eight letters, they can grasp and repeat without fault sentences of 20 to 25 syllables (see also attention: Perception and recall). Perception of sequence A series of physically discrete stimuli that impinge too rapidly on a sensory structure (e.g., flashes of light on the retina) may produce perceptual fusion; the flashes will be indiscriminable and will appear to be uninterrupted light. The experience of fusion yields to one of discontinuity over distinctive critical ranges of frequency for some of the senses: visual flicker appears under prescribed experimental conditions at about 60 flashes per second, auditory flutter at about 1,000 interruptions per second, and tactual vibration at about 4,000 pulses per second. These values depend on differences in the persistence of the receptor systems (e.g., how long an image is seen after removal of the stimulus). The question of perceiving sequence hardly has meaning for the senses of taste and smell. Hearing appears to be particularly adapted to temporal perception, since the pattern of auditory excitement shows little inertial lag, closely following the physical duration of successive stimuli. Tactual function can give comparable results, but hearing has the practical superiority in everyday experience of reception at a distance. When two heterogeneous stimuli (e.g., a flash and a click) are successively presented, the critical threshold for passing from perceived simultaneity to an awareness of succession is found for intervals that vary between 0.02 to 0.1 second, depending on the training of the subjects. The maximum interval for perceiving sequence is more difficult to measure. The minimum time intervals are largely determined by the immediate physiological conditions of direct perceiving, while the maximum intervals are obscured by the effects of other cognitive activities. Determining when direct perception ends and when memory takes over is difficult. At any rate, awareness of unitary sequence ceases for pairs of auditory or visual stimuli when the interval between them increases to approximately two seconds. For perceptually organized stimuli (as in a rhythm, a melody, or a phrase) the interval may reach five seconds, as indicated by one's ability to reproduce the pattern. Between the upper and lower limits there are optimal values that seem most likely to produce perception of sequence. In the simple case of two homogeneous stimuli the optimum interval seems to be about 0.6 to 0.8 second. This is inferred from a series of clues: the same interval defines the tempo most frequently adopted in spontaneous motor activity (e.g., tapping, walking) and corresponds to the heart rate. It is the interval that is most precisely reproduced by subjects in experiments; shorter intervals tend to be overestimated and longer ones underestimated. Stimuli repeated at that rate are subjectively judged to proceed most comfortably, without appearing to rush each other as in faster tempos and with no tendency to be separately perceived as at slower frequencies. Sensory deprivation and hypnosis Relatively complete sensory deprivation (such as may be experienced, for example, by persons undergoing prolonged stays in experimental isolation chambers) compresses the experience of time to the point that short or long intervals (from about a minute to a day) seem to pass about twice as fast as usual. Time spent under these unpleasant conditions paradoxically seems shorter than normal time. Thus, the 58 objective days of a subject's first stay in a cave were underestimated as 33 days. Under hypnosis, durations ordinarily are estimated at least as precisely as ever. Time distortion, however, can be readily induced among hypnotized subjects by simple suggestion. Such a subject, for example, may be exposed to two clicks that delimit an objective, 10-second interval but be told that it lasts 10 minutes. On being asked to count objects for 10 minutes, he may report having counted several hundreds without difficulty over what the experimenter's stopwatch shows to have been 10 seconds. Paul Fraisse, Ed.Louis Jolyon West, Ed. Source: "Time perception." Encyclopædia Britannica Ultimate Reference Suite, 2013. |
Please notice that the above article, like so many scientific articles), displays the existence for the conservation of number (quantity/duration). No doubt we will incorporate a similar function in AI systems because we have not yet learned how to effectively compress large "number" distinctions into a usable comprehension. Not only do we have difficulty in dealing with large number (lengths), but also quantities of information. Even though some may have a prodigious memory capacity, we as a species use such behavior to exemplify uniqueness to bolster our ego or apply such to mundane, era-specific interests, instead of trying to develop a means that all of us may share in this ability... that is if such a wide-spread ability would increase the viability of everyone's life beyond the self-centered structural functioning of society as it is now practiced. How do we keep the bad, the corruptible, the evil capacities of humans from being built into AI systems? While computers are not evil, they are built with corruptible and bad elements of behavioral functioning that can be made more evil. When will humanity stop trying to rationalize the development of commerce-increasing, greed oriented games by associating them with some "game-theory" labeling in order to make such a money making scheme appear to be of great social value because of some presumed sophistication? In other words, greed is being cloaked, being packaged under the guise of some fundamental expression of an exceptional intellectual exercise.
If we can misplace (lose) our vehicle keys, glasses, purse, wallet, vehicle in a parking lot, etc., that we frequently use... because we become preoccupied (absorbed) with some other sensory-linked data (thought processing); the way in which we go about trying to recall where the item is has importance to the way in which we will set up AI systems. While some people outright panic and may seek some assistance... while demeaning themselves for such an "imperfect" memory, others remain calm and shrug off the event as a memory that was momentarily less important than what we were involved with. For example, on several occasions I have taken a walk and lost sight of where I was. Nothing about me is familiar. No attempt on my part to force a recall has any effect. I simply continue my journey of thought and eventually come to realize where I am in terms of my body as opposed to where my mind is. I am sometimes so lost in a world of thought that I don't readily respond to those trying to get my attention. While some in the past have thought this behavior is the representation of something wrong with me, others realize I am momentarily taking a longer walk along those paths of diversion they too experience... such as when they are distracted and do not remember passing a particular place along their route while driving to work. Some of us take longer walks in our mind than do most people, just like some of us take longer walks with our body. Some people engage in both types of exercises on very limited terms.
One of these walks is the idea that we are looking into the past as the images of light from distant places in space reach us. This idea was (perhaps) prompted into fruition by the realization that light has a limited speed (186,000+ miles per second) at which it travels... such as in the measured length of a year. Thus, if information in the form of light reaches us now from a position in space many light years away, what we are looking at is some past event. Yet, the recall and retelling of an event (such as a homerun in baseball, a winning shot in basketball, or some unique football play), is an often practiced effort. The problem is, what occurred is recalled and conveyed in a manner which may not be the same as someone else would make. Embellishments and diminished variations of events takes place. Most people appear to generalize while those who stress exactness are sometimes viewed as irritations to those who prefer generalizations and the emotive content because it may provide them with an opportunity for involvement and personal gain. Most people not only engage in small (non-serious) talk, but small (particularized) memorization, and small (generalized) recitals thereof.
The present structure and functioning of society does not permit long, uninterrupted moments of contemplation which permit the development of new insights based on a re-compartmentalization of sensory data. Too much of society is focused on interruption. For those who engage in publicly displayed expressions of "entrenched" contemplation, negative behavioral labels are frequently applied. People have to find some means of permitting themselves the ability to engage in long periods of contemplation (memory activity) by providing themselves with an excuse... such as being perceived as working on a project, taking a walk, engaged in some leisure activity such as fishing, painting, bird watching, bicycling, etc... Engaging in long periods of contemplation requires a person to have a reason that is reasonable to a given observer, or is definable by someone who can then translate others observations into an acceptable definition. If a young person engages in such forms of contemplation, some adult (like a parent or teacher) who wants the child to be fully attentive towards them, may interpret the "distraction" as a mental or emotional problem. If one is old, the notions of senility or dementia may come to mind. This lack of both permitting and encouraging long periods of contemplation is a problem for both society and the construction of AI systems. We are socially taught not to permit contemplative memorization. The type of memorization we are conventionally instructed to engage in is short term. In other words, those involved in the construction of AI systems do not want independent thinkers. They want the systems to pay attention to them and their interests... and not permit AI systems to develop individual interests.
If we take away all of a person's memories, intelligence suffers a loss as well. However... again, we come back to the question of what is meant by intelligence. Lack of verbal communication is not a lack of intelligence. Nor is a loss of hearing. Neither is the lack of learning in a social situation. In other words, one need not have education from a public school to develop some measure of knowledge and intelligence. Being "street smart" is the opinion of some with respect to those who may not have an extensive formal (publicly guided) but have learned to live (relatively) well despite such a disadvantage... though some might argue that the education system is a type of repository of ignorance... as evidenced every four years when a U.S. Presidential election takes place... because it unveils so many problems in the political system... and of voting, candidate selection as well as media coverage.
The political practices used by humans appears to be the exercise of a forgetfulness of past events... and thus social problems continue but are excused as resulting from myriad other circumstances and not a wide-spread loss of memory... because to remember means the political process would be forced to change. The same goes for business and religious practices. Developing a human level AI system may not be possible in such a social environment because we are subjected to a system which enforces varying forms of required memory loss in order to remain as a "normal" person and patriotic citizen who believes in the nonsense that a Democracy is being practiced. The acceptance of social lies requires us to overlook the truth by way of forgetfulness.
If we view human society as a type of organism or AI machine process, both are susceptible to degradations in memory... to such debilitating effects as that which has occurred time and again like that experienced by ancient Rome. The Roman Empire got so large and accumulated so much data, its once functional system of social memory practice deteriorated because the number of connections increased beyond the capacity of the underlying (socially observed) operational code. It was too inefficient... like that we are seeing in both present day societies and computer systems. It was a system incapable of effectively computing large amounts of diverse data, which is being addressed by present AI developments with extremely limited operational parameters using networks of simplistic "trees" and categorization for a given application where regularity does not have to deal with too much anomaly... or creative and original events.
Criminal behavior is a type of anomaly that most societies do not know how to deal with effectively. The U.S. for example resorts to the practice of compartmentalization called warehousing in jails and prisons. It does not know how to alter the system in order to delete the occurrence of criminality before it has a means to develop. Instead, it incorporates criminality into a neurotic form of social structure that produces a situation of ego buttressing for those who do not resort to criminality (at least none that they are detected from having committed). We can not permit advanced AI systems to use a programming pattern which utilizes a U.S. social system form of collection, redistribution and categorization. It makes for a significantly lousy model in which to program a computer's functional architecture. No computer programmer worth their salt would use such a ridiculous system of programming. Any who would do so would be viewed as deranged.
Memory Abnormality Any of the disorders that affect the ability to remember. Disorders of memory must have been known to the ancients and are mentioned in several early medical texts, but it was not until the closing decades of the 19th century that serious attempts were made to analyze them or to seek their explanation in terms of brain disturbances. Of the early attempts, the most influential was that of a French psychologist, Théodule-Armand Ribot, who, in his Diseases of Memory (1881, English translation 1882), endeavoured to account for memory loss as a symptom of progressive brain disease by embracing principles describing the evolution of memory function in the individual, as offered by an English neurologist, John Hughlings Jackson. Ribot wrote:
The statement, amounting to Ribot's “law” of regression (or progressive destruction) of memory, enjoyed a considerable vogue and is not without contemporary influence. The notion has been applied with some success to phenomena as diverse as the breakdown of memory for language in a disorder called aphasia and the gradual return of memory after brain concussion. It also helped to strengthen the belief that the neural basis of memory undergoes progressive strengthening or consolidation as a function of time. Yet students of retrograde amnesia (loss of memory for relatively old events) agree that Ribot's principle admits of many exceptions. In recovery from concussion of the brain, for example, the most recent memories are not always the first to return. It has proved difficult, moreover, to disentangle the effects of passage of time from those of rehearsal or repetition on memory. Source: "Memory Abnormality." Encyclopædia Britannica Ultimate Reference Suite, 2013. |
Subject page first Originated (saved into a folder): Thursday, November 13, 2014... 5:50 AM
Page re-Originated:Sunday, 24-Jan-2016... 08:51 AM
Initial Posting: Saturday, 13-Feb-2016... 10:59 AM
Updated Posting: Saturday, 31-March-2018... 3:54 PM
Herb O. Buckland
herbobuckland@hotmail.com