2007/09/17

Taking Notes: 'A Few Notes on the Culture' (1994, 2004) by Iain M. Banks

本文是 Iain M. Banks 本人親身說法,闡述他心中關於 Culture 這個星際烏托邦文明各方面的種種設定。最初是由 Ken MacLeod 代表作者的名義發表於 rec.arts.sf.written 的新聞群組。從這一篇文章中,讀者一方面可以對 Culture 的背景有著更進一步的瞭解,另一方面也可以驗證 Banks 的超科技烏托邦設想是否周延。


由於我個人的惡趣味,筆記的部分仍以政經哲學思維為主,科技泡泡就不摘錄了。



閱讀版本:

Iain M. Banks, "A Few Notes on the Culture," in The State of the Art (San Francisco: Night Shade Books, 2004), pp. 167-188.


劃重點:

p. 167

The Culture is a group-civilization formed from seven or eight humanoid species, pace-living elements of which established a loose federation approximately nine housand years ago. The ships and habitats which formed the original alliance equired each others' support to pursue and maintain their independence from the political power structures—principally those of mature nation-states and autonomous commercial concerns—they had evolved from.

p. 168
The Culture, in its history and its on-going form, is an expression of the idea that the nature of space itself determines the type of civilizations which will thrive there.

The thought processes of a tribe, a clan, a country or a nation-state are essentially two-dimensional, and the nature of their power depends on the same flatness. Territory is all-important; resources, living space, lines of communication; all are determined by the nature of the plane (that the plane is in fact a sphere is irrelevant here); that surface, and the fact the species concerned are bound to it during their evolution, determines the mind-set of a group-living species. The mind-set of an aquatic or avian species is, of course, rather different.

Essentially, the contention is that our currently dominant power systems cannot long survive in space; beyond a certain technological level a degree of anarchy is arguably inevitable and anyway preferable.

To survive in space, ships/habitats must be self-sufficient, or very nearly so; the hold of the state (or the corporation) over them therefore becomes tenuous if the desires of the inhabitants conflict significantly with the requirements of the controlling body. …… In space, a break-away movement will be far more difficult to control, especially if significant parts of it are based on ships or mobile habitants. ……
p. 169
Concomitant with this is the argument that the nature of life in space—that vulnerability, as mentioned above—would mean that while ships and habitants might more easily become independent from each other and from their legally progenitive hegemonies, their crew—or inhabitants—would always be aware of their reliance on each other, and on technology which allowed them to live in space. The theory here is that the property and social relations of long-term space-dwelling (especially over generations) would be of a fundamentally different type compared to the norm on a planet; the mutuality of dependence involved in an environment which is inherently hostile would necessitate an internal social coherence which would contrast with the external casualness typifying the relations between such ships/habitats. Succinctly: socialism within, anarchy without. This broad result is—in the long run—independent of the initial social and economic conditions which give rise to it.

Let me state here a personal conviction that appears, right now, to be profoundly unfashionable, which is that a planned economy can be more productive—and more morally desirable—than one left to
p. 170
market forces.

......

Intelligence, which is capable of looking further ahead than the next aggressive mutation, can set up long-term aims and work towards them; the same amount of raw invention that bursts in all directions from the market can be—to some degree—channeled and directed, so that while the market merely shines (and the feudal gutters), the planned lazes, reaching out coherently and efficiently towards agreed-on goals. What is vital for such a scheme, however, and what was always missing in the planned economies of our world’s experience, is the continual, intimate, and decisive participation of the mass of the citizenry in determining these goals, and designing as well as implementing the plans which should lead towards them.

Of course, there is a place for serendipity and chance in any sensibly envisaged plan, and the degree to which this would affect the higher functions of a democratically designed economy would be one of the most important parameters to be set…but just as the information we have stored in our libraries and institutions has undeniably outgrown (if not outweighed) that resident in our genes, and just as we may, within a century of the invention of electronics, duplicate—through machine sentience—a process which evolution took billions of years to achieve, so we shall one day abandon the grossly targeted vagaries
p. 171
of the market for the precision creation of the planned economy.

The Culture, of course, has gone beyond even that, to an economy so much a part of society it is hardly worthy of a separate definition, and which is limited only by imagination, philosophy (and manners), and the idea of minimally wasteful elegance; a kind of galactic ecological awareness allied to a desire to create beauty and goodness.

......

It is, of course, entirely possible that real AIs will refuse to have anything to do with their human creators (or rather, perhaps, the human creators of their non-human creators), but assuming that they do—and the design of their software may be amendable to optimization in this regard—I would argue that it is quite possible they would agree to help further the aims of their source civilization (a contention we’ll return to shortly). At this point, regardless of whatever alternations humanity might impose on itself through genetic manipulation,
p. 172
humanity would no longer be a one-sentience-type species. The future of our species would affect, be affected by, and coexist with the future of the AI life-forms we create.

The Culture reached this phase at around the same time as it began to inhabit space. Its AIs cooperate with the humans of the civilization; at first the struggle is simply to survive and thrive in space; later—when the technology required to do so has become mundane—the task becomes less physical, more metaphysical, and the aims of civilization moral rather than material.

Briefly, nothing and nobody in the Culture is exploited. It is essentially an automated civilization in its manufacturing processes, with human labor restricted to something indistinguishable from play, or a hobby.

......

Where intelligent supervision of a manufacturing or maintenance operation is required, the intellectual challenge involved (and the relative lightness of the effort required) would make such supervision rewarding and enjoyable, whether for human or machine. The precise degree of supervision required can be adjusted to a level which satisfies the demand for it arising from the nature of the civilization’s members. People—and, I’d argue, the sort of conscious machines which would happily cooperate with them—hate to feel exploited, but they also hate to feel useless. One of the most important tasks in setting up and running a stable and internally content civilization is finding an acceptable balance between the desire for freedom of choice in one’s actions (and the freedom from mortal fear in one’s life) and the need to feel that even in a society so self-correctingly Utopian one is still contributing something. Philosophy matters, here, and sound education.

Education in the Culture is something that never ends; it may be at its most intense in the first tenth or so of an individual’s life, but it goes on until death (……). To live in the
p. 173
Culture is to live in a fundamentally rational civilization (this may preclude the human species from ever achieving something similar; our history is, arguably, not encouraging in this regard). The Culture is quite self-consciously rational, skeptical, and materialist. Everything matters, and nothing does. ……

......

An understanding of the place the Culture occupies in the history and development of life in the galaxy is what helps drive the civilization’s largely cooperative and—it would claim—fundamentally benign techno-cultural diplomatic policy, but the ideas behind it go deeper. Philosophically, the Culture accepts, generally, that questions such as “What is the meaning of life?” are themselves meaningless. The question implies—indeed an answer to it would demand—a moral framework beyond the only moral framework we can comprehend without resorting to superstition (and thus abandoning the moral framework informing—and symbiotic with—language itself).

In summary, we make our own meanings, whether we like it or not.
p. 174
The humans of the Culture, having solved all the obvious problems of their shared pasts to be free from hunger, want, disease and the fear of natural disaster and attack, would find it a slightly empty existence only and merely enjoying themselves, and so need the good-works of the Contact section to let them feel vicariously useful. For the Culture’s AIs, that need to feel useful is largely replaced by the desire to experience, but as a drive it is no less strong. The universe—or at least in this era, the galaxy—is waiting there, largely unexplored (by the Culture, anyway), its physical principles and laws quite comprehensively understood but the results of fifteen billion years of the chaotically formative application and interaction of those laws still far from fully mapped and evaluated.

......

This is where I think one has to ask why any AI civilization—and probably any sophisticated culture at all—would want to spread itself everywhere in the galaxy (or the universe, for that matter). It would be perfectly possible to build a Von Neumann machine that would build copies of itself and eventually, unless stopped, turn the universe into nothing but those self-copies, but the question does arise; why? What is the point? To put it in what we might still regard as frivolous terms but which the Culture would have the wisdom to take perfectly seriously, where is the fun in that?

Interest—the delight in experience, in understanding—comes from the unknown; understanding is a process as well as a state, denoting the shift from the unknown to the known, from the random to the ordered… a universe where everything is already understood perfectly and where uniformity has replaced diversity, would, I’d contend, be anathema to any self-respecting AI.

接下來介紹生活在 The Culture 的生命體。以下摘錄關於死亡的概念:
p. 177
Philosophy, again; death is regarded as part of life, and nothing, including the universe, lasts forever. It is seen as bad manners to try and pretend that death is somehow not natural; instead death is seen as giving shape to life.

......

None of this, of course, is compulsory (nothing in the Culture is compulsory). Some people choose biological immortality; others have their personality transcribed into AIs and die happy feeling they continue to exist elsewhere; others again go into Storage, to be woken in more (or less) interesting times, or only every decade, or century, or aeon, or over exponentially increasing intervals, or only when it looks
p. 178
like something really different is happening…

Culture statships—that is all classes of ship above interplanetary—are sentient; their Minds (sophisticated AIs working largely in hyperspace to take advantage of the higher lightspeed there) bear the same relation to the fabric of the ship as a human brain does to the human body; the Mind is the important bit, and the rest is a life-support and transport system. Humans and independent drones (the Culture’s non-android individual AIs of roughly human-equivalent intelligence) are unnecessary for the running of the starships, and have a status somewhere between passengers, pets, and parasites.

之後的文章是比較細部的設定,就不多列舉了。

No comments:

Related Posts Plugin for WordPress, Blogger...