The Millennial Project 2.0
Register
Advertisement

The origins of the digital infrastructure employed in Asgard colonies would lay in the service bus architecture of the Modular Unmanned Orbital Laboratory - MUOL, which itself borrows from concepts and technology employed in the Aquarian Digital Infrastructure as well as Bifrost program launch systems command and control architectures and which would likely form the basis of vessel bus architectures for all spacecraft and facilities in the Asgard phase. The chief unique characteristics of the Asgard architecture compared to that of the MUOL would be a multiplicity of specialized-purpose triply-redundant optical network backbones accommodating the large scale of habitats and a much wider reliance on wireless bridges operating both inside and outside the habitat environment and linking to the solar-system-spanning Deep Space Telemetry and Telecom Network - DST&TN.

As with the MUOL, most functional systems of the EvoHab settlement would employ IP based ‘web controllers’ using modular code which present their direct control interfaces in the form of virtual control panels based on basic web based protocols. (today this web controllers employ HTML, XML, and Java for this purpose. By the time of Asgard this will likely be superseded by new software technology, but the principle would be the same) Higher level control would be employed through the use of ‘sequencer’ control programs running from other computers which use these same IP interfaces with a byte-code command language to communicate with individual controllers under then and which present their own set of higher-level virtual control panel and display interfaces, accessed in the same way via the network. In this way every system in the settlement can be accessed –with appropriate security limitations– from any location through a dynamic web of virtual control panels and using command console displays –PADs and consoles– of generic design.

Computer systems of the Asgard phase would likely be based on the same Distributed Computer architectures using dynamic gate array based Homogenous Processor systems developed from Foundation to Aquarius and on, these also using the IP network as the basis of their systems integration. By the time of Asgard’s large settlements, this technology would be quite robust and employed as the basis of the civilization’s most advanced computers, employing a high degree of direct photonic systems integration and the latest in nanofabricated microelectronics –perhaps based on orbital manufacture. However, the hardware architectures of these systems will remain quite minimalist compared to the clockwork swiss army knife architectures of today’s personal computers, their systems architecture more dependent upon software than hardware and facilitating the on-orbit manufacture of a relatively small family of generic digital systems components with an infinite variety of applications.

The IP network used for command and control systems would be an exclusive one but function in an otherwise passive way in respect to the systems its hosts. There would rarely be a direct or ‘hard wire’ interface between any of the settlement systems –which tends to result in rather cumbersome and overcomplicated messes of cabling in traditional spacecraft– and most hardware would employ ‘black box’ designs relying largely on their virtual control interfaces and a fairly generic designs in a small number of standardized form factors, allowing for more flexibility and potential repurposing. This is a somewhat radical departure from contemporary spacecraft design, but with earlier MUOL and MUOF facilities employing the same architecture completely free of direct human physical intervention, the engineering of self-configuring and self-healing network architectures would be well established and robust enough to eliminate whole network failures even with radical systems damage, future generation optical network systems being able to employ optical power distribution on their data links in parallel to conventional power interfaces. With habitat computers employing Distributed Computer architectures capable of the same kind of dynamic and automatic architectural flexibility, it would be almost impossible for total network and systems failure. Virtually every digital device in the habitat could function in some way as back-up for others. In effect, the Asgard digital infrastructure would represent an accumulation of systems that simply can never be turned-off as a whole and are constantly self-maintaining their own integrity in response to changes.

With network backbones divided between command and control, ‘domestic’ communications and information systems (ie. personal data and communications, home appliance communications, and entertainment), and other applications areas like industrial control, Asgard settlements would deal with an extremely broad spectrum of systems integration and communication, its networks on par with that of any major terrestrial city of the time. And these network backbones would each be complimented by a wireless backbone extending from inside the habit to a vicinity of some miles in radius outside and around it –and then even farther by bridge to the DST&TN. Bandwidth demand on these wireless backbones may be nearly as great as with their optical fiber counterparts, since they would be integrating some of the most sophisticated teleoperated robotics of the age. Indeed, with the emergence of AI and their use of Avatar Remotes and other means of communication, many more wireless systems would be matching telerobotic systems for communications needs. This is potentially a bottleneck issue for Asgard whose solutions cannot be completely anticipated at present.

At the level of the habitat resident experience, reliance on Distributed Computer architectures in most digital systems would lead to a progressive generalization of systems hardware design and its submergence into the infrastructure of our habitat. By the time of Asgard this will be well-developed, the ‘personal computer’ having become a quite abstract concept with systems that are at once ubiquitous and largely invisible except during maintenance work. PADs or ‘personal access displays/devices’ will, from Aqarius on, become the common physical interface for personal computing and on Asgard represent the basic form of all control, personal communications, and entertainment devices.

Here on Earth today we have become accustomed to a ridiculous proliferation of overspecialized user interface devices intended to lock-up market shares through exclusivity of design and connectivity. This sort of Industrial Age baloney won’t fly in space where material efficiency in lifestyle and all activity is paramount. There will only be competition in digital systems platforms and architectures when new iterations of systems or radically new technology are devised and test-introduced. So, in general, a small family of very generic IT devices will be designed to serve a very large spectrum of uses with variations based only on functional issues or personal customization.

The most common of all user interface devices would be the tablet touch display PAD and the similar –usually fixed-mounted– touch console combining display, touch surface, camera, microphone, and speakers. These will vary chiefly in size and a choice between rigid and flexible displays and one-piece and clam-shell designs and will vary in range of common applications by the ergonomics associated with their size. This device will likely see similar ubiquity first on Aquarius with its pursuit of Distributed Computer development. However, by the time of Asgards PADs will be fabricated with the benefit of more advanced disply technology –most likely based on advanced variations of organic LED technology with touch sensor technology deeply integrated into the display matrix electronics– as well as the early generations of new flexible nanomembrane based elastomerics and rigid diamondoid materials. Thus they will tend toward large area flexible displays with more of a display-only role or smaller area rigid monolithic construction of great thinness and strength, perhaps well under .5cm yet as rigid as quarter-inch steel.

(today the notion of flexible pocket displays has been popular among portable computing designers, largely as an attempt to reconcile the desire for display real estate against the design for smaller form factors. But this overlooks the simple fact that such displays can’t function in a touch-control mode very well and, as has been well known by more sophisticated designers and now is being demonstrated by the Apple iPhone, multifunctionality with expanding interconnectivity, not multiplicity of pocket junk, is the more powerful trend in portable device use. The dominant trend underlying portable device evolution is, in fact, pointing to increasingly generic devices based on the simple PAD model)

Such PADs will be everywhere in the typical habitat, many fixed-mounted to function as group displays and control consoles. In fact, the entire inner-hull area of the large EvoHab, with its special light-transmitting surface, may be employed as a public display-only PAD. In order to accommodate such ubiquity, PADs are likely to feature common methods of fabrication and common electronics hardware independent of size, calling for manufacture-on-demand techniques for displays that allow them to be freely varied in size from one to the other while using the same machines and processes to make them.

The next most common mode of user interfacing in Asgard would be simple conversational interfaces using microphone and speaker systems distributed about the habitat and integrated to nearby display devices, much like the talking computers of the spacecraft of SciFi. As synthesized speech and speech recognition steadily improve, this will become a very simple and economical technology (because most of its overhead is in software) and thus Asgard is likely to employ conversational computer interfaces as a simple means of hands-free computer control throughout the habitat. With the advent of AI this may become extremely sophisticated, though is not likely to be as widely used as one might expect for the simple reason that people tend to prefer more discrete communication for general activity and personal communications. A common error in futurist projections about communications is overlooking the role of privacy. This is why the video-phone –now an icon of paleofuturist folly– remains perpetually in the future even though we have now had the ability to make it work for several generations. Though useful in some contexts, the video-phone is simply not discrete enough for interrupt-driven communication and may never –even as far in the future as the Solaria phase– become ubiquitous in use except in the much more passive mode of the Virtual Window-Wall Display. Similarly, routine use of conversational user interfaces will suffer the same limitation. In a work setting it would be as invaluable as having an extra set of hands. In a personal communications setting, it would tend to be cumbersome and annoying. It is a common mistake of computer designers to think that people treat their computers as personalities. In fact, routine computer users regard their machines as extensions of themselves. Conversational user interfaces can be a barrier to this if crudely designed. We don’t need or want our tools exhibiting and asserting their own personality in competition with our own intentions. Thus as AI is implemented, in any of its forms, it will most practically be implemented in rather passive modes of interface and discrete modes of autonomy –machine intelligence with the sensibility and sensitivity of the classic English butler.

However, where conversational computer interfaces are more likely to become ubiquitous is in combination with another communications technology now emerging and likely to become well developed over the Aquarius phase; sub-lingual speech recognition. Employed in the form of neck bands in combination with ear-phones like those of today’s Bluetooth headsets, this may become as ubiquitous a communications device as the cell phone is today, allowing for silent –and thus discrete– conversations with both computers and with other human beings based on synthesized duplication of the user’s actual speech and live language translation with passive AI assistance. Such systems will most certainly be implemented with the Asgard BioSuit and may be a ubiquitous personal communications device of the age, possibly combing with HUD-type eye displays based on small flexible gradient-index optics arms attached to the earphone unit hosting eye-tracking retinal projection displays to allow more full PAD-like functionality.

Another likely common user interface device would be the Personal Satellite Assistant type of Remote, now being experimented with by NASA. Initially based on beach-ball and softball sized microgravity-mobile robots of spherical form, these are likely to evolve into pocket-sized machines and larger panel-like units that function very much like any other PAD but can be used in a PSA mode to serve as a self-mobile camera and communications device and as a self-mobile information display able to hold position in free-space. They would function much like a video-phone and feature a conversational user interface. And they may also operate in a teleoperated mode in conjunction with another PAD and with small robotic tools, allowing them to be used to enter, navigate, and perform some limited activities in areas inaccessible to a human technician. Such devices are also likely to become the basis of robotic pets, fashioned in the forms of fish, jellyfish, fantasy animals, or more abstract and luminous forms. And they are also likely to be the first form of Avatar Remote used by sentient AI personalities residing in the larger habitats.

Virtual Environment Interfaces are also a likely common form of user interface device for Asgard communities. Initially developing from computer entertainment technology and telereobotic control systems, these are likely to be commonly used in similar roles on Asgard settlements. As noted in the articles on Asgard dwelling design, one-person workstations with a tubular or prismatic geometry are likely in microgravity facilities and this will likely translate to the design of teleoperation workstations for external robot and Remote operation based on hemispherical or cylindrical ‘halo’ displays using flexible display technology with hand-held controls at their base –a variation of the CAVE (CAVE automatic virtual environment) concept of immersive virtual environment interface. These are likely to evolve into progressively more generic virtual environment interfaces with progressively more minimalist hardware designs used both for operating remote machines as well as for operating in the Virtual Habitat that, along with the Internet in general, will be just as much a part of the space dweller’s lifestyle as it will be for those on Earth. At the same time, wearable retinal display and sublingual user interface devices will expand in capability and peripheral sensory and rudimentary neural interface options. A far cry from the goofy VR helmets of the late 20th century, these will compete with the CAVE approach for convenience while being especially useful in a ‘merged’ environment mode where wearable displays function in a HUD-like fashion. In addition, immersive interface rooms may become common based on total-surround display systems –a concept likely to be explored with facilities even in the Aquarius phase, most likely for situation rooms and group entertainment facilities. And by the time of Asgard we may even see the first comprehensive passive neural interface systems allowing a VEI to be based on a direct link to the neural processors of our natural senses, though this is likely to be a very complex –and potentially massive in system hardware– technology for some time. Overall, many variations on these basic approaches to virtual environment interfacing will be explored with a general competition in technology developing between convenience of casual use (since entertainment will dominate virtual habitat applications) and sensory performance.

Though it’s still difficult for many to imagine today, Virtual Habitats may play an increasingly significant role in future culture, though they are not likely to parallel the structure of the Internet in the way envisioned by past ‘cyberpunk’ science fiction writers. These environments will be hosted by, but be largely independent of, the Internet as a whole and play a primarily social role, the Virtual Habitat providing a bridge across space and time for the general society, a new medium of mass entertainment, and, ultimately, a native habitat for future generations of artificially intelligent beings. Communications latency may impede interconnectivity for space-based extensions of the Virtual Habit, resulting in a fracturing of its virtual geography and slower paces of growth for these more remote virtual domains due to their dependence on user populations for scale. However, should AI evolution move swiftly as many expect, sentient AI populations –likely emerging later in the Asgard phase– in space habitats may grow rapidly compared to Earth. This because these individuals will have distinct advantages over organic humans in space (they can travel by telecom and have very low life-support overhead) while the physical isolation of space (and sea) settlements could serve as security against possible violent persecution by neo-Luddite and psuedo-religious anti-AI movements on Earth that might emerge in parallel with AI technology –especially given the anti-science, anti-technology, and anti-reason trends in the contemporary religious right-wing sub-culture.

Considering this, Asgard residents (as well as Aquarian residents by this time) are likely to be very comfortable navigating between two distinct but parallel worlds of equal social and cultural importance; the physical habitat of their colonies and the Virtual Habitats hosted by them and eventually with their own special communities of ‘colonists.’ In effect, by this stage of development TMP may become a program of colonial expansion into and integration between two universes; one physical and one virtual. We will discuss this in more depth later in articles concerning Transhumanism and TMP.

Parent Topic[]

Peer Topics[]

Phases[]

d v e ASGARD
Phases Foundation Aquarius Bifrost Asgard Avalon Elysium Solaria Galactia
Cultural Evolution Transhumanism  •  Economics, Justice, and Government  •  Key Disruptive Technologies
References
Life In Asgard
Modular Unmanned Orbital Laboratory - MUOL  •  Modular Unmanned Orbital Factory - MUOF  •  Manned Orbital Factory - MOF  •  Valhalla  •  EvoHab  •  Asgard SE Upstation  •  Asteroid Settlements  •  Inter-Orbital Way-Station  •  Solar Power Satellite - SPS  •  Beamship Concept  •  Inter-Orbital Transport  •  Cyclic Transport  •  Special Mission Vessels  •  Orbital Mining Systems  •  The Ballistic Railway Network  •  Deep Space Telemetry and Telecom Network - DST&TN
Asgard Supporting Technologies
Urban Tree Housing Concepts  •  Asgard Digitial Infrastructure  •  Inchworms  •  Remotes  •  Carrier Pallets  •  WristRocket Personal Mobility Unit  •  RocShaw Personal Mobility Units  •  Pallet Truck  •  ZipLine Tether Transport System  •  MagTrack Transport System  •  BioSuit  •  SkyGarden and SkyFarm Systems  •  Meat Culturing  •  Microgravity Food Processors  •  Pools and Baths in Orbit  •  Solar Sails  •  Plasma and Fusion Propulsion
Advertisement