The Millennial Project 2.0
Register
Advertisement
Aercam exploded

Remotes are a class of microgravity robots that would be used for a large variety of tasks on the Asgard settlement. They would be divided into two groups, those designed for use within a pressurized environment and those designed for use in the ambient space environment. There may be some cross-over between these groups, though generally there would be significant differences in scale between those systems designed for the harsher environment of ambient space and the more benign environment inside a habitat. Remotes would also be employed terrestrially, usually in the roles of heavy industrial and excavation telerobotic systems and a variety of personal robots intended primarily for some degree of autonomous rather than teleoperated control but capable of that as a design option.

Remotes would be an evolution of the telerobotics technology cultivated in the early stages of Asgard with the development of MUOL and MUOF facilities. There predominately tracked robots –typified by the Inchworm telerobot based on a single multi-jointed arm– would be employed. The key innovation of the more advanced Remotes of larger Asgard settlements would be the use of free space mobility based on the use of nitrogen thruster and/or fan propulsion, modular gyroscopic stabilization, and modular flight/attitude microcontrollers with chip-based inertial sensors. The term ‘remote’ derives from the most common role of these systems as telerobots providing a ‘remote point of awareness’ for a human operator some distance away. This, in fact, would be how most of these systems are used in the ambient space environment. However, as computer based intelligence advances these robots would be increasingly used alternately under human and computer control –relying on remote teleoperation by computer in the same way –and by the same interfaces– as they would human teleoperation. (they may also increase in on-board intelligence, but in general it is more efficient to employ teleoperation from centralized computers that can coordinate the activities of robots in groups and afford a much greater intelligence performance for the cost) With the eventual advent of sentient artificial intelligence, Remotes would become the ‘space suits’ or ‘real-world avatars’ of these new beings –their means of reaching out from the confines of their virtual habitats to interact directly in the physical domain. As we’ll discuss, a special class of remotes for this purpose may be developed with more emphasis on aesthetics, communication, and emotional expression.

The Asgard Remote would be typified in design by NASA Personal Satellite Assistants of today; a series of small spherical robots designed to serve primarily as self-mobile camera platforms and communications devices. More sophisticated and smaller later-generation PSAs may be common personal tools on Asgard and may supersede spherical forms with small block shapes dominated more by their stereo camera and other sensor systems –looking rather like self-mobile goggles or binoculars with fold-out or microprojector displays. These could become quite tiny and easily carried in a pocket when not in use. Other personal-robotics applications would include self-mobile smart-positioning displays and small CarrierBots that serve the role of package delivery. Toys and robotic pets based on this technology are also quite likely. These small Remotes would tend to have limited physical capabilities owing to their low mass and small size. In most cases, their interaction with the surrounding environment would rely on wireless communication with other machines rather than any direct physical contact. Their designs would tend to focus more on communication and information gathering.

Remotes designed for the ambient space environment, for more direct physical activity, and for more industrial activities are more likely to assume designs akin to today’s marine Remote Operating Vehicles, relying on boxy open space-frame chassis whose greater mass and ability to employ larger gyroscopic stabilizers make them better suited to hosting tools and power systems operating for protracted periods in the space environment. Designed as multi-purpose platforms of varying scales but realted ‘series’ of designs, these robots would be fitted with a variety of modular tools according to their particular jobs at any given time. They may also include actuators designed to interface to tracked movement systems.

Most exterior activity on Asgard habitats would long rely on the much simpler and more robust teleoperated Inchworm robotic arms first developed for MUOL support. These robots would simply be stronger and able to handle heavier items and far lower in maintenance. However, with free mobility, these later generation Remotes would be capable of operating at distances of many miles from a habitat and be able to traverse its surface area with much greater speed (indeed, greater than a suited astronaut), affording them very quick access to every spot around the habitat. Applications would include self-mobile tool carriers and power supplies for human technicians, floating debris collectors, plant and farm tenders, heavy maintenance systems, and construction systems. Some advanced forms of these –as has already been explored by space agencies– may assume forms akin to a box-like human torso with multiple actuators and an anchoring ‘tail’ intended to plug-into Inchworm interface ports to attach to the end of Inchworm robotic arms, though for most tasks the full emulation of human ergonomics may represent unnecessary complexity.

Some of the most advanced of all Remotes eventually developed would be those intended as ‘avatars’ for sentient AI beings and designed less for functional performance than for the role of interaction with organic human beings. Often depicted as alien intelligences or comical robots with Pinocchio Syndrome in the classic SciFi media, AI beings, being the products of reverse-engineering human intelligence, are much more likely to be more ‘human’ in character than has commonly been imagined. These beings are likely to be gregarious and social, desiring organic human interaction for exactly the same mix of reasons we humans today do. But their reliance on virtual environments as a primary habitat will be a serious obstacle in this respect. It will long be far easier for organic human beings to access the virtual domain than for AIs to access the physical domain and the limitation of ‘hardware’ bodies compared to the freely expressive software avatars of virtual environments will be a major complication. AIs will likely tend to see the use of robotic avatars as akin to wearing a bulky cumbersome space suit.

Most casual organic human/AI interaction will be through conventional systems of digital communication and media –computer displays the most dominant medium. Thus the likely evolution of avatar Remotes may begin with the Personal Satellite Assistants and be based simply on the strategy of affording the AI a self-mobile two-way display medium. One interesting possibility would be a series of spherical PSA-style designs we refer to –tongue in cheek– as Leottas. Employing a classic PSA form covered in a dense image array of organic LEDs and pin-hole digital cameras (and later other forms of digital volumetric display), these devices would provide a spherical display that attempts to appear invisible or transparent in the ambient habitat environment while displaying a 3D image of an AI individual’s software avatar. Often displaying only the individual’s head, this form of remote gets it’s peculiar name from Madam Leotta, the ghostly gypsy fortune teller from Disney theme parks’ Haunted Mansion who appears only as a floating head inside a crystal ball.

Over time, and with the advent of nanofabrication, avatar Remotes will become progressively more sophisticated in design and eventually trade PSA-like displays for robotics capable of biomimicry and an emulation of the human form and human modes of expression. However, given the physically dynamic means of self-expression common in the virtual domain –the ability to spontaneously change appearance and forms as a means of expressing mood and individuality– AIs are still likely to consider these robotic bodies as nearly as cumbersome as simpler PSA style Remotes and will likely make little attempt to actually wholly mimic human appearance with them. They are more likely to alternate between very mechanistic, utilitarian, and androgynous android forms, shared by many AIs as tools for work like a generic one-size-fits-all space suit, and a large, wild, assortment of psuedo-human or anthropomorphic forms custom-fabricated like works of art to approximate the freedom of physical personalization and dynamic self-expression of the virtual avatar.

Parent Topic[]

Peer Topics[]

Phases[]

d v e ASGARD
Phases Foundation Aquarius Bifrost Asgard Avalon Elysium Solaria Galactia
Cultural Evolution Transhumanism  •  Economics, Justice, and Government  •  Key Disruptive Technologies
References
Life In Asgard
Modular Unmanned Orbital Laboratory - MUOL  •  Modular Unmanned Orbital Factory - MUOF  •  Manned Orbital Factory - MOF  •  Valhalla  •  EvoHab  •  Asgard SE Upstation  •  Asteroid Settlements  •  Inter-Orbital Way-Station  •  Solar Power Satellite - SPS  •  Beamship Concept  •  Inter-Orbital Transport  •  Cyclic Transport  •  Special Mission Vessels  •  Orbital Mining Systems  •  The Ballistic Railway Network  •  Deep Space Telemetry and Telecom Network - DST&TN
Asgard Supporting Technologies
Urban Tree Housing Concepts  •  Asgard Digitial Infrastructure  •  Inchworms  •  Remotes  •  Carrier Pallets  •  WristRocket Personal Mobility Unit  •  RocShaw Personal Mobility Units  •  Pallet Truck  •  ZipLine Tether Transport System  •  MagTrack Transport System  •  BioSuit  •  SkyGarden and SkyFarm Systems  •  Meat Culturing  •  Microgravity Food Processors  •  Pools and Baths in Orbit  •  Solar Sails  •  Plasma and Fusion Propulsion
Advertisement