New simulators can teach machines.
The ultimate computer technology allows developers to create simulations to teach machines. In those models, premade missions are downloaded into computers. And they can help AI to control physical machines in the physical environment.
The high-power computers allow astronauts to make the moon landings virtually before the real missions. In those virtual missions, the astronaut controls the moon lander's digital twin in a digitally modeled environment. If we think about the possibility of utilizing virtual or digital twins of environments and actors, to operate in those virtual worlds. We can teach computers and things like moon landers or aircraft in their operations.
The idea is that the operator makes a mission in virtual reality. Then that system records that mission. And then. The system can download things that the operator makes in simulation into the real mooncraft or airplane. The fuzzy logic makes it possible to make those premade missions more flexible. That means the system can fit the values like engine power for situations like if the capsule release happens at a different altitude from the Moon than in simulations.
"NASA’s Artemis missions, aimed at extending lunar exploration, face new challenges with larger landers that pose greater operational risks. These missions must navigate complex lunar landings and liftoffs in an environment with unique challenges, such as low gravity and a dusty surface. Credit: Patrick Moran, NASA Ames Research Center/Andrew Weaver, NASA Marshall Space Flight Center" ScitechDaily.com/From Apollo to Artemis: Advancing Moon Landings With NASA Supercomputers)
Digital twins are multi-use tools.
Digital twins can be models of physical or virtual objects. Sometimes, we can say that missions and operatives can have virtual, or digital twins. In those cases, a digital twin is a simulation that the operators can use for modeling missions and what they should do. In those simulations, operators can use augmented- or virtual reality to model actions and what they should do in a real situation.
Operators can use digital twins to teach machines what they should do in certain situations. Almost everything can have a digital twin.
People can use those models from Moon missions in everyday life from moonlanding modeling to teaching robots how to clean floors or how to assemble tubes in buildings. The robot bumplers can operate in highly radioactive areas. The digital twin is the ultimate tool for teaching machines.
In that model, a digital twin is a virtual machine-like robot that the operator moves in a virtual room, and then the system drives those virtual trajectories to a real machine. The other version is that the cleaner robot's operator once cleans the floor using the remote controller. Then those trajectories can be recorded in those robots' memories.
The autonomous system can use interactive artificial intelligence for that mission. In that case, the operator just says to AI that it should clean the floor. Then the AI that can run on a server in the command center just uses the cleaner robot's digital twin to plan the routes that the robot must use. The system recognizes things like furniture, and it knows if there are some limitations at certain times, like lunch restaurants, that the robot must avoid during rush hour. The robot can also be like an interactive "dog" that the operator commands using spoken words.
When the operator orders the robot to clean the floor that robot or AI that controls the robot searches the right database from memory. And then the robot starts to make its mission. If those databases are not loaded into the AI's memory the AI can search databases from builder or free internet. Then it might make a digital model about actions and what it must do. And then it introduces this model to an operator who accepts it or fixes it.
Comments
Post a Comment