The new humanoid robots are more fantastic than nobody thought. The BMW's new humanoid manufacturing robots are impressive tools. When humanoid robots work in manufacturing platforms under the dome of the full-scale WIFI transmission and the control of the same supercomputer, they can form multi-level morphing neural networks. They can communicate with supercomputers or with each other. And that is the impressive thing.
The robots that understand accents are easier to control using spoken words. They can understand natural languages, and people's natural way of communicating. In regular robotics, the user must use grammatically correct language. But modern robots and computers have started to follow orders, that accent-using users can give. The first portal in those systems is the speech-to-text application that transforms spoken words into text. That it drives to the robot's control.
The system requires only an accent wordbook that can translate orders to literal language and commands that the system must follow. The new ultra-fast processors can drive AI that can remove unnecessary words from the text that the speech-to-text application makes.
So the system uses the same method as translation programs. And translation programs make it possible. That user can give orders to robots using their language.
But if those robots are equipped with the sense of touch and the remote VR headsets and systems that bring a sense of touch to the human nervous system from the virtual reality. That thing allows the robot can transmit all its senses to the human operator, who smells, sees, touches, hears, and even tastes the same things as robots.
The BCI (Brain-Computer interface) VR (Virtual reality) headsets make it possible. That system transmits even tastes from robots to users. The BCI must just know the brain area, where certain signal belongs, and it can transmit a sense of touch and smell.
The system can also use similar systems that transmit a sense of touch to the system, connected with the tongue. The system can also use bio-printed togues with living neurons. They are connected to the computers. Researchers can use the bio-printed olfactory coil connected with microchips in that mission. Maybe in the future. We have laboratories where cloned tongues and olfactory coils can be used as chemical sensors. Those sensors can send their data through the Internet all over the world.
This thing is called robot-based augmented reality. In factories, the operators can use one robot that manufactures the first car. And then those operators can scale that model over the network.
https://www.freethink.com/robots-ai/general-purpose-robots
https://www.freethink.com/ar-vr/device-hacks-nervous-system-to-bring-touch-to-virtual-worlds
https://www.freethink.com/ar-vr/galea-beta
https://scitechdaily.com/1000x-faster-ultrafast-photonics-chip-reshapes-signal-processing/
Comments
Post a Comment