Zander signed the contract with The German Agency for Innovation in Cybersecurity. The deal aims to revolutionize the interaction between humans and machines, plus AI. It could eliminate the need for high-risk brain interventions to control machines.
For this initiative, the startup proposed a project called “Neuroadaptivity for Autonomous Systems” (NAFAS). The project employs a passive brain-computer interface (BCI), allowing users to perform actions without actively imagining them. Zander researchers hope to decode mental states based on brain signals, pinpointing categories transferable to artificial systems.
Ultimately, Zander said it hopes to explore a new generation of machines capable of real-time adaptation to user cognitive and affective states.
“It is our stated goal to redefine the interaction between humans and technology,” said Thorsten Zander, managing director of Zander Labs. “We aspire to systems that can intuitively adapt to the individual user based on their brain activity and to AI applications that learn directly from the human brain.”
More details on the future for Zander Labs
Over the next four years, Zander plans to develop a neurotechnological prototype. It intends for this to extract information from the brain and allow individuals to exchange information with an external system. Through thoughts, users can guide the system to perform a task or acquire new skills.
If successful, the company believes humans and machines can collaborate through the passive BCI. With this technology, they could perform actions, pursue goes and exchange information.
Thorsten Zander highlighted the difference between approaches in the BCI technologies coming out of the U.S. and Europe. He said those out of the U.S. — like Neuralink, Precision Neuro and more — prefer invasive methods. They have more focus on medical applications, Zander added.
The Zander Labs approach opts for non-invasive methods aimed at serving users without restrictions. Thorsten Zander wants to “revolutionize human-machine interaction.”
“The revolution lies in enabling machines to capture and interpret brain data in real-time, providing insight into the current, individual perception and interpretation of the user,” he said. “This allows us to transfer the user’s knowledge, values, and goals into the machine, enabling intuitive interaction.”