ARL is part of a Israeli consortium, aiming at 'next generation' neurofeedback: i) novel signatures of neural circuits using network analysis, machine learning and simultaneous fMRI-EEG recordings, and ii) 'intelligent' interfaces based on virtual environments and virtual reality
ARL takes part in the underlying scientific studies behind Beam Riders, a startup company developing a cloud-based solution for enhancing the learning process using advanced neurofeedback techniques.
ARL is part of a team, led by Dr Yotam Luz and Yariv Binder, aiming at providing a new communication technology for ALS patients, based on the detection of eye and muscle activity from EEG data. The team has been selected in the first stage of the Prize4Life competition.
Video 360 for research on human-machine confluence
Lab member Daniel Landau is working on several projects using video 360 productions for studying human machine relationships in the context of the human body.
The vere project is an Integrated Project aimed at dissolving the boundary between the human body and surrogate representations in immersive virtual reality and physical reality. Dissolving the boundary means that people have the illusion that their surrogate representation is their own body, and act and have thoughts accordingly. The work in VERE may be thought of as applied presence research and applied cognitive neuroscience, and it would also significantly add to scientific knowledge in these areas.
BEAMING is the process of instantaneously transporting people (visitors) from one physical place in the world to another (the destination) so that they can interact with the local people there. This project brings today’s networking, computer vision, computer graphics, virtual reality, haptics, robotics, and user-interface technology together in a way that has never been tried before thereby transcending what is possible today. The goal is to produce a new kind of virtual transportation, where the person can be physically embodied interacting with people who may be thousands of kilometres away. Moreover, this is underpinned by the practical utilisation of recent advances in cognitive neuroscience in understanding the process whereby the brain represents our own body
The ShanghAI Lectures are a global education project presented by the Artificial Intelligence Lab of the University of Zurich. The first lecture series on embodied - natural and artificial - intelligence was broadcast via videoconference from Shanghai Jiao Tong University. It involves more than 20 universities world-wide and over 250 students who participate every year. The students work together in interdisciplinary and multicultural teams on group exercises in a virtual world. In cooperation with an international team of researchers headed by Dr. B?atrice Hasler, we use the ShanghAI Lectures as a research platform for large-scale international field studies on avatar-based collaboration in virtual worlds.
Fostering Free Will
Research and Enhancement of Self-Initiation of Deliberate Behavior through Gaming Environments (lead by Dr. Son Perminger, sponsored by the Templeton Foundation)
Presence in Virtual Reality
Presenccia - Understanding the neuro-physiological correlates of being present (having a place illusion) in virtual reality, an EU project
Virtual Worlds as a public sphere
collaboration with University of Las Vegas, Nevada
Using physiological signals to evaluate task-performance in HCI
Vered Shachaf, Israeli Dep. Of Defense project
Research bots in Second Life
Brain & Body Control Interfaces