2D/3D Virtual Cognitive Paradigms

GIF references: Video clips taken by H Muzart, showing 3D simulation made by H Muzart, using Unity developer tools, Unity Platform assets, and CCBY images.


Over the last 100 years, cognitive stimuli paradigms have been invented to test all sorts of cognitive functions. These range from simple to more complex ones (for example: static visual bar lines, audio-visual autobiographical videos of emotional faces, to real environments and 3D virtual games). These are made to test and isolate specific functions under controlled conditions.


There are reason for why I focus on visual stimuli as opposed to other senses (e.g. auditory, olfactory, gustatory, tactile) - including my more extensive experience with vision. I am also interested in its integration with auditory information, partly motivated by my own hearing impairment and having studied it extensively.


Since 2014, I have been very interested in how paradigms are designed. Since the early 2000s, with my (non-professional) experience in competitive Video Game playing, 3D CAD/CAM, skills in conceptual game design development, web design, and experimental psychology (see www.Harry-Muzart.info and my shared Private Cloud Drive Folders), I have been inspired to develop novel paradigms. In fact, from my last 2 decades of being immersed in the advancements in video games GUI technology and the industry's international community, it has influenced a lot of my thinking in how games are made more immersive, interactive, educational, entertaining, and potentially addictive.


Also see the Babylon.JS #60 simulation, Apps for Human Wellbeing/Development and Online Human Experimenting for the use of PsychoPy, Google tools, and others. Since 2016, I have worked on an environment (currently in R&D beta mode) (mainly using Unity3d.dev and MS OS-based C/C++/C# script elements), that could, at the UI level, be used to test various cognitive functions:


  • vision

  • reward-based learning

  • episodic memory

  • decision making



I need to test benchmark it for validation before full deployment and I have data collection forms ready for feedback.


----------


Future works will include:

  • Translating the Unity simulation (which currently runs on MS Windows OS and in-browser), to Android and AR/VR-compatible versions.

  • Using the Unity and Unreal 2021.x engines. This will also involve integrating my work with the VR 'Metaverse' and Web 3.0 (e.g. developed by Facebook, Microsoft, Google, etc) (2021 onwards).

  • Combining these with my other work in machine learning, see my AI agent models and some of my git repos, and other tools that look into vision-based reinforcement learning, relational spatial semantic reasoning, etc (e.g. DeepMind [RL] [RL] [Unity] ), and using existing algorithms for various purposes (see Application of Deep Machine Learning).




References: TBC


Webpage image captions: TBC


3D simulation using Unity engine

Unity images
2D/3D Virtual Cognitive Paradigms [Images]

3D Simulation using Babylon.JS


Browser-based JavaScript coding:


from https://playground.babylonjs.com/#L92PHY#36



< embeds.sites.google.com/internal/iframe?targetUrl=https%3A%2F%2Fplayground.babylonjs.com%2F%23L92PHY%252336 >


Other apps (2D/other)

For this, see link: Apps for Human Wellbeing/Development, for works developed for Android, Windows, etc.


Using Psychopy, etc

Main gen XTR
CognTech General Spreadsheet Database