I've been experiencing issues getting Unity to mesh with Github & not pushing thousands of folders into git... but I did manage to get the Vive base stations to sync. :)

In between- Sara has received the bio-sonification module I sent her and now we are getting closer to looking at data coming from the module into Unity via, OSC and how we might (or where) attach this data? One idea is towards a procedural environment and/or shaders working with machine learning Ai. Would it be possible to use the biodata coming in from the fungi and plants towards triggering reactions (not sure if trigger is the correct term) and grow the forest settings, send or retrieve microbes? In this way every experience varies.
I came across the Unity ML-Agents toolkit and have been googling like mad reading up on ML training models for use in a Unity environment. There is this visual unusual (i love it!) game called Source of Madness, created by Carry Castle, where Players traverse an ever-changing dynamic world, battling new procedurally generated monsters each playthrough, brought to life by a powerful machine-learning AI. The monsters are very organic, and super bizzaro looking. I'm now feeling Symbiosis/Dysbiosis needs to be built in this way but with the fungi determining the environment based upon its response to humans.
Starting to look more closely at how to integrate the tactile interface by looking at VR gloves and deconstructing a DIY pair and came upon an Instructable tutorial- Etextile VR Gloves for Vive Tracker By rachelfreire that uses a Vive Tracker. InterAccess has loaned me their Vive tracker for developing - so, this is the pattern I will work with!


Comments