When starting to think about an under-represented part of the body, I thought of a few different body parts - the stomach, the jaw, and most interestingly the tongue.
I tried thinking of ways the tongue could be an interesting metric of how we feel about a certain situation. The thought was maybe there is some unconscious reaction we have with our tongues to certain stimuli. Like does the tongue retract when we are at a loss for words or even something like does it get physically smaller so that we taste less of it. All kind of interesting thoughts about how our body reacts to things.
I talked to Dano about my thoughts and he brought up “Cortical homunculus”
The pictures above represent a human but if our body parts were scaled to the level of sensitivity. Looking at this image we can see that our tongue is hypersensitive to a lot of things. It’s very finely tuned to changes in heat, pressure and taste. It’s also relatively the most powerful muscle in the body!
With that in mind I then thought about grinding my teeth and how at certain times I tend to push against my teeth maybe in times of stress (?). This behavior is apparently called tongue thrusting and can be a sign of nervousness. The thought here was can I analyze the amount of force I’m putting onto my front teeth and correlate that to the activity I was doing. I.e was I pushing when I was focused on writing this blog, or watching a stressful episode of the sopranos. It would be interesting to see if there would be any meaningful data that would come out of this.
So for this week I purchased a couple of different strain gauges and flex sensors to attach to a mouthguard to try and extract some data on how my tongue is acting up when I’m not paying attention to it. This will be interesting too because I wonder how much I’ll change my behavior while wearing it because it doesn’t really just blend in.
When first thinking about how I could track my subconscious mind, I thought about how Dano made an interesting point of how like even though we control our eyes we dont really control our eyes. They kinda just wander over to where they want. NFL Redzone came to mind. It’s a way to watch multiple football games happen at once. I was thinking about how when you have this multi content experience, is there a way to selectively listen what my eyes want to watch without having to pick up a remote and physically choose “I want to listen to game 3”
I feel like there could be a lot of interesting applications to this. Like a situation where you had to be on 2 zoom calls at once but really only selectively hearing the one that you’re actively focused on.
Another thought I had was it would be cool to see if we could (using some combination of electrodes) to try and create the sensation of flavor - that took me down a whole rabbit hole into mimicking that. See examples here and here which even though I don’t like VR very much I could see having useful applications especially in the world of assistive tech.