Overview/General Discussion #1
Replies: 10 comments 22 replies
-
Testing comments to confirm access. |
Beta Was this translation helpful? Give feedback.
-
Here is the problem statement and scope for this project: I have a number of clients with advanced ALS - no physical movement except eye movement and atypical eyelid movement (cannot do a quick blink). ImproveAbility is willing to purchase parts to create prototypes as well as test the device with our clients to find a solution. |
Beta Was this translation helpful? Give feedback.
-
Let's set some boundaries for the first generation of this. First, I'd like to focus on the input side of this: that is extracting the user's intention, not worrying about what we are going to activate. If we have the trigger in a Raspberry Pi, we can toggle a pin w/a relay or SSR and get an AT switch style output... OR we can play a sound... OR we can send a SMS message... but we shouldn't focus on what the output is for the first prototype. Second, I've seen in the email comments about different scenarios like low-light situations and folks who can't open their eyes without help. While we want to keep those in mind, I think our first solution should focus on a "known solved" situation: using eye-detection (not eye gaze or directional input - just eye detection) to detect intentional slow open and closes (with good feedback to the end-user) to trigger an alert. One nice feature of this is that it doesn't require careful positioning of the device or the user to be able to look in a particular direction: just that the camera can see the eyes well enough to detect them (which is commonly available in open-source computer vision systems). I believe this is something that will help a great many people (far beyond those w/ALS) given that many of ATMakers recipients are eye-gaze users w/CP, SMA, etc. or they are on an inflated trach at night and have no way to get attention. It's true that this may not help those who can't control eye closure (and opening), and I'm all for finding another solution for them (EMG is probably a good one, and not as cost-prohibitive as it seems), but I think we should start with this as a first-pass (or Minimal Viable Product - MVP) and expand from there. One pro for this approach is that low-light should actually be an easy modification w/a camera replacement but it's not necessary for many of the users -- it's a great addition. It's possible that we can use the same approach with a higher resolution camera and machine learning to detect eye-motion under lids, but that seems like a goal that shouldn't be in phase 1. Thoughts? |
Beta Was this translation helpful? Give feedback.
-
@ImproveAbility On the kickoff meeting - would you like me to host it on StreamYard? It allows up to 10 participants and streams to Facebook/YouTube natively. That would give the rest of ATMakers and Assistive Technology FB groups a chance to watch & provide input. |
Beta Was this translation helpful? Give feedback.
-
Here's a thought for hardware for the first prototype. It's a braincraft hat + RPi4. Noe & Pedro made a case for housing them plus a camera and speaker. I have a braincraft hat + camera + Pi4 here and can use it for prototyping (I haven't printed the housing). Obviously, raspberry pi's of all stripes are hard to come by with the chip shortage, but I think within the ATMakers community (and my friends @ adafruit), I can come with a few sets of these. I don't think that TensorFlow Light (or any type of machine learning) is needed for this project at all: simple OpenCV running on the Pi should be completely capable of doing eye detection. I just like this hardware because it has the preview screen, the 5 position input joystick, feedback LEDs, audio output, and expansion for other sensors via STEMMA/QWIIC connectors. |
Beta Was this translation helpful? Give feedback.
-
Project Activate Feedback-Raspberry Pi question I wanted to respond to the concept that Android Project Activate "does this already." I have been playing with it - having an old cell phone mounted in my office for a week now. It is a great option for what it is. But, twice now over the last week, the app quit. The phone stayed on, but when I went to the device, it was on the home screen instead of the project activate app. This is why having a dedicated device is so important for this problem. The users for this device need it to ALWAYS be available - it cannot go to sleep or shut down at any time. This leads me to a question about Raspberry Pi - I have never used one myself, so I am a bit ignorant about them. I know it is a computer, so how reliable are they? Do they need to be shut down on a regular basis or are they ok with being on for weeks or months at a time? |
Beta Was this translation helpful? Give feedback.
-
Pi’s are very reliable and yes, they can run for months and years. If the device remains simple and not connected to the internet then there’s no reason for updates that cause issues etc. |
Beta Was this translation helpful? Give feedback.
-
I wanted to throw out some factors/concerns for our first conversation. Although I completely agree with Bill's comment, there may be some factors which take away the simplicity of that solution. I am repeating some of them just to have them all in one place.
|
Beta Was this translation helpful? Give feedback.
-
I'd like to reset this to what Antoinette originally asked for and what we need to focus on: she repeated it in her last comment:
This is not about all of the other things that we'd like to do for people with ALS/SMA/TBI/etc. It's about solving a specific problem: when a user cannot access their primary communication device, they need to alert their caregiver. Specifically we're talking about folks who, when their primary device is not available, do not have access to switches or other input devices--they have to use their eyes in some way. This doesn't cover everything that Project Activate does: and for some folks Project Activate will provide much more functionality. But for others, having that Android device watch their eyes and face may interfere with their primary communication device (which may also track their eyes or use a head mouse, etc.). Antoinette originally asked for a device that watched for a user to "look in a specific location" long enough to trigger an alert. I asked if that could be replaced by deliberate blink (and wink) patterns that had a low probability of occurring randomly. That's how this started. These discussions of alternatives is great, but there are few things that should be critical:
I'm also fine with this discuss spawning five separate projects that all try to solve problems in this space. Wouldn't that be great? However, for our initial kickoff, let's try to solve the one Antoinette originally asked. |
Beta Was this translation helpful? Give feedback.
-
I have an idea that I wanted to record here. Let's think of this as an eye switch - some sort of eye activation of switch closure. We sell Smartbox devices, which have a remote power switch that is switch adapted. I do think that other AAC devices have a port that can be set to turn on/off the device with switch activation. So, the point is there are already ways to restart the computer - the issue is using only eye abilities to create a contact closure. At least that is the way I see it. |
Beta Was this translation helpful? Give feedback.
-
Here's a place to discuss what we need, start discussing plans, etc. Anyone can post to the discussions if they have an account. Committing to the code/files section needs teem access - ask Bill Binko.
Beta Was this translation helpful? Give feedback.
All reactions