Announcement

Collapse
No announcement yet.

Survey for a New AAC device

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Survey for a New AAC device

    Hello,

    I am a assistive technology masters student currently doing a design project which would allow dysarthric AAC users with MND to communicate and journal more easily using a transcriber interface. I am curious about how visualising and highlighting parts of conversations in real time may be beneficial to AAC users.

    I have made a short survey inquiring about peoples experience with eye-tracking AAC devices and dealing with communication speeds. I would love to hear everyones feelings and ideas about these technologies and their limitations and use them in my research.

    Here is a link:

    https://redcap.slms.ucl.ac.uk/surveys/?s=D3TMXTFETK


    I am continually amazed by the support and activity this forum has to offer (particularly inspired by the limericks being played)

    let me know your thoughts on the forum as well

    Kind regards,

    H
    Last edited by benevolent-ucjuhdr; 29 April 2021, 14:22.

  • Hi H,

    The link doesn't work.

    My app is a comprehensive care app and may use a speech service. It is more a management tool than pure AAC but can do both.

    I don't use eye-tracking.

    I think most of us use a QWERTY keyboard but find it rather tedious, however the standard AAC device is too limited.

    Comment


      #3
      Hi Graham,

      Thanks for your input about feelings toward keyboard arrangement its a really valid point for AACs! They have done lots of tests regarding personalised keyboards and they have found that there are many more ergonomic arrangements. However they would take 20 hours minimum training for people to get used to them to the point they would be beneficial and brands are too scared to risk that!

      Ive refreshed the survey and it is working on my browser, but ill relaunch if ever it continues to be a problem!
      let me know if it works,

      thanks for your response!

      H

      Comment


        #4
        A question for you, H: have you spoken to AAC (eye gaze users) and identified this need from their feedback or is it coming purely from a development side?

        ​Diagnosed 03/2007. Sporadic Definite ALS/MND Spinal (hand) Onset.
        Eye gaze user - No functional limbs - No speech - Feeding tube - Overnight NIV.

        Comment


          #5
          Hi Ellie,

          Before coming onto the forum I conducted interviews with 2 speech language therapists working with people with MND. With them we identified that conversation speeds were a need and paint point for eye gaze users. This is when we came up with a bookmarking and highlighting text idea. Since then I have been thinking up designs around an interface which would allow MND users to better interact with text through eye gaze.

          Comment


            #6
            Don't take this the wrong way, but I haven't a scooby what your big idea is from your explanation 😏 and I'm a long-term, very experienced eye gaze user, using a DIY solution of a gaming eye tracker with open source mouse emulation software.

            IDK if you want to explain more? Have you seen people with MND use eye gaze?

            Thanks.
            ​Diagnosed 03/2007. Sporadic Definite ALS/MND Spinal (hand) Onset.
            Eye gaze user - No functional limbs - No speech - Feeding tube - Overnight NIV.

            Comment


              #7
              Ellie Id love to,

              put simple the design is intended to make conversations easier for AAC users but also allow them to record and manipulate text.

              I was imagining a device with 2 screens (one for AAC user and one for others engaging in a conversation) and potential glasses for more precise gaze tracking. Upon each screen is a transcribed version of what is being said (there already exists very accurate and rapid audio transcribers using AI - eg. Otter.ai). Using eye tracking, the AAC user can highlight or flag parts of the conversation which they would like to draw attention to and present to their conversation partner using the screen. I am curious if this would improve feedback to their conversation partners making the conversation less frustrating for AAC users. In addition the highlighted text could better inform the predictive text functions which already exist in AAC devices, meaning that more accurate and suitable responses are provided.

              Eye gaze tracking, as you probably know, is used for a lot more than just to move a mouse and click on things. Gaze tracking can also be used to inform the computer of where ones attention is being drawn to - (and in the case of the transcriber, the computer could learn which parts of conversations are salient in real time). Therefore, with the right Machine learning application, the computer would be able to learn from the user's gaze and highlight which parts of conversations AAC users are paying attention to without them needing to do it physically as this would be tiring. I want to explore how this sort of implementation could make conversations easier.

              I was imagining this could be used for conversing but also to rapidly communicate urgent needs. For instance, if a carer lists out options of what could be wrong, and AAC user could rapidly select the answer through their eye gaze.

              Another part of the study is understanding AAC user journaling behaviors and how the interface could facilitate text entry as well.

              I am glad to have someone with such experience be willing to listen.

              let me know what you think and if you would like to participate in the work (through a confidential interview or survey regarding your experience).

              thanks
              H
              Last edited by benevolent-ucjuhdr; 29 April 2021, 19:35.

              Comment


                #8

                Originally posted by benevolent-ucjuhdr View Post
                I was imagining a device with 2 screens (one for AAC user and one for others engaging in a conversation)
                I can see the 2 screens causing problems as many eye gaze tablets/PCs are wheelchair mounted, the lighter and sleeker the better - remember, we transfer in and out of our chairs and move around.


                Originally posted by benevolent-ucjuhdr View Post
                Upon each screen is a transcribed version of what is being said (there already exists very accurate and rapid audio transcribers using AI
                This is the part I don’t understand - does the person with MND not need to have good enough speech for it to be transcribed, or am I missing something fundamental?




                ​Diagnosed 03/2007. Sporadic Definite ALS/MND Spinal (hand) Onset.
                Eye gaze user - No functional limbs - No speech - Feeding tube - Overnight NIV.

                Comment


                  #9
                  ahh great point on the clunkiness of the setup - maybe a flip tablet that can fold over backwards like the Samsung flex would get around having too much on now mount.

                  This is the part I don’t understand - does the person with MND not need to have good enough speech for it to be transcribed, or am I missing something fundamental?
                  RE: The transcriber function was thought up for when speech is lost completely and thus the transcriber would transcribe the people talking to the person with MND and display it to both people on the screens. The text of the conversation could be visual aid guiding the conversation and helping to predict more accurate answers and display where the AAC users attention is.

                  Comment


                    #10
                    Originally posted by benevolent-ucjuhdr View Post
                    The transcriber function was thought up for when speech is lost completely and thus the transcriber would transcribe the people talking to the person with MND and display it to both people on the screens. The text of the conversation could be visual aid guiding the conversation and helping to predict more accurate answers and display where the AAC users attention is.
                    OK, I can't see that being of any use to me - I think you might be trying to solve a problem that does not exist - but maybe someone will say otherwise.

                    I wish you the very best however, it's just not something I'd have an interest in. x

                    ​Diagnosed 03/2007. Sporadic Definite ALS/MND Spinal (hand) Onset.
                    Eye gaze user - No functional limbs - No speech - Feeding tube - Overnight NIV.

                    Comment


                      #11
                      Thank you for your feedback Ellie,

                      I read a lot about AAC users struggling with the speed of conversation and thought id try come up with something, but if this does not apply to you I am very happy for you
                      Last edited by benevolent-ucjuhdr; 29 April 2021, 21:06.

                      Comment


                        #12
                        Originally posted by benevolent-ucjuhdr View Post
                        I read a lot about AAC users struggling with the speed of conversation and thought id try come up with something
                        It's within a group that the speed of chat is a problem and not really in a one-on-one conversation or even one-on-two. How would the transcriber cope with several people chatting and talking over each other?
                        ​Diagnosed 03/2007. Sporadic Definite ALS/MND Spinal (hand) Onset.
                        Eye gaze user - No functional limbs - No speech - Feeding tube - Overnight NIV.

                        Comment


                          #13
                          H I wonder have you spend time with people with MND who use an eye gaze controlled computer all the time? To understand the problems faced I would suggest you contact your local MND coordinator and maybe spend a few hours with some eye gaze users.

                          Good luck with your research

                          Sarah

                          Comment


                            #14
                            Hi Sarah,

                            I would have loved to this year when the project started but with COVID my university prevented all contact with people with MND and anyone else for that matter.

                            We could only interview Speech therapists over zoom, read academic papers and watch videos, which doesnt come close to the lived experience. From there we were to make design portfolios with potential technologies to assist day to day experience of communicating. Was pretty great that we got to meet someone working on the Google euphoria project but still was very hard not being in contact with real AAC users because of Covid.

                            Its been quite challenging being confident of ideas so I thought I would read what some people were saying on the forum and create a survey. Ive had 1 response though so maybe the forums are not the place for this sort of interaction since researchers are naturally inexperienced and it might be seen as offensive. But I thought I would try get in touch anyhow and its been nice seeing the support given amongst you all.

                            all the best,

                            H
                            Last edited by benevolent-ucjuhdr; 30 April 2021, 09:31.

                            Comment


                              #15
                              Originally posted by benevolent-ucjuhdr View Post
                              ... I thought I would read what some people were saying on the forum and create a survey. Ive had 1 response though so maybe the forums are not the place for this sort of interaction since researchers are naturally inexperienced and it might be seen as offensive.
                              Don't take it personally, H, it is not a rebuff to you in any shape or form, nor is asking for input in any way offensive.

                              It's just that the vast majority of eye gaze users have much more pressing needs than to take the time and effort to complete a survey, which may or may not be relevant to them - that includes the effort and accuracy required in copying the non-clickable web link you gave and pasting it into their browser address bar. MND saps us of energy, so we find ourselves rationing it...

                              Yes, it certainly adds to your project's difficulty that you cannot sit with real eye gaze users - I wonder if there is any way of you borrowing an eye gaze system (maybe through an SLT or the Uni?)and seeing for yourself how conversations really happen in practice (you'd have to tape your mouth, sit on your hands and put a kilo weight on your head though 😉😉)

                              x

                              ​Diagnosed 03/2007. Sporadic Definite ALS/MND Spinal (hand) Onset.
                              Eye gaze user - No functional limbs - No speech - Feeding tube - Overnight NIV.

                              Comment

                              Working...
                              X