Skip to content
This repository has been archived by the owner on May 27, 2024. It is now read-only.

Add "how do you feel" dialog for capturing user emotions #100

Open
barbeau opened this issue Nov 5, 2020 · 6 comments
Open

Add "how do you feel" dialog for capturing user emotions #100

barbeau opened this issue Nov 5, 2020 · 6 comments
Labels
enhancement New feature or request
Milestone

Comments

@barbeau
Copy link
Member

barbeau commented Nov 5, 2020

Is your feature request related to a problem? Please describe.
We'd like to add a dialog in the app to capture how the user feels while they are listening to music.

Describe the solution you'd like
We discussed today adding a dialog in the app with the title of "How do you feel right now?" that would show 6 faces (neural/same color) similar to the pain scale:

image

...but without any other text.

We would prompt user at the beginning of a listening session and then every 3rd song (don't include skips). Session is "continuous consumption of songs".

We should stop music until they answer question, then resume it.

We should use vector drawables for faces.

Open questions:

  • Allow "not now" button option on dialog?
  • Allow user to choose interval between prompts in settings?
  • How do we handle stopping of music to answer dialog with events - don't log any?
  • Are there material design icons for these faces or do we need to create them?
  • Should face be solid blue with white eyes, or more of a stroke of blue lines without any fill (e.g., white/transparent fill)?
  • There are other dimensions to emotions other than happy/sad (mad, embarrassed, etc.). Could we have a wheel instead of a row to represent these? Is there other literature in this area for collecting emotional data?

Describe alternatives you've considered
Thumbs up/down dialog

@barbeau
Copy link
Member Author

barbeau commented Aug 6, 2021

From today's meeting:

  • We need more than just the two dimensions to capture the various emotional states.
  • Need to prompt user on startup of each listening session to get a baseline measurement before listening
  • Then, have control overlaid on album art unobtrusively to allow user to provide feedback of feeling at any point during listening
  • User can change the setting at any time. On each change, an event is generated with the emotional state as well as the elapsed time within the song so we can capture the part of the song they reacted to.

Some example wheels are below.

From "Human Emotion Recognition: Review of Sensors and Methods" at https://www.mdpi.com/1424-8220/20/3/592/htm:

image

From "How To Use The Emotion Wheel To Better Understand Your Feelings" at https://www.mindbodygreen.com/articles/emotion-wheel, which includes colors:

image

image

Having a similar color-ized control that has various steps for each of the "pieces of pie" (e.g., Tense-3, Calm-5) seems to make sense. Control could be patterned after the behavior of the Android alarm clock interface:

image

@barbeau
Copy link
Member Author

barbeau commented Aug 13, 2021

@antcoder007 Per today's meeting, we'll focus on implementing the faces from the pain scale first, but will add another face so there are 7 total (instead of 6 as shown in the image at the top of the issue).

Google Material Design icons, which we can use as a start for the faces, can be found at:
https://fonts.google.com/icons

We'll want the SVG files to be saved in the project in the /icons folder because we'll likely need to edit some of them to get the faces from the pain scale. So we should have 7 SVGs at the end, one for each face. You can use any vector graphics editor - I've used Inkscape in the past - https://inkscape.org/.

Color of line strokes of the SVG should be black, and we can tint them within the app.

Files should be added to project resources folder in Android vector format - you can convert from SVG to Android vector using Vector Asset Studio:
https://developer.android.com/studio/write/vector-asset-studio#svg

Implementation can be a simple row of 7 faces on top of the album art, each of which has another view with a filled background behind it. All faces start in a de-selected state when a song starts playing. When a user taps on the face, it should change to a selected state and record a Firebase UI event for that face level that includes the normal song metadata (including seek position) (same as PLAY or PAUSE event). The user can tap on different faces to change their selection at any point. If the user leaves the app for a certain amount of time (TBD), we should reset the state to unselected to avoid biasing their mood or responses.

@antcoder007 Let's try to implement this in Jetpack Compose, which just went stable. You'll need the latest stable version of Android Studio Arctic Fox to work with Compose. There are a lot of resources out there for learning Compose. I found this video helpful - https://www.youtube.com/watch?v=g5-wzZUnIbQ - which also has a GitHub repo at https://github.com/philipplackner/MeditationUIYouTube.

So this horizontal layout with the faces can all be implemented in Compose, and you should be able to add this composable within the existing XML layouts without needing to change them to Compose as well.

BEFORE you implement the entire solution in Compose, please try to implement a simple Box in Compose on the album art as a test. Shuttle, which MUSER is based on, still uses some old frameworks, and it's possible that Compose dependencies (or the newer Gradle plugin that is required for Compose) won't play well with them. We hit this type of issue earlier when trying to migrate to AndroidX - see #25.

If we have trouble using Compose in this project, then let's just implement using the good ole XML layouts instead.

@barbeau
Copy link
Member Author

barbeau commented Aug 30, 2021

Example UI to capture emotion in another study in a mobile app:

image

@m-y-u
Copy link
Collaborator

m-y-u commented Aug 30, 2021

Graphical representation of the emotion scale we want to represent/use for effective emotion matching.
Screen Shot 2021-08-30 at 11 07 32

@barbeau
Copy link
Member Author

barbeau commented Aug 30, 2021

@antcoder007 Thanks! Could you please add the simpler X/Y axis here too from the same document? IIRC from our meeting there was a more traditional square graph that was simpler and more along the lines of what a UI would look like. I have concerns with the above being understandable from a user perspective when being used in a UI.

The simpler 2D graph seemed like it would be easier to map emojis to.

@m-y-u
Copy link
Collaborator

m-y-u commented Aug 30, 2021

The other X/Y axis diagram to showcase the above is this:
Screen Shot 2021-08-30 at 13 56 01

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants