Service Design & the Shmoody Community
How I incorporated Social Work, Participatory Design, & UX Methods to develop a set of “V1” processes for the Shmoody team to engage with users in the in-app community space.
Project Information
Employer
Shmoody (Moodworks Inc.)
Year
2022
Contributors
Shmoody’s Head of Product, the development team (4 developers), Shmoody’s CEO, and Shmoody community members (users)
My Role
UX Researcher & Designer; Service Designer
Methods
In-app surveys
Affinity diagramming community posts
User, Subject Matter Expert (SME), and community ambassador interviews
Literature review
Competitor analysis
Participatory design
Goals & values alignment
Iterative process development
Project Summary
Context
The Shmoody community is an in-app space for users to participate in a forum with different topics related to mental health, respond to daily questions, and create new connections with others who may have experienced similar challenges or triumphs in their mental health journey.
The Shmoody app was first launched in April 2022 and I was working as the sole UX Designer on a small team that fluctuated from about 6-10 people depending on business needs. At the start of my employment, I worked mostly on a project-to-project basis with a UX focus. As I matured in my role, I brought in more of my social work experience and asked for more ownership in establishing processes and examining the larger mental health app space in order to inform the research, design, and backstage processes related to the community.
The Challenges
Our “playbook” (processes) for the community space needed to evolve because our team members were spending a lot of time in the app posting and replying to users and moderating content on a case-by-case basis.
We noticed that some of our users were struggling with unique mental health challenges like self-harm, eating disorders, or expressing suicidal thoughts or intentions. We agreed we didn’t know enough about how we should take specific actions (and when to take them) in order to create a supportive environment and respond appropriately to users. My task was to develop a set of go-to processes that my team could use to maintain the positive and vulnerable tone in the community we had worked hard as individuals to establish.
There are not many “go-to” references for how to set up an ethical and engaging community space in a mobile application. We needed to discover what standards did exist and go with our best “V1” effort in order to start somewhere.
We Wanted To…
Learn from users to understand how we could create a safe and ethical digital environment without over-moderating them or discouraging vulnerability and connection
Align on how much weight our values, business goals, standard practices we’d observe in competitors, and feedback from users should influence our processes
Create a teachable and clear set of “V1” processes for community management that would allow us to reduce uncertainty in our responses and to invest time in other areas of product development
Results
Understood some of the necessary touch-points of the in-app user experience we needed to address (or create altogether) according to market, Subject Matter Expert (SME), and user research related to community spaces
Understood some of the gaps in our service delivery and created several deliverables to assist the team with the processes around community management
Learned more about how to support users in the community and incorporated findings into our processes and in-app experience
I have highlighted a case example about how we addressed the topic of self-harm throughout the page below to showcase some of the methods I used to work on a “V1” set of processes and other deliverables to guide our team’s engagement with the community. We used many of these same methods in response to different needs that arose as the community grew.
Discovery Research
I needed to better understand self-harm in general and how we might incorporate processes related to this topic in our in-app community space. At the start, I didn’t have a wide-enough lens to ask my team or SMEs targeted questions, clearly state the problem/challenge, or start co-creating solutions. I tried to keep an open mind and avoid jumping to conclusions, despite having a background in social work.
Discovery research methods
Pulling insights from various user research channels, including:
In-app TypeForm Surveys
Relevant in-app community posts
Ongoing (weekly) user interviews
Reviewing research articles on the subject of self-harm and digital spaces
Conducting several competitor analyses
User Research: In-App Surveys
I incorporated several in-app surveys using TypeForm which allowed me to collect general and feature-specific feedback. As I built out our UX Research repository in Condens.io, I started tagging these survey responses along with quotes from user interviews. This added another dimension to our familiarity with user asks/needs and gave me an opportunity to connect with them and ask clarifying questions.
💡 Insights
We got several responses from users who were expressing concern for other users, which matched the sentiment I found when talking with users in interviews
A few users asked for the ability to flag concerning posts so Shmoody could take action; this reminded me that users expect us to have a process that is automated and triggered within-app by users
User Research: Community Posts
I kept track of posts and replies from users related to their experience in the app and any suggestions or feedback they had. As I worked on design projects, I continuously incorporated text from these posts into our tagging system. I knew that having access to the community posts gave us an advantage in knowing our users better. I was better equipped to develop processes for the community because I was actively participating with my own posts and replies daily.
You can see a few example posts I collected and sorted into a collaborative, work-in-progress board that I shared with team members to show how I was developing processes and how our users were influencing this work. I found that looking at the posts in the context of how they appeared visually in the community was better than only looking at text samples.
💡 Insights
There’s a theme of users marking their own posts with a “trigger warning” or TW. This shows they have concern for other users but also still feel vulnerable enough to share what they’re struggling with in a genuine way.
Some users are using posts as a way of staying accountable or updating the community on their status around self-harming behaviors and other mental health challenges
From a qualitative perspective, our users have created a very supportive community and are doing a lot on their own to respond to other users who are in struggle and we should not underestimate other users as a resource
User Research: Weekly User Interviews
To get continuous feedback and keep the team updated with the most pressing user needs, I incorporated a way for our users to sign up for paid interviews once they had shared feedback with us in one of our in-app surveys. This way, I was talking with users who had already expressed interest in sharing their thoughts and were actively using the app.
I used an expandable user interview discussion guide each time I spoke with one of our users. I used a semi-structured format so I could intentionally ask questions related to current projects but also adapt my questions to suit the app areas each participant was most engaged with.
I devoted hours to speaking with different types of community users: those who used it daily, those who had specific concerns, and those who were on the fence about sharing. I asked questions about their communication behaviors in the community and questions that would help me learn more about their perceptions.
Project goal addressed ✅
“Learn from users to understand how we could create a safe and ethical digital environment without over-moderating them or discouraging vulnerability and connection”
💡 Insights
The most interesting insights came from asking questions about posts that caused users to pause or posts that caught their attention. For users who had more experience in online spaces, I was able to ask more specific questions about the parts of the experience that made them feel encouraged (and safe) to share with others on personal subjects.
Literature Review
One challenge of working for Shmoody was that there wasn’t always a guide or a standard to follow on how to design and support an ethical and engaging community space. Many mental health apps (especially those that aren’t prescription digital therapeutics) are only “bound” by what the Apple or Google Play stores set as acceptance criteria. Therefore, I felt it was important to wear my social worker hat and explore what the recent peer-reviewed literature had to say about self-harm or creating safe and engaging digital community spaces.
When I searched for scholarly articles to review, I made sure they were published recently (no earlier than 2013) and included the following keywords in my searches: self-harm, suicidal intentions, felt-safety, community, social media, support forums, youth, internet use, online, mobile, and chat room. I screened through several articles until I found 5 that seemed the most relevant to the topics we had questions about.
💡 Insights
According to the literature, certain self-harm language could potentially introduce a new unhealthy coping mechanism. We need to decide as a team which kinds of discussions we can allow, and which we will have to remove because of potential harm.
Articles urged the importance of community forums emphasizing that they are not crisis resources and that those experiencing crisis should contact emergency services or a hotline.
Several potential benefits for users in safe community spaces, including:
reducing social isolation & finding informal support
being encouraged to seek additional support when needed
finding coping strategies
connecting with others who’ve had similar lived experiences
Some potentially harmful conversations around self-harm could include:
describing self-harm as a coping mechanism
sharing explicit details about the frequency or nature of the self-harm
discouraging other users from seeking more formal support around self-harm by minimizing the seriousness of self-harm or normalizing self-harm behaviors
Competitor Analysis
Competitor analysis is one of my favorite UX methods because it often brings up new questions or reveals project opportunities and challenges I hadn’t considered. In the context of the community research, if I found a component related to an app’s community user experience that all competitors had implemented, I knew I would recommend it for our app too. This took some of the guesswork out of “a place to start” with such a complex design project. I set a broad inclusion criterion as a base-line for community-related research since there still aren’t a ton of apps that offer similar features to Shmoody.
Inclusion criteria:
must offer an in-app community space (where “community space” is defined by users’ ability to create posts or interact with other users’ posts in a public forum)
must have a visible/accessible set of guidelines or rules & privacy policy
Topics explored:
competitors’ “stance” on allowing or denying minors access to in-app community spaces, minor-specific language in the privacy policy, and user experience related to specifying age
language & key elements competitors use in the community guidelines (sometimes referred to as “rules” or “community standards”), as well as any visible actions (processes) they take when their users violate guidelines
competitors’ inclusion (or exclusion) of language & processes around self-harm
💡 Insights
Although other apps and community forums make it clear to users what types of posts or replies are allowed, it’s harder to find information on the “backstage” processes that their teams use to moderate their community spaces or what in-app interactions happen when users violate the rules
There were specific types of guidelines/rules that were included across the apps I compared; this made it possible for me to discern what competitors deemed absolutely necessary and have a place to start from
Other apps wrote in their guidelines, very clearly, that they did not permit the discussion of details related to self-harm; self-harm was often grouped under the umbrella of “harm to self or others”
Research Question 🧐
In addition to elements related to user experience, what can I discover about the processes our competitors use to create safe and engaging community spaces?
Market Research
When I was exploring the regulation of content in mental health app community spaces I did not find a set of standards that we could follow. This finding is in line with what I discovered from the literature review: it seems that much is still unknown about what constraints (if any) should be in place for mental health apps. The benefits associated with digital community spaces (connecting with others on lived experiences, the potential for suicide prevention, and the therapeutic nature of expression) contrast with the concerns (being exposed to unhealthy coping mechanisms or mistaking a forum for a crisis resource).
💡 Insights
Many mental health apps lack disclaimers about the collection of user information and do not declare that they are not crisis resources.
There is currently no governing body to oversee and regulate app development and availability (specifically related to mental health).
Two factors make it difficult to conduct long-term research on the evidence-based effectiveness of mental health apps: the low retention rate of new users & the rate at which new apps are released on the market
Opportunities Revealed by Research
We can differentiate ourselves from competitors by using an informed self-harm strategy: we can provide a place for users to make connections and receive support, while also trying to discourage unhealthy conversations that could potentially harm users. We have an opportunity to include specific self-harm resources.
We have an opportunity to clearly define where the line is for our users, so they feel encouraged to share and express themselves but are also aware of the expectations and consequences around specific topics like self-harm.
A small percentage of mental health apps clearly declare that they are not crisis resources and do not provide resources for users who are in crisis. Few apps in the “health and wellness” category in the app store make it clear exactly how user information is collected or used. If we make it clear to users exactly what kind of help our community space seeks to provide, without claiming to be anything we are not, we will already be one step ahead of many competitors.
We have an opportunity to respond to the needs of our community. Since the mental health app space is still largely unregulated, we can monitor the types of conversations happening in the community and continue to talk with users and SMEs about how to improve our service delivery (for example, adding more specific resources or implementing a new process for moderation).
UX Democratization & Collaboration
The insights I gathered from discovery research allowed me to start drafting my deliverables early on to collaborate with stakeholders. I used the participatory design methods discussed below in order to…
1) bring stakeholders into accessible draft spaces, like Google Docs or Whimsical boards (outside of the research repository) where I could encourage them to contribute their thoughts and ask about any roadblocks to their participation
2) help to proactively scope out the ongoing work and communicate with my team about development asks.
I have found several benefits to starting and sharing work-in-progress deliverables (Google docs, journey maps, wireframes, etc.) early on in the project:
journey maps and user flow diagrams help me understand experiences, communicate concepts visually, & engage with stakeholders
early drafts show the gaps in my knowledge early on, so I don’t come to the end of the project and realize I have missed something big
making drafts available to my team (and asking for feedback) makes the UXR process visible and encourages collaboration
Participatory Design & Social Work
Participatory design is a method that is used to intentionally co-create, co-operate, and co-design with stakeholders, instead of presenting a design solution to them without their input and participation. Participatory design exists on a spectrum of stakeholder involvement and can be successfully carried out with a range of structured or semi-structured design activities (like brainstorming sessions or even having stakeholders directly contribute to designs or journey maps).
I was motivated to incorporate two types of participatory design into this project (user-centered design & co-design) to address the vastly different needs of 3 Shmoody stakeholder groups: the Shmoody team, users themselves (community members), and our Shmoody ambassadors.
User-Centered Design Methods
Case reviews & online, collaborative whiteboard sessions
Goal & vision alignment
Co-Design Methods
Interviews with community ambassadors
iterative process development
Case Reviews & Collaboration
Our “North Star” vision for the community space:
Create and contribute to an awesome, supportive, & positive community space that fosters human connection by giving users a place to give & get support, feel welcome & encouraged to share, have fun, and stay accountable to each other & themselves.
💡 Insight
There is value in conversations on topics that seem basic to the team. We thought we had a shared understanding of our north star, but this 30-minute task really helped us get aligned.
Project goal addressed ✅
“Align on how much weight our values, business goals, standard practices we’d observe in competitors, and feedback from users should influence our processes”
Goals & Values Alignment
We wanted users to feel supported while preventing others from seeing content that could be unsafe for them. We held scheduled meetings to discuss our goals and stay aligned with our values.
Design With: Community Ambassador Interviews
I held special interviews with members of our community that had volunteered to be ambassadors. We wanted a genuine feel in the community that wasn’t “forced,” but I also wanted to make sure we were supporting our ambassadors as they supported others.
💡 Insight
I gained some valuable insights from ambassadors and found real value in being transparent about my intentions with the meetings. Once the ambassadors knew we were looking for ways to better communicate with and support them, the suggestions and feedback started flowing.
This work has resulted in…
A Community Playbook - an expandable document that details processes (including template responses) for the Shmoody team to use in our interactions with the community.
New resource listings - I had previously made a page on our website with resources for users experiencing crisis. This work highlighted the need for more extended resources (including some specific to self-harm). In response to this research, we also added the resources inside the app experience itself and encouraged users to explore their options.
Community Guidelines (user facing) - a document that lists our values and informs users about what violates the rules (ie, anything that could cause harm to other users). I included self-harm-specific guidelines since my research had revealed that many users felt directly negatively impacted by posts from others on self-harm topics.
Conclusions
Looking forward…
The Shmoody community processes have since evolved as we’ve come across different challenges. We believe it’s necessary to continue to develop these processes to meet the needs of our users.
My Growth
The Shmoody team has trusted me to approach this project as a UX professional and a social worker with overwhelming support, and I’m convinced that the work I completed was better for it. The practice of UX Research & Design can be vastly different from direct practice Social Work, but I believe the social worker’s perspective gives me an edge in thinking about the systems and environments we exist in as individuals.
Incorporating the Social Work Code of Ethics into agile product development has been a big challenge for me because in previous roles I was accustomed to constant evaluation, favoring standards of practice and meticulous work over iterative designs meant to “get out the door.” I have a new-found respect for getting the first testable version out there that I can co-create with the users I’m serving much more than a polished and perfect document, prototype, or process that won’t see the light of day.
I am also growing in my ability to consider methods that are appropriate for project needs while keeping in mind the service-level processes that contribute to those end products. In short, I’m incorporating methods from service design into my UX process.
References
Reddy, N., Rokito, L., & Whitlock J. (2016). What is the link? The relationship between non-suicidal self-injury and social media. Information Brief Series, Cornell Research Program on Self-injury and Recovery. Cornell University, Ithaca, NY. Source
Palmer, K. M., & Burrows, V. (2021). Ethical and Safety Concerns Regarding the Use of Mental Health-Related Apps in Counseling: Considerations for Counselors. Journal of Technology in Behavioral Science, 6(1), 137–150. https://doi.org/10.1007/s41347-020-00160-9
Yates, A., Cohan, A., & Goharian, N. (2017). Depression and self-harm risk assessment in online forums. In Proceedings for the 2017 Conference on Empirical Methods in Natural Language Processing (pages 2968-2978). Association for Computational Linguistics.
Marchant, A., Hawton, K., Stewart, A., Montgomery, P., Singaravelu, V., et al. (2018). A systematic review of the relationship between internet use, self-harm, and suicidal behaviour in young people: The good, the bad and the unknown. PLOS ONE, 12(8), e0181722. Source
Diane, K., Hawton, K., Singaravelu, V., Stewart, A., Simkin, S., Montgomery, P. (2013). The power of the web: A systematic review of studies of the influence of the internet on self-harm and suicide in young people. PLOS ONE, 8(10): e77555. https://doi.org/10.1371/journal.pone.0077555
Mars, B., Heron, J., Biddle, L., Donovan, J. L., Holley, R., Piper, M., Potokar, J., Wyllie, C., & Gunnell, D. (2015). Exposure to, and searching for, information about suicide and self-harm on the Internet: Prevalence and predictors in a population based cohort of young adults. Journal of Affective Disorders, 185, 239–245. https://doi.org/10.1016/j.jad.2015.06.001