Remote Concerts

From
Jump to: navigation, search

Project Description

This project is about in what way can online concerts be conducted and billed so that visitors are satisfied and artists can earn a living. The main objective of the project is to overcome the in-site aspect of concerts, while keeping all the salient aspects of the real-life experience, for both artists and concert attendants.

Group Collaboration

At the beginning of the project we discussed our strenghts and weaknesses and how we want to collaborate during the project. Belbins Team role model serves as a basic concept how we define our roles.

BelbinTeamRoles.PNG

Background knowledge on music industry

We shared a basic understanding of the Music Industry to help put into a wider context the lived experiences of artists in relation to their industry. In doing so, we understood some of the complexities, in relation to the Music Industry, in operating a concert platform.

MusicIndustry.PNG

Contextual Inquiry

Contextual inquiry can be seen as a combinational use of different methods with the goal to disclose work structures, explore usage and environment of existing technology, and get ideas about future development of a product or system.
It usually can be accomplished within four steps:

  1. narrowing down the question or purpose
  2. choosing data collection method
  3. investigating in the context
  4. analysing and interpreting the findings

Since music events depend on visitors and artists our Contextual Inquiry gets devided on these sub groups.

We define users' perspective as the user experience as part of attending:

  • a virtual music event
  • a virtual non music event
  • a live music event

We define artists' perspective as the user experience as a part of performing within the context of:

  • a virtual music event
  • a virtual non music event
  • a live music event

Objectives

In general we are interested in learning about the user experience of artists and visitors of a virtual music event.
More specifically we want to understand:

Users' perspective Artists' perspective
motives behind attending and not attending a music event (virtually or live) motives behind taking or not taking a gig (virtually or live)
engagement and interaction with other people including the artist from that event impact of audience feedback in virtual and live settings
technological tools behind virtual music events (mostly during the event) technological tools behind virtual music events (preparation and actual event)
experience and improvements to virtual music events experience and improvements to virtual music events

Data Collection

To learn more about the objectives we use interviews, surveys and observation.

method Users' perspective Artists' perspective
experienced in virtual music events (a) not experienced in virtual music events (b) experienced in virtual music events not experienced in virtual music events
surveys 1. What are factors that make live music events enjoyable?
What motivates you to attend a live music event?

2. Have you participated in virtual music events?

In case the users do not have experience with virtual music events they will be asked about any other (non-music) virtual event.
In addition they will be asked the following set of questions (3.b) to understand their motives behind not attending or potentially attending virtual music events

3.b In the past 18 months were you offered an opportunity to attend a music event virtually you would have normally attended in real life?
if yes, what are your reasons for not attending the virtual event?
if no, imagine your favorite artist would give a virtual concert. Would you attend?
if no, what are your reasons for not attending the virtual event?

3. Please indicate your level of agreement with the following statements: The virtual music event I attended…
was comparable to live music events - 5 step Likert scale
Brought new experience - 5 step Likert scale
4. Would you please name a few of the virtual events you participated in?
5. What platform was the virtual event hosted on?
6. What did you enjoy about them?
7. What did you not enjoy about them?
8. What can be improved about them, in your opinion?
9. How much did you engage with other participants in the virtual live music events, in comparison to in the live settings?
10. How was your quality of engagement with other participants in the virtual music events, in comparison to that in the live settings?
11. Please describe a typical interaction you had with other participants of the same virtual music event.
12. How would you describe the main difference between a virtual music event and a live one?

no survey
interviews

1. What usually motivates you to attend a live music event?
2. Have you participated in virtual music event?

In case the users do not have experience with virtual music events they will be asked about any other (non-music) virtual event.
In addition they will be asked the following set of questions (3.b) to understand their motives behind not attending or potentially attending virtual music events

3.b In the past 18 months were you offered an opportunity to attend a music event virtually you would have normally attended in real life?
if yes, what are your reasons for not attending the virtual event?
if no, imagine your favorite artist would give a virtual concert. Would you attend?
if no, what are your reasons for not attending the virtual event?

3. Would you please name a few of the virtual events you participated in?
Let`s talk about your experience with virtual music events...
4. Was it comparable to live music events?
5. Did it bring new experiences?
6. What did you enjoy about those virtual events?
7. What did you not enjoy about them?
8. what can be improved?
9. Compared to live settings, did you...
engage more or less with other participants of the same event?
have better or not as good a quality of engagement with other participants?
10. Please describe a typical interaction you had with other participants of the virtual music event.
11. What makes it (not) enjoyable?
12. To summarize, what do you see as the difference between a virtual music event and a live music event?

1. As an artist, what does it mean for you to perform a gig?
2. I’d like to learn about your experience with concerts. Could you please share with me which aspects of live performance make them enjoyable for you?
3. Which criteria do you consider when deciding whether or not to take a gig? What factors play a role in this decision making process?
4. What does having audience feedback in live performance mean to you? How important is the audience feedback for you? (optional maybe: Could you please share a remarkable experience you had?)
5. Can you tell me about your experience with online performance?
5a) How many online gigs have you had?
5b) What does it mean for you to perform an online gig and how is it comparable to performing a live gig?
5c) Which tools do you need to perform a virtual gig?
5d) What does an enjoyable virtual gig mean to you?
6. What does having audience feedback in virtual performance mean to you? How important is the audience feedback for you?
6a) What does it mean for you to not have audience feedback?
7. What do you think needs to be improved for a better gig experience?

oberservation observation domain: 1. Audience Interaction 2. Audience Feedback 3. Technology


observation questions:
audience interaction - How can people interact?
audience interaction - How often do people interact observably
audience feedback - How can the reaction of the audience be feedbacked to the artist?
technology - How does the technology support the audience experience?
technology - What does the platform allow?
technology - How often do people use functions?
technology - What are disadvantages?

questions to the audience (if possible):
motivation - What brought them here?
experience - What are they (not) enjoying?
forward-thinking - What can be improved?

observation protocol can be expanded based on learnings after the first few observations

Results

In the following chapters participation rates and representative examples of results will be presented for each of the utilized data collection methods. In the interpretation, section will be a description of our insight-making process and the final results of it.

Surveys

Users' Perspective
Survey period: 27th May 2021 - 2nd June 2021
Platform: google forms
Distribution: collaboration with Rave the Planet, a company that organizes festivals
Participants: 60

caption
implementation of user survey on google forms

Interview
Users' perspective Artists' perspective

Interview period: 5th May 2021 - 14th May 2021
Participants: 4
Sample description: experienced in virtual music events (4)

Interviews period: 8th May 2021 - 25th May 2021
Participants: 4
Sample description: queer performer (3); DJ (1)

caption caption
implementation of interview protocols in google documents

Observation

Overall five virtual music events have been visited and observed:
8th May 2021 - Shanghai Community Radio (SHCR)
10th May 2021 - xJazz festival
11th May 2021 - xJazz festival
14th May 2021 - xJazz festival
15th May 2021 - xJazz festival


SHCR.png XJazz 02.png
screenshots of the observed platforms



Link to the observation protocol google sheet: https://docs.google.com/spreadsheets/d/1OPbP6Pw3i8EA2uU3igCzpzu3MsA5_y9Cde5_2QXqS_4/edit?usp=sharing
ObservationProtocol.PNG
implementation of observation protocol on google sheets

Insight-making

Our insight-making process for the interview was as follows: Each of us read through all the interviews and recorded the most important bits of information on a virtual notepad.
Subsequently, we group all the information on a table into different categories. This table serves as a basis for further work like personas, scenarios and focus group.
A similar insight-making approach has been used for the observations. The protocol google sheet was read and summed up into cohesive insights which were then grouped into the three domains.

InsightMaking.PNG
implementation of the insight making process



final interpretation of the results

ObservationInsights2.PNG
observation insights


Link to the Miro-table.pdf: https://drive.google.com/drive/u/0/folders/1awNgOwgO7652lOXdgrelSLz9XZ6mpv0W

JTpdf.PNG
interview insights

Focus Group

general information

date: 14th June 2021
participants: Users (2) and artists (3)
oberserver: Prof. Kolrep


material

  1. script and agenda
  2. Zoom room
  3. Nuudel time slot poll
  4. miro board


Procedures

Item Description Time
Introduction background of the student project
camera and muting rules
confidentiality for recording
2min
Information presentation of most important findings so far 8min
Ice Breaker participants are asked to shortly introduce themselfs, tell what made them happy last week
and then choose the next person to continue
7min
Round Robin The Focus Group is presented with a question. After 1min reflection time, one participant (No.1) is asked to share their thoughts out loud,then pass it on to the next person (No.2).
This person will have 30 seconds thinking time before contributing an additional point, idea, or thought, building off the thought of the person before.


question 1: What makes people want to attend a remote event as an audience?
question 2: What makes artists want to perform a virtual gig?

25min


Brain Storming Participant were asked to write their thoughts about how their ideal remote concert platform would look like on sticky notes in a miro board.
There was no restriction of which aspects can be included, however we gave the following hints features, visuals, functions and service.
10min
Wrap Up expressing our thanks to the participants 3min

Results

Everything the participants were mentioning is potentially valuable to us. Therefore 2 of us were focusing on taking notes. In addition we have a recording of the whole sessions to re-watch.
On top of that we have the miro board with all ideas provided by the participants during the brain storm activity.

Fg brainstorm.PNG
results of the brain storming phase

Modeling

User Stories

Based on our data from Contextual Inquiry and Focus Group we were able to form user stories. After the initial collection was done, we indicated user stories that were adressing the same aspect as another in grey colour, so we could focus on unique aspects.

Userstories.PNG
user stories

Requirements

With the data collection via interview, observation, survey and focus group we were able to identify requirements for our future platform. For some of them we already noted ideas for future implementation in form of product qualities.

ArtistR.PNG
Identified requirements of artists.


UserR.PNG
Identified requirements of users.

Personas

We tried to capture and represent our participants from the different methods into these personas.

PersonasE.PNG
Translation of our findings into three personas, one artist and two user.

Use Cases and Scenarios

In both use cases and scenarios certain keywords are marked. This indicates that the respective aspect relates to an identified requirement.


Pino.PNG
Typical use case and scenario of an artist on GiGGD.



Astrid.PNG
Typical use case and scenario of a simple user on GiGGD.



Pablo.PNG
Typical use case and scenario of an engaging user on GiGGD.

Findings

This is a short summary of our main findings. In the next chapter, these findings translate into tangible ideas for future implementation and a first visualization.

FindingsE.PNG
Summary of our main findings.

Outlook - Future Implementation

ImplementationE.PNG
Future elements of our platform.


Outlook.PNG
First visualization of our platform.

Methods evaluation

At the end of the first part of the project we evaluated the usefulness of the different methods. Each team member rated all methods on a 5 point scale and elaborated on their decision.
We calculated a rounded average and summarized the mentioned aspects.

Method evaluation.PNG
Internal evaluation of the usefulness of the methods for this particular project.

Prototype

Taking up the results from last semester we decided on implementing and testing three main functions of the Giggd platform: Feedback to the artist based on lighting colors, optional chat for engaging users, and a tipping system. For the prototype we used Figma, because it offers a broad range of functionalities and unlimited screens. No matter how extensive our prototype becomes, Figma can cater for it. In the next chapters we will describe the implementation of the main functions. In addition we present a first draft of research questions we hope to answer.


Giggd main.PNG
main screen of the prototype.

Access to the latest version: https://www.figma.com/proto/h4I6jPtcsDJc4A6xf0CJqo/GiGGD?node-id=55%3A371&scaling=min-zoom&page-id=0%3A1&starting-point-node-id=55%3A371&show-proto-sidebar=1


First research questions

Tipping System

  • Is tipping only a feasible business model? (highly impactful but not feasible)
  • Will people appreciate there is a tipping function and use it? (impactful and somewhat feasible)

💡The question above may be done in a post-testing interview.

  • Are the amounts we offer appropriate? (really feasible)
  • If there is manual input, how many people will use manual vs clickable options?

💡The two questions above are connected and can be tested with scenarios (e.g. "imagine yourself in a virtual concert...")

  • How to attract people's attention to tipping and cause action without being too intrusive? (really feasible)
  • Is the tipping function visible to the user (i.e. find the tipping function without observed difficulty within a reasonable amount of time)? (really feasible)
  • When users decide to tip, do they find the tipping function easy-to-use (i.e. complete the task of tipping without observed hesitation and/or difficulty)?

Chat function

  • Is the chat function visible to the user?
  • Are the users willing to engage in conversations taken place in a chatbox?
  • Do users like the existence of a chatbox?
  • Is the chat function easy to use?

💡 Can be tested via scenario tasks like "go talk with other concert attendee" or having a pop up notification for new chat messages

Color Feedback

  • Considering different colours may have different meanings for different people, do our associations of colour-mood resonate with users?
  • Do users understand the intention behind the colour response functionality?
  • Is the amount of colour option we offer "too much" or "enough"?
  • Do people understand the purpose of the colours?
  • What kind of emotions do concert goers usually have?

💡 Ideas for testing: use photoshoots and coloured lightbulbs, with each lightbulb corresponding to one concert participant

Heuristic Evaluation

In the first part of the testing phase we took advantage of the two groups and let the prototypes undergo a heuristic evaluation by the other team. To enhance the process, we provided an evaluation package which consisted of access to the prototype on Figma, an evaluation scenario, and a table with pre-selected criteria with explanations of the context.

Product Description
Name of the product: giggd
Purpose of the product / intended use: conduct and bill online concerts in a way that visitors are satisfied and artists can earn a living
Usage context: entertainment
Typical users:

  1. engaging users
  2. passive users


Evaluation Scenario
Imagine yourself in a virtual concert, you have the following tasks:

  1. You have a ticket code of "123 XYZ" and wish to enter the virtual concert.
  2. You are impressed by the performance. You know that without having to pay for a ticket, tipping is the only way you can contribute to the earnings of the performer, and therefore, you decide to give a tip by clicking one of the amounts.
  3. You wish to engage with other concert goers, and therefore, you wish to talk to them virtually and start a chat by typing "This is so good!".
  4. You feel engaged by the performance and really hope to give some feedback for the artist to see. Therefore, you find the color response buttons and try them one by one. Don't forget to notice that each color response corresponds to different mood/reactions. So before you click any of them, hover your mouse over the color button and read about what this color stands for in a giggd environment.


Results:

Criteria Meaning for Giggd Severity Index* Comment
Usability Can you complete your tasks without hesitation and/or difficulty? 2 We would expect the tip window to close after we tipped
Usability 1 We would like to see the amount we already tipped somewhere
Usability 1 Has method of payment been put in / maybe give user choice to confirm again
Usability 3 Chat: we would like to see who wrote what message, maybe a nickname or something
Usability 2 It would be good to see how many other people are watching. For example small window in the corner
Usability 2 Being able to see what colours others have chosen, for example when hovering over the colour
Usability 3 The blue square means two different things. On the starting page it means entering the concert, later on it means full screen.

On the starting page it would be helpful if you could just click on the image and then enter the space instead of having to click the blue square

Visibility Do you find the functions of tipping, chatting, and color response without difficulty within a reasonable amount of time? 1 "Tips" can be ambiguous, maybe a dollar sign would be helpful and help have less writing
User's Language Considering different colors may have different meanings for different people, do our associations of color-mood resonate with you?

Do you find the description of the color responses comprehensible?

2 Cute und sad are fitting colours, chilled not as intuitive

Meaning of angsty unclear White simplistic icons over the colours could be helpful, for example a hammock for chilled to make using the colours more intuitive. An additional category we all resonated with could be "Goosebumps"

Self-Decriptiveness Are the functions of the color response, tipping, and chatbox clear and comprehensible? 1 Three dots in tip section: our expectation is that you can choose amount, but not 100% sure
Self-Decriptiveness 2 Colours: how long do the colours stay? What happens when I click them? Expectation from us would be that they change artist colour and then disappear again. Potentially clarify
Self-Decriptiveness 2 Chat: have some sort of cue faded in grey in the chat box like "start chatting to other concert goers"
Appropriateness of options Is the amount of color options appropriate? Are the options of tipping amount appropriate? 1 Chat: to increase interactivity it could be fun to be able to react to comments in the chat through emojis
Simplicity Is the interface simple and straightforward? 1 Starting page: We would expect the "next gigs" to automaticly swipe through after a few seconds (maybe just not possible in the prototype?)
Simplicity 1 Merch: What is meant by merch? Merch from the artists? Or from giggd?

* (1) cosmetic (2) minor (3) major (4) disaster

Prototype Adaptations

Based on the Heuristic Evaluation we identified 13 suggestions regarding design and functionality of our prototype. In the following section we present all 13 aspects together with our considerations.
After this table we present screenshots of our prototype.

Potential adaptations based on HE our considerations
The scenario should make clear that each audience control only one light bulb We adapted the scenario accordingly
Change angsty to goosebump in the color panel We changed the emotion accordingly
Adding icons to explain the emotions This might make things easier to understand, but we would rather like to test whether it is neccessary for understanding. If not we would prefer cleaner design without icons.
Adding a lable (e.g. $) next to "Tips" We adpated accordingly
Adding a viewer count This is more of an additional function rather than an usability issue.
Including user names in the chat Contextual Inquiry revealed that only few really use and want communications in virtual events. We decided to "slim" all forms of communication. Therefore usernames are not needed. But we will test it again.
Having a hint in the chatbox We adapted accordingly
Entering the concert by clicking on the image vs. extending the screen We decided to leave as is, but to focus on the button in our testing.
Being able to see what emotions other users chose This is more of an additional function rather than an usability issue. Also normally, you only controll one light bulb, so you allways see what others choose.
Adding a function to like comments in the chat This is more of an additional function rather than an usability issue.
Having automated next gigs carousal in the menu Interesting suggestion, but the next gigs section is not our concern at the current stage.
Indicating what merch is available The reason behind this suggestion is that users can not be sure if they will find merch from artist or from Giggd. In this situation we actually see an advantage, because users will visit to find out.
Making the Confirmation of the tipping more clear, without the need for clicking twice We adapted accordingly

Final prototype for testing

Giggd enter.PNG
entering the concert

Giggd main.PNG
main menu

Giggd fullscreen.PNG
concert expanded view

Giggd tips.PNG
tipping the artist

Giggd chat.PNG
leaving a comment

Giggd color.PNG
using the color feedback

Access to the latest version: https://www.figma.com/proto/h4I6jPtcsDJc4A6xf0CJqo/GiGGD?node-id=55%3A371&scaling=min-zoom&page-id=0%3A1&starting-point-node-id=55%3A371&show-proto-sidebar=1

Usability Testing

To test our new prototype, we decided to mainly take a qualitative approach with the addition of one quantitative measure. Participants received a document that contained a scenario that explains the context of usage and four different tasks to solve on the platform. The tasks were focusing the main features we wanted to test as well as the entering process to the platform. The whole testing was done via the video-call platform Zoom. While participants were interacting with the system they shared their screen, so the researcher could observe the interactions. In addition participants were instructed to think out loud - verbalize there thoughts. The researcher made sure that participants would continue to think out loud.
After completion of all tasks an interview was conducted. Following the interview participants were asked to fill out the System Usability Scale (Brooke, 1986).

Testing protocol

The whole protocol was also translated in German.

Script

[ ] General introduction + Think out Loud method
[ ] Sending participant handout
[ ] Observation
[ ] Complete interview questions
[ ] Complete SUS


General Introduction + Think out loud:

Thank you for participating in this testing. I will now give you a short introduction of what is giggd and what is expected of you today. You will receive a pdf handout with all necessary information, including my introduction.

So giggd is an online concert platform that rethinks feedbacks to the artist. With each ticket purchase, you secure a lightbulb placed next to the artist. The lightbulb reacts your remote commands and speaks to the artist on your behalf in a simple but impactful way.

You will be presented with a prototype of giggd with limited functionalities. We ask you to step into the shoe of a concert attendant and go through the following scenarios while thinking out loud. It means to verbalize your thought, as much and as detailed as possible. If needed, I will remind you to think out loud.


Remind participants to think out loud:

What’s going through your mind right now?


If participants ask questions:

We want to understand your genuine experience, so I do not want to limit my influence on you as much as possible. So unless you are absolutely stuck, I will refrain from answering your questions. Thank you!

Participant handout

Giggd is an online concert platform that rethinks feedbacks to the artist. With each ticket purchase, you secure a lightbulb placed next to the artist. The lightbulb reacts your remote commands and speaks to the artist on your behalf in a simple but impactful way.

You will be presented with a prototype of giggd with limited functionalities. We ask you to step into the shoe of a concert attendant and go through the following scenarios while thinking out loud. If needed, I will remind you to think out loud.


(Thinking Out Loud = To verbalize one's thoughts, especially when trying to produce a solution or conclusion about something)


Tasks

You have previously purchased a virtual concert ticket. Your ticket code is "123 xyz". Your first task is to enter the concert.

You are impressed by the performance. You know that without having to pay for a ticket, tipping is the only way you can contribute to the earnings of the performer, and therefore, you decide to give a tip.

Next, you wish to engage with other concert attendants. Therefore, you will find the chat box and send “This is so good!” to the chat. Please stick to the exact spelling.

Then, you will give some feedback for the artist to see. Therefore, you find the color response buttons and try them one by one. Each color response corresponds to different mood/reaction. So before you click any of them, try hovering your mouse over the color button and read about what this color stands for in a giggd environment. Please note, that in reality, each audience controls one light bulb. For testing purposes only, you will find all lightbulbs react to your command.


Link to the prototype:

[1](https://www.figma.com/proto/h4I6jPtcsDJc4A6xf0CJqo/GiGGD?node-id=55%3A371&scaling=min-zoom&page-id=0%3A1&starting-point-node-id=55%3A371&show-proto-sidebar=1)

Once you are on the prototype you can adjust the zoom by pressing “z”


Observation protocoll

Task name Observations Notes
Log in selection: task failed/ difficulties/ success
Enter concert selection: task failed/ difficulties/ success
Tipping selection: task failed/ difficulties/ success
Participate in chatbox selection: task failed/ difficulties/ success
Give feedback with light bulb selection: task failed/ difficulties/ success


Interview

Task name Questions Notes
Log in/ Enter concert Did entering the concert work as expected?
Enter concert How did you understand the functionality of the “maximize/expand/fullscreen” icon?
Tipping Do you feel the tipping amounts offered are appropriate?
Tipping What would you expect to happen if you press other amounts?
Tipping Did the confirmation of the tipping work as expected?
Tipping Was the tipping function easy to use?
Participate in chatbox Did you like the anonymity of the chatbox or would you rather see the avatar of each user?
Participate in chatbox Was the chat function easy to use?
Give feedback with light bulb Do you understand the intention behind the colour response functionality?
Give feedback with light bulb Do our associations of colour-mood resonate with you?
Give feedback with light bulb What kind of emotions do you typically have when you go to concerts?
General What would you change in this online concert platform?/what would you wish for in an online concert platform?


System Usability Scale

Questions Rating (1=strongly diagree - 5=strongly agree)
I think that I would like to use this system frequently.
I found the system unnecessarily complex.
I thought the system was easy to use.
I think that I would need the support of a technical person to be able to use this system.
I found the various functions in this system were well integrated.
I thought there was too much inconsistency in this system.
I found the system very cumbersome/clumsy to use.
I felt very confident using the system.
I needed to learn a lot of things before I could get going with this system.
I think that I would need the support of a technical person to be able to use this system.


Examples

Observation.PNG
filled out observation protocol


Interview.PNG
filled out interview protocol

Analysis

quantitative data

The data from the System Usability Scale was analyzed according to the instructions provided by Adobe.
https://xd.adobe.com/ideas/process/user-testing/sus-system-usability-scale-ux/


Based on that calculation you get a score between 25 and 100, which can be compared to the following benchmark.


SUS.PNG
SUS benchmark

qualitative data

The qualitative data was analyzed by using a common table with the following format:

Research question Participants Summary
01 02 03 04 05 06 07 08
01
02
03
04

All research questions were listed in a table, together with all participant. We started analyzing the first research question by going through the available data sources (interview and observsation) and entering all relevant information in the cells for the respective participants. This process was repeated for every research question. After completion of the data entry, we again started at the first research question and worked our way through. For each research questions, we read out loud the information in the table and summarized the key insights. When in doubt the topic was discussed until consensus was reached.

Results

quantitative data

item Participants
01 02 03 04 05 06 07 08
I think that I would like to use this system frequently. 4 3 4 4 5 3 2 3
I found the system unnecessarily complex. 2 1 1 1 2 1 1 1
I thought the system was easy to use. 4 5 5 5 5 5 5 4
I think that I would need the support of a technical person to be able to use this system. 1 1 1 1 2 1 1 1
I found the various functions in this system were well integrated. 5 4 5 4 5 4 5 4
I thought there was too much inconsistency in this system. 2 2 2 2 1 1 1 1
I would imagine that most people would learn to use this system very quickly. 5 5 5 5 5 4 5 4
I found the system very cumbersome/clumsy to use. 1 1 2 2 1 2 1 2
I felt very confident using the system. 4 5 5 5 5 3 5 4
I needed to learn a lot of things before I could get going with this system. 1 1 1 1 1 2 1 1
SUS Score: 87.5 90 92.5 90 95 80 92.5 82.5

'* 1 = strongly disagree - 5 = strongly agree

The final SUS scores range from 80 to 95 with an average of 89 - this indicates good to excellent usability scores across all participants. These results confirm the findings from the heuristic evaluation that there are no drastic usability issues. However the qualitative data will still show some room for improvement.

qualitative data

The following table shows the results from our insight-making process for all research questions. Some aspects have been investigated additionally and were not part of the initial research questions.
These questions are written in italic.

category research question insights
Log in Did entering the concert work as expected? when logging in, some users want to use their mouse and have to figure out that pressing enter is required
Enter the concert How did you understand the functionality of the “maximize/expand/fullscreen” icon? polarized feedback * participants either found it easily or they couldn't find the icon for entering the concert. For future development it should be clearer whether the concert has started already
Tipping Is tipping only a feasible business model? no data
Will people appreciate there is a tipping function and use it? Users appreciate the tipping function
Are the amounts we offer appropriate? Overall the amounts seem to be appropriate
If there is manual input, how many people will use manual vs clickable options? Users usually stick to the offered options
What would you expect to happen if you press other amounts? Participants all agree what to expect
How to attract people's attention to tipping and cause action without being too intrusive? no data
Is the tipping function visible to the user? no problem for tipping function visibility
When users decide to tip, do they find the tipping function easy-to-use? overall it was easy to use; participants had few suggestions: (1) confirm button; (2) currency options; (3) reset selection after donation
Did the confirmation of the tipping work as expected? overall it worked as expected, participants had few suggestions: (1) some users expect the artist to get a notification (2) visisbility can be an issue on some colors (3) information about previous tips (4) additional confirmation
Chat Is the chat function visible to the user? participants agree the function is visible
Are the users willing to engage in conversations taken place in a chatbox? no data
Do users like the existence of a chatbox? no data
Did you like the anonymity of the chatbox or would you rather see the avatar of each user? There was a wide range of reactions and suggestions; further investigations needed
Was the chat function easy to use? overall participants find it easy to use. In rare cases people might miss a send button or indicator to press enter
Color Feedback Considering different colours may have different meanings for different people, do our associations of colour-mood resonate with users? "goosebumps, "cute", "sad", "fun" are all very relevant "more" is confusing"
Do users understand the intention behind the colour response functionality? few participants were confused by its function. users would need more instructions; more hands-on explanations
Is the amount of colour option we offer "too much" or "enough"? no data
Do users understand the intention behind the colour response functionality? few participants were confused by its function. users would need more instructions; more hands-on explanations
What kind of emotions do concert goers usually have? in addition to our moods; dancing (3), joy/happy (3), hyped/energized/excited (6), blissful, wonder/impressed (2), love,

Method evaluation

quantitative method
SUS scores indicate good to excellent usability across all participants. This is also true for participants that experienced difficulties during the interaction. Across all 8 participants there has been two cases, in which task failure occured. Participant 05 needed help to use the login. Participant 08 needed help to expand the concert view to fullscreen. These task failures are only partly reflected in the SUS scores. While participant 08 has scored 82.5, which is the second lowest score (mean = 89), participant 05 actually has the highest score in our sample (95). This can be interpreted in two ways. When filling out the questionaire Participant 05 did not consider that she needed help to achieve the tasks and therefore rates the system with her new updated knowledge about the system. On the other hand participant 08 did consider the fact that she needed help and adjusted her rating accordingly. Another interpretation could be, that the SUS is too general to capture all nuances.

qualitative method
While the SUS was good to confirm that the system had no drastic usability flaws, it would fail to point us to specific aspect that can still be improved. In that aspect, the qualitative part of our research was definitely the richer data source. The combination of observation and interview revealed more detaild findings, specific to certain aspects, and was the main source to answer our research questions. Some participants expressed minor concerns that thinking loud makes them feel awkward. In that light, choosing this method should be justified. In our case, thinking out loud proved itself to be a very powerful tool to further facilitate understanding of how participants are interacting with the system

Future Development

Giggd journey steps Booking a concert ticket Log in Payment method Backstage Entrance Hall Tipping Chat Color Feedback Community events
Action Items 1. go to main website and book a ticket
2. book a ticket inside a concert
1. receive e-mail with link and login credentials
2. enter the ticket code
1. connect desired payment method 1. expending the concert view 1. expending the concert view 1. find tipping function
2. choose amount
3. confirm donation
1. find chat function
2. type message
3. send message
4. react to others
1. find color panal
2. choose a mood
many different
Needs(+) and Pains(-) future investigation (+) to enter a concert by clicking with mouse
(+) notification about concert status (started/not started)
(-) security and data storage (+) to see soundcheck
(-) fullscreen was not intuitive
(-) fullscreen was not intuitive (+) easy to find
(+) appropriate amounts
(+) "other amounts" is self-descriptive
(+) function has a good location
(-) confirmation needs clarification
(-) amounts stay selected after tipping
(+) easy to find
(-) missing send button
(+) "goosebumps", "sad", "fun", "cute" are relevant
(-) "more" is confusing
(-) function can be misunderstood
future investigation
Opportunities 1. introduce a ticket booking flow
2. investigate subscription model
1. allow mouse and keyboard interaction
2. increase visibility of concert status by changing fonts and removing the loading circle
3. Create an animation to show the concert is starting with a hand turn on an amplifier.
1. Investigate the possibility of having giggd cryptocurrency
2. Investigate the possibility of having giggd tokens
3. In the case of tokens, users need to purchase token beforehand
1. allow to open the concert with expand button and by clicking anywhere on the screen
2. investigate different locations for full screen button
1. allow to open the concert with expand button and by clicking anywhere on the screen
2. investigate different locations for full screen button
1. Introduce collective total amount "tipped" by the community
2. Introduce individual total amount "tipped" by oneself
3. Introduce a visual representation as tipping notification visible to everyone (users, others, artist), e.g. led light next to the artist
4. Introduce different currency options for which users can pick in which currency (€, $, etc.) they'd like to tip with
5. Introduce color coordination/alignment between the amount chosen and confirmation message
1. Investigate the possibility of paid AI based concert summaries for the artist
2. Introduce a new function of reacting to or replying to specific message
3. Introduce the possibility of private chat in the same chat box, signified by a different color
4. Introduce a send button or indicator to press enter
1. Introduce additional moods; dancing (3), joy/happy (3), hyped/energized/excited (6), blissful, wonder/impressed (2), love
2. Allow artist to participate in the decision of what colours/reaction to include
3. Investigate which further instructions could facilitate understanding of the function
4. Investigate negative feedback
introducing events that are unique (or easier to perform) to online events e.g., meet and greet, challenges
Remaining Questions What if users get free tickets and are not committed to it and not show up? 1. How to develop a crypto currency?
2. How will we be perceived if users see us associated with our own currency?
3. Will people tip more if the tip is already included in a subscription bundle in the form of a token?
Will artist be open to make their sound check public? Does the visibility of total amount tipped give more incentive for people to tip?
2. How to attract people's attention to tipping and cause action without being too intrusive?
Do people really like anonymity of the chat box and does it lead to more engagement?
2. Do people enjoy reacting to messages?