Difference between revisions of "Remote Concerts"

From
Jump to: navigation, search
(Tipping System)
(Feedback)
Line 388: Line 388:
  
 
===Feedback===
 
===Feedback===
 +
*Considering different colours may have different meanings for different people, do our associations of colour-mood resonate with users?
 +
*If we train the users with our colour-mood associations, will users accept them?
 +
*Do users understand the intention behind the colour response functionality?
 +
*Is the amount of colour option we offer "too much" or "enough"?
 +
*How do users understand the feedback?
 +
*Do people understand the purpose of the colours?
 +
*What kind of emotions do concert goers usually have?
 +
 +
💡 Ideas for testing: use photoshoots and coloured lightbulbs, with each lightbulb corresponding to one concert participant

Revision as of 14:17, 6 December 2021

Project Description

This project is about in what way can online concerts be conducted and billed so that visitors are satisfied and artists can earn a living. The main objective of the project is to overcome the in-site aspect of concerts, while keeping all the salient aspects of the real-life experience, for both artists and concert attendants.

Group Collaboration

At the beginning of the project we discussed our strenghts and weaknesses and how we want to collaborate during the project. Belbins Team role model serves as a basic concept how we define our roles.

BelbinTeamRoles.PNG

Background knowledge on music industry

We shared a basic understanding of the Music Industry to help put into a wider context the lived experiences of artists in relation to their industry. In doing so, we understood some of the complexities, in relation to the Music Industry, in operating a concert platform.

MusicIndustry.PNG

Contextual Inquiry

Contextual inquiry can be seen as a combinational use of different methods with the goal to disclose work structures, explore usage and environment of existing technology, and get ideas about future development of a product or system.
It usually can be accomplished within four steps:

  1. narrowing down the question or purpose
  2. choosing data collection method
  3. investigating in the context
  4. analysing and interpreting the findings

Since music events depend on visitors and artists our Contextual Inquiry gets devided on these sub groups.

We define users' perspective as the user experience as part of attending:

  • a virtual music event
  • a virtual non music event
  • a live music event

We define artists' perspective as the user experience as a part of performing within the context of:

  • a virtual music event
  • a virtual non music event
  • a live music event

Objectives

In general we are interested in learning about the user experience of artists and visitors of a virtual music event.
More specifically we want to understand:

Users' perspective Artists' perspective
motives behind attending and not attending a music event (virtually or live) motives behind taking or not taking a gig (virtually or live)
engagement and interaction with other people including the artist from that event impact of audience feedback in virtual and live settings
technological tools behind virtual music events (mostly during the event) technological tools behind virtual music events (preparation and actual event)
experience and improvements to virtual music events experience and improvements to virtual music events

Data Collection

To learn more about the objectives we use interviews, surveys and observation.

method Users' perspective Artists' perspective
experienced in virtual music events (a) not experienced in virtual music events (b) experienced in virtual music events not experienced in virtual music events
surveys 1. What are factors that make live music events enjoyable?
What motivates you to attend a live music event?

2. Have you participated in virtual music events?

In case the users do not have experience with virtual music events they will be asked about any other (non-music) virtual event.
In addition they will be asked the following set of questions (3.b) to understand their motives behind not attending or potentially attending virtual music events

3.b In the past 18 months were you offered an opportunity to attend a music event virtually you would have normally attended in real life?
if yes, what are your reasons for not attending the virtual event?
if no, imagine your favorite artist would give a virtual concert. Would you attend?
if no, what are your reasons for not attending the virtual event?

3. Please indicate your level of agreement with the following statements: The virtual music event I attended…
was comparable to live music events - 5 step Likert scale
Brought new experience - 5 step Likert scale
4. Would you please name a few of the virtual events you participated in?
5. What platform was the virtual event hosted on?
6. What did you enjoy about them?
7. What did you not enjoy about them?
8. What can be improved about them, in your opinion?
9. How much did you engage with other participants in the virtual live music events, in comparison to in the live settings?
10. How was your quality of engagement with other participants in the virtual music events, in comparison to that in the live settings?
11. Please describe a typical interaction you had with other participants of the same virtual music event.
12. How would you describe the main difference between a virtual music event and a live one?

no survey
interviews

1. What usually motivates you to attend a live music event?
2. Have you participated in virtual music event?

In case the users do not have experience with virtual music events they will be asked about any other (non-music) virtual event.
In addition they will be asked the following set of questions (3.b) to understand their motives behind not attending or potentially attending virtual music events

3.b In the past 18 months were you offered an opportunity to attend a music event virtually you would have normally attended in real life?
if yes, what are your reasons for not attending the virtual event?
if no, imagine your favorite artist would give a virtual concert. Would you attend?
if no, what are your reasons for not attending the virtual event?

3. Would you please name a few of the virtual events you participated in?
Let`s talk about your experience with virtual music events...
4. Was it comparable to live music events?
5. Did it bring new experiences?
6. What did you enjoy about those virtual events?
7. What did you not enjoy about them?
8. what can be improved?
9. Compared to live settings, did you...
engage more or less with other participants of the same event?
have better or not as good a quality of engagement with other participants?
10. Please describe a typical interaction you had with other participants of the virtual music event.
11. What makes it (not) enjoyable?
12. To summarize, what do you see as the difference between a virtual music event and a live music event?

1. As an artist, what does it mean for you to perform a gig?
2. I’d like to learn about your experience with concerts. Could you please share with me which aspects of live performance make them enjoyable for you?
3. Which criteria do you consider when deciding whether or not to take a gig? What factors play a role in this decision making process?
4. What does having audience feedback in live performance mean to you? How important is the audience feedback for you? (optional maybe: Could you please share a remarkable experience you had?)
5. Can you tell me about your experience with online performance?
5a) How many online gigs have you had?
5b) What does it mean for you to perform an online gig and how is it comparable to performing a live gig?
5c) Which tools do you need to perform a virtual gig?
5d) What does an enjoyable virtual gig mean to you?
6. What does having audience feedback in virtual performance mean to you? How important is the audience feedback for you?
6a) What does it mean for you to not have audience feedback?
7. What do you think needs to be improved for a better gig experience?

oberservation observation domain: 1. Audience Interaction 2. Audience Feedback 3. Technology


observation questions:
audience interaction - How can people interact?
audience interaction - How often do people interact observably
audience feedback - How can the reaction of the audience be feedbacked to the artist?
technology - How does the technology support the audience experience?
technology - What does the platform allow?
technology - How often do people use functions?
technology - What are disadvantages?

questions to the audience (if possible):
motivation - What brought them here?
experience - What are they (not) enjoying?
forward-thinking - What can be improved?

observation protocol can be expanded based on learnings after the first few observations

Results

In the following chapters participation rates and representative examples of results will be presented for each of the utilized data collection methods. In the interpretation, section will be a description of our insight-making process and the final results of it.

Surveys

Users' Perspective
Survey period: 27th May 2021 - 2nd June 2021
Platform: google forms
Distribution: collaboration with Rave the Planet, a company that organizes festivals
Participants: 60

caption
implementation of user survey on google forms

Interview
Users' perspective Artists' perspective

Interview period: 5th May 2021 - 14th May 2021
Participants: 4
Sample description: experienced in virtual music events (4)

Interviews period: 8th May 2021 - 25th May 2021
Participants: 4
Sample description: queer performer (3); DJ (1)

caption caption
implementation of interview protocols in google documents

Observation

Overall five virtual music events have been visited and observed:
8th May 2021 - Shanghai Community Radio (SHCR)
10th May 2021 - xJazz festival
11th May 2021 - xJazz festival
14th May 2021 - xJazz festival
15th May 2021 - xJazz festival


SHCR.png XJazz 02.png
screenshots of the observed platforms



Link to the observation protocol google sheet: https://docs.google.com/spreadsheets/d/1OPbP6Pw3i8EA2uU3igCzpzu3MsA5_y9Cde5_2QXqS_4/edit?usp=sharing
ObservationProtocol.PNG
implementation of observation protocol on google sheets

Insight-making

Our insight-making process for the interview was as follows: Each of us read through all the interviews and recorded the most important bits of information on a virtual notepad.
Subsequently, we group all the information on a table into different categories. This table serves as a basis for further work like personas, scenarios and focus group.
A similar insight-making approach has been used for the observations. The protocol google sheet was read and summed up into cohesive insights which were then grouped into the three domains.

InsightMaking.PNG
implementation of the insight making process



final interpretation of the results

ObservationInsights2.PNG
observation insights


Link to the Miro-table.pdf: https://drive.google.com/drive/u/0/folders/1awNgOwgO7652lOXdgrelSLz9XZ6mpv0W

JTpdf.PNG
interview insights

Focus Group

general information

date: 14th June 2021
participants: Users (2) and artists (3)
oberserver: Prof. Kolrep


material

  1. script and agenda
  2. Zoom room
  3. Nuudel time slot poll
  4. miro board


Procedures

Item Description Time
Introduction background of the student project
camera and muting rules
confidentiality for recording
2min
Information presentation of most important findings so far 8min
Ice Breaker participants are asked to shortly introduce themselfs, tell what made them happy last week
and then choose the next person to continue
7min
Round Robin The Focus Group is presented with a question. After 1min reflection time, one participant (No.1) is asked to share their thoughts out loud,then pass it on to the next person (No.2).
This person will have 30 seconds thinking time before contributing an additional point, idea, or thought, building off the thought of the person before.


question 1: What makes people want to attend a remote event as an audience?
question 2: What makes artists want to perform a virtual gig?

25min


Brain Storming Participant were asked to write their thoughts about how their ideal remote concert platform would look like on sticky notes in a miro board.
There was no restriction of which aspects can be included, however we gave the following hints features, visuals, functions and service.
10min
Wrap Up expressing our thanks to the participants 3min

Results

Everything the participants were mentioning is potentially valuable to us. Therefore 2 of us were focusing on taking notes. In addition we have a recording of the whole sessions to re-watch.
On top of that we have the miro board with all ideas provided by the participants during the brain storm activity.

Fg brainstorm.PNG
results of the brain storming phase

Modeling

User Stories

Based on our data from Contextual Inquiry and Focus Group we were able to form user stories. After the initial collection was done, we indicated user stories that were adressing the same aspect as another in grey colour, so we could focus on unique aspects.

Userstories.PNG
user stories

Requirements

With the data collection via interview, observation, survey and focus group we were able to identify requirements for our future platform. For some of them we already noted ideas for future implementation in form of product qualities.

ArtistR.PNG
Identified requirements of artists.


UserR.PNG
Identified requirements of users.

Personas

We tried to capture and represent our participants from the different methods into these personas.

PersonasE.PNG
Translation of our findings into three personas, one artist and two user.

Use Cases and Scenarios

In both use cases and scenarios certain keywords are marked. This indicates that the respective aspect relates to an identified requirement.


Pino.PNG
Typical use case and scenario of an artist on GiGGD.



Astrid.PNG
Typical use case and scenario of a simple user on GiGGD.



Pablo.PNG
Typical use case and scenario of an engaging user on GiGGD.

Findings

This is a short summary of our main findings. In the next chapter, these findings translate into tangible ideas for future implementation and a first visualization.

FindingsE.PNG
Summary of our main findings.

Outlook - Future Implementation

ImplementationE.PNG
Future elements of our platform.


Outlook.PNG
First visualization of our platform.

Methods evaluation

At the end of the first part of the project we evaluated the usefulness of the different methods. Each team member rated all methods on a 5 point scale and elaborated on their decision.
We calculated a rounded average and summarized the mentioned aspects.

Method evaluation.PNG
Internal evaluation of the usefulness of the methods for this particular project.

Prototype

Taking up the results from last semester we decided on implementing and testing three main functions of the Giggd platform: Feedback to the artist based on lighting colors, optional chat for engaging users, and a tipping system. For the prototype we used Figma, because it offers a broad range of functionalities and unlimited screens. No matter how extensive our prototype becomes, Figma can cater for it. In the next chapters we will describe the implementation of the main functions. In addition we present a first draft of research questions we hope to answer.


|screenshot of the main screen|


Tipping System

  • Is tipping only a feasible business model? (highly impactful but not feasible)
  • Will people appreciate there is a tipping function and use it? (impactful and somewhat feasible)

💡The question above may be done in a post-testing interview.

  • Are the amounts we offer appropriate? (really feasible)
  • If there is manual input, how many people will use manual vs clickable options?

💡The two questions above are connected and can be tested with scenarios (e.g. "imagine yourself in a virtual concert...")

  • How to attract people's attention to tipping and cause action without being too intrusive? (really feasible)
  • Is the tipping function visible to the user (i.e. find the tipping function without observed difficulty within a reasonable amount of time)? (really feasible)
  • When users decide to tip, do they find the tipping function easy-to-use (i.e. complete the task of tipping without observed hesitation and/or difficulty)?

Chat function

  • Is the chat function visible to the user?
  • Are the users willing to engage in conversations taken place in a chatbox?
  • Do users like the existence of a chatbox?

💡 Can be tested via scenario tasks like "go talk with other concert attendee" or having a pop up notification for new chat messages

Feedback

  • Considering different colours may have different meanings for different people, do our associations of colour-mood resonate with users?
  • If we train the users with our colour-mood associations, will users accept them?
  • Do users understand the intention behind the colour response functionality?
  • Is the amount of colour option we offer "too much" or "enough"?
  • How do users understand the feedback?
  • Do people understand the purpose of the colours?
  • What kind of emotions do concert goers usually have?

💡 Ideas for testing: use photoshoots and coloured lightbulbs, with each lightbulb corresponding to one concert participant