Summary

Measuring Engagement of Spectators of Social Digital Games

Published: July 03, 2021
doi:

Summary

We propose a methodology that enables measuring engagement of spectators in a social digital game combining physiological and self-reported data. As this digital game involves a group of freely moving people, the experience is filmed using a synchronizing technique that links physiological data with events in the game.

Abstract

The goal of this methodology is to assess explicit and implicit measures of engagement of spectators during social digital games in a group of participants with motion tracking systems. In the context of games that are not confined within a screen, measuring the different dimensions of engagement such as physiological arousal can be challenging. The focus of the study is made on the spectators of the game and the differences in their engagement according to interactivity. Engagement is measured with physiological and self-reported arousal, as well as an engagement questionnaire at the end of the experiment. Physiological arousal is measured with electrodermal activity (EDA) sensors that record the data on a portable device (EDA box). Portability was essential because of the nature of the game, which is akin to a life-size pong and includes many participants that move. To have an overview of the events of the game, three cameras are used to film three angles of the playing field. To synchronize the EDA data with events happening in the game, boxes with digital numbers are used and put in the frames of cameras. Signals are sent from a sync box simultaneously to the EDA boxes and to light boxes. The light boxes show the synchronization numbers to the cameras, and the same numbers are also logged on the EDA data file. That way, it is possible to record EDA of many people that move freely in a large space and synchronize this data with events in the game. In our particular study, we were able to assess the differences in arousal for the different conditions of interactivity. One of the limitations of this method is that the signals cannot be sent farther than 20 meters away. This method is, therefore, appropriate for recording physiological data in games with an unlimited number of players but is restricted to a limited space.

Introduction

Studying the experience of game spectators helps to better understand the positive and negative aspects of the game, and in turn, can help to improve its design1. Recent innovations in the gaming industry have allowed new types of experiences that move forward from traditional console-based gaming2. With digital games that use motion tracking systems that are not confined within a screen, audiences do not have to be positioned in a fixed spot anymore. This new reality creates challenges in the assessment of spectators' experience. The experiment was performed in the studio of the creators of the game but could be replicated in a laboratory setting or another environment that has enough space to fit the game.

The purpose of this methodology is to measure spectator engagement during a social digital game. More precisely, arousal, which leads to engagement, will be measured when the spectator has access to a web application that influences the gameplay. This method combines physiological and self-reported data. As this game is social and involves a group of people that move, the experiment is filmed. With the use of cameras and portable physiological devices, we were able to synchronize physiological data with events in the game. The portable devices (EDA boxes) are 3D printed boxes that are connected to electrodes that record physiological activity. The boxes have an ON/OFF switch, visual indicators, a microSD card slot and charging slots. The visual indicators help in case of troubleshooting. For example, these indicate whether the microSD is functional, show the state of the Bluetooth and Wi-Fi connections and signal whether physiological data are being recorded.

The use of physiological measures is a common and validated approach for measuring game engagement3. Physiological valence has been measured in the context of video games4. It has also been used in other research domains such as education5. Because emotional engagement is not observable and self-report can be biased, Charland et al. have used physiological arousal to assess emotional engagement in learners that were solving problems5. They used electrodermal activity (EDA) to measure physiological arousal, which is a widely used method6. EDA is the measurement of skin conductivity, which varies according to the differences in sweat gland activity3. This measurement is an important correlation to real-time emotional variations. EDA is associated with many constructs such as stress, excitement, frustration, and engagement7. Complementing EDA data with self-report responses are therefore recommended to associate the data with the right construct3. The Self-Assessment Manikin (SAM) is a self-reported pictographic scale that assesses three dimensions of emotion: valence, arousal, and dominance8. The current work used the arousal dimension, assessed using a visual 9-point Likert scale, ranging from calm to excited. Perceived arousal has been used in combination with physiological arousal7.

In traditional video games contexts, spectators are seated in a chair and stay more or less in the same position for the duration of the experiment. They are expected to look at a screen where the actions take place. This setting has been seen in previous games studies using physiological data9. In this case, it is simple to start the recording of the game at the same time as the recording of the physiological data10.

In the context of new digital games that are played outside of the screen, and in which participants stand and are free to move, traditional EDA recording might not be appropriate. The game used in this study is akin to a life-size Pong11. This game is composed of a ball and two paddles, each on an extremity of the playing field. Players move their paddle in order to push the ball from one end of the field to the other. In the version used for this research, the game is projected on the ground and players use their bodies as controllers for the paddles. Movement detection technology allows the paddle to follow the two players who are situated at opposite sides of the playground. An example of how the players prevent the ball from hitting the virtual wall behind them is presented in Figure 1. The game also involves spectators standing on the sides of the playground, who can use their smartphones to influence the gameplay. Using a mobile web application, spectators can vote for certain power-ups or obstacles that can either help or harm the players (e.g., less walls versus more balls, or modulating the speed of the ball). The option with the most votes wins.

In this study, we investigate the influence of interactivity on spectators. The conditions of interactivity are with or without smartphone. We compared the engagement of the spectators in these two conditions. A within-subject design was used for the interactivity condition, in order to assess the difference in arousal, and therefore in engagement. In the current study, groups of 12 people were ideal to promote ecological validity of the game12. two people as players and 10 as spectators. Only two EDA boxes were available for our study, so we had a total of eight groups which totalized 16 EDA data sets (two participants with EDA recording per group of 12). Each member of the public was randomly assigned to two games with access to their smartphone to influence the gameplay and one game without access to their smartphone. Game engagement literature suggests that giving many interactive options can lead to higher engagement13. Research in education has found that physiological arousal is a correlate of emotional engagement5. Building on game engagement literature and research in education, we hypothesized that giving the spectators access to interactivity will increase arousal which will in turn increase their engagement.

Contrary to studies about player experience, studies about spectators of a digital game rarely use psychophysiological measures. They are mostly done with questionnaires14, observation15, and interviews16. One difficulty of using psychophysiological measures with spectators is that they are often a group and their movements are less predictable than those of the players. This methodology uses multiple cameras to capture the participants and light boxes, enabling linking of participants video and physiological data.

As we used a within-subject design for the smartphone condition, each subject participated in two games with the interactivity condition, using their smartphone, and one game in the control condition, without the use of their smartphone. Synchronization of EDA data with the starts and ends of each game was therefore crucial to enable the assessment of the differences in each condition of interactivity. It would be impossible to start the recording of all the three cameras at the same time as the recording of the EDA on the spectators due to the dimensions of the room. To overcome that issue, we have used a new synchronization technique called wireless synchronization protocol for the acquisition of multimodal user data17. Bluetooth Low Energy (BLE) signals are sent from a sync box simultaneously to the EDA boxes and to light boxes (see Figure 2). The sync box is a 3D printed box with ON/OFF and auto/manual switches and a button. The manual function is used for testing the signals using the button. The signals are incrementing numbers that start at one and that are shown on the 3D printed light boxes. There numbers are shown to the cameras, and the same numbers are also logged on the EDA data file (see Figure 3). This allows synchronization of events happening in the game to variations in the EDA recordings. In our case, the events identified were the starts and ends of the three games. Then we could link the game to the condition and to the participant number. In this way, we identified which dataset corresponded to each condition.

The following section describes the protocol that allows the use of the technique developed by Courtemanche et al.17. We adapted the technique to answer our research question. This protocol received an ethical certificate from our institution's ethics committee. In this protocol, we use physiological devices18, mounted into a 3D-printed casing. We will refer to the device as the EDA boxes (boxes used to record the EDA of the participant), the light box (the box with a digital light), and the sync box (box that sends signals to the EDA boxes and the light boxes to synchronize data). The synchronization software enabling the wireless synchronisation protocol for the acquisition of multimodal user data17 was embedded onto the boxes.

Protocol

The following protocol was approved by HEC Montréal's ethics committee prior to the beginning of the data collection.

1. Participant screening for the experiment

  1. Recruit participants 18 years and older. Ensure that participants understand the language of the experiment, can stand for 20 min, possess a smartphone dating from a maximum of 5 years, do not have skin allergies or sensitivities, do not have a pacemaker and do not suffer from epilepsy or any other diagnosed health problem.
  2. Recruit groups of people that are friends, and other groups of people that do not know each other, in order to control for familiarity. Group sizes must be determined based on the purpose of the study, the game studied, and the size of the available room.
  3. Schedule participants. Impose a date and time for group of people knowing each other and group the people who do not know each other on their most convenient dates.
  4. Ask participants to charge their smartphones and bring chargers to the data collection session.

2. Conditions and experimental design

  1. Prepare the randomization sheet for the interactivity condition by associating each participant number to the two conditions of interactivity for each game. Also assign numbers to players and to spectators who will wear an EDA box.

3. Preparation

NOTE: These materials are needed to perform the protocol: EDA box, the box used to record the EDA of the participant; light box, the box containing lighted digital numbers; and sync box, the box that sends signals to the EDA box and the light boxes to synchronize data. Two armbands, EDA electrodes, EDA sensors, medical tape, and antiseptic wipes are also needed.

  1. Plug the EDA boxes, the three light boxes, and the sync boxes into the charging station.
  2. Turn on the game in the studio (projector and 3D scanner for movement detection technology) and test the game by running it through a full game.
  3. Place the consent forms, the pre-experiment questionnaire, and jerseys on a table in the greeting area.
  4. Test the Bluetooth connection of the light boxes. Set the sync box to manual.
    1. Turn on the three light boxes, the two EDA boxes, the Bluetooth on the EDA boxes, and the sync box.
    2. Push the pulse button on the sync box. The light boxes will flash the number 01.
    3. Turn off the sync box, the light boxes, and the EDA boxes.
  5. Set the sync box and light boxes in place for the collection. Place the light boxes in view of each camera.
    1. Put the sync box on the tripod, at a height of 6 feet.
    2. Set the sync box to auto.
  6. Unplug the batteries and put them into the cameras.
    1. Check that battery power can record for over an hour.
  7. Place the camera in order for the framing to include all four extremities of the game's paying field and the light box. Place the two low light cameras at opposite corners of the playing field at hip level and place the go pro mid-field on a higher tripod to have an overhead shot of the playing field.
  8. Ensure that framings include the full playing field and an area of 1 m around its limits, and the light box. Ensure that sync box is not further than 20 m from where participants will stand, otherwise the pulses will not be transmitted.

4. Welcoming participants

  1. Greet the participants at the front door. Tell them to go sit at the table.
  2. Once all the participants have arrived and are seated, describe the tools which will be used to collect data for the present study. This description should be written in the consent form. Then, tell the two randomly chosen participants to follow the researcher to install the EDA equipment. During that time, other participants can start to fill the pre-experiment questionnaire.
  3. Ask the participants to read and sign the consent forms. Verbatim: "I will ask you to read the consent form. The two copies are identical. One is for you; one is for me. Please answer all the questions and sign both copies."
  4. Go around the table to sign the consent form, verifying that all questions have been answered and put one copy of the consent form into a folder designated for this purpose and give the participant the second copy.
  5. Ask the participants to put on the jersey with their participant number.

5. Installation of the physiological device

  1. Ask the participants to remove any jewelry from the non-dominant hand.
  2. Use an antiseptic wipe to clean the area where the electrodes will be placed. Remove the plastic from the electrode and place them on the hands of the participant.
  3. Snap the two sensors on the two electrodes. The red wire must be placed on the thumb's side. The black wire must be placed on the other side, under the pinky finger.
  4. Plug the sensor wire to the A3 port of the EDA box. Ask the participant if they tend to have sweaty palms. If they say that they do, wrap medical tape around the electrodes without touching the metal part.
  5. Add an armband over the palm of the hand to secure the sensors and electrodes in place.
  6. Turn on the EDA device. Check that the Bluetooth switch is still on.
  7. Check that the four lights flash.
  8. Note the number of the participant and the number of the EDA box serial number associated to each participant.
  9. Place the EDA box on the belt or in the pocket of the participant. If the participant's clothes do not allow this placement, offer them a belt, and hook the EDA to the belt.
  10. Ask the participants wearing the EDA boxes to return to the table with the others and complete the pre-experimental questionnaire.

6. Record baseline

  1. Go around the table, starting with the participants who do not have the EDA, and check whether all the questions have been answered. If the questionnaire is completed, put it in the folder with the participant's consent form.
  2. Once all participants have completed the pre-experimental questionnaire, walk them to the game studio.
  3. Then, record baseline.
    1. To do so, tell the participants to calibrate the tools and ask them to breathe calmly and to fix something in front of their eyes for 2 min.
    2. Simultaneously, turn the EDA devices off and then on.
    3. Start a timer for 2 min. After the 2 min ends, turn the EDA device off and turn on again.

7. Start the experiment

  1. Start the recording of the three cameras and turn on the three light boxes.
  2. Verify that the light boxes and the full playing field are still within the camera frame.
  3. Verify that the sync box is on auto and turn on the sync box.
  4. After 10 s, the numbers on the light boxes will flash.
    NOTE: This indicates that the sync box is automatically sending a pulse every 10 s to both the lights and the EDA boxes.
  5. Explain the game by informing that the game is like Ping-Pong and one will understand while playing. To win, one player needs to make 3 points. Some members of the public will use smartphones to influence the game by visiting the website URL that is projected on the playground.
  6. Using the randomization sheet with the number of participants for each condition, tell the participants who will play the game, and who will be on the sidelines as spectators.
    NOTE: For the purpose of this study, the participants wearing the EDA boxes cannot be selected as playing participants because the spectator's engagement is being studied.
  7. Tell the participants which spectators will be using their smartphone. Ask the spectators to influence the game. Tell the participants to stay within one meter of the playing field.

8. Start the game

  1. Tell the game technician to start the game by turning on the projectors and the movement detection technology.
  2. Tell the players the scenario. Verbatim: "Here is the context: you are walking in a public space and you see this game. You decide to participate."
  3. While the participants are playing, visually check whether the lights are flashing every 10 s.
  4. In between each game, ask the spectators (not players) to fill in the Self-Assessment Manikin (SAM) Scale8 questionnaire on their smartphone on an URL. Give them the link of the questionnaire. When the first game is over, ask all the spectators, not players, to fill out a questionnaire on the smartphone about the experience. Ensure they answer three questions using three scales. Do not evaluate the game itself but rather the feeling during the participation.

9. Removal of physiological devices

  1. Read this verbatim: "Thank you very much for participating in the game. The last game is over. Spectators will now fill two paper questionnaires, players can leave. Please follow me to the greeting room."
    1. Ask all spectators, except the ones with the EDA, to go back to the table. They will answer the UES-SF two times, one time thinking about when they had the smartphone and one time when they didn't have the smartphone, this is written in the instructions of the questionnaire. Verbatim: "The participants with the physiological tool, can wait at the table. The others, can fill out the end of experiment questionnaire, please answer extensively by explaining clearly what is meant.' They can ask questions if any.
  2. Ask the participant to return the EDA box; turn off the device and the Bluetooth of the device.
    1. Unplug the sensor from the A3 port, remove the armband, and unsnap the sensor from the electrodes.
    2. Ask the participant to remove the medical tape and electrodes on their hand. Give the participant a tissue to remove the cream from the hand.
    3. Remove the micro SD card from the EDA box and repeat steps 9.2. to 9.2.3 with the other EDA participants.

10. Debrief the participants

  1. Bring the EDA participants to the table where the other participants are sitting.
  2. Ask the participants to fill the end of experience questionnaire. Ask the participants to answer extensively by explaining clearly what they mean. Tell them to seek help of the experimenter in case they have questions.
  3. Place the filled out post-experiment questionnaires with the pre-experiment questionnaires and consent forms in the folder.
  4. Debrief the participants. Once they finish, thank them for their participation, tell them about the compensation and walk them out.

11. Cleaning up materials

  1. Turn off the three light boxes.
  2. Stop the recording of the three cameras and remove the batteries and SD cards from the three cameras. Place the camera batteries in the charger.
  3. Turn off the sync box and plug the EDA boxes, light boxes, and sync box into the charging station.

12. Physiological data management

  1. Put the micro SD card from the EDA box in an adaptor. Transfer the data to the computer in a folder named by the number of the participant. Delete the files from the SD card.
  2. Select all the data and put it in a spreadsheet. Hide the columns that are not useful. Select approximately line 1 to line 3,000 and make a scatter plot. If all the data is between 240 and 550, the data is valid.
  3. Verify that the markers generated by the sync box are present by selecting the event column and sorting it. Press control Z to revert the sorting of the markers.
    NOTE: All the markers that were generated will be visible. Sometimes there are markers that did not appear. This is not a problem, only one marker will provide a point of reference. From this point, the beginnings and ends of the events can be calculated using the time of the camera. There are 100 data points every second.
  4. Add an event_start_end column. Watch the footage, when there is the beginning of an event, calculate the difference between the time of the event, and the last marker. When the seconds related to the event start are found, add a marker named event1_start in the spreadsheet file. Do the same for the end of the event.
  5. Repeat step 12.4 for the baseline.
  6. When all the markers are added, export the spreadsheet in .txt format (tab delimited text).
    NOTE: There will be two spreadsheets per participant, one with the experiment data and one with the baseline data.
  7. Import these files in the software that was developed for these EDA boxes (see the next section)19. This will generate a file ready for analysis that contains the relative time, absolute time, events, and EDA signal.
  8. Upload files to the EDA analysis software
  9. Click on Add Project. Add a title. Add a description. Enter the date of the project and the total number of participants.
  10. Click on the name of the project. Click on Experimental Design. Click on Signals and choose physiological, EDA, Bluebox recorder, Bluebox and version 3.0.
  11. Click on Events and enter the events as they were previously named in the spreadsheet (e.g., event_start_end). Choose Bluebox, version 3.0.
  12. Click on Transformations and choose GSR (galvanic skin response).
  13. Click on Unlocked to change to lock for locking the project. Click on File Import to import the files previously prepared.
  14. Click on the participant profile to give information about the participants by entering their email addresses. Click on Participant is There. Click on Ok Complete.
  15. Upload the data file which needs to be zipped in order for the software to recognize it. Click on the arrow. Click on the pies to upload the file.
  16. Go to Analysis and choose Data Exportation; select the participant and their data. Click on Export Data to create a file for statistical analysis. This can take hours if there are many participants. The file will appear under Filename at the end of the exportation.
    ​NOTE: To obtain the file ready for analysis, the software generates clean phasic data. Signal preprocessing steps were executed as follows: data was recorded at 100 Hz and resampled to 25 Hz, before applying a low-pass 2nd order Butterworth filter and a 50 Hz cut-off. Signal was then decomposed in tonic and phasic components using the convex optimization algorithm described in Greco's article20. This algorithm filters for artifacts and outlier data points.
  17. Use the file generated for physiological data analysis.

13. Analyze the data

  1. Subtract the EDA mean from the EDA value, then divide this value by its standard deviation (where the mean and standard deviations are based on the entire dataset)21 to standardize the EDA data.
  2. Subtract the mean of baseline EDA from each EDA standardized value, where the mean is based on the baseline data for each participant in question21 to baseline the EDA data.
  3. Calculate the means for each condition of interactivity for the SAM Scale and the post-experiment questionnaire (i.e., UES-SF).
  4. Test two mediation models, one for each type of arousal: physiological and self-reported.
  5. Test the relationship between the independent variable (interactivity) and the mediators (physiological and perceived arousal).
  6. Test the relationship between the independent (interactivity) and dependent variables (engagement assessed in the UES-SF).
  7. Assess the relationship between the combination of the independent variable and the mediators, and the dependent variable.

Representative Results

This section describes the representative results of this study. We recruited participants using social media and our institution's panel of participants. Of the 78 participants, 40 were women. The mean age was 22 years old. None of the participants had previously played the game. Other exclusion criteria can be found in step 1 of the protocol.

The descriptive statistics, which can be seen in Table 1, contain the mean per condition, for each measure. The mean of the arousal dimension of the Self-Assessment Manikin (SAM) is reported in the second row of the table. The SAM Scale was administered using a visual 9-point Likert scale ranging from calm to excited8 (see Supplementary File). Results show that the participants were more excited with the smartphone. The third row shows the difference between the mean of the standardized EDA for each condition, again showing that it was higher in the smartphone condition. The fourth row reports the means for each condition in the User Engagement Questionnaire Short Form (UES-SF), a 5-point Likert scale ranging from Strongly agree to Strongly disagree was used22. Again, results demonstrate that perceived engagement was higher in the smartphone condition. The p-values are reported for each measure, confirming their statistical significance. Employing the Baron & Kenny procedure, we were able to identify the mediating role of arousal in the relationship between interactivity and spectators' engagement23. Self-perceived arousal and self-perceived engagement had 78 participants and physiological arousal had 12 participants. The numbers are lower than what we recruited because we had to discard four EDA participants and two SAM Scale and UES-SF participants due to data loss.

These results show that this data collection and analysis method provides the necessary data to compare the two conditions of interactivity. As suggested by player experience literature3, combining lived and perceived arousal measures provide a more robust assessment. Moreover, this method allows for an ecologically valid measurement of both physiological and self-reported arousals, as the wireless EDA devices permitted a live recording during uninterrupted gameplay. Further, the self-reported arousal questionnaires were completed in between each game, directly on spectators' smartphones, which were already used to play the game. This allowed the participants to stay in the flow of the game.

Figure 1
Figure 1: Visual representation of the game. This figure shows the playground with one player on each side and six spectators watching from the side of the playground. All the participants are wearing a jersey with a number on it. Please click here to view a larger version of this figure.

Figure 2
Figure 2: Visual representation of the syncing devices. This figure shows the devices used to sync the EDA data. There is the sync box on the left and the light box14 showing a number at the right. Please click here to view a larger version of this figure.

Figure 3
Figure 3: Visual representation of the camera and light box. This figure shows a light box being positioned in front of a camera. The camera is on a tripod and the light box14 is on a mechanical arm that is mounted on the tripod. Please click here to view a larger version of this figure.

Figure 4
Figure 4: Relationships between variables. This schema represents the mediating role of arousal in the relationship between interactivity and spectators' engagement. Please click here to view a larger version of this figure.

With smartphone Without smartphone P-value
Self-perceived arousal 5.54 4.64 < .001
Physiological arousal (EDA) 0.0295 -0.1262 < .001
Self-perceived engagement 3.49 3.31 < .001

Table 1: Descriptive statistics per group. The numbers represent the means of the total values of each measurement tool per condition of interactivity. The p-values are shown in the P-value column. P-values were measured using a linear regression with random intercept with a two-tailed level of significance.

Supplementary File 1: SAM scale Please click here to download this File.

Discussion

Please note that the steps were performed in the studio of the creators of the game but could be replicated in a laboratory setting or another environment that has enough space to fit the game. It is important to note that the sync box can only transmit a pulse to the lights and EDA boxes that are within 20 meters. Therefore, the game room or playing field must not be larger.

Existing laboratory methods have used software to simultaneously begin both the recording of the videogame screen and physiological measurement tools10. In the context of digital games that do not take place within a screen, this method is inadequate. This issue is bypassed by the synchronization method described in our protocol. No matter when the recordings begin, the data can be synced. Our work has demonstrated that the technique proposed by Courtemanche et al. can be applied to game research, specifically, in games that take place outside of the traditional console-based gaming17. With the combination of synchronized physiological and video data, as well as self-reported measures, we were able to compare the two conditions of interactivity and observe a difference in engagement.

For researchers who wish to use this protocol, there are some recommendations that are not to be missed. The method relies on technology requiring long lasting battery power. All the material should be fully charged before the experiment to prevent data loss. The EDA equipment should always be tested prior to the experiment to make sure that it is fully charged, that the Bluetooth reception is working, and that the lights are flashing. Although the light boxes are very important for synchronization, if the light only sends one signal during the whole game it is possible to use the data. The events will then be calculated according to their camera time difference from that single signal. If one light is not sending any signal, it is possible to use the two others to calculate the events. If none of the lights are working, it is also possible to turn on the two EDA boxes and the sync box all at the same time and make it visible in the camera frame and rely on that for the synchronization of data, although this method will be less accurate.

EDA measurement can be affected by movement and sweat; this measurement could be compromised if the participants were to engage in intense physical activity. In the context of this game, what is important for spectators is simply to be able to walk around freely and use a smartphone. This level of physical activity was acceptable for our type of measurement. EDA sensors were placed on the non-dominant hand of the spectators, which allowed them to be comfortable using their smartphone with their other hand. Placing an armband on the hand and the arm of the participant is important as it helps ensure the sensor cable and the electrodes do not move. Particular attention must be paid to the movement artifacts during the data analysis process. Some data sets might need to be removed from the study.

It is also recommended to transfer the data after each session to avoid linking the data set to the wrong participant. This process also allows the verification of data recordings, as data cannot be visualized in real time. There should be three text files on each of the micro SD cards for each session per participant. The first file being the test (when the device was installed on the participant), the second file being the baseline, and the third file being the recording during the actual games.

The method presented in this work could be used by game designers who wish to understand the lived experience of the audience watching the game being played. As opposed to self-reports or interviews, physiological measures are objective and non-obtrusive to both the participants and the game24. Coupled with self-reported measures, they offer a more accurate way of assessing participants' emotional reactions24. A stronger understanding of the users will allow for a better design1. Due to its portable equipment, this method could be used outside of a laboratory setting. It could be recreated in the real context of the game, which is a public space in our case. This would further promote ecological validity. Other fields of research such as education and shopping, could also benefit from the portability aspect of this method and investigate its use. As Charland et al. state, engagement in learning is crucial5. This method could allow the assessment of the multiple dimensions of engagement in the real context of a class. Emotional responses have also been found to lead to important outcomes in the shopping environment25. This method could provide arousal assessment in the context of shopping malls. Further work would be needed to determine whether this methodology can be used in these other fields.

Declarações

The authors have nothing to disclose.

Acknowledgements

We would like to thank MITACS in partnership with the company that created the game to have funded this research project.

Materials

BITalino (r)evolution Freestyle Kit (PLUX Wireless biosignals S.A.)  BITalino 810121006
Devices (1 syncbox, 3 light boxes, 2 EDA boxes) Developed by Tech3Lab researchers1 n/a
CubeHX2 n/a n/a
Charging station Prime 60W 12A 6-Port Desktop Charger RP-PC028
6 USB3 wires for charging Insignia 3m (10 ft.) Charge-and-Play USB A/ Micro USB Cable NS-GPS4CC101-C2
3D scanner Velodyne LiDAR VLP-16
Projectors Barco F90-W13
Jerseys* (fabric, tape, string) Any Any
2 low light cameras Sony A7S
2 tripods for the A7S Manfrotto MVK500190XV
2 light stands for the go pro and the syncbox Impact  LS-8AI
1 plier for the light stand of the syncbox Neewer  Super Clamp Plier Clip
1 magic arm for the light box of the go pro Magic Arm 143A
1 Go Pro Go Pro 5
1 Microphone Rode  VideoMic Rycote
2 armbands Amyzor Moisture Wicking Sweatband 
*Make them yourself by taping the number on the fabric and perforating two holes to enter the string
Sources:
1.Courtemanche, F. et al. Method of and System for Processing Signals Sensed
From a User. US 15/552,788 (2018).
2. Léger, P.M., Courtemanche, F., Fredette, M., Sénécal, S. A cloud-based lab
management and analytics software for triangulated human-centered research.
In Lecture Notes in information Systems and Neuroscience. Edited by Thomas
Fischer, 93-99, Springer. Cham (2019).

Referências

  1. Cheung, G., Huang, J. Starcraft from the stands: Understanding the game spectator. Conference on Human Factors in Computing Systems – Proceedings. , 763-772 (2011).
  2. Foxlin, E., Wormell, T., Browne, C., Donfrancesco, M. Motion tracking system and method using camera and non-camera sensors. Google Patents. 2 (12), (2014).
  3. Nacke, L. E., Bernhaupt, R. Games User Research and Physiological Game Evaluation. Game User Experience Evaluation. , 63-86 (2015).
  4. Hazlett, R. L. Measuring emotional valence during interactive experiences: Boys at video game play. Conference on Human Factors in Computing Systems – Proceedings. , 1023-1026 (2006).
  5. Charland, P., et al. Assessing the multiple dimensions of engagement to characterize learning: A neurophysiological perspective. Journal of Visualized Experiments: JoVE. (101), (2015).
  6. Martey, R. M., et al. Measuring game engagement: multiple methods and construct complexity. Simulation and Gaming. 45, 528-547 (2014).
  7. Lang, P. J., Bradley, M. M., Hamm, A. O. Looking at pictures: evaluative, facial, visceral, and behavioral responses. Psychophysiological Research. 30, 261-273 (1993).
  8. Bradley, M. M., Lang, P. J. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry. 25 (1), 49-59 (1994).
  9. Granato, M., Gadia, D., Maggiorini, D., Ripamonti, L. A. An empirical study of players’ emotions in VR racing games based on a dataset of physiological data. Multimedia Tools and Applications. 79, 33657-33686 (2020).
  10. Ravaja, N., Saari, T., Salminen, M., Laarni, J., Kallinen, K. Phasic emotional reactions to video game events: A psychophysiological investigation. Media Psychology. 8 (4), 323-341 (2006).
  11. Alcorn, A. Pong. Atari. , (1972).
  12. Labonte-LeMoyne, E., Courtemanche, F., Fredette, M., Léger, P. M. How wild is too wild: Lessons learned and recommendations for ecological validity in physiological computing research. PhyCS 2018 – Proceedings of the 5th International Conference on Physiological Computing Systems. , (2018).
  13. Rozendaal, M. C., Braat, B. A. L., Wensveen, S. A. G. Exploring sociality and engagement in play through game-control distribution. AI and Society. 25 (2), 193-201 (2010).
  14. Downs, J., Smith, W., Vetere, F., Loughnan, S., Howard, S. Audience experience in social videogaming. Conference on Human Factors in Computing Systems – Proceedings. , 3473-3482 (2014).
  15. Tekin, B. S., Reeves, S. Ways of spectating: Unravelling spectator participation in Kinect play. Conference on Human Factors in Computing Systems – Proceedings. 2017, 1558-1570 (2017).
  16. Downs, J., Vetere, F., Smith, W. Differentiated participation in social videogaming. OzCHI 2015: Being Human – Conference Proceedings. , 92-100 (2015).
  17. Courtemanche, F., et al. Method of and system for processing signals sensed from a user. US Patent. , (2018).
  18. Batista, D., et al. Benchmarking of the BITalino biomedical toolkit against an established gold standard. Healthcare Technology Letters. 6 (2), 32-36 (2019).
  19. Léger, P. M., Courtemanche, F., Fredette, M., Sénécal, S. A cloud-based lab management and analytics software for triangulated human-centered research. Lecture Notes in Information Systems and Organisation. 29, 93-99 (2019).
  20. Greco, A., Valenza, G., Lanata, A., Scilingo, E. P., Citi, L. A convex optimization approach to electrodermal activity processing. IEEE Transactions on Biomedical Engineering. 63 (4), 797-804 (2015).
  21. Braithwaite, J., Watson, D., Robert, J., Mickey, R. A Guide for Analysing Electrodermal Activity (EDA) & Skin Conductance Responses (SCRs) for Psychological Experiments. Psychophysiology. (49), (2015).
  22. O’Brien, H. L., Cairns, P., Hall, M. A practical approach to measuring user engagement with the refined user engagement scale (UES) and new UES short form. International Journal of Human Computer Studies. (112), 28-39 (2018).
  23. Baron, R. M., Kenny, D. A. The moderator-mediator variable distinction in social psychological research. conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology. 51 (6), 1173-1182 (1986).
  24. Nacke, L. E. . Game User Experience Evaluation. , (2015).
  25. Lam, S. Y. The effects of store environment on shopping behaviors: A critical review. Advances in Consumer Research. 28 (1), 190-197 (2001).

Play Video

Citar este artigo
Brissette, R., Léger, P., Courtemanche, F., Rucco, E., Sénécal, S. Measuring Engagement of Spectators of Social Digital Games. J. Vis. Exp. (173), e61596, doi:10.3791/61596 (2021).

View Video