Warwick's Awesome Speedruns and Demos (or WASD for short) is an annual, live-streamed event run by the University of Warwick Computing Society to raise money for SpecialEffect. After a scaled-down version in the previous year, we returned to the SU Atrium and were able to raise £1345.36 for a total of over £9000 over the years.
I was part of the five-person team behind WASD 2023, particularly helping with the sound side of our tech. This post aims to both provide an interesting overview of how everything worked behind the scenes, but also serve as a source of reference for future years.
Our main aim for the tech was to ensure that everything looked and sounded great, both in-person and online. The previous event had had some issues regarding audio, so we were particularly looking to improve this this time around.
Additional requirements that we adhered to were:
- There are two places for runners (with both a PC and option to plug in a console), each of which can be streamed independently or together in a race.
- There is a space for commentators to watch the current livestream and provide commentary over gameplay.
- Commentators and runners can communicate with each other and are all audible over the stream.
- Each runner's game is shown to the runner with stereo audio and minimal latency.
- The tech team can communicate with runners and commentators without being heard over the stream.
- The audience in the SU Atrium can both hear and watch the gameplay.
- Runners can perform their runs remotely.
The equipment that we used was provided by the Warwick Student's Union (standard AV equipment and cameras), the Warwick Esports Centre (computers and peripherals) and the University of Warwick Computing Society (everything else).
The Warwick Student's Union provided:
- 2 Mackie SRM450 PA Speakers
- 2 projectors and screens
- ATEM Mini Pro ISO video switcher
- Blackmagic Pocket Cinema Camera 4k with video tripod
- HDMI Minicam
- 1m-high staging for the main table, and 2 pieces of low staging for the runners
- HDMI splitter and very long HDMI cables
The Warwick Esports Centre provided:
- 5 PCs with peripherals (mouse, keyboard, webcam, monitor, headset and cables)
- These PCs were used for 1 Stream PC, 2 Runner PCs and 2 Practise PCs
- 2 extra monitors with peripherals, one for the Stream PC and one for the commentator feed
- Gaming chairs
The University of Warwick Computing Society provided:
- ATEM Mini Pro ISO video switcher with monitor
- Elgato 4K60 Pro Capture Card
- Elgato HD60 X Capture Card
- X Air XR18 Audio Mixer
- RØDE Wireless GO Microphones
- 3 XM1800S Behringer Dynamic Microphones
- 2 NEEWER LED Dimmable Light Panels
- 2 Audio-Technica BPHS1 Broadcast Stereo Headsets
- Networking PC and switches
- Many miscellaneous cables and adapters
We also used the following items, which were provided by various tech team members or their friends:
- X-TOUCH Universal Control Surface for the XR18 audio mixer
- Nikon D5500 Camera with tripod
- Elgato Stream Deck XL for the Stream PC
- Even more miscellaneous cables and adapters
Preparing for the Day
To prepare for WASD, we had an original introductory meeting and then two tech runs on the weekends leading up to the event. These tech runs were held in the Esports Centre, and allowed us to connect everything together, ensure they were working and work out which extra connectors we needed to purchase.
Two runner setups were configured on either side of the table. Each setup was identical, apart from using a different capture card.
The video and audio feed from the PC (or console) was sent via HDMI to a capture card in the Stream PC. The passthrough video feed out was then sent back to the runners' monitors.
The stereo audio feed sent to the Stream PC (through the capture card) was then forwarded back out to the XR18 Mixer through the XR18's USB interface. This was done using Voicemeeter Banana. We then sent two mono output buses from the mixer into each runner's stereo headphones.
Each runner had a webcam which was plugged directly into the Stream PC, and a microphone which was plugged into the XR18 mixer via the wireless Rode microphone packs. We planned connecting these directly to the mixer via a 3.5mm to 1/4" adapter, however no audio came through. During the stream, we found that charging the receivers caused some audible interference - but we avoided this by only charging the receivers whilst they were not in use.
Remote runners connected to our Stream PC using the Esports' RTMP server for game capture and vMix's call function for their microphone, webcam and their audio return feed (for interaction with commentators). Audio-mixing wise, the microphone and gameplay audio was mixed inside vMix and then sent to the mixer via a vMix audio bus and XR18 USB input channel. The runners' return audio feeds were sent through a XR18 USB output channel and another xMix audio bus. Note that this return feed did not contain gameplay audio (unlike the physical runners).
The commentators were located at the front of the stage, facing the runners with the audience behind them. This was partly due to space on stage and cable length restrictions, both for the microphones and for the camera/monitor. Although we had planned and setup for three commentators, for the majority of the stream only one or two people (Keegan and Samit) were there.
A monitor underneath the table showed the current stream output, with the Blackmagic Pocket Cinema Camera 4k above. Each commentator had a headset and a microphone. The headphones were all connected via a 3.5mm splitter to two output buses from the XR18 (configured to one stereo feed), and the microphones were connected to the mixer with many daisy-chained XLR cables. During the event, we found that commentators having an on/off switch directly on the microphones was very helpful so used the handheld mics more than expected.
The Control Centre
The control centre was positioned at the back of the table, mainly due to cable length restrictions. The main station was the dual-monitor Stream PC (manned by Ashe) which was also connected to a Stream Deck. This station was used for mixing the stream's video feed and managing the graphics.
On the left was the X-Touch control surface for the audio mixer (with a laptop for more advanced control) and on the right was the camera mixing station using an ATEM Mini.
All video feeds (apart from the runners') were mixed using an ATEM Mini. We mainly used three video feeds throughout the event:
- A front commentator feed, from the Blackmagic Pocket Cinema 4K under the table. This was easy to use, with a lovely depth of field.
- A roaming feed from the Nikon D5500, mainly used for a side-view of the commentators. This was connected via the SU's ATEM Mini (a very expensive HDMI extender!) as it outputted to a Mini-HDMI port which we only had a short cable for. This camera was also difficult to focus as the on-camera screen could not be enabled at the same time as the HDMI output, so it had to be unplugged whenever the camera was moved.
- A top-down feed from the minicam on the first floor. This camera was also fairly difficult to focus because there was no on-camera screen and we had to look down to the ATEM Mini screen on the ground floor. This camera had a very impressive zoom but didn't work as well in low-light conditions.
The final feed that we occasionally used was an iPhone's NDI feed being sent to a MacBook. This was used as an additional roaming camera, and was replaced by a Discord call for the Among Us sussycam.
All audio was mixed using the XR18, an 18-channel mixer with a main stereo output and 6 additional mono output buses. We surprisingly ended up using the vast majority of these connections - 14 inputs and all outputs. The XR18 also allowed us to use a USB cable to connect 4 stereo input and output channels to the Stream PC. This was essential for connecting the 2 Runner PC audio inputs, the stream output and the input and return audio feeds for remote runners.
Now is perhaps the time to introduce the diagram that we made for how everything was connected together. It may seem complicated at first, however if you follow each media source step-by-step it makes a bit more sense (thanks to Owen for making a much nicer diagram than my original hand-drawn one!).
Joel surprisingly managed to acquire an X-TOUCH Universal Control Surface, which we connected to the XR18 mixer via a switch. This gave us a fast way to adjust the volumes of each channel and bus, which definitely helped a lot. However, we still needed to use a laptop during setup to provide more control, e.g. when setting channels and buses to stereo or when setting channels to use the USB connection instead of a physical input.
For breaks, Spotify was run on the Stream PC, which was sent to the mixer through an XR18 USB input channel.
Networking-wise, we had two networks for the event:
Our main network was for general use and was connected to the internet via the Computing Society's Network PC and an open port that IT Services opened. This was used for all of the PCs and also was connected to an access point to allow NDI communication from an iPhone.
Our secondary network was isolated from the internet and was used for inter-equipment communication. This was used to connect the XR18 mixer with the X-TOUCH control surface and a laptop, and the ATEM Mini to a laptop (although the latter was not used as much).
The graphics were provided using a custom-built NodeCG application on the Stream PC, maintained by Owen. This provided a web dashboard which allowed us to manage the speedrunning timer and current run. During the event, some of the races required our own speedrun of building a new overlay in the hour preceding them!
This was also linked to Tiltify, the service we used to handle the donations and donation incentives. This allowed us to display real-time counts of how far we were to each goal. We had also originally intended this to run donation polls, however Tiltify conveniently decided to release a new version of their API on the weekend of the event, breaking some of the pre-existing API in the process.
The Stream Itself
Most of the streaming side was handled by Ashe. The stream was mixed using vMix, with a direct feed from the XR18's master output (through USB) for the audio mix. xMix's output stream was then routed through NDI to OBS, allowing us to have a fallback if vMix crashed or needed to be restarted. This actually came in useful several times, as vMix needed to be restarted at some points due to needing to reconfigure vMix audio monitoring outputs. A Stream Deck XL was used to make switching vMix scene and aspect ratio switching easier. If we had more time, we could have explored also controlling the XR18 using this.
Keegan handled the organisation of the donation goals, which were an important part of how we were able to raise as much money as we did. These were split into goals and run modifiers.
Goals were certain actions or challenges that we would run after a certain milestone was reached. Although they were great fun, some of these provided extra technical challenges to overcome.
The simplest incentive to stream was Joel's Hot Sauce Challenge. This consisted of Joel eating increasingly-higher levels of hot sauce. From a technical side, this only needed the relocation of some of our cameras. However, one of the issues we faced was that the iPhone camera (connected through NDI to a MacBook) appeared much redder than it should have. We believe that this was due to the MacBook not displaying colours properly to capture cards, but we didn't have enough time to diagnose this further.
The Suspicious Campus Lap, running across Warwick Campus in an Among Us costume, required us using a Discord video call to get a video feed into the stream. Although we considered using alternatives such as the in-built vMix caller, due to chaos behind the scenes of needing to organise remote runners at the same time we decided to just use Discord on a Macbook. This worked quite well, however we did have some problems with Discord messages and UI showing. There were also some parts where the video feed was very pixelated or dropped completely, but we couldn't have done much about this.
The Binley Mega Trippy trip (visiting the Mega Chippy on foot) was originally intended to be performed live, however we soon realised that this wouldn't have been realistically possible due to the travel time in our already-packed schedule. Instead, we made a video and then played this during a break on the second day.
Run modifiers were also a great way to encourage people to donate, however this had to be much more limited to not overly distract or inhibit the speedruns. The Celeste Chase at the end of the first day was perhaps one of the best events, with donation incentives to make the runners wear blindfolds, collect strawberries in-game or eat strawberries in real life (big thanks to Kristina who made an emergency trip to the grocery store)!
The End of the Stream
The event ended with a 'speedrun' of the get-out.
Debrief and Future Improvements
As with everything, there were new things which we learnt and which could be improved in future years. We organised a debrief meeting the following week, and came up with the following tech improvements (in addition to various points mentioned earlier in this blog post):
- Publicise the event more (e.g. on social media) - perhaps assign a dedicated person to do this.
- Assign a dedicated person to create the graphics and the overlay.
- Contact the SU Big Screen in advance to see if we can show the stream in the plaza.
- Consider adding more mid-run donation incentives, and publish these incentives earlier.
- Co-ordinate more with the SU regarding chairs and staging - there seemed to have been some miscommunication with the setup staff not knowing what we had requested, and we never got additional chairs that we'd asked for. However, this worked out well in the end as the fancy, coloured chairs were sufficient and looked good on stream.
- Consider configuring audio mix presets beforehand so there is less to manage on the day.
- Have more headphones for the tech station, so that each person can listen to the stream.
- Purchase more long HDMI cables.
- Find the batteries for the Computing Society's camera so that it can be used as another video feed.
- Use our own Twitch channel instead of Warwick Esports' channel due to it being unclear what we could change and only gaining access to the channel fairly late into setup.
- Work out why the runner headset mics would not be connected directly to the mixer, and don't use the Rode wireless mics as an expensive adapter to do so.
- For remote runs, receive the runner feed through a Runner PC instead of needing to reconfigure vMix each time.
- For remote runs, try and minimise the delay between runners and commentators. For example, use a separate Discord call or explore alternatives to RTMP (e.g. SRT or Parsec).
- After the event, we found that we had an audio drift issue on the second day. Investigate where this came from and try to mitigate it (we believe it may have been caused during the NDI stage between vMix to OBS, however we were unable to test this).
On the Day
- Split up on-the-day production responsibilities more to take the load of any individual person, e.g.:
- Audio mixing
- Video switching (both the camera feed through the ATEM Mini and the stream scenes through the Streamdeck)
- vMix configuration, e.g. setting up remote runners
- Take more pictures during the event - perhaps assign someone to do this.
- Assign someone to chat moderation and to monitor the overall stream from Twitch.
- Be more careful about using copyrighted game music, as one of the tracks in the break music received a copyright strike. Prefer remixes to original soundtracks.
- Do additional soundchecks to ensure that runners can hear commentators and game audio clearly.
Overall, we felt that WASD 2023 went well, and was another great opportunity to raise money for charity. A huge thanks to everyone involved, including the tech and organiser team (Owen, Keegan, Joel, Ashe and I), additional people that helped with tech on the day (Adam and Joseph), the commentators (Keegan and Samit), the runners, and the audience. And thanks to all the societies and organisations that contributed to WASD: the University of Warwick Computing Society, Warwick Esports Centre, Warwick Student's Union and Warwick Esports.