Livestreaming App Case Study



Intro


Despite the pandemics, a couple of events were conducted in the last months. Recently volunteers from Qualifast joined forces with the organizers of one of these nation-wide initiatives and we helped them bring additional value to everyone involved.


The Challenge


The initiative involved thousands of people in Bulgaria. Orchestrating that many people was a huge endeavour to organize which required a big deal of planning, monitoring and attention. Hats-off to this incredible achievement of the organizers’ core team. 


Our guys were part of a sub-initiative that was conceptualized just a few days before the event. After some analysis and thorough judicial review, it was decided that the value of the initiative would be further strengthened, if the participants were given the ability to do live video streams. However, this idea got a green light just 10 days before the start of the event. With such extremely limited time left for implementation, the mobile applications supporting the stream could have only been developed by the likes of Qualifast(ers). Even the main organizers of the initiative had their doubts about this being achieved in such a short period of time. Honestly, at this point we ourselves deemed this impossible. Citing Dimitar (our CTO): “This is comparable in complexity to passing Doom in nightmare difficulty. No way we will make it. I love it! Let’s try!”. So, here is our story of how we ‘got good’ and completed a run on nightmare.


The Solution


10 days before the live event we were heading to the planning board. The good news was that the backend that would support the live stream has long been prepared in anticipation of the final approval. This meant that the huge infrastructure struggle was already out of our way. So, our task was to create the UI design and implement iOS and Android apps that will authenticate the people that enrolled for the streaming and allow them to live stream, following the diverse legally imposed restrictions, e.g. timeframes, audio and video quality.


Up until this day we had never implemented live streaming, so the first thing we needed to solve was to understand how it works. We were already aware of what the protocol of streaming would be, so we initiated a research of libraries / frameworks that would help us using this protocol. Throughout the implementation, we experimented with a few, but eventually we ended up using rtmp-rtsp-stream-client-java for Android and HaishinKit.swift for iOS. We generally tend to avoid using third party libraries of unfamiliar sources, but given the tight time constraints we did not have an option, so we had to try and hope for the best. Luckily, those two libraries proved mostly stable and, after some reading about the streaming configuration options, we were able to get satisfactory results on both platforms.


Obviously, tight schedules and unfamiliar domains were not enough for us, so we decided to make a first production trial of the new version of our bootstrapping tool with this project. It is meant to help us setup new applications more efficiently, structuring the components we use as they should be. As with every first run, there were problems but after some bug fixing, eventually we got the output as we needed it. To be frank, in this particular project we didn’t mind the overhead because we fixed all these issues in our tool and we knew that the next experiment will produce completely different results. Our vision has always been - “Better spend some time fixing the automation once, rather than doing all the work manually a dozen times.”


Of course, getting things to mostly work is one thing and sorting out all more obscure problems and defects is another. For example, we ended up with these lines of code in our Android implementation:


// there used to be a defect related to Pixel 3a, that seems to have reemerged with latest Android version
// https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/issues/381
if (Build.MODEL == "Pixel 3a") {
   localCamera.setForce(CodecUtil.Force.SOFTWARE, CodecUtil.Force.FIRST_COMPATIBLE_FOUND)
}

They force software codec to be used if your device is Pixel 3a. Very smelly a solution, but throughout the course of this project we came to learn that media processing very often has such device-specific tweaks. (This is true for both the libraries we used, and the native android platform itself). We did our release with the logic above, but after the live event day we got in communication with Pedro Sánchez, one of the creators of the Android library we used. Eventually, we concluded that the settings at the side of the backend we used were actually causing the issue in this particular case.


The Results


Since the very beginning, we knew that we actually had 6 calendar days for the development because the applications would need to pass human review in the respective mobile apps stores. Time was running low and we decided to work throughout the weekend in order to get the app ready in time. Luckily (but also skilfully), we managed to submit both applications for review in the stores on the Wednesday before the live event (which was scheduled for Sunday). Breathing heavily and steaming hot air due to the overstrain, we proved everyone, ourselves included, that given the right foundation things normally considered impossible can be achieved.