An Interview with Phil Cluff, Product Lead for Mux.com

Play Video

Hi, it’s Ben from Betterpodcast here a few weeks ago. I had a chance to talk to Phil Cluff from mux. Who’s a video ingest transcoding and delivery platform? Amongst many other things, we spoke at length about his journey into video delivery. We talked about how COVID has changed events, driving them all online, how the industry has changed since that happened, and what he believes the future holds for events and video online. Please enjoy. Well, thanks very much for joining me on Betterpodcast. I appreciate

It. No problem. Thanks for inviting me to be here. Really

Excited was fantastic. Well, for, I mean, we’ve had a little bit of intro, uh, introduction. Now we’ve had a bit of, I was going to say, intercourse, uh, conversational intercourse.


I’m going to keep that in. So we’ve spoken a little bit over, over the period of, you know, uh, since I’ve been dealing with mux, uh, but for those who don’t know, tell us a little bit about yourself, what an introduction,

An introduction. Now let’s try it as well.

Yeah, so, uh, my name’s Phil; I work for a company called monks. We’re a San Francisco based startup, uh, with people, uh, in London as well. Um, I’ve, I’ve been in the video technology space for another 10 years now. Um, my first job was at the BBC, working on BBCI player. Uh, I then went and built, uh, streaming platforms at Braco from the first, the biggest RVPs and then thought, oh, I’ll have a little bit of a break and then went to work at a startup with a bunch of my friends. That’s going to be a break.

Yeah. Yeah. Well, that’s oh, I could do a whole show, a whole show just on your experience. Um, so tell, I mean, that is a hell of the original bay. How did you get into tech? Because, I mean, that sounds like you’ve done a lot of different things. Like, can you give us an understanding of the course?

Yeah, sure. I, um, I started in tech. It it’s so weird. Uh, I never intended to end up in video technology. Um, I often think, like, not a lot of people did really want to end up in video technology. Um, as a kid growing up, I guess like a lot of time I was, I was very like into electronics and that sort of thing, never, never like Valin to computer science. Like it wasn’t something that was taught, uh, in like the schools I went to particularly like, I, I grew up in like Northeast Lincoln church, not exactly where you think about, you know, top-end science. Um, but yeah, I was, I was always around like cameras and video cameras and that sort of thing. So always had a passion for it. Um, I kind of stumbled into my job at the BBC, honestly, when I graduated, uh, I dunno, I done a placement yet.

hey’re working with, um, a small startup in Manchester at the time called transitive field ended up getting bought by IBM and a lot of history there. Um, and some people I’d worked with their herd ended up coming down to the BBC who we see at times a big Pearl shop. Yep, man, I, yeah, I still occasionally right. Pearl. I like Pearl school, but, uh, um, my, my manager at the time from my, my year in industry, uh, was sat down at work at BBC and said, Hey, you should come work down here and then kind of worlds collided a bit of like, Hey, yeah, I really liked this video thing. And okay. So let’s, let’s do that, I guess. So

Did you, I mean, BBC is a Pearl shop, meaning I player is built-in Pearl.

It was at the time. Uh there’s. There’s very little pill that now, as far as I know, there might, there might be some, uh, um, that’s crazy. Yeah. When I joined it was, uh, hang on. It’s just like 2010 ish. Um, it was yeah, like overwhelmingly a Pearl shop. Yeah. Um, now it’s a big Java shop, lots of Java there for kind of central media pieces. So

Tell us a little bit about, um, about marks. I mean, so Mike’s, I think, is a Y Combinator alum, is that correct? Absolutely. Can you tell us a little bit about max?

Yeah, sure. So mux is, we’re a series D startup at this point. Um, we’ve been going for just coming up for six years now. Um, I I’ve been there about just commit for free now. Um, we build video for developers. So, um, if you’re just a developer who just wants to put video in your app or your service or whatever, that just works, that’s what we’re there to do. Um, you know, when we’re not there to, um, go after huge media companies wants to do video, you know, someday we’d love to go after that stuff. Um, but you know, there’s a huge amount of people who just want to put video in applications like video is becoming an especially more over the last, you know, 18 months, two years than it ever has been. You know, it was a critical part of how people interact and videos becoming like, like images was in the early days of the web. Um, and that is, you know, to us really exciting. We want to democratise video, every developer should be able to put video in their own application or service. So that’s what we’re, that’s what we’re there to do. You can say, Hey, you’d like the Stripe of video, right? You, you don’t go build your whole payments gateway when you want to accept payments, you just have Stripe or, you know, those, those people, we want to be the video like this, the Stripe of video.

So it’s kind of positioning, uh, for those who may know about like AWS, they have an IVS service, which I think is just the opened, they opened up the back end of Twitch and they call that IVS. So, um, uh, Bettercast was actually, we built out the very first version on IVS, pure, purely because it was incredibly simple, very, very simple and relatively low cost. Um, and we have moved to Marx, uh, and I’m really happy to have done that. Uh, not only because of the simplicity of what you guys are doing, um, but the level of data feedback that you get from your systems, um, what do you think, uh, do you know, or what was sort of the deciding factor that kind of, how were you going to be setting yourself different from something like IVS?

Yeah, totally. Um, maybe a little bit of, of kind of the background of, of a company is super interesting here. You mentioned data being really important and we completely agree, like the first product we went and built was, and still is a data platform. And this is a platform to understand how well your video is performing like a new Relic for video, I think was one of the very early pitches we used for that. Um, and that’s used by like some of the biggest streaming platforms in the world that’s been used on the super bowl of last three years. Wow. Um, so understand the experience people are having when they watch video, are they seen buffering? Are they, you know, how long does it take for video to start playing his video, low quality, all those sorts of things. People put video applications out there and they just don’t monitor them. And so the first product we built was a product to enable people to go and monitor their view applications. Um, but that now informs everything we build in terms of the live streaming platform. And the, you know, the on-demand streaming platform is we use that data to make decisions. We use that data to educate how we build our adaptive bitrate, ladders, all those pieces like data is critical to that feedback loop of building better video to use of the model of the company.

It’s a good motto. It’s a good motto. Yeah. So, uh, a lot of people will just sort of, when I say a lot of people, there are a large number of new entrance into live streaming, especially live streaming for client work in the AAV, they’ve coming off traditional Avi camera projector, uh, moving to this live stream and for, for a number of people, the comprehension of, of, you know, it gets to my encoder. And then as we said, before, magic happens and delivery. Um, so, so max is the magic, but can you give us sort of a, layman’s understanding of just how that, uh, cloud delivery goes and transcoding and the actual, if you, the process screen is totally,

Um, it is amazing people. Yeah. You kind of think it’s really easy. Right. You just send it out in coding and give it to people. Right.

Easy, I guess.

Yeah. Um, we kind of think about it in three phases. You know, a string comes into us for life. That means an RTMP stream. Uh, RTMP a very old protocol from the flash video days, um, that has stuck around, um, you know, a lot, lot of competitors for it these days, but non-vet have really gained like a critical maths to replace it. Um, so they you’re sending up like one stream. So one high quality copy of your content from your encoder that’s on-site. Um, and then that’s received by our video processing stack. Uh, first thing we do is transmit code that. Um, so we’re taking that inbound stream and we’re creating a bunch of different qualities of that content. So we’re creating, um, anywhere between, you know, a couple and, you know, five-plus different copies of that content for different qualities.

So, um, anything from, you know, say half a megabit up to five megabit or six megabits somewhere around there, um, and different qualities kind of vertical resolutions there. So, you know, if you play with a little, little cogwheel in the corner of YouTube, you get, you know, seven 20 P and ADP 360 feet, all those different qualities. Um, and we’re doing that because people’s internet connections change all the time. You can’t just measure at the start of a playback session and say, okay, you’ve got enough bandwidth to do a, you know, a 10 ADP stream, you know, all sorts of things happen on the internet all the time, whether that is, um, you know, a Dropbox sync kicking off on your machine, whether that is the update windows update a kid upstairs, who’s decided I’m going to start streaming my YouTube now. And you haven’t even got enough bandwidth for two. Um, over, through to my favorite one, which is, uh, I genuinely wants to live somewhere where the wifi router was sat on top of the microwave. So every time Sweden’s on the microwave to me, ages to work out what was going on there, where wifi would just go and people were making porridge.

So, so we, we create all these different qualities. Uh, some people call them renditions or whatever you want to call them. Um, and we, we then package that up kind of next step is packaging. So we then put those into segments or chunks, people, them, um, those trunks are anywhere between like two and 10 seconds. Um, there’ll be regular cadence. So if you start at two seconds, it’ll always be two seconds or if it’s six seconds or six seconds. Um, so then when we host those and deliver those through something called content delivery networks, very common strategy. So this is how we ensure reach and quality across a large footprint of users. We use, we basically outsource that, right. Um, we don’t want to go build a massive network around the world of tens of thousands of machines and, you know, hundreds of geographies. Uh, we pay for content delivery networks to do that for us. Um, so every request that’s made to mugs goes through one or many content delivery networks, um, which then take those, you know, two to six, second chunks of video, and then you can feed those into your, your player. Um, and you know, the player then pretty much just looks at the trunk, downloads. It gets video out, they use it for you. Okay. So there’s, there’s a lot of steps.

Yeah, just a couple, just a couple. Um, so there’s two questions that come up the first one, uh, I want to talk about, uh, the RTMP and how legacy that is. Um, but I also want to ask you about, uh, latency. So you said that the video is chunked into an two and six seconds. Um, does that mean that, uh, when the browser is downloading each chunk, essentially that’s my latency time, like as I’m downloading, I’m two seconds ahead. Is that how it works?

Yeah. Good, great, great question. So, yeah, w uh, general trunk size is between two and six seconds. Um, generally a player is going to have a few of those in buffer at any particular time. Um, iOS will have free generally in buffer at any one time. So if your chunk size is two seconds iOS ads natively, just in the player, six seconds on top of that. So that’s a minimum kind of buffer you’re going to expect to have the, um, tuneable and a lot of players, not an iOS specifically. I was just a great example where we know what the tuning is by default. Um, but yeah, so like, if, if you have a bigger trunk sizer, so, you know, if your chunk size is five seconds, you know, an iOS keeps three segments is going to be 15 seconds of blatancy, just in that kind of last little bit of delivery for you. You think about, you know, getting a signal in, encoding it, making sure that it’s available, um, those sorts of things. Yeah. That does play a huge part in the latency. And we can really go into the latency story as well, if you want to, um, I don’t know, just getting out. So, um, but I can, I’m also happy to talk about, you know, what we’re doing there, um, and what the industry is doing there to improve that. So, yeah,

I’d love to know a little bit more, this isn’t going out for a couple of weeks, two weeks. So, um, so maybe that fits in with, um, I’m assuming you have some development plans and release schedule.

Yes. Yes. There is something coming in about a week, uh, that we’ll move on, which plans for that. So

Can you tell me,

I need to tell me, I can talk about if you promise not to release this in seven days. Okay. Yeah. So, um, in terms of improving latency, um, a lot of latency is, is as I just described very much built into the protocol there. So, you know, if you’re using two seconds segments, yeah. You’re going to kind of naturally build in about six seconds of can see that. Um, for nearly two years now, I realised to see the day when I was writing a blog post, uh, we’ve been talking about low-latency HLS. So HLS is one of the protocols that is generally used to deliver video it’s Apple’s protocol for living video, but it’s also pretty universal. Um, 95% of devices can play HLS and somewhere in of a, whether that is via a web player or a native player or, or something else. Um, it doesn’t, it doesn’t really matter.

Um, we’ve been talking effort for two years now about low-latency HLS. And this was something that if you offer of HLS spec, um, uh, Roger pantos, uh, started working about two years ago. Um, it’s been a super interesting journey this last two years to low-latency HLS because, um, the community had proposed a low-latency extension to HLS, which was, uh, a little bit of history here, but it was, it was actually based on, um, the Twitch approach for low latency. So Twitch have their own proprietary low-latency HLS, which is just an extension on top of the spec. And, um, John Bartlett, who was working at Twitch at the time, actually proposed that as like, Hey, here’s a community standard for doing low-latency HLS. So it was very close to that, what, which we’re doing at the time. Um, and then apple said, Hey, no, we’re going to go do something like completely different, uh, um, course classic apple.

So, yeah. Uh, we, so Twitch’s proposal was around using, um, trunks transfer. So he, she, people at 1.1 has truck transfer mode, which is like, you can, you can send them something back without a content-length as it’s being produced. So, and then kind of like accustomed player to, to deal with that. Um, so you kind of advertise it just as it’s beginning to be created veteran cause just that spinning to created, and then it will take you exactly two seconds to download two seconds of video. So yeah. Enables you to get latency pretty low. Um, it’s very hard to estimate bandwidth reliably in a low latency mode, specifically built on top of chunk transfer. Um, Apple’s initial proposal was, was really radically different. It used HTTP to push, to push segments from the service side. So that’s where it may get pushed from the server to the client.

Um, that, that kind of hit a lot of, a lot of resistance in the video developer community because, um, the availability of HTTP to push on mom’s CDMs is, is very poor. Um, and that was actually really validated, uh, perhaps nitty yeah, there we go. Now because, um, Google are getting rid of HTTP to push from Chrome. Like it’s staying in the protocol obviously, but two pushes disappearing from Chrome because it’s virtually unused. I think it’s a tiny fraction of a percentage of sites actually used it. So, um, a lot of community, yeah. Went back to apple. We, we actually did two workshops, uh, just the community in general with apple, both onsite in Cupertino, which was a fun experience where you went yourself. Yup, yup. Francisco for it. We actually did it back to back with . Um, which was great. Um, a bit more about two months later.

Yes we will. Uh, but yeah, so we did it back to back 50 minutes and we did the over one back to back with mile-high video, which is another conference, uh, video, and yet, so we gave him a lot of feedback, worked for them, a lot of, um, a lot of problems we were seeing there. And, uh, so did lots of other people in the industry and apple ended up making some significant changes to that specification. And yet it’s now finally getting into a place where it’s, it’s getting towards usable. Um, we’re going to be, I mean, you think about how I’m gonna phrase this. Uh, so this is going out in two weeks. Okay.

Okay. I’m not, I’m not very fastidious with my editing, so yeah.

Perfect. That’s, that’s good to hear. So we just announced our, uh, low latency implementation, which is based straight off on top of Apple’s HLS. Um, that’s kind of available to everyone now. Um, it’s cool. There are some caveats, uh, the player ecosystems still isn’t quite there yet. We have, have, uh, changes going into source players. We’re, we’re pushing in the community to help improve that stability for web-players. Um, but the big one is where we’re also kind of waiting on apple a little bit. Um, while low-latency HLS does work on iOS 14, really iOS 15 has like a huge improvement in the low-latency HLS experience. So we, you know, it’s there, you can try it, you can use it, um, play a sport is still not perfect, but we’re pushing hard in the community to improve that. You know, we, we really want this standard. So what

Players are supported on, on release?

Lot that seems if it’s

There, it’s just not quite perfect yet. And yeah, we’re hoping by bringing this thing and putting it in front of a lot of people, we can say like, Hey, let’s work as a community and improve the player support, um, to, to hopefully get that to an even better place than it is. Right?

Yeah. Yeah. Well, I’m glad that, uh, video JS is supported because that’s what I used to get. That’s what we use. Bettercast I currently, I think your latency, I know that you have, uh, when, uh, building the, um, the, the, the, the live stream, you can specify to show, uh, a low latency or low latency stream. Um, but you lose your 25 seconds, 25 seconds or 60 seconds. Yeah, yeah, yeah, absolutely. Um, so how’s that going to go? How’s the reconnect window and latency going to work now?

Yeah. So start a little bit, maybe, maybe talk a bit about latency and what we’re seeing and what we’re expecting and where we think it’s going to get to. Um, so the current implementation of low-latency HLS on mux will get used somewhere in the region of four to seven seconds. Um,

That’s exciting. That’s good. That’s four to six seconds is very good.

Yeah. Um, we w there’s caveats, there always will be caveats on this stuff. It, it depends on geography pretty heavily right now, we’re working hard to improve that. Um, so like mainland us and Western Europe, uh, uh, should be pretty good, um, beyond those geographies, it potentially, as you get further from us, it starts to get a bit, bit more questionable. We’re working on that actively. Um, we also, we also think there’s, well, we know there’s still some latency improvements we can do. Sure. Um, still some player tuning to do for sure. But what I, what I’d say is the protocol as it stands, low-latency HLS, best case is going to get down to two and a half

Inside. Say

It’s not going to get much blue. I would be very impressed if the protocol can get below two and a half, maybe even three. Um, yeah.

But yeah, I don’t think like two, five seconds is, is fantastic in most of, uh, in most of these sort of conference worlds, like 45 seconds is, is acceptable. 25 seconds is which I think you’re currently at is like, good five seconds is exceptional. You know what I mean? So, so, uh, what about, or do you have a rough idea on Australia, New Zealand latencies? Is it a couple of seconds? We just, you know, instead of five, it might be six or seven, or is it still going to be 15?

We don’t have a good answer on that. Honestly, I’m looking forward to gathering more data on those, obviously. Um, I would expect in a lot of cases, players to kind of fail off of the low latency mode, um, as latency increases that that’s one of the things we’re looking at, um, in our open source contributions. So apples player, um, does failback. If it, if it kind of encounters too much buffering on low latency stream, it will fail you back to like the normal latency experience. Um, and that’s, that’s also one of the big things about the protocol as it stands is if your player can’t speak low latency, you just get kind of normal latency, which you’ll be the, yeah. The, the kind of seven to 15 territory, you know, they’re, they’re two second segments. Um, so it’s going to be six at least, um, at a fundamental level.

So, um, yeah, it’s, it’s, that’s what we’re expecting. Um, you know, getting this out and getting it in more people’s hands is an important step in progressing an ecosystem as well. Um, you did, you did also mention, uh, reconnect. I did want to mention that, yes, please. Um, today the same caveats as like our reduced latency, uh, present low latency. So you do lose the ability to use reconnects that is a priority vote. We do want to get kind of reconnects into that. Um, yeah, it might look a little bit different. Like it might be slate is one of the things we’re looking at doing there rather than like right now for reconnect behavior is to hold people in a, uh, rebuffering state, which isn’t necessarily great. So we’re looking at, for example, should we, should we give you the option to say, tell us what slate you want us to show? If you hit a reconnect, you know, that classic Simpson’s technical difficulties, um, and maybe they’d be better, a better solution there

Latency issues. So do developers now have, so the latency changed and the developers have to do anything different. Uh, is there additional steps they have to take now that you’re introducing the low latency or not? No, not really.

It’s a flat bag, you know, flag your stream with low latency when you create it. That’s, that’s it, um, you use a player that’s porcelain, but agency is an easy thing. Um, and that’s, that’s getting better as we, as we push that buddy ecosystem as well. Okay,

Fantastic. So I want to go back to the very start and which is the RTMP and RTMP S as you stated, it is very old tick. I didn’t quite know how old, um, and, uh, I’ve been talking to some people and hopefully I’ve got a guy coming on on the show in a number of weeks. Uh, they are pioneering NDI and NDI cameras, um, and the whole NDI technology, which was explained to me, and I don’t know if you know much about it. Um, and if you do, I’d love some more insight. Um, but the whole NDI technology of, instead of it’s packaging video, it’s, it’s packaging data and posting data out. Do you think that there are viable, uh, alternatives to RTMP that are faster, better quality, higher data rates? Is NDI a good option for that? Or what do you think? Yeah,

That will just change, uh, really rapidly. Um, it kind of, every time I have a conversation there it’s, it’s changing versus there’s a couple of big protocols there’s there’s SRT and wrist. Um, so that’s, uh, I’m not even gonna remember the acronyms, those SRT to me is, is text format, some titles. Right. Um, so yeah, uh, so SRT and wrists are both like, um, like for error-corrected transport streams for, for public internet. Um, they go about kind of how they do like retransmission and like knacks a little bit differently from each other, but fundamentally they’re kind of the same approach. Um, one of them, uh, SRT came out of a commercial product that was on kind of open specification back from the commercial product. Uh, wrist is kind of an open standard from the start. Um, they, they have like characteristics, but kind of approaching the same, the same problem, you know, they will get with a good amount of headroom, you know, on a, on a lossy network, still a very good video signal out, um, right now for a lot harder to configure and understand of an RTMP is, um, RTMP is kind of really simple for people to use.

Um, and, and for us, one of the big challenges with, with both those technologies is they can’t do multiplexing in the same way we do RTMP. So that it’s kind of a double-edged sword, like, um, wrist and SRT are both UDP based. Right. Um, which has headaches, um, RTMP is a TCP based, which has a whole different set of headaches, you know, that is like one of the fundamental problems that RTMP is, you know, cause it’s TCP it’s connection-oriented. If you drop connection, well, you kind of gotta reconstruct it. Um, so yeah, like the advantages, yes. Ascertain risk, both kind of UDP fire and forget, and kind of NAX going back over the control channel. Um, so those technologies like, uh, much harder to configure and much harder for users to understand as well. Like you need to talk very differently about, uh, how you size a network for an SRT or a risk restraint because, um, you kind of sizing for like, say, like, let’s say half of your bandwidth, because you want to have that second half of your bandwidth available for retransmission, for example.

Um, so like that, like the amount of extra headroom you have on bandwidth directly translates to how much packet loss, for example, you can, you can deal with, um, and on like SRT in particular, um, new specify, like the latency that you’re going to build in for protocol, which kind of gives you the window for how long you can retransmit things for like how much buffer for his retrans. Um, both of those are, you know, when, when you used well break protocols, um, NDI I’m less familiar with any, I honestly, as a contribution protocol, like to me, NDI has always kind of been be in the room with the same piece of kit. Yeah. Yeah. I would love to read more into that. Um, yeah, but the big challenge is that that multiplexing component of it RTMP is really is important Plex. We can give all our customers the same RTMP endpoint, stick it behind the TCP load balancer and kind of just, just worked right.

You’ve got a stateful connection, much harder and less complete and less understood in, in SRT and risk world. Um, I’d also, I also should want to mention a protocol called a whip, which is the web RTC ingests specifications. So this is idea here is to use, um, web RTC as a replacement for WebEx, uh, for RTMP. Um, that’s pretty exciting as well. And that’s by a guy called Ryan Jespersen, uh, and a bunch of people at Millie cast worked on that. Um, it’s pretty cool. There’s a fork of OBS with compatibility with it. So it’s pretty good. Um, would that

Like, what’s the data size of that where my whereabouts I see can handle 4k and not really,

I mean, there’s, there’s, it’s, it’s complicated. Uh, um, you can kind of send whatever you want. Um, it’s, it’s, it’s about how, you know, how you tune it more than anything, you know, how much, how much buffer you put in like, like we’re on a web RTC connection now for those who can’t see us strange description, but we’re, we’re communicating with the web RTC right now. And, you know, the protocol as it stands today is very much built for like make sacrifices in quality to maintain latency, uh, things like with a more tuneable to like, okay, well, because it’s an ingest protocol, I’m actually expecting slightly more latency, but that gives me like higher retransmit windows. And so I can hopefully make sure that more of the video picture gets, uh, um, but yeah, I’m excited about that as well. And then there’s, there’s also all sorts of people doing proprietary things over quick. Um, a lot of people are experiencing that area at the moment. Uh, Facebook and Instagram are doing lots of stuff in that area where they talked about that last year at the mixed, um, so that, that stuff is, is also really exciting as well. So I think we’re going to suddenly be in a world of different contribution protocols and the next couple of

Fantastic lots of confusing and indirect non-interactive software. Um, so let’s, let’s talk a little bit more high level if we can, for a moment. Um, what are the basic differences between AWS and mux? Why should I choose mugs?

Great question. Um, I, I tied down to a few things. Um, it’s worth remembering that there’s a bunch of ways you can build video on AWS as well. Um, everything from, Hey, I’m going to spin up my own box with FFmpeg engine X RTMP, you know, use S3, um, you know, build it yourself. A lot of people do do that still, um, through to yeah. Uh, media services, AWS and media services division, which obviously a lot of that was acquired from elemental few years ago, up in Portland, um, through to, yeah, IVS the new live streaming service, which again was quiet through Twitch were very different kind of that, that media services group, that’s a large, you know, meteor oriented, like bolt together. You’re like picking encoder. You use that pick up package, you use that maybe tie on the live streaming use of storage, that sort of thing.

Um, so, you know, compared to that with very, you know, plug and play his video, it just works. Um, yeah. Compared to IVS, we see ourselves as providing much more of an ecosystem. Um, you know, any live stream you do in mux is instantly available as on demand asset, for example, there’s no, you know, doesn’t get archive to S3, have to download it, process it needed to deal with it. It’s kind of all just in that ecosystem, but we’re also giving, um, you know, if you use mux video, you also get mixed data for free, you know, this, this same tool that people use to monitor the Superbowl, you can use to monitor your event streams. And that’s, that’s really important. Um, um, we’ve worked very hard on giving kind of feedback around, you know, the quality of a connection to us and RTMP connection civically.

Like, is it coming in at a good quality, you know, and being able to dig in and, and, and take a look at those sorts of things. And we’ve really built it focusing very aggressively on the developer experience, right. The API experience, um, the documentation and yeah. Trying to think about like the flows people use when they use it. And, you know, we can be, you know, like, yes, your live stream becomes a, an on-demand asset immediately afterwards. You can also use us to host and, you know, the exact same API or if you’re on-demand material. Um, so yeah, that’s, that’s how, how we differ really a much more complete picture of ecosystem and, uh, and, uh, a great Volpara experience as well alongside that data,

Which is all of the reasons you’ve just listed are, uh, why we moved great pens down, why we moved across?

Is there anything I missed?

Well, no, not really. I mean, uh, the biggest thing that for us, uh, using IVS is, as you said, the on-demand, the needing instant on-demand fully transcoded file. You get us within seconds of stopping the stream, um, on IVS, like you said, you have to essentially post everything into an S3, then you have to run it through a transcoder, which will take an hour, four or more for an eight hour stream, and then so much extra work. And you guys, it’s like, there’s the, there’s the asset ID. It’s fantastic. Um, so everything you’ve just said is exactly why, um, I would like to see a little bit more API personally, API APIs on the, um, the data, uh, for the stream data. Um, but I have a specific use case and, you know, I probably won’t get that

For awhile stream health fire API is, is absolutely on the roadmap. Happy.

Yes. Push it up in, in priority and give it to me earlier. You heard it, you heard it folks. I mean, you started out saying that video is being put into practically everything these days. Um, and as being on your side of the fence as, as the actual technology provider for this, how are you seeing this change playing out in the industry? Is, is it just like an afterthought, but add video now, or is people are like focusing on video?

Great question. I’m seeing both honestly, you know, um, often depends on the vertical that we’re talking about. You know, a lot of people are like retrofitting video as like, uh, a value add or a, you know, like that sort of thing. But a lot of people, you know, becoming video, video native, like I, you know, the transformation you’ll appreciate it. Right. And in the events and industry in the last 12 to 18 months has been crazy. Like it’s completely changed if those people are willing like native experiences for events, right. And it’s not, it’s not now a hybrid or an afterthought. It’s like, okay, well, this, this life event is, is a primary experience. And like a great example of that is hopping. Right. You know, I, I talked to Johnny, um, 18 months ago and he was, you know, two people in his bedroom in London. Um, I,

Can we just take a moment hop in it in 18 months, they have gone from two-bedroom to bivalent a bedroom to raising over a billion dollars. It is insane. They’re taking all the funding, but it is insane how fast they’ve grown it has is phenomenal.

Absolutely. Sorry,

Derailed.

Yeah, go on. You know, that business has gone from yeah. Like she’d been bedroom to, you know, whatever, whatever worth now night, seven, eight, $9 billion

To many billions,

Many, many, many millions. And, you know, that’s an example of like, that was built video first. That was that wasn’t, you know, , you know, hybrid events company that, that, that is right now, like a live stream events company. Right. Um, and I think that’s very like indicative things, but like taking a for examples, you know, we do a lot of video with like Robin hood. Um, that’s kind of like, yeah, absolutely. So they have, they have video in the newsfeed, we provide all, all that video music, um, which is, which is super cool. Um, but that’s like a, like, Hey, like I just want to add video into my app. Right. Um, and that’s the sort of stories we see a lot of people like, Hey, like video is really engaging. It’s the thing that keeps people in an app and keeps people engaged with what that, like what people are building and shipping.

So like we absolutely see people going. And I think, I think the story of 20, 28 or 20 19, 20, 20, 20 21 is, um, you know, 10 years ago we kind of had Netflix. Right. And YouTube, I guess, um, so much VAT is now like unpacked. Right. You know, we have more streaming services. We can shake a stick out. I I’ve lost count of email. I subscribe to someone mentioned it the other day. And I genuinely had, uh, I had to go and count them, and I think I’ve still missed them and things. Um, and so we had we had that unpacking of like, you know, Netflix and, you know, to a certain extent has now happened with life as well. Right. A lot of stuff’s come unpacked from major pieces. Um, the next unpacking is going to be real-time. It’s going be soon, you know, it’s going to be, my friend texted me the other day. Uh, I’ve told this story before, and it’s not going to tell again. He’s like, oh, I went to the opera. You’re like, oh, great. How was it? It’s like, eh, it was okay. It was on zoom. Like, oh, watch the opera on the zoo.

Hey, that’s a really terrible platform that, but okay. But yeah, like there’s like, you know, where we were 10 years ago with, with live streaming and an on-demand video is kind of where we are now with real-time video. Like real-time video is becoming mainstream, right. You can, you can go and buy web RTC service platforms and you can start building real-time applications, which is, uh, starting to produce kind of this unbundling experience for like, where people just like, you know, on a Friday night after dinner and their friends, like, oh, let’s, let’s have a beer on zoom together and say, oh, I spent all day in zoom. It’s Friday. Um, you know, isn’t there some sort of platform that’s oriented around like having a fake Pope or things like that. When, you know, you get into products like, like Gavin town, she’s grateful. And if you haven’t used it, um, yeah, I’ve seen the avatars go hang out and like, you go near people and like you get a web RTC connection to them.

Do real-time video of, um, uh, really good fun. I really liked the guys over there. I went to their virtual office once they have an office in their own product, which I fought for. So, you know, you get there and yeah, there’s someone sitting behind the front desk and you can chat to them and then you go off to a conference room and I was, I was, yeah. Okay. You definitely dog food, your products. That’s definitely, I appreciate that. Um, I think for your unpacking of real-time video services is, you know, getting there and it’s crazy. I’ve been in the real-time video space a lot in the last six months. And the difference to do where like streaming video is to where real-time video is right now is just a world apart, right? Whereabouts you see was only, you know, V1 ratified this year, which is crazy to think about, right. Hey, how long he’s been around. And, you know, there’s still a world of improvements that can still be done there too, to get that, that protocol more flexible. So I think that’s where the next major growth sector is, is like doing real-time video as well. Okay. So a second,

Um, will the next D Marxist

Real-time video or is that going to be streaming? Nice segue. So do you mix 20, 21, uh, first you don’t know who bought the most is so dim upstairs probably should start that

You run an event apparently coming to market. Can you tell me a little bit about it?

I do. It hates this, a video podcast. Can I show, like I actually have a DMX most in the convenience? Yeah. So do you mixed is the conference for engineers working in video? Um, is, this is, um, the, um, five, six, seventh year five was 20 19 6 was 20, 20, so seven restructuring. Um, yeah. So this started as a terrible drunken idea in a pot, like kind of all do. Um, we’ve been doing, like, I was living in San Francisco at time. We’d been doing San Francisco video techniques for, uh, probably a couple of years that point on, you know, at least, and, um, conversation kind of goes, Hey, why don’t we get together and do a day of e-store books. Yeah. That sounds like a good idea. Right.

So you did that first time around in 2015, it must have been, yeah, 20, 15, 20 15 at the time. The website’s still there. If you’re, if you’re really into it, that you can see 20, fifteen.teamworks.com 2016, the mercks.com like that stuff. Um, so that was, you know, October, 2015. Um, we got together in, in what at the time was the crunchy roll kind of event space. And I think it was a couple hundred of us pretty great. Um, 2019, we were 800 people in a huge warehouse in South San Francisco, um, 20, 20 over well, over a thousand, you know, 1100 people. Wow. Uh, online streaming with, with very good uptake around very good retention rate there. Um, yeah, 20, 21 where, you know, we were hoping to be hybrid, you know, this year, but it doesn’t happen. You know, things, things change in 2021. So, yeah. And, you know, there’s talks there from Netflix, from Twitch, from YouTube, from Facebook, from the biggest video tech companies in the world. And it really is video technology, engineers. Low-level video tech, just talking technically with other video engineers. There’s no sales, there’s no pay to speak. There’s no nothing. It is all about the, the video taxpayer. Wow. That’s an

Absolute nerd Fest. I Love it. It really is.

Um, yeah. And, and, and we love it and it’s, it really is a say, but it’s, it’s a passion project at this point. You know, it, it, you know, we, we do it alongside our day jobs and it takes an awful lot of time and at work to do, but it is, um, it is so important and, and we love the community that, that comes out of that as well. So, um, incredibly important to us.

So maybe you can tell me that, I mean, you’re doing this alongside your day job. What, what are some of the key learnings of like actually producing these types of events and as a video company, um, you’re putting them online, they are hybrid, or they’re at least virtual, or there is the zeal nine. So what have you been taken away?

Hi, for the longest time we used to before 20, 20 let’s, let’s say this before 20, we, you know, we really want to get people in the room. We think it’s very important people in a room. And like, we always went like a single-track conference as well. We don’t have like multiple rooms. We have, like, we have spaces people can go to and they, they can actually watch the live stream in the breakout room, but if it’s just the mainstream. Um, so yeah, we, we, you know, up until 2019, that was just how we did it. Um, 20, 20 things changed. Um, we were, uh, you know, fairly late to cancel and the live stream historically had just been on Twitch, honestly, like we would just stream on Twitch and, you know, no payment gateway, no nothing. You know, we wish for you to be there in person, but if you can’t be there, Hey, it’s all on Twitch.

It’s all live there. Just go watch it there. Um, and the team at Twitch where we’re huge part of that as well, they were, you know, come and do the videography for us called the OBS set up for us and everything. And they would run the stream for use on the day. And that was great, but obviously coming into 2020 things fixed changed. But, um, so 2020 we built our own platform for lack of a better description. Um, we, we were very lucky, um, uh, kind of daydream muck stuff. Uh, we’d actually just hired someone who had a bunch of experience, Twitch streaming, um, so, and relatively big events there as well in the fighting game space. It’s pretty cool. And, um, he joined a year ago this week and during his first week, we were like, Hey, how do you fancy?

So he’s a big VMX guy. Um, the mix, you know, a very, very impressive tool suite for building live streams. Um, so he actually did a huge amount of work there, um, the actual life. Um, we kind of think about format a little bit. Um, what’s important to us is like, we, we want to make it like well-produced, but also feel human. So we have to find a hybrid there a little bit, so the way we do it is talks are prerecorded. So we, you know, give you like pick you a couple of months out, and then we say, okay, please. Pre-record, here’s some information on how to record yourself. And we, we, we do have a very specific way, so we say, okay, don’t do any post-processing on it. Literally just get a decent, like, get a decent camera, um, or as best you can, like put a light on yeah.

Face back to the window, like map their last year of all things. Um, it didn’t, it didn’t even have a blind, it was ridiculous. Lovely, lovely, nice. And, uh, like even record yourself in like a side-by-side view. So like just the two streams put together, like your screen-share and your, um, and your, your video, your headshot, um, or like get two cameras, uh, like a screen recording and like your roll camera and just like clap the start of it. So he’d get a sync point. Um, and then we do the editing. Um, so we have a great production editor who we pay. He’s a good friend of Matt’s from, uh, Atlanta in Georgia. And, uh, what he does, is he your van? Yeah. Like good at it, out album mistakes and make you look amazing. All sorts of things. Like, I mean, when I, when I, when I do the DMX podcast, uh, tour about twice, as long as the content you get out of him with me and you have guys flipping whatever we say.

Um, so yeah, like we encourage people to record them. Like if you fumble just like that, sorry, I’ll start that again and do it again. And, um, you know, do that one slide again, and we’ll, we pretty much like we don’t do anything super advanced there, like a lot of, you know, we do it as jump cut edits. It’s not, there’s no point in going in and being really fancy there and pretending like it was perfect the first time, it was, there’s no benefit there. And, and so we do that, they produce those two versions of that content. Then one version that’s for the Livestream and one version that’s for the via on-demand archive, your on-demand archive is like fully produced with like loafers animated backgrounds and that sort of thing. And the version that’s for the live stream is actually very simple. It kind of just has the two video cut outs and, um, the borders around the video.

Um, and that’s so that we can do the lower thirds live that so that we can do all that sort of thing as live content. So on the day, then we eat so midnight do live and seeing on the event. So we’re talking to each other, like we are now, uh, introduce the event with the opening stuff, and then, uh, we’ll play for talk. And then we’ll invite the participant to come in life, um, to talk about, like to take you and a, so, you know, if talks 10 minutes say, and get five minutes of live Q and a at the end of it, um, that’s great. Cause it means like the talks are always going to go to time. It’s always going to be well-produced. The audio is always going to be good. We’ll do whatever post presses. We need any noise reduction on the audio and the post-production. And you know, if that person’s got bad internet on the day, it’s only their Q and a, like, it’s not the end of the world. Um, you know, it’s more important than the talks there and well produced. So we think that’s huge, you know, the, the talk, we think that’s the best, the best of both worlds, like the person is that life gets to answer Q and a in real-time, but the talk still gonna be well-produced and, and well put together. Um, yeah,

That’s actually, that’s a good point that that you’re making is, is, um, getting people as much pre-record because on a live event, uh, or a fully virtual event, you can just stream the prerecord. It’s still going outlive. Um, but you can control that quality, uh, using a tool like we’re using right now, Riverside, where we’re just chatting and web RTC, whatever quality, whatever. Uh, but then the, the, the upload is going to be 4k. It’s all going to be the best quality it can be. So you’re not worried about, you know, you can do post-processing. So I wonder if, if that’s kind of maybe a best case, um, for, for technicians to be like, get as much pre-record done as possible, and then just, you know, comping people after the thing. It’s a really, yeah.

Interesting. One of the other big pieces for us there is it enables us to have a great accessibility story. This is so important to us right now. Um, so by having those pre-records, you know, a few days ready early and ready and edited, uh, we also give them to our captioners and our captioners produce perfectly aligned, you know, 99.9, nine, nine, 9% accurate captions that content. Um, and then on the day we would work with, um, we work with, uh, red bee media and London who, you know, they do the captions for the BBC. Um, we don’t just use them because I used to work in the same office.

They’re just really good at what they do. Um, they do a great, a great service. So what they do is, um, during the day, they kind of listen on our caption Lupron, um, and for live capture, live life pieces, they’ll do live captioning. So sternal, graphy, uh, or talk over and that’ll be, yeah, that will be a few seconds delayed and there’ll be some inaccuracies in it. But then when a talk starts, they are going to play in that pre-produced file. So if your alignment will be perfect, you know, the, the winning will be perfect. So like we can really improve the experience from an accessibility standpoint there. Um, that’s huge as well.

Yeah. We’re actually at Bettercast, we’re in talks. Uh, we’re going to be partnering with a company called rev. Um, and we’re offering, we’re going to start offering, uh, full captions as default, um, AI captions free and then human captioning as a paid upgrade for that, that level of accessibility after the events giving that level. Yeah. And I’ve also had on, um, uh, on the show, uh, a fellow Tim from talent entertainment, talking about stenography, talking about the technology of live captioning, um, and also how that can be included, uh, into platforms. Um, some, some players and some services don’t support injection, uh, on the server of, of live captioning, which is a shame, but I’m sure it’s one of those things it’s sort of, it just has to come about, you know, a little bit more time. Um, but accessibility is the, I mean, one of the key and foundational things that going online and hybrid does is makes this accessible. If you have any physical ha uh, I only no handicap, and you can’t make it, not necessarily this I’m not going, but doing online opens up this information that is usually cloistered in a lodge warehouse, you know? Um, so it’s, it’s really exciting. Uh, accessibility is as, as a function as if it’s for me anyway.

No, absolutely cannot agree more.

So in a changing world, um, how are you seeing, or what do you see the future of online events and online video? Um, and I wanna prereq, like, there are some platforms that are trying to do like VR and re you know, all of this immersive, which, um, I’m not a big fan of, I think it’s very gimmicky, but what do you say? I mean, you are right in the middle of it. What are you seeing the future 10 years, even

10 years, 10 years, wow. Brains in jars, then we can have the VR experience. Uh, no, no, I, I, I kind of agree with you. Uh, um, you know, I think the AR VR stuff is, is very immature where it is now. It’s not say it won’t get somewhere interesting, but, but all of kind of experience that I’ve tried make, make fairly big sacrifices to get to that AR the AR VR experience. I’m not, you know, super excited about that. Um, you know, I think, I think writing’s on the wall that the future is hybrid. Um, like the, and you know, we, we have this policy at work, which is, is remote equal. Um, so you know, like some other kind of where you are, you’re, you’re equal with, with people who are like, yeah. Located in America or San Francisco specifically. Um, and I think trying to get to that, like remote equal experience for, for online events is going to be like critical over the coming year.

Like, you know, as much as I like to sit there and say like, Hey, you know, COVID, it’s gone. It’s over, it’s it, it isn’t right. Like, as you know, it’s w we’re going to be stuck with this thing, especially globally, like we’re going to be stuck with this thing for a good while yet. So like, and I think it’s, it’s changed how people think about travel as well. Like people are going to be doing just fundamentally less traveling. We’re going to be trying to like string things together, um, like string together, different events in different time zones. So, um, I think, yeah, the AR VR stuff is a bit gimmicky. I think reducing the latency and increasing the interactivity is, is going to be critical. Um, over the next like short term, that’s going to be critical, like getting people’s reactions and chat and, you know, that sort of thing in as quickly as possible, um, I think is, is, is very important.

Um, and you know, there’s, there’s people doing super interesting stuff there around like, yeah, using, using web RTC for four components there to really bring the latency down. And that can be things like, um, I was doing a webinar the other day and they had this cool, cool experience where it’s like, um, we’d be doing the webinar. And then if someone had a question they could ask it themselves. Right. You can just say, Hey, I’m going to unmute this person from the audience. Now you can ask, um, and doing that with larger and larger audiences is really exciting without it being like this jarring experience and changing and latencies, uh, cause they might have to transition from one protocol to another, those sorts of things. Um, I think I’m, I’m very bullish about like seeing that, that experience change and you know, I’m also, Ooh, we talked about web RTC.

A bit we’re about to see is, is, you know, has today to make quite a lot of sacrifices around perceptual quality, um, and you know, uh, to, to get to a good latency. Um, I think that’ll change over the next five years. We’ll see more ability to tune web RTC at the receiver end, um, to be able to say, Hey, actually I’m fine with 502nd, 500 milliseconds or, you know, a second of latency, but I still want to be on web RTC. Cause that’s, that’s like a second of latency there at a global scale is actually really, really good. Um, but like that technology has to evolve with it from the receiver perspective and you know, even the encoder and the SFU perspective. But, um, beyond that, like them, there has to be like a fundamental change in the cost model for those sorts of super-low latency, ultra-low latency.

Some people refer to it as, or like sub-second technologies to become, to become like financially accessible to people. Um, Stevo technologies tend to be an order of magnitude, more expensive than, um, you know, traditional streaming technologies like hatred and dash and those sorts of things. Um, so like when we get to a point where we can truly do like web RTC on the CDN, like things are gonna start to change big time, we’re going to get to like hyper low latency ultra or high collar latency or whatever you want to call it, um, at scale for events. So I think that’ll be exciting, but yeah, really, you know, that, that interactivity model has to become is going to become really important because like, yeah, like one of the big things we always loved with D was like the hallway experience and nobody’s been able to create that in my opinion, like in, in the same way.

So like the hallway experience I would wander down the hallway at , I’d be like, you know, there’s Derek from Vimeo, there’s, you know, Casey from Netflix, there’s, you know, like I can chat to those people and like you bump into those people and it’s an, an unstructured social, and like a lot of platforms have tried to replicate this right by whether it’s like I’m using as an example, like a chat roulette style experience where you’re just like putting in with some of two minutes off we go. Yep. Probably someone I hate. That’s fine. It’s only two minutes.

Like, um, think about how you can like do almost discoverable groups. So if like, like if I see a couple of my friends over there in the hallway between talks, like I’ll be like, oh, go and hang out with them. And like that might be like three really influential people from three different companies, like have that experience. Um, but also hallway Swiss always included like sponsorship experiences. It’s really hard in a lot of cases to give really cool sponsorship and really meaningful sponsorship experiences to, um, companies want sponsor events as well. And like, I I’ve seen lots of implementations with like demo booths for just the stamp duty most of the day. And it like, whereas in a true hallway in a conference, like there’s always people running around, always people looking at freebies and like, yeah, like you can kind of have this weird obligation where it’s like, well, I’ll watch your demo, but I really just want that power back. That’d be nice. Yeah, exactly.

Yeah. I’ve done that myself. I’m not going to lie.

My USB sticks it from, but yeah, I think, I think nailing those two things is really going to change online experiences as well. Cause like it’s not remotely cooks you like those things don’t work as well in a virtual environment. Like we we’ve been playing around with what we’re going to do with, um, like breaks this year. So that’s in less than a month and we’re still thinking about things. So that’s kind of how we work with the mix. We really cut it

Down the line now shooting from the hip.

I wouldn’t tell you how many, how close last year was to align. There was code changes every night during the conference last year, but I was, that was good. Fun. And we were wondering things like the outbreaks, like, do we want quizzes? Like, do we want like fun things going on during like, like we call them ad breaks. They’re also like bathroom breaks and coffee breaks and that sort of thing where it, wherein, in the normal conference. Yeah. You would go and like, yeah, get a coffee, but also walk through the sponsor area, chat to people, maybe grab some food. Um, but like when it’s online, you get to that break and people just kind of switch off and go to the bathroom and get a coffee there on their own. Like they’re not engaged in the streaming cause maybe you haven’t made anything engaging in the stream in that time. But like how do you do that? How do you make that engaging? That’s that’s going to be good. Fun. Interesting.

And how do you think, uh, for me, uh, and a lot of the people we work with, the focus on quality of production, um, is less about it’s a live stream and more about this is TV. This is broadcast television that the events are doing now. Um, so taking the core learnings of broadcast television production into the live stream. And when you see what some of these guys, I mean, quite literally children are doing on Twitch streaming on YouTube, like the level of production that these kids are getting with a couple of hundred bucks of gear in their bedroom while they’re playing a goddamn game is phenomenal. So what are you seeing, um, in that sort of shift? Are you seeing that it’s generally like we’re in TV now or there’s still a long way to go? What do you think?

Oh, great question. Very, very, very near and dear to my heart at the moment. The question about like that making content production accessible is so, so important. Um, yeah, I have this, this example slide that I occasionally use it to post about. Like everyone’s becoming a live streamer now and give examples in there as like, like I, I used, I like wrote a slide during 2019, early 20, 20. I think it was actually it’s like, like F1 drivers are suddenly Twitch streamers, like Lando Norris and people like that. And they’re like, they’re in their bedrooms now. Like, like we do live stream. Um, but also like your university professor is, is now a live stream and there’s, there’s a big difference between those two, right? In their ability to drive OBS as an example, like a university professor might not even be able to install OBS on his laptop for that.
So like there is this, this huge, uh, market evolving for like browser-based content production. Um, and like that’s, that’s coming on handover for us. You know, you look at how much hopping bought stream yard for like is a real reinforcement of how, how that technology is going to become like the future of content production. Um, the browser is getting better and better at processing video every, every year it’s better and better. And, uh, Google puts something called web codex into the browser this year, which is a really big deal for manipulating low level audio video in the browser. Um, you know, where previously, like even even getting something with a regular I-frame into it was virtually impossible, um, is, is now like meaningfully possible. It’s like manipulating the data in a browser is going to get really much better. Uh, there’s a great talk at the max this year about that.

So like the browser will become this tool for creating live video and complete comments and it will get to a point where you can get to, you know, close to a television. It’s not up to a television-quality broadcast, uh, experience from, from browser-based technologies. Um, you know, that’s going to be like a hybrid of a bunch of stuff. Like it’s going to be a hybrid of like web RTC plus like maybe canvas or web GL and those sorts of things. But like we’ve seen people produce streams where, you know, they just link up their zoom to, to us and like, yeah, that’s super unengaging, right. It’s just, you’re going to end up broadcasting a black background, rectangles that just jump around when somebody joins or leaves. And like, that’s just so unengaging, like something as simple as like a nice background, and overlay lower thirds.

It just, as simple as that, it’s just this huge step up from where a lot of people are now in terms of the content they produce. And I don’t think, I don’t really think that the tools like, like no offense like ops and, uh, you know, VMX and that sort of thing like that, they’re, they’re great products and great tools they’re not accessible in general. Right. Um, the next step of accessible live streaming will be yeah. Building, building studio tools in browsers and then compositing that and making those amazing experiences. And there’s, there’s a lot going on in that space. Um, it’s a very exciting space, obviously. Yeah. Like streaming arts and a lot to, to start the foundations there. And obviously that’s why Hoppin went and paid her a lot of money to acquire them. Um, the guys at restream restream is now incredibly popular as well.

I use restream studio. I think it’s fantastic Tool. And there’s a small startup in San Francisco called stream club doing some really cool stuff as well. Very much like very closer to an OBS in the browser. Like rather than like just having these predefined layouts. Yeah. I can pull it in custom mode. I can drag anything anywhere I want. Um, and yeah, really cool, really cool product. So that stuff is going to get to become,

I’m seeing personally I’m seeing a split in the market. There is, I, I believe you’re going to have desktop production, which is that like you were just talking about, I am producing my own event for X number of people. Um, and there’s definitely going to be that, but there’s also, and this is where, uh, Bettercast really sits, uh, working with a V in audio production tapes is event managers who are doing a hybrid event. They’re not going to be doing it in browser. They’re not going to be sitting there. They need the production team to be doing it. And I think that there’s definitely a split and a lot of like hoppy. And as you said, stream yard, they are putting their cash into their, their eggs, into the desktop published events and, and, you know, only hybrid, uh, only virtual, sorry. Um, whereas the, I see in quite a lot of people do see that hybrid is actually the longterm, um, where it is that that broadcast level of like it’s a camera in a venue with a, with a switch up, you know? Um, and, and a lot of these streamers, I, I keep saying it and I reckon it’ll start to be, but a lot of the streamers will, you know, they’ll go into a career in hybrid event production, you know?

Yeah. Like you’ve spent the last 10, you know, five years streaming to a million people every single day. We’ll just take that exact same skillset and go into a corporate environment and stream five different rooms across six different CA like it’s, it’s, it’s the same stuff, you know what I mean? But the Le the level of understanding they’re gonna bring to, I really think that there’s a whole new career opening-up

Super interesting. I think that’s a genuinely really interesting idea that like, I mean, that’s kind of what we did with content production guy at yeah. Like, yeah. So yeah, no completely agree with that. So that’s, yeah, I hadn’t really thought about it like that from like a, Hey, these Twitch stream is going to turn into yet content producers now for like corporate side. That’s a super interesting idea.

I think it’s fantastic and absolutely a great, you know, a great way of doing it. Um, my son is maybe a little too young, but I’ll push him into it. So, um, I want to wind up, um, it’s been a really, really interesting chat. Uh, what I want to wind up with is, uh, you have spoken about stuff that max has already in the pipeline. Is there anything that you can maybe give us a hint on for future or at least plans on what max is sort of planning, uh, for the greater, especially for developers or people trying to

Implement their own? Definitely, uh, check a moon before you publish this. Okay. Okay. Um, when developers come to mux looking to build video, um, we categorise that market into three slices. Um, people want to build a on-demand video, right? The Netflix is a YouTube that, you know, everyone knows what on-demand video is right. Um, live streaming video, right? Like either like a Twitch or like what you guys do, like live events, those sorts of things, and then real-time. Um, and Hey, developers don’t actually know the difference between live streaming and real-time, cause they can kind of look really alike the only real differences, like the communication aspect of it, like for BI directionality of it. Um, almost the directionality of it and, um, the latency, right? Like we’re probably at like 150 milliseconds somewhere in that region versus, you know, seconds. Um, we want to service all those markets. So, you know, real-time becomes this critical piece of, of what we want to do at mugs down the road. Interesting. Awesome. Well, thank you so much for the time. Uh, it’s been a really interesting chat. Of course. Thank you so much for having me. I, this is really good. Fun. I really appreciate it. Uh, yeah. Thank you so much. Awesome.

Stay Connected

More Updates