Show newer

@gabek hey, do you have a link to that owncast webhook stuff you were talking about ? I was looking for it on github and I can't find it.

One of the biggest problems I have with my current polling-based approach (git.beta.sequentialread.com/fo): when I get the list of current viewers, the JSON object for each viewer doesn't have their name (the name it displays in the top right) until they send their first chat message.

Can you think of a way to get that username right away when someone joins, or would it require changes to owncast ?

stream.sequentialread.com more ingress controller stream today

@gabek

I just had an idea when I woke up this morning, I was thinking about how owncast can be configured to upload the HLS segments to object storage. What if the entire app, including all of the user facing static content & user facing api responses could be uploaded too? Then the only thing missing would be the chat, and I know there would be ways to fix that.

Another idea I had was what if owncast could be run in "frontend" and "backend" mode, when its run in frontend mode its just a static file server + the chat, when its run in "backend mode" it's just encoding HLS segments and uploading them to the frontend.

@gabek according to caniuse this browser feature is on 92% of all clients today: caniuse.com/es6-module

Show thread

@gabek Hey I did some 1st impression load time tests with owncast frontend app, testing pre-loading the JS modules via <script type="module" src="..."> tags. The 1st test with HTTP/1.1 took about 10 seconds regardless of whether the script tags were there or not:

picopublish.sequentialread.com

But watch what happens when you turn on HTTP/2:

picopublish.sequentialread.com

Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.