Show newer

@unlobito Right, but then that doesn't do progressive loading? Since that's what I'm looking for.

depol nlpol 

I’d give Thüringen shit for voting nazi but my entire country did and their party supplied the prime minister so errm yeah

me away from my computer: “I will go post a thing on fedi before i do chores”

me at my computer: “uhhhhhh… hmmm….. what was i going to post again?”

@unlobito Hmm. Assuming you're referring to the resolution selector, how does the browser select the resolution(s) to load?

@unlobito It's been brought up, but I'm unsure how this would be applied as a background image - aside from that, does this actually work to do progressive loading?

When specifying multiple `background-image`s, for example, Firefox will still just show a flash of white while loading the first one, and the fallback never seems to be used.

It would be cool to have a meetup or something like that where people that need help with their self hosting infra (or want to learn about self hosting) could get hands-on help. Is there something like that in #portland #pdx ?

Does anyone have any better suggestions for me than "go back to JPEG for my background images", to get reasonable-quality images on a webpage that still load comfortably on slow connections?

Note that anything that requires JS to function is not an option; I will not be implementing the pattern of "lazy-loading images after the fact", unless it can be made to work reasonably well without JS.

Show thread

Honestly, I'm kind of sad about Wedson leaving RfL. He developed a huge part of the foundation that made Rust for Linux possible.

I'll still work on DRM (except sched) and driver upstreaming when the core stuff is in place, but I don't know about other subsystems.

At the rate things are going, I wouldn't be surprised if upstreaming the drm/asahi driver isn't possible until 2026 at the earliest. I had hopes for things to move much faster, but that's not possible without active cooperation from existing maintainers, and we aren't getting it.

Reading upstreaming mailing list threads is painful. Every second comment is "why is this not like C" or "do it like C". Nobody is putting any effort into understanding why Rust exists and why it works. It's just superficial "this code is scary and foreign" type reactions.

I know the (top level) DRM maintainers are at least somewhat committed to making this work, and even there I ran into the drm_sched guy blocking simple fix patches. Every other subsystem is an unknown.

As far as I can tell the 'incremental decoding' that webp provides instead is functionally useless for this purpose, it just gives you back the exact 56k-era line-by-line rendering that progressive decoding was supposed to get rid of (if the client implements it at all, which is optional)

Show thread

food, making fun of immigration policies 

@marlies @lis I think you do still need to do the exam, which consists of successfully one-handed cycling with a frikandelbroodje (vegan or otherwise) in the other hand

Marilou Schultz, the artist, learned weaving as a child and is part of four generations of weavers. She used wool from the Navajo-Churro sheep along with traditional plant dyes. She worked from an Intel photo of the die (shown below) and used the "raised outline" weaving technique to make the borders of chip regions more visible. The lack of symmetry made the project challenging. 4/6

Show thread

Whether LLMs work also Does Not Matter. The environmental impact of both training and using them is currently such that it really does not fucking matter whether copilots are useful or not. You shouldn’t be using them, just like you shouldn’t be driving an SUV, flying frequently, or going on cruises. If you do one of these things out of dire necessity, you need to be offsetting that with behavioural changes somewhere else

If that changes in the future, then sure, we can reassess copilots

Show thread

@jacksonchen666 I was trying to figure out how to prevent an unloaded-image flash on slow network on something I'm working and, well

@rogueren I think it was a mistake to ask existing maintainers to review the Rust abstractions for their subsystems.

On paper that makes sense, due to the magic undocumented semantics since the docs are terrible. But that only works if those maintainers are actively helpful and willing to learn new things and see things from a different perspective.

We'd have been better off designating Rust maintainers for each subsystem and not allowing the C maintainers to gatekeep everything. Then they can choose to help review and get the semantics right, or not. But they wouldn't have the power to just block everything and force things to be done "their way" or delay merging until everything goes through an agonizing process that boils down to teaching how Rust is supposed to work to someone who doesn't want to learn.

For subsystems with helpful maintainers, the outcome would have been the same, since they'd cooperate anyway (and maybe even sign up to maintain the Rust side). And for subsystems with unhelpful maintainers, this would have avoided a lot of pain, and not given them the power to derail the project.

"Two of the best ways to optimize images for the web are by using a modern image format (like WebP)"

"WebP does not offer a progressive or interlaced decoding refresh in the JPEG or PNG sense."

.... ok 🙃

Looks like MDN knocked the "AI Help" button off the menu bar, hiding it under "Tools" instead - and the "AI" announcement header has been replaced with an ad for a paid course provider.

I guess the hype is starting to get stale?

@n8chz @eniko In at least one case, I've seen complaints about harassment. It's not just about monetization.

Show older
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.