jmoiron plays the blues
@jmoiron.bsky.social replied: Abolish ICE is an old movement.
Arguably, it predates, ICE, starting as "Abolish DHS", a GWOT era agency designed to codify the governments ability to violate the rights of its citizens.
The problems inherent in ICE and its remit were clear from day one.
@jmoiron.bsky.social replied: The immediate attempt to paint the victim as a "domestic terrorist" is yet another disgusting demonstration of their depravity.
They don't believe in truth, responsibility, anything like that. They only believe in power and their right to wield it.
@jmoiron.bsky.social Abolish ICE.
The stupidity and inevitability of the shooting in Minneapolis make it no less of an outrage.
My heart breaks from a world away for these women, their child, and their families. I can't even begin to imagine the horror.
@jmoiron.bsky.social Oh we're doing "greeted as liberators" again are we.
@jmoiron.bsky.social replied: To clarify my original post a little, I was also against the Iraq war after it started and while it continued. At no point was I in favor of the Iraq war, and this will be no different.
@jmoiron.bsky.social Guess it was obvious this would come up again.
Obviously we should not be doing this and I wish to register in the strongest possible terms my reaction of "fuck this."
Extremely minor silver lining is the utter beclownment of Infantino and FIFA.
www.bbc.com/news/live/c5...
@jmoiron.bsky.social replied: Pushing past "here's what I read on twitter" into new frontiers of laziness in journalism.
@jmoiron.bsky.social Happy New Year, let's hope for some good news in 2026.
@jmoiron.bsky.social replied: The real question is whether they will backfill the gaps in what AI will teach you, and how much of that is actually load bearing.
There are dozens of tricks and skills from the 70s and 80s that no one my age knows and are completely irrelevant.
@jmoiron.bsky.social replied: Younger developers always know the new hotness better than the olds, and they don't really listen to us.
Old devs rejecting AI for various reasons is pretty widespread, but is "New devs aren't picking up AI" a problem?
@jmoiron.bsky.social replied: This is the context where skill erosion is a worry for me. LLMs are very powerful in the hands of skilled developers, but how do you get there if you're gilding your education with fool's gold?
@jmoiron.bsky.social replied: Compared to writing code yourself, generating code with LLMs feels like empty calories, pedagogically.
You feel like you're getting full, but there is no nutritional value.
@jmoiron.bsky.social replied: On the whole, LLM oriented software development it feels more like managing and reviewing code than writing it.
Reading code is under-practiced, but it's nowhere near as good for learning as writing is.
@jmoiron.bsky.social I'm moderately convinced this argument is wrong, and that llm tools are different in kind to the other ones mentioned.
The inputs are different and there is no determinism to build upon. Using eg. Codex to write a program doesn't feel like programming.
@jmoiron.bsky.social replied: Prompt is pretty grungy so figured I'd mix it up.
@jmoiron.bsky.social In many ways, traveling in Asia is landing in a city you've never heard of and finding out it's bigger than Chicago.
@jmoiron.bsky.social replied: Can't see that being soon, but that really depends on the application. For AI coding assistance, it feels a long ways off, but maybe I should take another look at some of the more recent small models!
The enshittification for hosted models has already begun, though they're still...
@jmoiron.bsky.social replied: I should clarify, I think the OP is a good rundown of the parameters of this decision which is why I shared the link. They explicitly highlight Gemini free tier as something that makes the economics of local inference questionable, though I personally think all of the $20 plans d...
@jmoiron.bsky.social replied: This is provided that demand on other components don't continue to raise prices on consumer computing equipment. I bought 2x 32GB CORSAIR modules in 2024 for $210 and they are now being sold for $899.
@jmoiron.bsky.social replied: The biggest change to the economics for local inference likely to happen in the next 12-24 months will be an increase in the number of PC "Ryzen AI Max" style unified architecture chips to compete with the Mac Pro, the current local inference machine of choice.
@jmoiron.bsky.social replied: Google's search index in 2006 was ~850TB of data. This was a very large cluster at the time, but with nearly 20 years of progress, Seagate's latest and greatest is a $800 36TB drive. Replicating the 2006 index "locally" would still be around $20,000 just for storage.
@jmoiron.bsky.social replied: For all of the stories of the Apollo computers having fewer transistors than the circuitry in a novelty keychain, it takes forever for consumer hardware to catch up to the capabilities of a distributed system like a modern AI product.
@jmoiron.bsky.social replied: This change in capabilities didn't happen because smaller models ran better, it happened because we've created bigger and more focused models, and they can orchestrate themselves to use more silicon to do more things.
It's scaling out, not optimization.
@jmoiron.bsky.social replied: Small model capability is improving, but not nearly fast enough, and it hasn't kept pace with user expectations on what AI coding agents can do.
In just a year we've gone from "An AI agent can write small scripts badly" to "A swarm of AI agents can collaborate to plan and create...
@jmoiron.bsky.social replied: But if you get a few of them, or you buy access from some of the third party resellers, they're a lot cheaper and more capable than building something yourself.
Part of that is that hardware to run inference is still costly. I don't see this changing in 2026.