Read_934 - The Code Liberation

March 04, 2026 01:01:20
Read_934 - The Code Liberation
Bitcoin Audible
Read_934 - The Code Liberation

Mar 04 2026 | 01:01:20

/

Hosted By

Guy Swann

Show Notes

"The cypherpunk principle that 'cypherpunks write code' takes on revolutionary meaning when writing code no longer requires programming knowledge. Eric Hughes meant that political freedom comes through building systems without requesting permission, and now that building happens through natural language instruction to AI agents. You achieve sovereignty over computing through direct technical capability.

The code becomes yours in the most fundamental sense: modified by your specifications, compiled on your hardware, serving your purposes exclusively."

~ Max Hillebrand

What if every piece of software you use could be shaped entirely to your will - no bloat, no telemetry, no features you never asked for? Max Hillebrand argues that AI hasn't just changed who can code, it's changed *what code even is*. Could the combination of open source and AI agents finally destroy the vendor lock-in model for good?

Check out the original article: The Code Liberation: How AI Makes Software Infinitely Customizable by Max Hillebrand (Link: https://towardsliberty.com/articles/naddr1qvzqqqr4gupzpdlddzcx9hntfgfw28749pwpu8sw6rj39rx6jw43rdq4pd276vhuqqgrjvesxyurwctp8p3nzefevyuny53hzf6)

References from the episode

Host Links

Check out our awesome sponsors!

View Full Transcript

Episode Transcript

[00:00:00] The cypherpunk principle that cypherpunks write code takes on revolutionary meaning when writing code no longer requires programming knowledge. Eric Hughes meant that political freedom comes through building systems without requesting permission. And now that building happens through natural language instruction to AI agents. The code becomes yours in the most fundamental sense. The modified by your specifications, compiled on your hardware, serving your purposes exclusively. [00:00:38] The best in Bitcoin made Audible I am Guy Swan and this is Bitcoin Audible. [00:01:01] What is up guys? Welcome back to Bitcoin All. I am Guy Swan, the guy who has read more about Bitcoin than anybody else. You know, we have got a piece, we talked about this in Max Hill Brand's chat episode, which was really, really good. I will have the link to that in the show notes if you haven't listened to it. But he had a really great little article on Noster and his. His blog, his collection. He does quite a bit of writing and in fact I had not kept up with it very well. And I will link to that as well so that you can dig into the other stuff. [00:01:38] Something that I will also be doing to see some of his other writing recently because I'm very well out of date with what I've been reading for his from him. [00:01:47] But this was just such a fun little piece. I talked about it in the episode, but it was, it was past time to come back and like really dig into this. It's called Code Liberation and it's about how drastically the environment has changed in such a short span of time and what the nature of AI agents are doing or changing about how it is we build things and what the landscape, what our tools and what the possibilities will look like or look like now and why. This is not something that's bad for sovereignty or centralization. To the contrary, it is literally going to free code. It is, it is going to completely change the landscape in our favor. And I love his perspective and he's always just got some really interesting takes on stuff. So we will get right into it. A really quick thank you to the HRF Human Rights Foundation. They have the Oslo Freedom Forum going on June 1st to 3rd this year where activists, artists, freedom fighters from around the world get together and they share their experience, their stories, their courage about how to fight for freedom and how to protect it around the world. [00:03:07] And not only, you know, what they did, but the tools that they used and, you know, bitcoin and all of these things. And imagine how much more expansive that will be in the era where I can. We can just build and modify our own tools like this is. This is the hope, this is the like. I can't imagine how many things like that are going to be discussed at this year's conference. So if you want to check that out, the link is down in the show notes. You can get tickets now and of course the Financial Freedom Report to stay up to date on all of that news. Their newsletter also write down in the details. So with that, let's get into today's read and it's titled the Code Liberation How AI Makes Software Infinitely Customizable by Max Hillebrand When AI handles compilation and patching tasks, source based computing finally escapes the domain of experts, ending decades of binary distribution monopoly. [00:04:11] The binary package model represents computing's greatest betrayal. [00:04:17] You download a pre compiled executable optimized for some theoretical average machine, bloated with features you'll never use, while missing capabilities you desperately need that persistent notification you cannot disable, that telemetry silently phoning home that workflow breaking UI change in the latest update, all imposed without recourse because the compiled binary obscures its logic and prevents modification. [00:04:48] Software vendors discovered they could manufacture dependency through opacity, transforming users from operators into consumers. [00:04:58] Source based installation inverts this power dynamic completely. When you compile software from source, as Gentoo and FreeBSD users have done for decades, you control every aspect of the build, use flags, determine which features get compiled in compiler optimizations, target your specific CPU architecture, and unwanted functionality simply never gets built. The software becomes truly yours, shaped by your specifications during compilation. [00:05:31] What once required deep technical knowledge now becomes accessible through AI mediation. [00:05:38] The technical barriers that made source based computing the province of experts have dissolved. [00:05:45] Modern AI coding assistants like Claude, Cursor and Copilot understand code bases at semantic levels, generating patches, resolving dependencies, and handling merge conflicts that would challenge experienced developers. When you tell your AI assistant that a feature bothers you, it reads the source, writes the modification, manages the patch through updates, and recompiles the package. The same AI that helps developers write production code now enables users to customize their entire software stack through conversation. [00:06:20] This transformation embodies Friedrich Hayek's spontaneous order in its purest form. [00:06:27] No software company possesses the distributed knowledge of millions of users about their specific workflows, hardware configurations, and privacy requirements. [00:06:38] The binary distribution model pretends otherwise, imposing centrally planned solutions that serve corporate interests while ignoring user needs. [00:06:48] When individuals modify source code for their purposes and share those modifications, they participate in a discovery process that reveals possibilities no central authority could anticipate. The AI Agent serves as translator between human intent and technical implementation, allowing the knowledge problem to be solved where the information actually exists. [00:07:12] The practical implementation exists today. [00:07:15] Install Gentoo or FreeBSD, where Portage or ports manage source based packages with dependency resolution and compilation automation. Deploy an AI coding assistant that understands both natural language and code structure. When software behaves unacceptably, describe the problem to your AI. Remove all telemetry from this application or change this keyboard shortcut to match my muscle memory or optimize this for maximum performance on my AMD processor. The AI reads the source, identifies the relevant code, generates the patch and manages it through the compilation process. [00:07:53] Your computing environment becomes totally sovereign, shaped by your needs alone. [00:07:59] The cypherpunk principle that cypherpunks write code takes on revolutionary meaning when writing code no longer requires programming knowledge. [00:08:10] Eric Hughes meant that political freedom comes through building systems without requesting permission, and now that building happens through natural language instruction to AI agents. You achieve sovereignty over computing through direct technical capability. [00:08:27] The code becomes yours in the most fundamental sense, modified by your specifications, compiled on your hardware storage, serving your purposes exclusively. [00:08:39] Critics point to legitimate concerns about AI generated code quality. Noting increased duplication and debugging time in current studies, these critics miss the essential point. Imperfect customization that serves your needs surpasses perfect software that serves corporate interests. The binary packages you install today already contain bugs, vulnerabilities and unwanted features bugs, but you have zero ability to address them. With AI assisted source based computing, you gain the power to fix what affects you, remove what threatens you, optimize what matters to you. User control now surpasses vendor control. [00:09:21] The economic implications extend beyond individual sovereignty. When users can modify and share software improvements, the artificial scarcity that supports software monopolies collapses. [00:09:34] Why pay for a proprietary solution when the community continuously improves open source alternatives? Why accept vendor lock in when you can modify any software to integrate with your workflow? The software industry's rent seeking model depends on users being unable to help themselves. [00:09:53] AI assisted source modification breaks that dependency permanently. [00:09:59] The convergence happening now between AI capabilities and source based systems represents more than technical evolution. [00:10:07] Gentoo recently began offering binary packages for convenience while maintaining its source based foundation. Recognizing that users want choice, not ideology. [00:10:19] AI coding assistants grow more capable daily with tools like Cursor achieving significant improvements in development metrics. The pieces exist today for any motivated user to escape the binary distribution trap. The AI powered source revolution exists as present reality. [00:10:38] Every moment you accept software limitations, tolerate privacy violations, work around artificial restrictions, you choose dependency over sovereignty. The tools exist now source based Package managers that have operated reliably for decades. And AI assistants that understand code at levels surpassing most programmers. Hardware powerful enough to compile software while you sleep. You now choose whether to continue accepting software as a product or begin shaping it as a tool. [00:11:15] Every year, the Human Rights foundation puts on the Oslo Freedom Forum. They bring together human rights defenders, journalists, artists, technologists and policymakers. [00:11:26] The goal, to share stories of courage, to share creativity, to explore ideas and technologies to advance freedom around the world. If this matters to you, this is an event you should Not Miss. From June 1st to 3rd of this year in Oslo, Norway. The Oslo Freedom Forum. A link to grab your tickets is right down in the description of this episode. [00:11:53] All right, so we had a fantastic conversation with Max Hillebrand and he brought this up and we talked quite a bit about this. And I've been thinking about this idea. A lot of how code is become. Has become something, has become a tool for everyone. [00:12:08] One that doesn't need the. [00:12:14] The practice of coding, but only needs the sort of art of the art of preference and judgment and the simple skill of engineering and thinking about how a system or a workflow or a process should be put together, which in a lot of ways even that will be offloaded quite a bit to essentially the default of what works and what other systems can actually, what working systems that are already out there employ. [00:12:52] But I think a lot of the innovation and optimization will exist in how things are architected, and that will be kind of the. The major difference between those who are very skilled at this and those who just kind of run the default. But regardless, these will all be customized to workflow. And I think a really good example actually is I'm constantly getting rid of tools that I've used for a single purpose. Like a great one, actually, that I just did was Camo Studio. So Camo Studio is a tool that's supposed to add all these features to, you know, create. To modify your camera and like your webcam thing so that you can have a green screen background and you can put in text and have your name down at the bottom. You know, all this stuff, right? You know, change the lighting and all of this. And they even got like some AI tools implemented in it, but I didn't need any of that. In fact, my camera itself, which is a insta360 on a. It's just a little webcam thing that clips onto a stand, actually has a tool that does most of that stuff. It's one of the ones with the little gimbal on it. So it will literally have like, AI tracking of my face. I can change the color, temperature and the brightness and all of this stuff. But the one thing it doesn't have is the green screen. It doesn't have chroma keying. So I have to boot up a app for my camera to. To basically control it and work with it. And then I boot up another app for the camera that takes that feed and applies the green screen effect. [00:14:36] And then I boot up the Riverside or Stream Labs or whatever it is that I'm actually recording the show in or sharing the show through. [00:14:46] And that talks to the second in the layer. Like, so I have like four layers. Three. Three layers with you, sometimes with the fourth to. To capture the audio and grab things ahead of time before it goes to all of this other stuff as a, you know, a backup. And that's done at kind of like the driver level or I use audio hijack or something like that. And so there's this whole stack of things that I'm going through. And what's funny is most of it's just something that I have to use because it's just native to the camera or something that I'm using, for one thing. And in fact, all of its other features get in the way. Like, a good example is Camo Studio also has camera controls. So it can, like, you can move the camera around and you can zoom in or zoom out, but because of that is that it actually resets the camera position anytime I make a change inside of Camo. Because Camo is trying to hijack the control of the camera. Whereas I'm using insta360 software to actually control the camera. Because the. The translation between Camo Studio and the camera itself is not actually Perfect. In the Insta360, the native application obviously does it a hundred times better. The other one tends to jerk it like, or over kind of like one of those things. Like when you're in like a remote desktop app, sometimes you scroll and it just scrolls like a billion miles an hour when you're actually just trying to scroll it a very small amount or in a very nuanced way, because the controls between one operating system and another don't perfectly translate. And then the same thing happens when you're. When you're in like a mobile environment and you're trying to project something from a desktop environment. Touch, touch controls and normal scrolling and clicking don't perfectly line up. There's like, nuances and quirks. Well, this same sort of thing happens between my stupid camera Apps and all the different things they each control. [00:16:39] And I realized how frustrated I was with Camo Studio. And worst of all is all these other features that I don't even use. It costs like $5 a month and I can't even remember what it is exactly that I needed to do for the, for the pro version. But I remember thinking I kept running into some little limitation and I tried some other ones that like worked half decent, but they were just kind of annoying. And I didn't like the interface or the setup or like I couldn't save my setup in like an easy way. In other words, it wouldn't open to. It would open to like the default. And then I would have to go in and I had to make a bunch of changes. I just want it to open and start working. [00:17:17] And so I was like the other day I had gotten to the point where I'm building so much stuff, custom stuff in my workflow and Vibe coding a bunch of things. I almost vibe coded my own just Chroma key thing that I could maybe attach to opening up my Insta360 app. Where because, you know, if I can, if I can change or I can basically make my own boot icon. So the Insta360, obviously if I just click on the icon for that app, it's just going to boot that app. Well, I could make another icon that is the icon for that app that is its own little script to also open the app. But then it also opens my Chroma key tool so I can modify this so that it is one button in one click. And it does always open together because I'm always going to need them together and I don't need them otherwise unless they are together. And if I do need them separately, I can easily still just close one of them. [00:18:14] But I have, I found a little Chroma key app. This open source looks really crappy. It's not. There's nothing special about it. But that's literally all it does is it takes a feed and does like a few core modifications to green screen. It is open source for some re. It's not supported anymore and for some reason I cannot save my configuration. So I literally just had Claude Code go in and customize the instruction like the, the part of the tool on boot. So, so that I quote, unquote, save my configuration by hard coding my configuration into the app so that it always opens. Because there's a default of like just it pulls up a background. There's like, well, I'll just change it to mine, there are default settings. I just changed them to my settings and I could modify this further or put it in the background if I wanted to. There's a bunch of things that I could basically do with this. I could even update the interface, which now that I think about it, I might do. I might actually clean it up, make this tool a little bit better and try to, you know, make it quote unquote supported now for the new Mac os because it's literally an old tool. It was, I can't even remember what version it was, but it was like this is no longer supported because app require, Apple requires you to pay yearly fees to keep it on the App Store and blah, blah, blah. It's a perfect example of something that would be a casualty of how, for no reason, but is a casualty of how we think of moats around software delivery and software use. [00:19:44] It's absolutely going to be destroyed that it's going to be ripped away because of how, how ephemeral and customizable all software is going to be. And this is why I think Open Source is just about to explode, because you would never be able to. The big problem why Open source is, has such a limitation and essentially has a uphill battle against something that's highly curated and very, very carefully made is the fact that you do have this moat around the, the skill of programing and explicitly about the time to devote to something, but that is increasingly falling away. There still will be software that is extremely curated and very, very good at what it does. But I think it will increasingly be more and more difficult because software will change so quickly. It will be more and more difficult to maintain that as a moat if it's not serving an explicit purpose or a very targeted use case and doing so in a way that can communicate with other things. [00:20:57] Here's another decent example, and this is something where I really think Apple should update their neural engine. [00:21:05] So Apple has their like neural engine neural network thing in their chips and it's super proprietary, super closed off. So you can't use it to train anything that is not within their framework. Right. You can't really modify the chip or the way they're using their neural engine to do something that Apple doesn't intend for you to do with it. [00:21:29] But one of the interesting things is, and we talked about this in a couple of the Bitcoin and AI and cloners like group Telegram and signal groups that I'm in, I'm in like four or five of them, I think, where everybody's just talking about like Claude bots and coding up their own stuff and building their own custom workflows and their agents. All this stuff. It's fascinating all of that. There's more to keep up with in those. Just in those simple conversations these people are doing, let alone the whole ecosystem and what's being shared on Twitter or anything. Like, we've already reached the point where software modification is so insane, it is so broad, it is so capable. And some of the crazy things that some of these people are building or toying around with are things that I wouldn't even think of because they're not even in my realm of like, I care about. [00:22:24] One of the, one of the guys in, who's in the Raleigh Bitcoin Group, actually, because the Raleigh Bitcoin Group has kind of become one about Bitcoin and AI was doing this crazy thing where he's having his agent like check all of the local news items and companies and like the school district and anything on Facebook being posted where somebody mentioned a specific problem that had to do with a software stack and basically build out a database of like, opportunities for potentially selling or working, working with them on a, a software fix to, to move them to a new stack. [00:23:09] I can't remember exactly what it was, but it's some like very specific like database and like spreadsheet problem or something. But he's basically sourcing pain points through random comments and then building a database in his local area for opportunities around problems to fix for people and building a client base. And his clawbot just does this like, just runs all the time. [00:23:35] I've been back and forth with a media analyzer and I've built a ton of these little like piecemeal tools. A lot of them will be used for Pear Drive and a lot of the pair tools that we're also working on. But because they're piecemeal, because they're granular to like one specific purpose or one specific thing. [00:23:55] They're also just things that I can run locally independently, like scripts. And I build an API and a simple way to interact with them so that I can test each one independently as a script, like my image captioner or my categorizer of like, oh, this is a meme. Oh, this is a receipt. This is a business card. This is a family photo. [00:24:16] This was taken by your iPhone and specifically your iPhone camera. So it is one that you have taken a video or an image. And then I create, I just carry around these little JSON files with all of my different folders and Videos and categorizers. And then I can use all of this information to change or like I can just talk to my agent or talk to Claude code or whatever and take it and have it look at the JSON file that's associated with all of these things and then make decisions with this knowledge. And I'm slowly being able to organize and I'm recovering information and context and text itself from images and objects and all of these things. And I'm. I'm now able to search through on 64 terabytes worth of stuff on my Linux machine. Granted, the processing and all of the various. I have different tools for different parts of the hierarchy and I've only run it on some things, but I'm already rediscovering things. It's already becoming incredibly useful. And I'm going to have a stack where I can just boot up a broader application where I can just grab all of these individual scripts, because each, like I said, each one of them is being built entirely modular, entirely standalone, so that if you run it and it doesn't find the tools that it needs, it will install and download the various tools and the pieces that it's agnostic to what operating system that you're building it on, and that has a specific API. So it can be called inside a broader application, booted up, run and then closed out. And if there's an error or something, it doesn't kill the broader application. The broader application then just has logic to decide what to do with it. Should we try to reboot it? Should we assess a couple different options for how to. How to modify or what process or flag to run in order to get around some error? You could kind of build that logic into the broader application. And all of this is built with, you know, a little Docker or I guess in Python you'd have vnv, Venv. Like you have little Python environments to build it into so that it runs reliably on no matter what machine you have, because it builds its own little blob, like its own little place and installs all the things that it needs and then runs and has a clear API for how to use it and instructions for to do so. And then I can take all of these little, various little tools that I continually build and optimize for and put them into a larger application that does. [00:26:58] It actually does one thing, but it uses like nine different tools, modular tools to do that thing. And the really cool idea about this, or the reason I really love this architecture, is that I can Constantly modify all of these little pieces and which models they use and which prompts I put into them. If they're using a model like a VLLM model, or if you know something is a really slow model that takes a lot of analysis, or a really, really fast model like Clip or YOLO World that just does object detection and basic like framing out of an image, or if I want to do video analysis versus image analysis, all of this stuff, I can modify and fine tune all of these things and change them later, but keep the API the same and the broader application just gets better. And I didn't even do anything. I didn't change that application at all. I just updated the tool that's inside of that application. And this is another big point as to why, you know, a lot of people were watching a lot of software companies collapse like kind of in front of our eyes, like IBM Stock dropped like 10 or 20% or something when Claude announced that they had an agent that could write and understand cobol. Because IBM, COBOL is an ancient, ancient programming language. It's a very low level language and it's essentially our entire financial infrastructure basically all falls down to cobol. And I remember reading or hearing something at some point that there's like 10 programmers that basically know COBOL. It's such an unused and obscure language that it's just completely fallen out of favor with every other thing. And that somehow I can't remember if it's like literally IBM owns it or something, I don't remember, but it's something that IBM runs the infrastructure and basically houses all of the main coders that actually understand how to use this language. And because of that, IBM is, is the only one that can change our financial infrastructure, that can actually update the core tools. And because it's such a high risk thing to change, IBM has, has essentially had a veritable monopoly on this. And so Claude Code comes out with a blog that says we can now do COBOL and IBM stock plummets because it's like your moat is gone. A monopoly that has been around for like 40 years on maintaining one type of infrastructure that is core to the entire global financial system for more or less just got eviscerated with a single blog post. [00:29:59] Now a lot of people think that that means that software is going away, you're not going to be able to sell it. I do think, and I've said quite a quite often, that most software is going to be free. And I do believe that. But software isn't going away. [00:30:12] Software is going to integrate into a million, a billion other use cases and things that software did not make financial sense to actually tackle. There will be software for use cases, for extremely specific, specific isolated use cases within one part of one segment or of one department within one company that does something specific and customized to their environment that only 100 people or maybe even 10 people use before. It would never, ever have made any sort of financial or economic sense to create custom software for that environment or for that specific use case. Today it will. I think the easy, the better way to think about it is to think about it in a. [00:31:05] In the analogy of what the Internet has done so many times before, right? Is YouTube did not destroy Hollywood. YouTube did not destroy big feature films and all of this stuff. What did it really do? It has certainly hollowed out the monopoly that Hollywood has. And I don't mean it just YouTube specifically like, oh, YouTube's competing with Hollywood. They are and they aren't. But more about just what does YouTube represent, right, is a technological move to the digital landscape, to streaming, to production that isn't tied to one group or one specific area or one specific copyright like Monstrosity, right? You have Netflix, you have Spotify, you have YouTube, you have Amazon prime, you have all of these segmented out, like deviating landscapes of new ways to provide content, new ways to provide media. And there's a whole bunch of independent studios that have actually exceptionally well in this, like angel studios. I don't think you could have actually made angel studios work at scale and be. Be sustainable 15 years ago, but today you can because of how the landscape of technology, the environment of our technology and our networks has changed. And Hollywood isn't gone. [00:32:25] Big feature films aren't gone, but they have drastically decreased in importance. And they do not have the moat that they used to have. And that moat is to collapsing even further when you have AI tools that can do image generation and special effects that big budget Hollywood movies were the only source of before that. That $100 million budget, $200 million budget moat is also falling away because I can make scenes that look like they're from a $200 million budget movie on my own machine. And that has been increasingly true even just with tools like DaVinci Resolve and Adobe After Effects and. [00:33:07] But they required extreme amounts of skill and finesse and tons of work. And those tools are getting easier and easier and easier and they require less and less skill. And even when the environment requires too much specificity and too much knowledge of the, the specific editing tools. Now you can get AI to code you a new environment or simplify the tools to recreate a process or do a thing even easier because it's using the editing and the AI tools to do the job for you. So it's just barriers crushing all over the place. And what happens when those barriers fall away is you have millions of side use cases. So going back to the Internet or the YouTube example, is that what exists on YouTube that never could have existed in Hollywood, that never could have existed in broadcast media, Tutorial videos, little funny, like oh, this random dude just gets kicked in the balls videos, Mr. Beast videos, like the most successful YouTube account in history literally may never have existed on broadcast media or in Hollywood. Like it's not a format or a type or style of media that seems to actually align with those. Like it's not a sitcom, right? Like everything was a sitcom or a CSI thing on tv. Like you have these kind of styles of things where they can only really find a particular investable category of content. [00:34:39] And what YouTube did is it exposed an information, exposed an audience and in the production capacity for all these different types and styles and lengths of content. Another great example is YouTube shorts. YouTube shorts and like you know, one minute, three minute explainer videos or just punchy clips, basically conversational stuff. You would never. You can't put that on tv. You can't possibly deliver that to the right audience without a social media algorithm and a filtering of who are you talking to and who is your network. Like it explicitly only exists as something viable in a place where you are deciding that thousand people that you're interested in and that you want to follow and that you want to hear the conversation of. You couldn't possibly have that kind of granular control or segmentation of topic or content type in broadcast media on like a TV channel. You'd literally need a billion TV channels and it was already overwhelming when there was like 200. [00:35:48] Just, just pause for a second and think. [00:35:51] Try to grasp the every type of video you've seen on YouTube. The politically incorrect stuff, the conspiracy theory stuff, the little short videos, the tutorials, the long form three hour conversations, like you name it. The audiobooks. [00:36:13] I have audiobooks and articles just being read aloud on my YouTube. [00:36:18] Think about the sheer breadth of media and specific things. My son watches cars going down, racetracks, plastic racetracks, and they're commentated like they're actual races and he loves it and it's actually super entertaining. We watched a video the other day of a dude who turns matchbox cars into matchbox Monster truck cars by doing these chassis. And we watched a 30 minute video of him breaking down how he cuts them and modifies them to get them to fit into the, the, the chassis or the particular car and what tools and glue he uses. There's an account we put on every once in a while called Nasmar, which is stop motion Nas. He my son loves cars. If you didn't catch this, that does stop motion. Race cars races and they build these elaborate. And it's like a kid too. [00:37:12] Whoever's doing it must be, I don't know, I think he's in like middle school, maybe early high school or something. [00:37:20] But he literally puts all this like he's got like cotton balls for smoke and he puts little trees and he builds these huge elaborate sets. And there's only like 4,000 or 5,000 views on some of these things. But it's so cool. I'm actually like wildly impressed by what this random kid is doing on YouTube. How would I have ever been exposed to this in any other way? How could you have ever put this like, think about in the late 1990s versus the early 2010s, what was possible in the type of content that you could deliver to someone and the way that you could deliver it. That like 15 year zone so drastically and revolutionarily changed the nature of media and content, media type and content delivery and production in our world as possibly any other zone in history. [00:38:16] And what happened? [00:38:18] What happened to big budget Hollywood movies? What happened to high quality, like high risk and nuanced or niche television series and you know, book adaptations? What happened to those things? Did they all disappear? [00:38:36] Did they all get replaced with YouTube? [00:38:40] No, quite the contrary. They exploded. [00:38:45] And Importantly, there was 10 times or a hundred times the variety of those things. [00:38:54] Every book that's worth anything or even mildly interesting is being adapted. Every fantasy that you could ever dream of is getting a live action visual storytelling psychotic shows that no one would ever, ever have taken the risk on, let alone the fact that the rating and the content type like so drastically shrinks the audience. You never ever in a billion years would you have been able to pitch the show Black Mirror to anybody, to any broadcast company in 1995. But you can get away with it on Netflix and you can test out some of those things. Because if you can get a niche audience that that is actually interested in some kind of crazy or really out there content, well then they're going to stay on your platform or stay subscribed because they can't get that anywhere else. So it makes more sense to go to customized interest to niche audiences to explore and take far greater risk, especially when you have a streaming, a subscription that can just batch all of it together. In fact, the more niche audiences you could get, the more worthwhile your singular subscription cost actually is to each of the individual customers. [00:40:16] So here's the thing. [00:40:18] This same thing's going to happen to software. [00:40:21] The whole stack that we're used to is gonna stay, but it's also going to lose its moat. [00:40:30] The reins of control are going to start falling away because everyone will be able to modify their software. And I think also what would it look like if we wanted to compare it to that analogy or that pattern shift that we've seen in other areas of the Internet and content creation? And how do you create distribution? [00:40:53] Is what, what wins in this new environment, I wonder? Like an analogy that I'm, I think might end up surfacing is different versions of the same application that become popular. [00:41:11] Like, let's say, let's say DaVinci resolve is. I mean, I'm not saying that this will actually manifest specifically, but I mean it in the sense of like, okay, how could you think about what you could do with software now when the means of using this thing is very, very. Or means of modifying this thing is extremely accessible? Well, maybe you have like four different tiers of DaVinci resolve. And not tiers in like a pricing sense, but tiers in like a customization sense. It's like, how much do you actually want to use this software? Do you need 800 tools sitting in front of your face? Or do you need three and maybe there's a three tool version or there's a. [00:41:54] I'm just doing. And you think about. A lot of software is actually kind of built like this anyway. And DaVinci resolve kind of has this, this layer to it is there's a color coder inside of it, there's a fusion, which is your, your extensive effects and like customization. There's your linear basic linear editor. There's your renderer, like all of these like major tabs. Well, maybe people start delivering custom software in different batches or for different use cases. [00:42:23] Like sometimes I literally just might want to use the renderer. Well, I don't do that with DaVinci resolve. I don't drop something into a timeline and then process it and do all this stuff to output a different format. I open up audio. I mean, excuse me, I open up a handbrake or I use FFmpeg. Honestly, DaVinci Resolve probably use it. FFmpeg is probably the base, the thing underlying DaVinci Resolve. I know FFMPEG is the thing underlying Handbrake, but I used FFMPEG before on the terminal, like almost never. [00:43:00] I would, I had like maybe one memorized command and flag that I used enough that I could have just opened up a terminal and use it. Now I use it constantly, constantly, because I just build scripts for the things that I do. Like, I don't even. I asked, you know, I can ask quad code like, what's the command for FFMPEG to do this thing? But that's an inefficient way to do it. Instead I, I mean, I will look at the command and you know, just kind of like, because I can. It's easier to read than it is to write. [00:43:34] You know, it's easier to understand or look at the structure of a thing and be like, okay, I understand what this flag is. And then also I can just for the fun of it, ask Claude code to be like, can you break down what each individual flag and command in this structure is? Just for my own curiosity's sake. And I do that quite a bit actually. But then I get it to write a little like drag and drop or simple double click style script that prompts me for a file, or I can drag a file on top of it and then it runs the command using that as my input. [00:44:06] And I have a default set for my output. And I use FFMPEG directly. I don't need a lot of these, so I don't need DaVinci resolve or handbrake to do a lot of those things anymore. Camo Studio is another great example. Camo Studio is trying to do 800 jobs. I needed it for one thing, and just yesterday I realized I didn't need it for that one thing anymore. I can do that myself. [00:44:32] This kind of feels like the subsection of content or media that YouTube exposed, right? If DaVinci resolve was the, you know, the Friends sitcom of video editing, well, using ff, building my own FFMPEG scripts that allow me to drag and drop is the short YouTube explainer or tutorial video. A thing that never could have sustainably existed in the old world, but now makes perfect sense in the new technological environment and in fact may have a huge audience. [00:45:11] But the craziest thing about this is that all of the negative consequences of the moats of the vendor lock in will start to fall away. [00:45:25] All of the. You can't have any privacy options without it with a tool that actually works or is Actually clean all of the limitations of being forced to use this platform for this one specific use case and you're being trapped in basically in their box is going to fall away. [00:45:47] As Max explains. I think this was a perfect title. The liberation of the code will liberate the way the code uses us because we will modify it to be used by us. [00:46:03] And specifically all the open source things that don't have, you know, rounded edges or rounded corners and clean edges or like a really good style. Great example is the little green screen app that I was just telling you about. Looks terrible. It looks like it was made in 1998. The interface is about as garbage as it gets, but that's probably something I can update with a one shot with Claude code. I would bet money that I could just say make this, give this better margins, clean it up and make it a vertical tool window, kind of like a, you know, Mac info page and give it the glassy minimalist aesthetic. In fact, I'm gonna take this exact prompt, what I just said and I'm just gonna one shot it. I'm just gonna see what it looks like. But because of this unlock software is gonna change dramatically. And specifically open source software is re. Is I genuinely think is gonna start catching up because too many people can actually fix things and we can use those fixes. And I think there's a great, I think Open Claws or Open Clawed or. Yeah, no, Open Claw. That's. I can't. There's so many different names of this thing already. It's still having a hard time sticking to my head. But openclaw is such a good example. Is some random dude built an agent and I'm not using anything built by any like official like big corporate thing. I'm literally using solutions that are being shared by people who are using openclaw in these groups. So like skills and tricks or prompts for saving memory in an efficient way. I'm not getting anything from Anthropic. They are providing the base model for most of what I do, but not even that. I'm running some of it through PPQ and I'm probably going to get on Maple, try Maple AI, Maple AI and play around with stuff there too. Because I can try out Kimmy and you know, the larger Quinn models and Olama or not Olama is the interface, Llama is the model. But I can do that privately and importantly because of Claude, because of these various tools and AI people being able to use and install these things has absolutely 100x like open claw with the Kind of pain of setting it up and getting a few of the things working and kind of tweaking it for my setup and my environment. [00:48:40] It would have absolute like, I feel like the. [00:48:43] The track from release to, you know, 40 million users, which it may have now, I'm not even sure would have taken 10 times as long if we were not in the age of AI. You know, the fact that we have AI itself is allowing a whole bunch of people that this would normally be over their head or way too much of a pain and they would get stuck in the middle and they would never actually complete. Complete it. AI is getting them through the process of setting it up and having open claw actually operational in their environment and getting around all the tweaks. And it was really funny. I was still having a number of problems, but I got the model actually up and running with a few of the problems and with one of the skills. I only had like two or three skills or whatever, but one of the skills not working or something, and I was getting some sort of an error, but it was working. It was actually coming up. It just wasn't working completely. And as soon as I had it up, I just switched to my telegram chat, having a conversation with it and I fixed all of the problems. It fixed all of its own problems. It made itself better and fixed all of the edge cases by actually booting the things. Like, if I could boot it in the simplest of sense, it could now look in the. At the entire setup and the computer and the environment and, and what was wrong with the different things? And you know, a bunch of things were just like, you just don't. You haven't given the right permissions, go into settings and do this and this or. And I would even be so lazy as to like not try to find it in settings or they would give me a list in the settings. And it wasn't perfectly right because, you know, sometimes the different OS versions are off. [00:50:20] And so I, I immediately got it to. It's like, actually instead of that, can you ping the system to try to do something so that it will deliver me the prompt to give you permission so I don't have to go looking for it. And that's what we started doing after that. And it started just like popping up. Like, I need permission for this. And it's like, yes, but that's like a perfect example of the acceleration itself, of the tools that enable all of these things accelerating the tools that are enabling these things. But what happens where does all this is, like, why part of My thesis for where we're going on the Internet is I think a ton of the value accrues to the protocols because we're going to have agents talking to each other. And you know, maybe I'm over analyzing the, the value of some of these protocols in particular or the need for them because obviously there are API calls everywhere and maybe we'll still use HTTP API calls, right? But I'm not so sure. I really think that in an age where all software is customized or software is broken up enough and modular enough and built in as many iterations as there are custom processes and workflows, that the king of that world is the protocol that allows them to talk to each other in a way that either a allows for a common language to transmit to translate between one tool and another, or one custom setup and another, or enables, you know, that that traversal without crashing. Is that what, how do you optimize or segment out communication so that when there is a compatibility error or a translation error, it doesn't cause a problem, it's simply something that is ignored? Kind of like a backwards compatible soft fork, right? As a soft fork to Bitcoin is if you didn't have the new version that could read Segwit signatures or understand the Segregated Witness upgrade to Bitcoin, all of the old transactions and everything still worked and you can still see the new UTXOs, you just didn't understand their signature. You could still validate everything. [00:52:56] That old node simply didn't get a feature. But they didn't conflict, right? Segregated Witness was completely compatible with the old way of doing things on the Bitcoin network. Then of course, in a world where billions and billions and billions of bots can be spun up, I think also a trust system is critical and I don't think that is sustainable in a closed way where your trust is entirely based on Amazon reviews or ratings. Because that's exactly the problem right now Amazon has to kyc and basically trap their data and their reputations into that network. I think the only way you can scale to a world with billions of bots and all of these various tools and what will eventually be quote unquote platforms that are kind of case specific. You need a universal identity system. You need, and I don't mean it universal like everybody's doing the exact same thing. I need, I mean you need a protocol for who is this person? Like this is this public key that I can attach this reputation to it, that I can potentially pull reputation data from many different sources that I can Control and regulate access without having to KYC without having to. Because like it would create such a staggering friction from a everybody creating bots perspective. And every agent going is going to be KYC'd or you're gonna punch in a phone number and you get a text like that's just not gonna happen. You're talking about using a set of tools, a little ecosystem of billions of agents that would be like dragging them through, like having them sprint through five feet of water just to get a simple job done. If you put that friction at every, at every individual place you want something that just has a reliable API call or an explicit. It's also why I think it kind of makes sense to have some sort of a little proof of work type thing is pay per API call and then essentially an open market for provision of that standard API. Like a, like an, you know, an Airbnb style thing of compute built on Nostr. And with the way some of these hardware things are going and how much shift there has been in that like the Mac studio, the Apple, Apple in general has been. This was talked about as well. In fact I think I brought it up and didn't finish. The thought was that Apple is a little bit of a unexpected winner and all of this because their hardware is so well adapted to running significant inference in a local setting. [00:56:00] And even crazier is Brian Romelli posted something about a open source dude who. Yeah I did. I brought up the neural engine. I didn't even come back to. It was that somebody basically ran like hundreds of processes through the Apple neural engine to reverse engineer how it works and actually utilize it without the MLX and like to basically utilize the chip itself like the core of their processing engine and actually use it for training which is a completely different process from inference and something that Apple literally tries to close off. They. They don't want you using it and in doing so they just got. They only did one thread which means that there's a limitation and like a lot has to change and further development has to occur to be able to use essentially the entire tool. I guess like 10% efficient or 10% of its actual use is what he was able to pull out of it. [00:57:05] But if you can do it like multi threads with like transformers and you can use it directly, it would essentially allow you to do model training and fine tuning on Apple hardware. This just like sitting idle like just normal Mac Minis and Mac studios that are around. [00:57:24] And even crazier is you do it for like a 50th of the cost and a 20th of the energy burn. And this guy just kind of put this together. [00:57:36] Now there is this notion or thought that, you know, Apple usually shuts this stuff down, but I think they would be stupid to do so. [00:57:44] You know, if you can actually use this hardware for these purposes, like, sell the shovels, man. [00:57:49] Sell the shovels. Don't try to be the ones digging up all the gold. Sell the freaking shovels. Like, that is what Nvidia is doing, right? [00:57:59] Nvidia is selling shovels. And they just became the largest company in the world. I mean, I'm not sure how it plays out now, because I know there's been a bunch of volatility at the top, but Mac has unified memory. On the Mac Studio, you can run the Kimi K2 or K5, whatever the heck that model is. [00:58:19] That is a 690 gigabyte model. You can run it on their 512 gigabytes of unified memory. You can't do that on Nvidia graphics cards. Or you can, but it just takes like, literally 50 times the amount of money. Like, you're looking at a graphics card, that's $20,000, and you need like 10 of them. [00:58:42] But that's how fast some of these things could change. And if Apple just kind of like opened that and said, okay, our neural engine is, you know, open source and, like, fully available, like, all I would see is their hardware sales double practically overnight, maybe triple. So I can't imagine why they would try to desperately hang on to a moat that they're not going to be able to keep anyway when they're sitting on a literal potential gold mine that isn't being tapped because of their incessant need to control every single thing happening on the devices that they give to consumers because they still see them as their devices that they control, rather than something that the user gets to customize for themselves. That is going to be an increasingly very, very difficult battle to fight. [00:59:38] This is exactly why, and if you haven't listened to the episode we did with Max, I highly recommend it. It was such a good chat. [00:59:46] But that's why this is the best. [00:59:50] Like, this is the era of the cypherpunk. This is the era of a world of protocols and Noster and Bitcoin and where open source is not years behind proprietary stuff, but days and weeks behind it, and where everyone is a cypherpunk because everyone can write code. It's about a vision. It's about wanting to build something. And code is just an abstraction of how you seek to build a thing, the process of communicating what it is that you want to see in the world. And that is wild. [01:00:31] So anyway, shout out to Max. [01:00:33] Don't forget to listen to that chat episode. Shout out to HRF and the audionauts for supporting the show. Don't forget to check out the links in the show notes and I will catch you on the next episode of Bitcoin Audible. And until then, Everybody, that's my two SATs. [01:01:07] Empires fall when individuals quietly start living as if they were already free.

Other Episodes

Episode

February 03, 2025 00:22:50
Episode Cover

Read_867 - The danger of the collective "we"

"We owe it to ourselves." I truly hope you aren't so stupid as to believe this statement when it is uttered by the frauds...

Listen

Episode

February 26, 2020 00:28:24
Episode Cover

CryptoQuikRead_358 - The Blockchain is Exhaust [Udi Wertheimer]

Is the blockchain a critical tool that creates the power behind Bitcoin, or is it a wasteful and unavoidable byproduct, an exhaust, of the...

Listen

Episode

August 24, 2022 01:35:43
Episode Cover

Read_646 - A Look at the Lightning Network - Part 3 [Lyn Alden]

"Back when the iPhone was introduced in 2007, few people thought, “wow this could really disrupt the taxi industry a decade from now.” A ...

Listen