<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" encoding="UTF-8" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:admin="http://webns.net/mvcb/" xmlns:atom="http://www.w3.org/2005/Atom/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:fireside="http://fireside.fm/modules/rss/fireside">
  <channel>
    <fireside:hostname>web02.fireside.fm</fireside:hostname>
    <fireside:genDate>Thu, 05 Mar 2026 14:32:24 -0600</fireside:genDate>
    <generator>Fireside (https://fireside.fm)</generator>
    <title>LINUX Unplugged - Episodes Tagged with “Open Source Ai”</title>
    <link>https://linuxunplugged.com/tags/open%20source%20ai</link>
    <pubDate>Sun, 25 May 2025 15:00:00 -0700</pubDate>
    <description>An open show powered by community LINUX Unplugged takes the best attributes of open collaboration and turns it into a weekly show about Linux.
</description>
    <language>en-us</language>
    <itunes:type>episodic</itunes:type>
    <itunes:subtitle>Weekly Linux talk show with no script, no limits, surprise guests and tons of opinion.</itunes:subtitle>
    <itunes:author>Jupiter Broadcasting</itunes:author>
    <itunes:summary>An open show powered by community LINUX Unplugged takes the best attributes of open collaboration and turns it into a weekly show about Linux.
</itunes:summary>
    <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/f/f31a453c-fa15-491f-8618-3f71f1d565e5/cover.jpg?v=3"/>
    <itunes:explicit>no</itunes:explicit>
    <itunes:owner>
      <itunes:name>Jupiter Broadcasting</itunes:name>
      <itunes:email>chris@jupiterbroadcasting.com</itunes:email>
    </itunes:owner>
<itunes:category text="Technology"/>
<itunes:category text="News">
  <itunes:category text="Tech News"/>
</itunes:category>
<item>
  <title>616: From Boston to bootc</title>
  <link>https://linuxunplugged.com/616</link>
  <guid isPermaLink="false">685be1b0-b48e-4f25-84ae-6e340494cd88</guid>
  <pubDate>Sun, 25 May 2025 15:00:00 -0700</pubDate>
  <author>Jupiter Broadcasting</author>
  <enclosure url="https://aphid.fireside.fm/d/1437767933/f31a453c-fa15-491f-8618-3f71f1d565e5/685be1b0-b48e-4f25-84ae-6e340494cd88.mp3" length="86996763" type="audio/mpeg"/>
  <itunes:episodeType>full</itunes:episodeType>
  <itunes:author>Jupiter Broadcasting</itunes:author>
  <itunes:subtitle>Fresh off Red Hat Summit, Chris is eyeing an exit from NixOS. What’s luring him back to the mainstream? Our highlights, and the signal from the noise from open source's biggest event of the year.</itunes:subtitle>
  <itunes:duration>1:30:37</itunes:duration>
  <itunes:explicit>no</itunes:explicit>
  <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/f/f31a453c-fa15-491f-8618-3f71f1d565e5/cover.jpg?v=3"/>
  <description>Fresh off Red Hat Summit, Chris is eyeing an exit from NixOS. What’s luring him back to the mainstream? Our highlights, and the signal from the noise from open source's biggest event of the year. Special Guests: Ben Breard, Carl George, Jef Spaleta, Matthew Miller, and The Spectacular AI Wish Machine.
</description>
  <itunes:keywords>Jupiter Broadcasting, Linux Podcast, Linux Unplugged, Red Hat, Fedora, Red Hat Enterprise Linux, RHEL 10, Red Hat Summit 2025, Red Hat Summit, open source, open source AI, AI, LLM, llm-d, Matt Hicks, hybrid cloud, Hugging Face, FIPS, vLLM, image mode, bootc, Podman, artificial intelligence, Red Hat Ansible Automation Platform, Ansible, post-quantum cryptography, quantum computing, Red Hat AI, AI Wish Machine, Fedora Project Leader, MCP, Model Context Protocol, OpenShift, OpenShift Virtualization, RamaLama, crab cakes, TUI Challenge, Podcast completionists, NixOS, Bluefin, atomic updates, immutable, Carl's pocket meat, Yazi, Ollama</itunes:keywords>
  <content:encoded>
    <![CDATA[<p>Fresh off Red Hat Summit, Chris is eyeing an exit from NixOS. What’s luring him back to the mainstream? Our highlights, and the signal from the noise from open source&#39;s biggest event of the year.</p><p>Special Guests: Ben Breard, Carl George, Jef Spaleta, Matthew Miller, and The Spectacular AI Wish Machine.</p><p>Sponsored By:</p><ul><li><a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale</a>: <a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale is a programmable networking software that is private and secure by default - get it free on up to 100 devices!</a></li><li><a rel="nofollow" href="https://1password.com/unplugged">1Password Extended Access Management</a>: <a rel="nofollow" href="https://1password.com/unplugged">Secure every sign-in for every app on every device.</a></li></ul><p><a rel="payment" href="https://jupitersignal.memberful.com/checkout?plan=52946">Support LINUX Unplugged</a></p><p>Links:</p><ul><li><a title="💥 Gets Sats Quick and Easy with Strike" rel="nofollow" href="https://strike.me/">💥 Gets Sats Quick and Easy with Strike</a></li><li><a title="📻 LINUX Unplugged  on Fountain.FM" rel="nofollow" href="https://www.fountain.fm/show/dWiuBeqpDSM86AwXRXov">📻 LINUX Unplugged  on Fountain.FM</a></li><li><a title="LINUX Unplugged TUI Challenge Rules" rel="nofollow" href="https://github.com/JupiterBroadcasting/linux-unplugged/blob/main/challenges/TUI-Challenge.md">LINUX Unplugged TUI Challenge Rules</a> &mdash; Help shape the challenge - what did we miss?</li><li><a title="TUI Challenge Rules Discussion Thread" rel="nofollow" href="https://github.com/JupiterBroadcasting/linux-unplugged/issues/5">TUI Challenge Rules Discussion Thread</a></li><li><a title="Red Hat Summit 2025 Homepage" rel="nofollow" href="https://www.redhat.com/en/summit">Red Hat Summit 2025 Homepage</a> &mdash; May 19-22 2025 in Boston, MA</li><li><a title="Red Hat Summit 2025: Execs Tout Opportunities In Open Source AI, Virtualization Migration" rel="nofollow" href="https://www.crn.com/news/ai/2025/red-hat-summit-2025-execs-tout-opportunities-in-open-source-ai-virtualization-migration">Red Hat Summit 2025: Execs Tout Opportunities In Open Source AI, Virtualization Migration</a> &mdash; OpenShift Virtualization has seen almost triple the number of customers, with the number of clusters deployed in production more than doubling and the number of virtual machines managed by the offer more than tripling.</li><li><a title="Agentic AI, LLMs and standards big focus of Red Hat Summit" rel="nofollow" href="https://www.networkworld.com/article/3993622/agentic-ai-llms-and-standards-big-focus-of-red-hat-summit.html">Agentic AI, LLMs and standards big focus of Red Hat Summit</a></li><li><a title="Red Hat Summit: Key Innovations for IT Channel Partners" rel="nofollow" href="https://www.channelfutures.com/cloud/red-hat-summit-innovations-it-channel-partners">Red Hat Summit: Key Innovations for IT Channel Partners</a></li><li><a title="Unlock what’s next: Microsoft at Red Hat Summit 2025" rel="nofollow" href="https://azure.microsoft.com/en-us/blog/unlock-whats-next-microsoft-at-red-hat-summit-2025/">Unlock what’s next: Microsoft at Red Hat Summit 2025</a> &mdash; Red Hat Enterprise Linux (RHEL) is now available for use with Windows Subsystem for Linux (WSL).</li><li><a title="Red Hat Launches the llm-d Community, Powering Distributed Gen AI Inference at Scale" rel="nofollow" href="https://www.businesswire.com/news/home/20250520692917/en/Red-Hat-Launches-the-llm-d-Community-Powering-Distributed-Gen-AI-Inference-at-Scale">Red Hat Launches the llm-d Community, Powering Distributed Gen AI Inference at Scale</a> &mdash; Red Hat’s vision: Any model, any accelerator, any cloud.</li><li><a title="Red Hat Introduces Red Hat Enterprise Linux 10 with Supercharged Intelligence and Security Across Hybrid Environments" rel="nofollow" href="https://www.redhat.com/en/about/press-releases/red-hat-introduces-rhel-10">Red Hat Introduces Red Hat Enterprise Linux 10 with Supercharged Intelligence and Security Across Hybrid Environments</a> &mdash; Red Hat Enterprise Linux 10 delivers a paradigm shift in enterprise operating systems with image mode.</li><li><a title="10.0 Release Notes | Red Hat Enterprise Linux | 10 | Red Hat Documentation" rel="nofollow" href="https://docs.redhat.com/en/documentation/red_hat_enterprise_linux/10/html/10.0_release_notes/index">10.0 Release Notes | Red Hat Enterprise Linux | 10 | Red Hat Documentation</a></li><li><a title="Red Hat Enterprise Linux 10 Officially Released, Here&#39;s What&#39;s New" rel="nofollow" href="https://9to5linux.com/red-hat-enterprise-linux-10-officially-released-heres-whats-new">Red Hat Enterprise Linux 10 Officially Released, Here's What's New</a> &mdash; Red Hat Enterprise Linux 10 highlights include Red Hat Enterprise Linux Lightspeed for integrating generative AI directly within the platform to provide users with context-aware guidance and actionable recommendations through a natural language interface.</li><li><a title="RHEL 10: Leading the future with AI, security and hybrid cloud" rel="nofollow" href="https://siliconangle.com/2025/05/23/rhel-10-staying-ahead-ai-security-cloud-rhsummit/">RHEL 10: Leading the future with AI, security and hybrid cloud</a></li><li><a title="Red Hat Enterprise Linux 10 Reaches GA" rel="nofollow" href="https://www.phoronix.com/news/Red-Hat-RHEL-10-GA">Red Hat Enterprise Linux 10 Reaches GA</a></li><li><a title="SiFive Collaborates with Red Hat to Support Red Hat Enterprise Linux for RISC-V" rel="nofollow" href="https://www.sifive.com/press/sifive-collaborates-with-red-hat-support-enterprise-linux-risc-v">SiFive Collaborates with Red Hat to Support Red Hat Enterprise Linux for RISC-V</a> &mdash; The developer preview of Red Hat Enterprise Linux 10 is initially available for use on the SiFive HiFive Premier P550 platform.</li><li><a title="Red Hat AI on Hugging Face" rel="nofollow" href="https://huggingface.co/RedHatAI">Red Hat AI on Hugging Face</a></li><li><a title="FIPS 203/204/205" rel="nofollow" href="https://www.federalregister.gov/documents/2024/08/14/2024-17956/announcing-issuance-of-federal-information-processing-standards-fips-fips-203-module-lattice-based">FIPS 203/204/205</a> &mdash; These standards specify key establishment and digital signature schemes that are designed to resist future attacks by quantum computers, which threaten the security of current standards.</li><li><a title="Virtualization success stories: Join Red Hat OpenShift Virtualization&#39;s momentum in 2025" rel="nofollow" href="https://www.redhat.com/en/blog/join-red-hat-openshift-virtualizations-momentum-2025">Virtualization success stories: Join Red Hat OpenShift Virtualization's momentum in 2025</a></li><li><a title="llm-d" rel="nofollow" href="https://github.com/llm-d/llm-d">llm-d</a> &mdash; llm-d is a Kubernetes-native high-performance distributed LLM inference framework</li><li><a title="What is vLLM?" rel="nofollow" href="https://www.redhat.com/en/topics/ai/what-is-vllm">What is vLLM?</a> &mdash; vLLM is an inference server that speeds up the output of generative AI applications by making better use of the GPU memory.</li><li><a title="Image mode for Red Hat Enterprise Linux" rel="nofollow" href="https://www.redhat.com/en/technologies/linux-platforms/enterprise-linux-10/image-mode">Image mode for Red Hat Enterprise Linux</a> &mdash; Image mode leverages the bootc tool to build and deploy Red Hat Enterprise Linux. Bootc stands for bootable container, and the image will include the kernel, bootloader, and other items typically excluded from application containers.</li><li><a title="Image mode for Red Hat Enterprise Linux Overview" rel="nofollow" href="https://developers.redhat.com/products/rhel-image-mode/overview">Image mode for Red Hat Enterprise Linux Overview</a></li><li><a title="Introducing Fedora Project Leader Jef Spaleta" rel="nofollow" href="https://fedoramagazine.org/introducing-fedora-project-leader-jef-spaleta/">Introducing Fedora Project Leader Jef Spaleta</a></li><li><a title="Bluefin" rel="nofollow" href="https://projectbluefin.io/">Bluefin</a> &mdash; Featuring automatic image-based updates and a simple graphical application store, Bluefin is designed to get out of your way. Get what you want without sacrificing system stability.</li><li><a title="KongrooParadox&#39;s nixfiles" rel="nofollow" href="https://github.com/KongrooParadox/nixfiles">KongrooParadox's nixfiles</a> &mdash; This was my second nixos release since getting into Nix last year (February I think), and this strategy made it really painless. No surprises about deprecated options since I saw these cases slowly when these changes hit unstable.</li><li><a title="yazi: Blazing fast terminal file manager written in Rust, based on async I/O" rel="nofollow" href="https://github.com/sxyazi/yazi">yazi: Blazing fast terminal file manager written in Rust, based on async I/O</a></li><li><a title="jira-cli - Feature-rich interactive Jira command line" rel="nofollow" href="https://github.com/ankitpokhrel/jira-cli">jira-cli - Feature-rich interactive Jira command line</a></li><li><a title="browser-use" rel="nofollow" href="https://github.com/browser-use/browser-use">browser-use</a> &mdash; Make websites accessible for AI agents</li><li><a title="Thomato&#39;s TUI Resources" rel="nofollow" href="https://tui.chef-li.eu/">Thomato's TUI Resources</a></li><li><a title="Pick: RamaLama" rel="nofollow" href="https://ramalama.ai/#about">Pick: RamaLama</a> &mdash; Make working with AI boring through the use of OCI containers.</li><li><a title="ramalama on GitHub" rel="nofollow" href="https://github.com/containers/ramalama">ramalama on GitHub</a> &mdash; RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.</li></ul>]]>
  </content:encoded>
  <itunes:summary>
    <![CDATA[<p>Fresh off Red Hat Summit, Chris is eyeing an exit from NixOS. What’s luring him back to the mainstream? Our highlights, and the signal from the noise from open source&#39;s biggest event of the year.</p><p>Special Guests: Ben Breard, Carl George, Jef Spaleta, Matthew Miller, and The Spectacular AI Wish Machine.</p><p>Sponsored By:</p><ul><li><a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale</a>: <a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale is a programmable networking software that is private and secure by default - get it free on up to 100 devices!</a></li><li><a rel="nofollow" href="https://1password.com/unplugged">1Password Extended Access Management</a>: <a rel="nofollow" href="https://1password.com/unplugged">Secure every sign-in for every app on every device.</a></li></ul><p><a rel="payment" href="https://jupitersignal.memberful.com/checkout?plan=52946">Support LINUX Unplugged</a></p><p>Links:</p><ul><li><a title="💥 Gets Sats Quick and Easy with Strike" rel="nofollow" href="https://strike.me/">💥 Gets Sats Quick and Easy with Strike</a></li><li><a title="📻 LINUX Unplugged  on Fountain.FM" rel="nofollow" href="https://www.fountain.fm/show/dWiuBeqpDSM86AwXRXov">📻 LINUX Unplugged  on Fountain.FM</a></li><li><a title="LINUX Unplugged TUI Challenge Rules" rel="nofollow" href="https://github.com/JupiterBroadcasting/linux-unplugged/blob/main/challenges/TUI-Challenge.md">LINUX Unplugged TUI Challenge Rules</a> &mdash; Help shape the challenge - what did we miss?</li><li><a title="TUI Challenge Rules Discussion Thread" rel="nofollow" href="https://github.com/JupiterBroadcasting/linux-unplugged/issues/5">TUI Challenge Rules Discussion Thread</a></li><li><a title="Red Hat Summit 2025 Homepage" rel="nofollow" href="https://www.redhat.com/en/summit">Red Hat Summit 2025 Homepage</a> &mdash; May 19-22 2025 in Boston, MA</li><li><a title="Red Hat Summit 2025: Execs Tout Opportunities In Open Source AI, Virtualization Migration" rel="nofollow" href="https://www.crn.com/news/ai/2025/red-hat-summit-2025-execs-tout-opportunities-in-open-source-ai-virtualization-migration">Red Hat Summit 2025: Execs Tout Opportunities In Open Source AI, Virtualization Migration</a> &mdash; OpenShift Virtualization has seen almost triple the number of customers, with the number of clusters deployed in production more than doubling and the number of virtual machines managed by the offer more than tripling.</li><li><a title="Agentic AI, LLMs and standards big focus of Red Hat Summit" rel="nofollow" href="https://www.networkworld.com/article/3993622/agentic-ai-llms-and-standards-big-focus-of-red-hat-summit.html">Agentic AI, LLMs and standards big focus of Red Hat Summit</a></li><li><a title="Red Hat Summit: Key Innovations for IT Channel Partners" rel="nofollow" href="https://www.channelfutures.com/cloud/red-hat-summit-innovations-it-channel-partners">Red Hat Summit: Key Innovations for IT Channel Partners</a></li><li><a title="Unlock what’s next: Microsoft at Red Hat Summit 2025" rel="nofollow" href="https://azure.microsoft.com/en-us/blog/unlock-whats-next-microsoft-at-red-hat-summit-2025/">Unlock what’s next: Microsoft at Red Hat Summit 2025</a> &mdash; Red Hat Enterprise Linux (RHEL) is now available for use with Windows Subsystem for Linux (WSL).</li><li><a title="Red Hat Launches the llm-d Community, Powering Distributed Gen AI Inference at Scale" rel="nofollow" href="https://www.businesswire.com/news/home/20250520692917/en/Red-Hat-Launches-the-llm-d-Community-Powering-Distributed-Gen-AI-Inference-at-Scale">Red Hat Launches the llm-d Community, Powering Distributed Gen AI Inference at Scale</a> &mdash; Red Hat’s vision: Any model, any accelerator, any cloud.</li><li><a title="Red Hat Introduces Red Hat Enterprise Linux 10 with Supercharged Intelligence and Security Across Hybrid Environments" rel="nofollow" href="https://www.redhat.com/en/about/press-releases/red-hat-introduces-rhel-10">Red Hat Introduces Red Hat Enterprise Linux 10 with Supercharged Intelligence and Security Across Hybrid Environments</a> &mdash; Red Hat Enterprise Linux 10 delivers a paradigm shift in enterprise operating systems with image mode.</li><li><a title="10.0 Release Notes | Red Hat Enterprise Linux | 10 | Red Hat Documentation" rel="nofollow" href="https://docs.redhat.com/en/documentation/red_hat_enterprise_linux/10/html/10.0_release_notes/index">10.0 Release Notes | Red Hat Enterprise Linux | 10 | Red Hat Documentation</a></li><li><a title="Red Hat Enterprise Linux 10 Officially Released, Here&#39;s What&#39;s New" rel="nofollow" href="https://9to5linux.com/red-hat-enterprise-linux-10-officially-released-heres-whats-new">Red Hat Enterprise Linux 10 Officially Released, Here's What's New</a> &mdash; Red Hat Enterprise Linux 10 highlights include Red Hat Enterprise Linux Lightspeed for integrating generative AI directly within the platform to provide users with context-aware guidance and actionable recommendations through a natural language interface.</li><li><a title="RHEL 10: Leading the future with AI, security and hybrid cloud" rel="nofollow" href="https://siliconangle.com/2025/05/23/rhel-10-staying-ahead-ai-security-cloud-rhsummit/">RHEL 10: Leading the future with AI, security and hybrid cloud</a></li><li><a title="Red Hat Enterprise Linux 10 Reaches GA" rel="nofollow" href="https://www.phoronix.com/news/Red-Hat-RHEL-10-GA">Red Hat Enterprise Linux 10 Reaches GA</a></li><li><a title="SiFive Collaborates with Red Hat to Support Red Hat Enterprise Linux for RISC-V" rel="nofollow" href="https://www.sifive.com/press/sifive-collaborates-with-red-hat-support-enterprise-linux-risc-v">SiFive Collaborates with Red Hat to Support Red Hat Enterprise Linux for RISC-V</a> &mdash; The developer preview of Red Hat Enterprise Linux 10 is initially available for use on the SiFive HiFive Premier P550 platform.</li><li><a title="Red Hat AI on Hugging Face" rel="nofollow" href="https://huggingface.co/RedHatAI">Red Hat AI on Hugging Face</a></li><li><a title="FIPS 203/204/205" rel="nofollow" href="https://www.federalregister.gov/documents/2024/08/14/2024-17956/announcing-issuance-of-federal-information-processing-standards-fips-fips-203-module-lattice-based">FIPS 203/204/205</a> &mdash; These standards specify key establishment and digital signature schemes that are designed to resist future attacks by quantum computers, which threaten the security of current standards.</li><li><a title="Virtualization success stories: Join Red Hat OpenShift Virtualization&#39;s momentum in 2025" rel="nofollow" href="https://www.redhat.com/en/blog/join-red-hat-openshift-virtualizations-momentum-2025">Virtualization success stories: Join Red Hat OpenShift Virtualization's momentum in 2025</a></li><li><a title="llm-d" rel="nofollow" href="https://github.com/llm-d/llm-d">llm-d</a> &mdash; llm-d is a Kubernetes-native high-performance distributed LLM inference framework</li><li><a title="What is vLLM?" rel="nofollow" href="https://www.redhat.com/en/topics/ai/what-is-vllm">What is vLLM?</a> &mdash; vLLM is an inference server that speeds up the output of generative AI applications by making better use of the GPU memory.</li><li><a title="Image mode for Red Hat Enterprise Linux" rel="nofollow" href="https://www.redhat.com/en/technologies/linux-platforms/enterprise-linux-10/image-mode">Image mode for Red Hat Enterprise Linux</a> &mdash; Image mode leverages the bootc tool to build and deploy Red Hat Enterprise Linux. Bootc stands for bootable container, and the image will include the kernel, bootloader, and other items typically excluded from application containers.</li><li><a title="Image mode for Red Hat Enterprise Linux Overview" rel="nofollow" href="https://developers.redhat.com/products/rhel-image-mode/overview">Image mode for Red Hat Enterprise Linux Overview</a></li><li><a title="Introducing Fedora Project Leader Jef Spaleta" rel="nofollow" href="https://fedoramagazine.org/introducing-fedora-project-leader-jef-spaleta/">Introducing Fedora Project Leader Jef Spaleta</a></li><li><a title="Bluefin" rel="nofollow" href="https://projectbluefin.io/">Bluefin</a> &mdash; Featuring automatic image-based updates and a simple graphical application store, Bluefin is designed to get out of your way. Get what you want without sacrificing system stability.</li><li><a title="KongrooParadox&#39;s nixfiles" rel="nofollow" href="https://github.com/KongrooParadox/nixfiles">KongrooParadox's nixfiles</a> &mdash; This was my second nixos release since getting into Nix last year (February I think), and this strategy made it really painless. No surprises about deprecated options since I saw these cases slowly when these changes hit unstable.</li><li><a title="yazi: Blazing fast terminal file manager written in Rust, based on async I/O" rel="nofollow" href="https://github.com/sxyazi/yazi">yazi: Blazing fast terminal file manager written in Rust, based on async I/O</a></li><li><a title="jira-cli - Feature-rich interactive Jira command line" rel="nofollow" href="https://github.com/ankitpokhrel/jira-cli">jira-cli - Feature-rich interactive Jira command line</a></li><li><a title="browser-use" rel="nofollow" href="https://github.com/browser-use/browser-use">browser-use</a> &mdash; Make websites accessible for AI agents</li><li><a title="Thomato&#39;s TUI Resources" rel="nofollow" href="https://tui.chef-li.eu/">Thomato's TUI Resources</a></li><li><a title="Pick: RamaLama" rel="nofollow" href="https://ramalama.ai/#about">Pick: RamaLama</a> &mdash; Make working with AI boring through the use of OCI containers.</li><li><a title="ramalama on GitHub" rel="nofollow" href="https://github.com/containers/ramalama">ramalama on GitHub</a> &mdash; RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of containers.</li></ul>]]>
  </itunes:summary>
</item>
<item>
  <title>525: Beating Apple to the Sauce</title>
  <link>https://linuxunplugged.com/525</link>
  <guid isPermaLink="false">3b6e8589-19d1-4f16-893d-1dc3bce41ab1</guid>
  <pubDate>Sun, 27 Aug 2023 19:45:00 -0700</pubDate>
  <author>Jupiter Broadcasting</author>
  <enclosure url="https://aphid.fireside.fm/d/1437767933/f31a453c-fa15-491f-8618-3f71f1d565e5/3b6e8589-19d1-4f16-893d-1dc3bce41ab1.mp3" length="60772071" type="audio/mp3"/>
  <itunes:episodeType>full</itunes:episodeType>
  <itunes:author>Jupiter Broadcasting</itunes:author>
  <itunes:subtitle>We daily drive Asahi Linux on a MacBook, chat about how the team beat Apple to a major GPU milestone, and an easy way to self-host open-source ChatGPT alternatives.</itunes:subtitle>
  <itunes:duration>1:12:20</itunes:duration>
  <itunes:explicit>no</itunes:explicit>
  <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/f/f31a453c-fa15-491f-8618-3f71f1d565e5/cover.jpg?v=3"/>
  <description>We daily drive Asahi Linux on a MacBook, chat about how the team beat Apple to a major GPU milestone, and an easy way to self-host open-source ChatGPT alternatives. Special Guest: Neal Gompa.
</description>
  <itunes:keywords>Jupiter Broadcasting, Linux Podcast, Linux Unplugged, 🦙, Hector Martin, telemetry, Asahi Linux, Fedora, Fedora Asahi Remix, Arm, Apple Silicon, ARM64, macOS, Apple, Arch ARM, Neal Gompa, Davide Calvalca, Gallium3D, OpenGl ES 3.1, GPU, M1, M2, conformant GPU driver, Alyssa Rosenzweig, dual booting, UEFI, thunderbolt, Plasma, GPU acceleration, battery life, KDE, 16k pages, 16k kernel, Mac Mini, SIP, VoIP, Jitsi Meet, Mattermost, XFS, HPC, JBOD, xfs_repair, filesystem, data loss, server temperature, data center, NixOS, RDP, VNC, immutability, impermanence, ZFS, Btrfs, LUKS, OpenStreetMap, StreetComplete, LVM, disk encryption, Organic Maps, openSUSE Tumbleweed, OnePlus 6, Snapdragon 845, KDE Connect, Llama 2, Meta, OpenAI, ChatGPT, LLM, llama-gpt, llama.cpp, AI, ML, Umbrel, self-hosting, open source AI,</itunes:keywords>
  <content:encoded>
    <![CDATA[<p>We daily drive Asahi Linux on a MacBook, chat about how the team beat Apple to a major GPU milestone, and an easy way to self-host open-source ChatGPT alternatives.</p><p>Special Guest: Neal Gompa.</p><p>Sponsored By:</p><ul><li><a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale</a>: <a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale is a programmable networking software that is private and secure by default - get it free on up to 100 devices!</a></li><li><a rel="nofollow" href="https://linode.com/unplugged">Linode Cloud Hosting</a>: <a rel="nofollow" href="https://linode.com/unplugged">A special offer for all Linux Unplugged Podcast listeners and new Linode customers, visit linode.com/unplugged, and receive $100 towards your new account. </a></li><li><a rel="nofollow" href="https://1password.com/unplugged">1Password Extended Access Management</a>: <a rel="nofollow" href="https://1password.com/unplugged">Secure every sign-in for every app on every device.</a></li></ul><p><a rel="payment" href="https://jupitersignal.memberful.com/checkout?plan=52946">Support LINUX Unplugged</a></p><p>Links:</p><ul><li><a title="🎉 Alby" rel="nofollow" href="https://getalby.com/">🎉 Alby</a> &mdash; Boost into the show, first grab Alby, top it off, and then head over to the Podcast Index.</li><li><a title="⚡️ LINUX Unplugged on the Podcastindex.org" rel="nofollow" href="https://podcastindex.org/podcast/575694">⚡️ LINUX Unplugged on the Podcastindex.org</a> &mdash; You can boost from the web. Once Alby is topped off, visit our page on the Podcast Index.</li><li><a title="Hector Martin&#39;s Controversial Question" rel="nofollow" href="https://social.treehouse.systems/@marcan/110837288605832455">Hector Martin's Controversial Question</a> &mdash; Would you be okay with us adding some really trivial telemetry to the Asahi installer?</li><li><a title="Berlin with Brent" rel="nofollow" href="https://www.meetup.com/jupiterbroadcasting/events/295135448/">Berlin with Brent</a> &mdash; Brent will be back in Berlin for the Nextcloud Conference and can't get enough of Berlin Meetups! Friday, September 8th, 6 PM.</li><li><a title="Fedora Asahi Remix" rel="nofollow" href="https://fedora-asahi-remix.org/">Fedora Asahi Remix</a></li><li><a title="Fedora Asahi Remix Coming For Fedora Linux On Apple Silicon Hardware" rel="nofollow" href="https://www.phoronix.com/news/Fedora-Asahi-Remix-Coming">Fedora Asahi Remix Coming For Fedora Linux On Apple Silicon Hardware</a> &mdash; Fedora Asahi Remix will be their new flagship distribution for providing a polished Linux experience on Apple Silicon.</li><li><a title="Fedora Asahi Remix: bringing Fedora to Apple Silicon Macs (Flock To Fedora 2023)" rel="nofollow" href="https://www.youtube.com/watch?v=bD2R4Yt8m88">Fedora Asahi Remix: bringing Fedora to Apple Silicon Macs (Flock To Fedora 2023)</a></li><li><a title="Our new flagship distro: Fedora Asahi Remix" rel="nofollow" href="https://asahilinux.org/2023/08/fedora-asahi-remix/">Our new flagship distro: Fedora Asahi Remix</a> &mdash; We’re still working out the kinks and making things even better, so we are not quite ready to call this a release yet. We aim to officially release the Fedora Asahi Remix by the end of August 2023. Look forward to many new features, machine support, and more!</li><li><a title="Hector Martin: “Okay, I’m going to be honest…”" rel="nofollow" href="https://social.treehouse.systems/@marcan/109971521711413167">Hector Martin: “Okay, I’m going to be honest…”</a> &mdash; I apologize to all Asahi Linux users. You deserve better. When I chose Arch Linux ARM as a base I didn't realize it would have so many basic QA issues.</li><li><a title="Coming soon: Fedora for Apple Silicon Macs! (Fedora Discourse)" rel="nofollow" href="https://discussion.fedoraproject.org/t/coming-soon-fedora-for-apple-silicon-macs/86745">Coming soon: Fedora for Apple Silicon Macs! (Fedora Discourse)</a></li><li><a title="The first conformant M1 GPU driver" rel="nofollow" href="https://rosenzweig.io/blog/first-conformant-m1-gpu-driver.html">The first conformant M1 GPU driver</a> &mdash; Our reverse-engineered, free and open source graphics drivers are the world’s only conformant OpenGL ES 3.1 implementation for M1- and M2-family graphics hardware. That means our driver passed tens of thousands of tests to demonstrate correctness and is now recognized by the industry.</li><li><a title="Asahi Linux’s Apple M1/M2 Gallium3D Driver Now OpenGL ES 3.1 Conformant" rel="nofollow" href="https://www.phoronix.com/news/Asahi-Linux-GLES-3.1-AGX-M1-M2">Asahi Linux’s Apple M1/M2 Gallium3D Driver Now OpenGL ES 3.1 Conformant</a> &mdash; It's even more rewarding for the community developers in that Apple doesn't provide any conformant (OpenGL or Vulkan) graphics drivers for their Arm-based platform.</li><li><a title="Feature Support · AsahiLinux/docs Wiki" rel="nofollow" href="https://github.com/AsahiLinux/docs/wiki/Feature-Support">Feature Support · AsahiLinux/docs Wiki</a></li><li><a title="Switch to the kernel-16k variant - Fedora Discussion" rel="nofollow" href="https://discussion.fedoraproject.org/t/switch-to-the-kernel-16k-variant/87711">Switch to the kernel-16k variant - Fedora Discussion</a></li><li><a title="NixOS: Unlocking your LUKS via SSH and Tor" rel="nofollow" href="https://nixos.wiki/wiki/Remote_LUKS_Unlocking">NixOS: Unlocking your LUKS via SSH and Tor</a></li><li><a title="StreetComplete" rel="nofollow" href="https://github.com/streetcomplete/StreetComplete">StreetComplete</a> &mdash; Easy to use OpenStreetMap editor for Android.</li><li><a title="getumbrel/llama-gpt: A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device." rel="nofollow" href="https://github.com/getumbrel/llama-gpt">getumbrel/llama-gpt: A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device.</a></li><li><a title="serge-chat/serge: A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API." rel="nofollow" href="https://github.com/serge-chat/serge">serge-chat/serge: A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.</a></li><li><a title="liltom-eth/llama2-webui: Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use" rel="nofollow" href="https://github.com/liltom-eth/llama2-webui">liltom-eth/llama2-webui: Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use</a></li><li><a title="llama.cpp" rel="nofollow" href="https://github.com/ggerganov/llama.cpp">llama.cpp</a> &mdash; Port of Facebook’s LLaMA model in C/C++</li><li><a title="Llama2.c" rel="nofollow" href="https://github.com/karpathy/llama2.c">Llama2.c</a> &mdash; Inference Llama 2 in one file of pure C</li><li><a title="Koboldcpp" rel="nofollow" href="https://github.com/LostRuins/koboldcpp">Koboldcpp</a> &mdash; A simple one-file way to run various GGML models with KoboldAI’s UI</li><li><a title="lollms-webui" rel="nofollow" href="https://github.com/ParisNeo/lollms-webui">lollms-webui</a> &mdash; Lord of Large Language Models Web User Interface</li><li><a title="LM Studio" rel="nofollow" href="https://lmstudio.ai/">LM Studio</a> &mdash; Discover, download, and run local LLMs</li><li><a title="text-generation-webui" rel="nofollow" href="https://github.com/oobabooga/text-generation-webui">text-generation-webui</a> &mdash; A Gradio web UI for Large Language Models. Supports transformers, GPTQ, llama.cpp (ggml/gguf), Llama models.</li><li><a title="A comprehensive guide to running Llama 2 locally" rel="nofollow" href="https://replicate.com/blog/run-llama-locally">A comprehensive guide to running Llama 2 locally</a> &mdash; Code Llama is a state-of-the-art LLM capable of generating code, and natural language about code, from both code and natural language prompts.</li><li><a title="Meta Releases Code Llama, a Coding Version of Llama 2" rel="nofollow" href="https://www.wired.com/story/meta-code-llama/">Meta Releases Code Llama, a Coding Version of Llama 2</a></li><li><a title="Introducing Code Llama, a state-of-the-art large language model for coding" rel="nofollow" href="https://ai.meta.com/blog/code-llama-large-language-model-coding/">Introducing Code Llama, a state-of-the-art large language model for coding</a></li><li><a title="Llama and ChatGPT Are Not Open-Source" rel="nofollow" href="https://spectrum.ieee.org/open-source-llm-not-open">Llama and ChatGPT Are Not Open-Source</a></li><li><a title="Meta launches Llama 2, a source-available AI model that allows commercial applications" rel="nofollow" href="https://arstechnica.com/information-technology/2023/07/meta-launches-llama-2-an-open-source-ai-model-that-allows-commercial-applications/">Meta launches Llama 2, a source-available AI model that allows commercial applications</a> &mdash; A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.</li><li><a title="Meta’s Llama 2 is not open source" rel="nofollow" href="https://www.theregister.com/2023/07/21/llama_is_not_open_source/">Meta’s Llama 2 is not open source</a> &mdash; Meta's newly released large language model Llama 2 is not open source.</li></ul>]]>
  </content:encoded>
  <itunes:summary>
    <![CDATA[<p>We daily drive Asahi Linux on a MacBook, chat about how the team beat Apple to a major GPU milestone, and an easy way to self-host open-source ChatGPT alternatives.</p><p>Special Guest: Neal Gompa.</p><p>Sponsored By:</p><ul><li><a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale</a>: <a rel="nofollow" href="http://tailscale.com/linuxunplugged">Tailscale is a programmable networking software that is private and secure by default - get it free on up to 100 devices!</a></li><li><a rel="nofollow" href="https://linode.com/unplugged">Linode Cloud Hosting</a>: <a rel="nofollow" href="https://linode.com/unplugged">A special offer for all Linux Unplugged Podcast listeners and new Linode customers, visit linode.com/unplugged, and receive $100 towards your new account. </a></li><li><a rel="nofollow" href="https://1password.com/unplugged">1Password Extended Access Management</a>: <a rel="nofollow" href="https://1password.com/unplugged">Secure every sign-in for every app on every device.</a></li></ul><p><a rel="payment" href="https://jupitersignal.memberful.com/checkout?plan=52946">Support LINUX Unplugged</a></p><p>Links:</p><ul><li><a title="🎉 Alby" rel="nofollow" href="https://getalby.com/">🎉 Alby</a> &mdash; Boost into the show, first grab Alby, top it off, and then head over to the Podcast Index.</li><li><a title="⚡️ LINUX Unplugged on the Podcastindex.org" rel="nofollow" href="https://podcastindex.org/podcast/575694">⚡️ LINUX Unplugged on the Podcastindex.org</a> &mdash; You can boost from the web. Once Alby is topped off, visit our page on the Podcast Index.</li><li><a title="Hector Martin&#39;s Controversial Question" rel="nofollow" href="https://social.treehouse.systems/@marcan/110837288605832455">Hector Martin's Controversial Question</a> &mdash; Would you be okay with us adding some really trivial telemetry to the Asahi installer?</li><li><a title="Berlin with Brent" rel="nofollow" href="https://www.meetup.com/jupiterbroadcasting/events/295135448/">Berlin with Brent</a> &mdash; Brent will be back in Berlin for the Nextcloud Conference and can't get enough of Berlin Meetups! Friday, September 8th, 6 PM.</li><li><a title="Fedora Asahi Remix" rel="nofollow" href="https://fedora-asahi-remix.org/">Fedora Asahi Remix</a></li><li><a title="Fedora Asahi Remix Coming For Fedora Linux On Apple Silicon Hardware" rel="nofollow" href="https://www.phoronix.com/news/Fedora-Asahi-Remix-Coming">Fedora Asahi Remix Coming For Fedora Linux On Apple Silicon Hardware</a> &mdash; Fedora Asahi Remix will be their new flagship distribution for providing a polished Linux experience on Apple Silicon.</li><li><a title="Fedora Asahi Remix: bringing Fedora to Apple Silicon Macs (Flock To Fedora 2023)" rel="nofollow" href="https://www.youtube.com/watch?v=bD2R4Yt8m88">Fedora Asahi Remix: bringing Fedora to Apple Silicon Macs (Flock To Fedora 2023)</a></li><li><a title="Our new flagship distro: Fedora Asahi Remix" rel="nofollow" href="https://asahilinux.org/2023/08/fedora-asahi-remix/">Our new flagship distro: Fedora Asahi Remix</a> &mdash; We’re still working out the kinks and making things even better, so we are not quite ready to call this a release yet. We aim to officially release the Fedora Asahi Remix by the end of August 2023. Look forward to many new features, machine support, and more!</li><li><a title="Hector Martin: “Okay, I’m going to be honest…”" rel="nofollow" href="https://social.treehouse.systems/@marcan/109971521711413167">Hector Martin: “Okay, I’m going to be honest…”</a> &mdash; I apologize to all Asahi Linux users. You deserve better. When I chose Arch Linux ARM as a base I didn't realize it would have so many basic QA issues.</li><li><a title="Coming soon: Fedora for Apple Silicon Macs! (Fedora Discourse)" rel="nofollow" href="https://discussion.fedoraproject.org/t/coming-soon-fedora-for-apple-silicon-macs/86745">Coming soon: Fedora for Apple Silicon Macs! (Fedora Discourse)</a></li><li><a title="The first conformant M1 GPU driver" rel="nofollow" href="https://rosenzweig.io/blog/first-conformant-m1-gpu-driver.html">The first conformant M1 GPU driver</a> &mdash; Our reverse-engineered, free and open source graphics drivers are the world’s only conformant OpenGL ES 3.1 implementation for M1- and M2-family graphics hardware. That means our driver passed tens of thousands of tests to demonstrate correctness and is now recognized by the industry.</li><li><a title="Asahi Linux’s Apple M1/M2 Gallium3D Driver Now OpenGL ES 3.1 Conformant" rel="nofollow" href="https://www.phoronix.com/news/Asahi-Linux-GLES-3.1-AGX-M1-M2">Asahi Linux’s Apple M1/M2 Gallium3D Driver Now OpenGL ES 3.1 Conformant</a> &mdash; It's even more rewarding for the community developers in that Apple doesn't provide any conformant (OpenGL or Vulkan) graphics drivers for their Arm-based platform.</li><li><a title="Feature Support · AsahiLinux/docs Wiki" rel="nofollow" href="https://github.com/AsahiLinux/docs/wiki/Feature-Support">Feature Support · AsahiLinux/docs Wiki</a></li><li><a title="Switch to the kernel-16k variant - Fedora Discussion" rel="nofollow" href="https://discussion.fedoraproject.org/t/switch-to-the-kernel-16k-variant/87711">Switch to the kernel-16k variant - Fedora Discussion</a></li><li><a title="NixOS: Unlocking your LUKS via SSH and Tor" rel="nofollow" href="https://nixos.wiki/wiki/Remote_LUKS_Unlocking">NixOS: Unlocking your LUKS via SSH and Tor</a></li><li><a title="StreetComplete" rel="nofollow" href="https://github.com/streetcomplete/StreetComplete">StreetComplete</a> &mdash; Easy to use OpenStreetMap editor for Android.</li><li><a title="getumbrel/llama-gpt: A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device." rel="nofollow" href="https://github.com/getumbrel/llama-gpt">getumbrel/llama-gpt: A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device.</a></li><li><a title="serge-chat/serge: A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API." rel="nofollow" href="https://github.com/serge-chat/serge">serge-chat/serge: A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.</a></li><li><a title="liltom-eth/llama2-webui: Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use" rel="nofollow" href="https://github.com/liltom-eth/llama2-webui">liltom-eth/llama2-webui: Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use</a></li><li><a title="llama.cpp" rel="nofollow" href="https://github.com/ggerganov/llama.cpp">llama.cpp</a> &mdash; Port of Facebook’s LLaMA model in C/C++</li><li><a title="Llama2.c" rel="nofollow" href="https://github.com/karpathy/llama2.c">Llama2.c</a> &mdash; Inference Llama 2 in one file of pure C</li><li><a title="Koboldcpp" rel="nofollow" href="https://github.com/LostRuins/koboldcpp">Koboldcpp</a> &mdash; A simple one-file way to run various GGML models with KoboldAI’s UI</li><li><a title="lollms-webui" rel="nofollow" href="https://github.com/ParisNeo/lollms-webui">lollms-webui</a> &mdash; Lord of Large Language Models Web User Interface</li><li><a title="LM Studio" rel="nofollow" href="https://lmstudio.ai/">LM Studio</a> &mdash; Discover, download, and run local LLMs</li><li><a title="text-generation-webui" rel="nofollow" href="https://github.com/oobabooga/text-generation-webui">text-generation-webui</a> &mdash; A Gradio web UI for Large Language Models. Supports transformers, GPTQ, llama.cpp (ggml/gguf), Llama models.</li><li><a title="A comprehensive guide to running Llama 2 locally" rel="nofollow" href="https://replicate.com/blog/run-llama-locally">A comprehensive guide to running Llama 2 locally</a> &mdash; Code Llama is a state-of-the-art LLM capable of generating code, and natural language about code, from both code and natural language prompts.</li><li><a title="Meta Releases Code Llama, a Coding Version of Llama 2" rel="nofollow" href="https://www.wired.com/story/meta-code-llama/">Meta Releases Code Llama, a Coding Version of Llama 2</a></li><li><a title="Introducing Code Llama, a state-of-the-art large language model for coding" rel="nofollow" href="https://ai.meta.com/blog/code-llama-large-language-model-coding/">Introducing Code Llama, a state-of-the-art large language model for coding</a></li><li><a title="Llama and ChatGPT Are Not Open-Source" rel="nofollow" href="https://spectrum.ieee.org/open-source-llm-not-open">Llama and ChatGPT Are Not Open-Source</a></li><li><a title="Meta launches Llama 2, a source-available AI model that allows commercial applications" rel="nofollow" href="https://arstechnica.com/information-technology/2023/07/meta-launches-llama-2-an-open-source-ai-model-that-allows-commercial-applications/">Meta launches Llama 2, a source-available AI model that allows commercial applications</a> &mdash; A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.</li><li><a title="Meta’s Llama 2 is not open source" rel="nofollow" href="https://www.theregister.com/2023/07/21/llama_is_not_open_source/">Meta’s Llama 2 is not open source</a> &mdash; Meta's newly released large language model Llama 2 is not open source.</li></ul>]]>
  </itunes:summary>
</item>
  </channel>
</rss>
