{"id":206,"date":"2026-03-11T16:06:01","date_gmt":"2026-03-11T16:06:01","guid":{"rendered":"https:\/\/rebaihamida.com\/?page_id=206"},"modified":"2026-03-11T18:18:33","modified_gmt":"2026-03-11T18:18:33","slug":"are-you-already-running-ai-workloads-in-docker","status":"publish","type":"page","link":"https:\/\/rebaihamida.com\/?page_id=206","title":{"rendered":"Docker&#8217;s New Direction"},"content":{"rendered":"\n<p>How Docker is reinventing itself as the secure backbone of agentic AI development, without abandoning the developers who built the container revolution.<\/p>\n\n\n\n<!DOCTYPE html>\n<html lang=\"en\">\n<head>\n<meta charset=\"UTF-8\">\n<meta name=\"viewport\" content=\"width=device-width, initial-scale=1.0\">\n<title>Docker&#8217;s New Direction: Security, AI, and the Next Generation of Container Development<\/title>\n<style>\n  @import url('https:\/\/fonts.googleapis.com\/css2?family=Merriweather:ital,wght@0,300;0,400;0,700;1,300;1,400&family=Inter:wght@400;500;600;700&display=swap');\n\n  *, *::before, *::after { box-sizing: border-box; margin: 0; padding: 0; }\n\n  :root {\n    --blue: #0066cc;\n    --dark-blue: #003d7a;\n    --light-blue: #e8f1fb;\n    --text: #1a1a2e;\n    --muted: #556070;\n    --border: #d8e4f0;\n    --accent: #00b4d8;\n    --bg: #f9fbff;\n  }\n\n  body {\n    font-family: 'Merriweather', Georgia, serif;\n    background: var(--bg);\n    color: var(--text);\n    line-height: 1.85;\n    font-size: 17px;\n  }\n\n  \/* \u2500\u2500 HEADER \u2500\u2500 *\/\n  header {\n    background: linear-gradient(135deg, #001d40 0%, #003d7a 55%, #005cb8 100%);\n    color: #fff;\n    padding: 64px 24px 56px;\n    text-align: center;\n    position: relative;\n    overflow: hidden;\n  }\n  header::before {\n    content: '';\n    position: absolute; inset: 0;\n    background: radial-gradient(ellipse 80% 60% at 50% 110%, rgba(0,180,216,.25) 0%, transparent 70%);\n    pointer-events: none;\n  }\n  .kicker {\n    font-family: 'Inter', sans-serif;\n    font-size: 11px;\n    font-weight: 700;\n    letter-spacing: 3px;\n    text-transform: uppercase;\n    color: var(--accent);\n    margin-bottom: 18px;\n  }\n  header h1 {\n    font-family: 'Merriweather', serif;\n    font-size: clamp(26px, 4vw, 44px);\n    font-weight: 700;\n    line-height: 1.2;\n    max-width: 820px;\n    margin: 0 auto 22px;\n    letter-spacing: -0.5px;\n  }\n  .subtitle {\n    font-size: 17px;\n    font-weight: 300;\n    font-style: italic;\n    color: rgba(255,255,255,.75);\n    max-width: 640px;\n    margin: 0 auto 30px;\n  }\n  .meta {\n    font-family: 'Inter', sans-serif;\n    font-size: 12px;\n    color: rgba(255,255,255,.5);\n    letter-spacing: 1px;\n  }\n\n  \/* \u2500\u2500 BODY \u2500\u2500 *\/\n  .article-wrap {\n    max-width: 760px;\n    margin: 0 auto;\n    padding: 60px 24px 80px;\n  }\n\n  \/* drop-cap on first paragraph *\/\n  .lead::first-letter {\n    float: left;\n    font-size: 72px;\n    line-height: 0.75;\n    margin: 6px 10px 0 0;\n    font-weight: 700;\n    color: var(--blue);\n    font-family: 'Merriweather', serif;\n  }\n\n  p { margin-bottom: 1.6em; }\n\n  h2 {\n    font-family: 'Inter', sans-serif;\n    font-size: 22px;\n    font-weight: 700;\n    color: var(--dark-blue);\n    margin: 2.6em 0 0.8em;\n    padding-bottom: 10px;\n    border-bottom: 2px solid var(--border);\n    line-height: 1.3;\n  }\n\n  h3 {\n    font-family: 'Inter', sans-serif;\n    font-size: 17px;\n    font-weight: 600;\n    color: var(--blue);\n    margin: 2em 0 0.6em;\n  }\n\n  \/* pull-quote *\/\n  blockquote {\n    border-left: 4px solid var(--accent);\n    background: var(--light-blue);\n    margin: 2.4em 0;\n    padding: 24px 28px;\n    border-radius: 0 8px 8px 0;\n  }\n  blockquote p {\n    font-size: 19px;\n    font-style: italic;\n    color: var(--dark-blue);\n    margin: 0 0 8px;\n    line-height: 1.6;\n  }\n  blockquote cite {\n    font-family: 'Inter', sans-serif;\n    font-size: 12px;\n    font-weight: 600;\n    color: var(--muted);\n    text-transform: uppercase;\n    letter-spacing: 1px;\n    font-style: normal;\n  }\n\n  \/* stat cards *\/\n  .stats-grid {\n    display: grid;\n    grid-template-columns: repeat(auto-fit, minmax(160px, 1fr));\n    gap: 16px;\n    margin: 2.4em 0;\n  }\n  .stat-card {\n    background: #fff;\n    border: 1px solid var(--border);\n    border-radius: 10px;\n    padding: 20px 16px;\n    text-align: center;\n    box-shadow: 0 2px 8px rgba(0,60,140,.06);\n  }\n  .stat-card .num {\n    font-family: 'Inter', sans-serif;\n    font-size: 30px;\n    font-weight: 700;\n    color: var(--blue);\n    line-height: 1;\n    margin-bottom: 6px;\n  }\n  .stat-card .label {\n    font-family: 'Inter', sans-serif;\n    font-size: 12px;\n    color: var(--muted);\n    line-height: 1.4;\n  }\n\n  \/* callout box *\/\n  .callout {\n    background: #fff;\n    border: 1px solid var(--border);\n    border-top: 4px solid var(--blue);\n    border-radius: 0 0 8px 8px;\n    padding: 24px 28px;\n    margin: 2.4em 0;\n    box-shadow: 0 2px 10px rgba(0,60,140,.07);\n  }\n  .callout-title {\n    font-family: 'Inter', sans-serif;\n    font-size: 11px;\n    font-weight: 700;\n    text-transform: uppercase;\n    letter-spacing: 2px;\n    color: var(--blue);\n    margin-bottom: 12px;\n  }\n  .callout p { margin-bottom: 0.8em; font-size: 15.5px; }\n  .callout p:last-child { margin-bottom: 0; }\n\n  \/* timeline *\/\n  .timeline { margin: 2em 0; padding-left: 0; list-style: none; }\n  .timeline li {\n    display: flex;\n    gap: 18px;\n    margin-bottom: 20px;\n    align-items: flex-start;\n  }\n  .tl-dot {\n    flex-shrink: 0;\n    width: 38px; height: 38px;\n    background: var(--blue);\n    color: #fff;\n    border-radius: 50%;\n    display: flex;\n    align-items: center;\n    justify-content: center;\n    font-family: 'Inter', sans-serif;\n    font-size: 11px;\n    font-weight: 700;\n    margin-top: 2px;\n  }\n  .tl-body strong {\n    font-family: 'Inter', sans-serif;\n    font-size: 14px;\n    font-weight: 700;\n    color: var(--dark-blue);\n    display: block;\n    margin-bottom: 2px;\n  }\n  .tl-body span { font-size: 15px; color: var(--muted); }\n\n  \/* conclusion stripe *\/\n  .conclusion-box {\n    background: linear-gradient(135deg, #001d40, #003d7a);\n    color: #fff;\n    border-radius: 12px;\n    padding: 36px 36px;\n    margin-top: 3em;\n  }\n  .conclusion-box h2 {\n    font-family: 'Inter', sans-serif;\n    color: var(--accent);\n    border-bottom-color: rgba(255,255,255,.15);\n    font-size: 20px;\n    margin-top: 0;\n  }\n  .conclusion-box p { color: rgba(255,255,255,.85); margin-bottom: 1em; }\n\n  \/* footer *\/\n  footer {\n    text-align: center;\n    padding: 32px 24px;\n    font-family: 'Inter', sans-serif;\n    font-size: 12px;\n    color: var(--muted);\n    border-top: 1px solid var(--border);\n  }\n\n  @media (max-width: 560px) {\n    header { padding: 48px 20px 40px; }\n    .stats-grid { grid-template-columns: 1fr 1fr; }\n    .conclusion-box { padding: 28px 22px; }\n  }\n<\/style>\n<\/head>\n<body>\n\n\n<div class=\"article-wrap\">\n\n  <p class=\"lead\">For more than a decade, Docker&#8217;s name has been synonymous with containers \u2014 the lightweight, portable packaging format that quietly transformed how software gets built, shipped, and run. But in 2025, Docker made moves that signal something larger than an incremental product update. The company repositioned itself at the intersection of two of the most pressing concerns in modern software: supply chain security and the explosive rise of agentic AI. The result is a Docker that looks familiar on the surface, yet is fundamentally different underneath.<\/p>\n\n  <p>The twin pillars of this transformation \u2014 the free release of Docker Hardened Images and the launch of the Docker MCP Catalog and Toolkit \u2014 represent a deliberate bet that Docker&#8217;s next decade won&#8217;t be defined by containerization alone, but by who controls the secure, trusted infrastructure on which AI-powered development runs.<\/p>\n\n  <div class=\"stats-grid\">\n    <div class=\"stat-card\"><div class=\"num\">20M+<\/div><div class=\"label\">Registered developers on Docker<\/div><\/div>\n    <div class=\"stat-card\"><div class=\"num\">20B+<\/div><div class=\"label\">Monthly Docker Hub pulls<\/div><\/div>\n    <div class=\"stat-card\"><div class=\"num\">1,000+<\/div><div class=\"label\">Hardened images now free<\/div><\/div>\n    <div class=\"stat-card\"><div class=\"num\">95%<\/div><div class=\"label\">Reduction in attack surface vs. traditional base images<\/div><\/div>\n    <div class=\"stat-card\"><div class=\"num\">$60B<\/div><div class=\"label\">Supply chain attack cost (2025, projected)<\/div><\/div>\n    <div class=\"stat-card\"><div class=\"num\">300+<\/div><div class=\"label\">Verified MCP servers in catalog<\/div><\/div>\n  <\/div>\n\n  <h2>The Security Reckoning<\/h2>\n\n  <p>The context for Docker&#8217;s pivot to security is hard to overstate. Supply chain attacks \u2014 exploiting vulnerabilities baked into the open source libraries and base images that power virtually every modern application \u2014 have surged in frequency and cost. By 2025, the damage had tripled from 2021 levels, projected to exceed $60 billion globally. Every language, every ecosystem, every build and distribution step became a potential target.<\/p>\n\n  <p>Against this backdrop, Docker launched its Docker Hardened Images (DHI) initiative in May 2025: a curated catalog of security-hardened, enterprise-grade container images designed to address software supply chain vulnerabilities head-on. The images are built on the widely adopted open source distributions Debian and Alpine, minimizing the attack surface by stripping out unnecessary components such as package managers and shells, and running as non-root users by default. Each image includes a complete software bill of materials (SBOM), transparent CVE data, cryptographic proof of authenticity, and SLSA Build Level 3 provenance \u2014 the gold standard of supply chain integrity.<\/p>\n\n  <blockquote>\n    <p>&#8220;Security has to start at the earliest point in development, and needs to be universally available to every developer. By making hardened images freely available and providing tooling that works with today&#8217;s AI coding agents, we&#8217;re giving the entire industry and community the best possible baseline to build on.&#8221;<\/p>\n    <cite>\u2014 Mark Cavage, President and COO, Docker, Inc.<\/cite>\n  <\/blockquote>\n\n  <p>Initially a commercial offering, DHI underwent a dramatic shift in December 2025 when Docker made the entire catalog \u2014 more than 1,000 images \u2014 freely available to all developers under the Apache 2.0 open source license, with no restrictions on use, sharing, or building. The move was described internally as &#8220;a fundamental reset of the container security market.&#8221;<\/p>\n\n  <h3>What &#8220;Hardened&#8221; Actually Means<\/h3>\n\n  <p>The engineering behind Docker Hardened Images goes beyond simply patching known CVEs. The images eliminate the tools attackers most commonly exploit: package managers that could be used to install malicious software at runtime, shells that allow arbitrary command execution, and unnecessary system utilities that expand the attack surface. By starting from this minimal foundation and enforcing non-root execution, DHI makes entire categories of attacks structurally impossible rather than just temporarily patched.<\/p>\n\n  <p>Docker claims the hardened images achieve up to 95% reductions in attack surface compared to traditional base images \u2014 a figure that has driven adoption among enterprises with strict compliance requirements. Adobe and Qualcomm are among the organizations that have staked their container security strategy on DHI for compliance in highly regulated environments, while startups like Attentive and Octopus Deploy have used it to accelerate their own compliance programs when selling to larger enterprise customers.<\/p>\n\n  <div class=\"callout\">\n    <div class=\"callout-title\">Key Features of Docker Hardened Images<\/div>\n    <p><strong>Minimal footprint:<\/strong> Package managers, shells, and unnecessary utilities removed by default \u2014 entire attack classes eliminated rather than patched.<\/p>\n    <p><strong>Full transparency:<\/strong> Every image ships with a complete SBOM, public CVE data, and cryptographic provenance (SLSA Build Level 3).<\/p>\n    <p><strong>Non-root by default:<\/strong> Containers run without elevated privileges, enforcing least-privilege principles from the first pull.<\/p>\n    <p><strong>Enterprise SLAs:<\/strong> DHI Enterprise offers critical CVE remediation within seven days, with a roadmap toward 24-hour turnaround.<\/p>\n    <p><strong>Extended Lifecycle Support:<\/strong> Vulnerability updates and provenance attestations for up to five years after upstream support ends.<\/p>\n  <\/div>\n\n  <p>The decision to open-source DHI wasn&#8217;t purely altruistic. Docker controls Docker Hub, which processes more than 20 billion image pulls each month. Raising the security baseline of those pulls raises the security baseline of the entire software industry \u2014 and positions Docker as the steward of that secure foundation. When Jonathan Bryce, Executive Director of the Cloud Native Computing Foundation, praised the move, he highlighted the broader ecosystem impact: giving developers access to secure, well-maintained building blocks &#8220;helps strengthen the software supply chain together.&#8221;<\/p>\n\n  <h2>The AI Pivot: MCP and the Agent Economy<\/h2>\n\n  <p>If Docker&#8217;s security moves addressed an existing crisis, its AI investments are a bet on where development is heading. The company&#8217;s second major initiative of 2025 \u2014 the Docker MCP Catalog and Docker MCP Toolkit \u2014 arrived in May as a response to the rapid emergence of agentic AI systems and the protocol that is becoming their connective tissue.<\/p>\n\n  <p>Model Context Protocol (MCP), originally developed by Anthropic, has emerged as a de facto open standard for enabling AI agents to communicate with external tools, databases, APIs, and services. Think of it as the USB-C port of AI integration: a universal connector that, once standardized, allows anything to plug into anything else. By mid-2025, the protocol had achieved broad industry alignment, and the pace of agentic AI development accelerated accordingly.<\/p>\n\n  <p>The challenge Docker identified was familiar: a powerful new technology had arrived with massive potential but a fragmented, insecure developer experience. MCP servers \u2014 the services that expose tools and data to AI agents \u2014 were scattered across GitHub repositories, required separate installation for every AI client, ran untrusted code directly on developer machines, and had no centralized security or authentication model. Configuring a GitHub MCP server for Claude meant configuring it again from scratch for Cursor, VS Code, and every other AI client.<\/p>\n\n  <blockquote>\n    <p>&#8220;MCP has the potential to do for agentic AI interaction what containers did for app deployment \u2014 standardize and simplify a complex, fragmented landscape.&#8221;<\/p>\n    <cite>\u2014 Docker Engineering Team<\/cite>\n  <\/blockquote>\n\n  <h3>The MCP Catalog: A Hub for the Agent Era<\/h3>\n\n  <p>The Docker MCP Catalog, integrated directly into Docker Hub, addresses the discovery problem by providing a curated, verified collection of over 300 MCP servers packaged as container images \u2014 complete with versioning, provenance, and security updates. Developers can find, evaluate, and launch tools from partners including Stripe, Grafana Labs, GitHub, MongoDB, Neo4j, Elastic, Pulumi, and Heroku from a single location, rather than hunting across the open web.<\/p>\n\n  <p>Crucially, because each MCP server runs as an isolated Docker container, the classic problems of environment conflicts, dependency collisions, and inconsistent behavior across machines are eliminated from the start. The same container-based isolation that made Docker indispensable for traditional application deployment applies with equal force to AI tool infrastructure.<\/p>\n\n  <h3>The MCP Toolkit: Security and Simplicity for Agents<\/h3>\n\n  <p>Discovery alone wasn&#8217;t sufficient. Docker&#8217;s MCP Toolkit \u2014 integrated into Docker Desktop \u2014 solves the operational complexity of managing MCP servers across projects and clients. Rather than configuring each server for each AI application separately, developers set up profiles once and connect all their clients to a centralized MCP Gateway. A &#8220;web-dev&#8221; profile might include GitHub and Playwright servers; a &#8220;backend&#8221; profile, database tools. Configure once, share across the team.<\/p>\n\n  <p>The security model of the MCP Toolkit reflects hard lessons from the traditional container security world. Each MCP server runs with strict resource limits \u2014 capped at 1 CPU and 2 GB of memory \u2014 with no default access to the host filesystem. All MCP server images are digitally signed and include SBOMs for full transparency. Built-in OAuth support and secure credential storage mean that API keys and tokens never get hardcoded into environment variables or configuration files. The Toolkit also introduced Dynamic MCP \u2014 a capability that allows AI agents to discover, add, and compose MCP servers on-demand during a conversation, without any manual reconfiguration.<\/p>\n\n  <p>One-click connections to leading AI clients \u2014 Claude, Cursor, VS Code, Windsurf, continue.dev, and Goose \u2014 mean developers can go from browsing the catalog to having a running, secured MCP server connected to their AI tools in seconds, not hours.<\/p>\n\n  <div class=\"callout\">\n    <div class=\"callout-title\">MCP Toolkit Security Architecture<\/div>\n    <p><strong>Image signing and attestation:<\/strong> All catalog images are built and digitally signed by Docker, with SBOMs for full supply chain transparency.<\/p>\n    <p><strong>Runtime isolation:<\/strong> Each MCP tool runs in its own container with CPU, memory, and filesystem limits enforced by default.<\/p>\n    <p><strong>Credential management:<\/strong> Built-in OAuth support and secure storage prevent secrets from leaking into environment variables.<\/p>\n    <p><strong>Hardened MCP Servers:<\/strong> Docker extended its DHI methodology to MCP server images, launching hardened versions of popular servers including Grafana, MongoDB, GitHub, and Context7.<\/p>\n  <\/div>\n\n  <h2>A Year of Milestones<\/h2>\n\n  <p>The scope of Docker&#8217;s 2025 transformation is best appreciated through its timeline. The company moved with unusual speed, shipping a series of interconnected announcements that each built on the last.<\/p>\n\n  <ul class=\"timeline\">\n    <li>\n      <div class=\"tl-dot\">Q1<\/div>\n      <div class=\"tl-body\">\n        <strong>Docker Model Runner<\/strong>\n        <span>Docker enables developers to run large language models locally via a Docker Desktop extension, establishing a local-first AI development workflow without complex infrastructure setup.<\/span>\n      <\/div>\n    <\/li>\n    <li>\n      <div class=\"tl-dot\">May<\/div>\n      <div class=\"tl-body\">\n        <strong>Docker Hardened Images Launch<\/strong>\n        <span>DHI debuts as a commercial product \u2014 over 1,000 security-hardened container images with full SBOM coverage, CVE transparency, and SLSA Level 3 provenance.<\/span>\n      <\/div>\n    <\/li>\n    <li>\n      <div class=\"tl-dot\">May<\/div>\n      <div class=\"tl-body\">\n        <strong>Docker MCP Catalog and Toolkit Beta<\/strong>\n        <span>Docker launches its MCP ecosystem with 100+ verified servers and one-click integration with major AI clients, establishing a secure, centralized hub for AI agent tooling.<\/span>\n      <\/div>\n    <\/li>\n    <li>\n      <div class=\"tl-dot\">Dec<\/div>\n      <div class=\"tl-body\">\n        <strong>DHI Goes Free and Open Source<\/strong>\n        <span>Docker open-sources its full DHI catalog under Apache 2.0, making over 1,000 hardened images available to every developer at no cost \u2014 a move praised by the CNCF as a watershed for supply chain security.<\/span>\n      <\/div>\n    <\/li>\n    <li>\n      <div class=\"tl-dot\">Dec<\/div>\n      <div class=\"tl-body\">\n        <strong>Hardened MCP Servers<\/strong>\n        <span>DHI methodology extended to MCP server images, applying minimal-footprint hardening to AI infrastructure components including Grafana, MongoDB, GitHub, and Context7.<\/span>\n      <\/div>\n    <\/li>\n  <\/ul>\n\n  <h2>The Competitive Landscape<\/h2>\n\n  <p>Docker&#8217;s moves are not made in a vacuum. The container security market, valued at roughly $3 billion in 2025, is projected to exceed $20 billion over the next decade, attracting well-funded competitors. Chainguard, a direct rival in hardened images, offers nearly 500 minimal container images with a similar focus on reducing known vulnerabilities, and provides production images with patch SLAs as a commercial offering. Echo Software, another competitor, uses AI agents to autonomously build and maintain vulnerability-free container images and recently secured significant funding.<\/p>\n\n  <p>By making DHI free under Apache 2.0, Docker applied a classic open source playbook: raise the baseline security floor for the entire ecosystem, make it structurally difficult for competitors to charge for a commodity that Docker now provides at no cost, and compete on the value-add commercial tiers \u2014 DHI Enterprise with SLA-backed CVE remediation and Extended Lifecycle Support for regulated industries.<\/p>\n\n  <p>In the AI tooling space, Docker&#8217;s advantage is its scale. No other company has Docker Hub&#8217;s distribution reach \u2014 20 billion monthly pulls, an established trust relationship with tens of millions of developers, and deep integration into the development toolchain. Applying that infrastructure to MCP servers gives Docker a credible claim to become the equivalent of Docker Hub for the agent era: the place where AI tools get discovered, verified, and run safely.<\/p>\n\n  <h2>What This Means for Developers<\/h2>\n\n  <p>For the individual developer, Docker&#8217;s 2025 initiatives translate into concrete, practical changes. The days of pulling a base image from Docker Hub and hoping it&#8217;s reasonably secure are over \u2014 or should be, now that a hardened alternative is free, minimal-friction, and actively maintained. Teams that previously layered security patches on top of unknown upstream risk can instead begin from a verified, minimal baseline. Docker&#8217;s own AI assistant can scan existing containers and automatically recommend or apply equivalent hardened images, reducing the migration burden that would otherwise make adoption impractical.<\/p>\n\n  <p>On the AI side, the MCP Toolkit changes the economics of integrating AI agents into development workflows. What previously required hours of per-client configuration \u2014 and left credential management as an afterthought \u2014 can now be accomplished in minutes from Docker Desktop, with security built in by default rather than bolted on later. As agentic AI systems move from clever prototypes to operational tools that update infrastructure, resolve customer issues, and manage SaaS environments, the blast radius of a compromised agent tool grows correspondingly. Docker&#8217;s approach of container-based isolation and strict resource limits is a direct response to that new risk surface.<\/p>\n\n  <blockquote>\n    <p>&#8220;The real problem wasn&#8217;t what models say. It was what they could do. Once agents can act, blast radius matters more than the prompt.&#8221;<\/p>\n    <cite>\u2014 Docker 2025 Year in Review<\/cite>\n  <\/blockquote>\n\n  <div class=\"conclusion-box\">\n    <h2>Looking Ahead<\/h2>\n    <p>Docker&#8217;s trajectory in 2025 suggests a company that has internalized a lesson from its own history: the developers who won the container era weren&#8217;t the ones with the most technically sophisticated solutions, but the ones who made the right thing the easy thing. Containers became universal not because they were theoretically superior, but because Docker made them frictionless.<\/p>\n    <p>The same logic now applies to security and AI. Docker Hardened Images are free because the cost of security should be zero \u2014 or close enough to zero that no team has an economic excuse to skip it. The MCP Catalog and Toolkit are integrated into Docker Desktop because the right developer experience for AI agents shouldn&#8217;t require a manual, error-prone setup process that most teams will cut corners on.<\/p>\n    <p>The next generation of container development will be AI-assisted, continuously hardened against supply chain threats, and built on infrastructure that knows its provenance from the first pull to the final deploy. Docker, betting everything on being that infrastructure, has made its strategy unmistakably clear. Whether it succeeds will depend on whether the ecosystem accepts its latest bid to define the standard \u2014 just as it did more than a decade ago with the original Docker container.<\/p>\n  <\/div>\n\n<\/div>\n\n\n\n<\/body>\n<\/html>\n","protected":false},"excerpt":{"rendered":"<p>How Docker is reinventing itself as the secure backbone of agentic AI development, without abandoning the developers who built the container revolution. Docker&#8217;s New Direction: Security, AI, and the Next Generation of Container Development For more than a decade, Docker&#8217;s name has been synonymous with containers \u2014 the lightweight, portable packaging format that quietly transformed how software gets built, shipped, and run. But in 2025, Docker made moves that signal something larger than an incremental product update. The company repositioned itself at the intersection of two of the most pressing concerns in modern software: supply chain security and the explosive rise of agentic AI. The result is a Docker that looks familiar on the surface, yet is fundamentally different underneath. The twin pillars of this transformation \u2014 the free release of Docker Hardened Images and the launch of the Docker MCP Catalog and Toolkit \u2014 represent a deliberate bet that Docker&#8217;s next decade won&#8217;t be defined by containerization alone, but by who controls the secure, trusted infrastructure on which AI-powered development runs. 20M+ Registered developers on Docker 20B+ Monthly Docker Hub pulls 1,000+ Hardened images now free 95% Reduction in attack surface vs. traditional base images $60B Supply chain attack cost (2025, projected) 300+ Verified MCP servers in catalog The Security Reckoning The context for Docker&#8217;s pivot to security is hard to overstate. Supply chain attacks \u2014 exploiting vulnerabilities baked into the open source libraries and base images that power virtually every modern application \u2014 have surged in frequency and cost. By 2025, the damage had tripled from 2021 levels, projected to exceed $60 billion globally. Every language, every ecosystem, every build and distribution step became a potential target. Against this backdrop, Docker launched its Docker Hardened Images (DHI) initiative in May 2025: a curated catalog of security-hardened, enterprise-grade container images designed to address software supply chain vulnerabilities head-on. The images are built on the widely adopted open source distributions Debian and Alpine, minimizing the attack surface by stripping out unnecessary components such as package managers and shells, and running as non-root users by default. Each image includes a complete software bill of materials (SBOM), transparent CVE data, cryptographic proof of authenticity, and SLSA Build Level 3 provenance \u2014 the gold standard of supply chain integrity. &#8220;Security has to start at the earliest point in development, and needs to be universally available to every developer. By making hardened images freely available and providing tooling that works with today&#8217;s AI coding agents, we&#8217;re giving the entire industry and community the best possible baseline to build on.&#8221; \u2014 Mark Cavage, President and COO, Docker, Inc. Initially a commercial offering, DHI underwent a dramatic shift in December 2025 when Docker made the entire catalog \u2014 more than 1,000 images \u2014 freely available to all developers under the Apache 2.0 open source license, with no restrictions on use, sharing, or building. The move was described internally as &#8220;a fundamental reset of the container security market.&#8221; What &#8220;Hardened&#8221; Actually Means The engineering behind Docker Hardened Images goes beyond simply patching known CVEs. The images eliminate the tools attackers most commonly exploit: package managers that could be used to install malicious software at runtime, shells that allow arbitrary command execution, and unnecessary system utilities that expand the attack surface. By starting from this minimal foundation and enforcing non-root execution, DHI makes entire categories of attacks structurally impossible rather than just temporarily patched. Docker claims the hardened images achieve up to 95% reductions in attack surface compared to traditional base images \u2014 a figure that has driven adoption among enterprises with strict compliance requirements. Adobe and Qualcomm are among the organizations that have staked their container security strategy on DHI for compliance in highly regulated environments, while startups like Attentive and Octopus Deploy have used it to accelerate their own compliance programs when selling to larger enterprise customers. Key Features of Docker Hardened Images Minimal footprint: Package managers, shells, and unnecessary utilities removed by default \u2014 entire attack classes eliminated rather than patched. Full transparency: Every image ships with a complete SBOM, public CVE data, and cryptographic provenance (SLSA Build Level 3). Non-root by default: Containers run without elevated privileges, enforcing least-privilege principles from the first pull. Enterprise SLAs: DHI Enterprise offers critical CVE remediation within seven days, with a roadmap toward 24-hour turnaround. Extended Lifecycle Support: Vulnerability updates and provenance attestations for up to five years after upstream support ends. The decision to open-source DHI wasn&#8217;t purely altruistic. Docker controls Docker Hub, which processes more than 20 billion image pulls each month. Raising the security baseline of those pulls raises the security baseline of the entire software industry \u2014 and positions Docker as the steward of that secure foundation. When Jonathan Bryce, Executive Director of the Cloud Native Computing Foundation, praised the move, he highlighted the broader ecosystem impact: giving developers access to secure, well-maintained building blocks &#8220;helps strengthen the software supply chain together.&#8221; The AI Pivot: MCP and the Agent Economy If Docker&#8217;s security moves addressed an existing crisis, its AI investments are a bet on where development is heading. The company&#8217;s second major initiative of 2025 \u2014 the Docker MCP Catalog and Docker MCP Toolkit \u2014 arrived in May as a response to the rapid emergence of agentic AI systems and the protocol that is becoming their connective tissue. Model Context Protocol (MCP), originally developed by Anthropic, has emerged as a de facto open standard for enabling AI agents to communicate with external tools, databases, APIs, and services. Think of it as the USB-C port of AI integration: a universal connector that, once standardized, allows anything to plug into anything else. By mid-2025, the protocol had achieved broad industry alignment, and the pace of agentic AI development accelerated accordingly. The challenge Docker identified was familiar: a powerful new technology had arrived with massive potential but a fragmented, insecure developer experience. MCP servers \u2014 the services that expose tools and data to AI agents \u2014 were scattered across GitHub repositories, required separate installation for every AI client, ran untrusted code directly on developer machines, and had no centralized security or authentication model. Configuring a GitHub MCP server for Claude meant configuring it again from scratch for Cursor, VS Code, and every other AI client. &#8220;MCP has the potential to do for agentic AI interaction what containers did for app deployment \u2014 standardize and simplify a complex, fragmented landscape.&#8221; \u2014 Docker Engineering Team The MCP Catalog: A Hub for the Agent Era The Docker MCP Catalog, integrated directly into Docker Hub, addresses the discovery problem by providing a curated, verified collection of over 300 MCP servers packaged as container images \u2014 complete with versioning, provenance, and security updates. Developers can find, evaluate, and launch tools from partners including Stripe, Grafana Labs, GitHub, MongoDB, Neo4j, Elastic, Pulumi, and Heroku from a single location, rather than hunting across the open web. Crucially, because each MCP server runs as an isolated Docker container, the classic problems of environment conflicts, dependency collisions, and inconsistent behavior across machines are eliminated from the start. The same container-based isolation that made Docker indispensable for traditional application deployment applies with equal force to AI tool infrastructure. The MCP Toolkit: Security and Simplicity for Agents Discovery alone wasn&#8217;t sufficient. Docker&#8217;s MCP Toolkit \u2014 integrated into Docker Desktop \u2014 solves the operational complexity of managing MCP servers across projects and clients. Rather than configuring each server for each AI application separately, developers set up profiles once and connect all their clients to a centralized MCP Gateway. A &#8220;web-dev&#8221; profile might include GitHub and Playwright servers; a &#8220;backend&#8221; profile, database tools. Configure once, share across the team. The security model of the MCP Toolkit reflects hard lessons from the traditional container security world. Each MCP server runs with strict resource limits \u2014 capped at 1 CPU and 2 GB of memory \u2014 with no default access to the host filesystem. All MCP server images are digitally signed and include SBOMs for full transparency. Built-in OAuth support and secure credential storage mean that API keys and tokens never get hardcoded into environment variables or configuration files. The Toolkit also introduced Dynamic MCP \u2014 a capability that allows AI agents to discover, add, and compose MCP servers on-demand during a conversation, without any manual reconfiguration. One-click connections to leading AI clients \u2014 Claude, Cursor, VS Code, Windsurf, continue.dev, and Goose \u2014 mean developers can go from browsing the catalog to having a running, secured MCP server connected to their AI tools in seconds, not hours. MCP Toolkit Security Architecture Image signing and attestation: All catalog images are built and digitally signed by Docker, with SBOMs for full supply chain transparency. Runtime isolation: Each MCP tool runs in its own container with CPU, memory, and filesystem limits enforced by default. Credential management: Built-in OAuth support and secure storage prevent secrets from leaking into environment variables. Hardened MCP Servers: Docker extended its DHI methodology to MCP server images, launching hardened versions of popular servers including Grafana, MongoDB, GitHub, and Context7. A Year of Milestones The scope of Docker&#8217;s 2025 transformation is best appreciated through its timeline. The company moved with unusual speed, shipping a series of interconnected announcements that each built on the last. Q1 Docker Model Runner Docker enables developers to run large language models locally via a Docker Desktop extension, establishing a local-first AI development workflow without complex infrastructure setup. May Docker Hardened Images Launch DHI debuts as a commercial product \u2014 over 1,000 security-hardened container images with full SBOM coverage, CVE transparency, and SLSA Level 3 provenance. May Docker MCP Catalog and Toolkit Beta Docker launches its MCP ecosystem with 100+ verified servers and one-click integration with major AI clients, establishing a secure, centralized hub for AI agent tooling. Dec DHI Goes Free and Open Source Docker open-sources its full DHI catalog under Apache 2.0, making over 1,000 hardened images available to every developer at no cost \u2014 a move praised by the CNCF as a watershed for supply chain security. Dec Hardened MCP Servers DHI methodology extended to MCP server images, applying minimal-footprint hardening to AI infrastructure components including Grafana, MongoDB, GitHub, and Context7. The Competitive Landscape Docker&#8217;s moves are not made in a vacuum. The container security market, valued at roughly $3 billion in 2025, is projected to exceed $20 billion over the next decade, attracting well-funded competitors. Chainguard, a direct rival in hardened images, offers nearly 500 minimal container images with a similar focus on reducing known vulnerabilities, and provides production images with patch SLAs as a commercial offering. Echo Software, another competitor, uses AI agents to autonomously build and maintain vulnerability-free container images and recently secured significant funding. By making DHI free under Apache 2.0, Docker applied a classic open source playbook: raise the baseline security floor for the entire ecosystem, make it structurally difficult for competitors to charge for a commodity that Docker now provides at no cost, and compete on the value-add commercial tiers \u2014 DHI Enterprise with SLA-backed CVE remediation and Extended Lifecycle Support for regulated industries. In the AI tooling space, Docker&#8217;s advantage is its scale. No other company has Docker Hub&#8217;s distribution reach \u2014 20 billion monthly pulls, an established trust relationship with tens of millions of developers, and deep integration into the development toolchain. Applying that infrastructure to MCP servers gives Docker a credible claim to become the equivalent of Docker Hub for the agent era: the place where AI tools get discovered, verified, and run safely. What This Means for Developers For the individual developer, Docker&#8217;s 2025 initiatives translate into concrete, practical changes. The days of pulling a base image from Docker Hub and hoping it&#8217;s reasonably secure are over \u2014 or should be, now that a hardened alternative is free, minimal-friction, and actively maintained. Teams that previously layered security patches on top of unknown upstream risk can instead begin from a verified, minimal baseline. Docker&#8217;s own AI assistant can scan existing containers and automatically recommend or apply equivalent hardened images, reducing the migration burden that would otherwise make adoption impractical. On the AI side, the MCP Toolkit changes the economics of integrating AI agents into development workflows. What previously required hours of per-client configuration \u2014 and left credential management as an afterthought \u2014 can now be accomplished in minutes from Docker Desktop, with security built in by default rather than bolted on later. As agentic AI systems move from clever prototypes to operational tools that update infrastructure, resolve customer issues, and manage SaaS environments, the blast radius of a compromised agent tool grows correspondingly. Docker&#8217;s approach of container-based isolation and strict resource limits is a direct response to that new risk surface. &#8220;The real problem wasn&#8217;t what models say. It was what they could do. Once agents can act, blast radius matters more than the prompt.&#8221; \u2014 Docker 2025 Year in Review Looking Ahead Docker&#8217;s trajectory in 2025 suggests a company that has internalized a lesson from its own history: the developers who won the container era weren&#8217;t the ones with the most technically sophisticated solutions, but the ones who made the right thing the easy thing. Containers became universal not because they were theoretically superior, but because Docker made them frictionless. The same logic now applies to security and AI. Docker Hardened Images are free because the cost of security should be zero \u2014 or close enough to zero that no team has an economic excuse to skip it. The MCP Catalog and Toolkit are integrated into Docker Desktop because the right developer experience for AI agents shouldn&#8217;t require a manual, error-prone setup process that most teams will cut corners on. The next generation of container development will be AI-assisted, continuously hardened against supply chain threats, and built on infrastructure that knows its provenance from the first pull to the final deploy. Docker, betting everything on being that infrastructure, has made its strategy unmistakably clear. Whether it succeeds will depend on whether the ecosystem accepts its latest bid to define the standard \u2014 just as it did more than a decade ago with the original Docker container.<\/p>\n","protected":false},"author":1,"featured_media":207,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-206","page","type-page","status-publish","has-post-thumbnail","hentry"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Docker&#039;s New Direction - Next-Generation Tech Blogs<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/rebaihamida.com\/?page_id=206\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Docker&#039;s New Direction - Next-Generation Tech Blogs\" \/>\n<meta property=\"og:description\" content=\"How Docker is reinventing itself as the secure backbone of agentic AI development, without abandoning the developers who built the container revolution. Docker&#8217;s New Direction: Security, AI, and the Next Generation of Container Development For more than a decade, Docker&#8217;s name has been synonymous with containers \u2014 the lightweight, portable packaging format that quietly transformed how software gets built, shipped, and run. But in 2025, Docker made moves that signal something larger than an incremental product update. The company repositioned itself at the intersection of two of the most pressing concerns in modern software: supply chain security and the explosive rise of agentic AI. The result is a Docker that looks familiar on the surface, yet is fundamentally different underneath. The twin pillars of this transformation \u2014 the free release of Docker Hardened Images and the launch of the Docker MCP Catalog and Toolkit \u2014 represent a deliberate bet that Docker&#8217;s next decade won&#8217;t be defined by containerization alone, but by who controls the secure, trusted infrastructure on which AI-powered development runs. 20M+ Registered developers on Docker 20B+ Monthly Docker Hub pulls 1,000+ Hardened images now free 95% Reduction in attack surface vs. traditional base images $60B Supply chain attack cost (2025, projected) 300+ Verified MCP servers in catalog The Security Reckoning The context for Docker&#8217;s pivot to security is hard to overstate. Supply chain attacks \u2014 exploiting vulnerabilities baked into the open source libraries and base images that power virtually every modern application \u2014 have surged in frequency and cost. By 2025, the damage had tripled from 2021 levels, projected to exceed $60 billion globally. Every language, every ecosystem, every build and distribution step became a potential target. Against this backdrop, Docker launched its Docker Hardened Images (DHI) initiative in May 2025: a curated catalog of security-hardened, enterprise-grade container images designed to address software supply chain vulnerabilities head-on. The images are built on the widely adopted open source distributions Debian and Alpine, minimizing the attack surface by stripping out unnecessary components such as package managers and shells, and running as non-root users by default. Each image includes a complete software bill of materials (SBOM), transparent CVE data, cryptographic proof of authenticity, and SLSA Build Level 3 provenance \u2014 the gold standard of supply chain integrity. &#8220;Security has to start at the earliest point in development, and needs to be universally available to every developer. By making hardened images freely available and providing tooling that works with today&#8217;s AI coding agents, we&#8217;re giving the entire industry and community the best possible baseline to build on.&#8221; \u2014 Mark Cavage, President and COO, Docker, Inc. Initially a commercial offering, DHI underwent a dramatic shift in December 2025 when Docker made the entire catalog \u2014 more than 1,000 images \u2014 freely available to all developers under the Apache 2.0 open source license, with no restrictions on use, sharing, or building. The move was described internally as &#8220;a fundamental reset of the container security market.&#8221; What &#8220;Hardened&#8221; Actually Means The engineering behind Docker Hardened Images goes beyond simply patching known CVEs. The images eliminate the tools attackers most commonly exploit: package managers that could be used to install malicious software at runtime, shells that allow arbitrary command execution, and unnecessary system utilities that expand the attack surface. By starting from this minimal foundation and enforcing non-root execution, DHI makes entire categories of attacks structurally impossible rather than just temporarily patched. Docker claims the hardened images achieve up to 95% reductions in attack surface compared to traditional base images \u2014 a figure that has driven adoption among enterprises with strict compliance requirements. Adobe and Qualcomm are among the organizations that have staked their container security strategy on DHI for compliance in highly regulated environments, while startups like Attentive and Octopus Deploy have used it to accelerate their own compliance programs when selling to larger enterprise customers. Key Features of Docker Hardened Images Minimal footprint: Package managers, shells, and unnecessary utilities removed by default \u2014 entire attack classes eliminated rather than patched. Full transparency: Every image ships with a complete SBOM, public CVE data, and cryptographic provenance (SLSA Build Level 3). Non-root by default: Containers run without elevated privileges, enforcing least-privilege principles from the first pull. Enterprise SLAs: DHI Enterprise offers critical CVE remediation within seven days, with a roadmap toward 24-hour turnaround. Extended Lifecycle Support: Vulnerability updates and provenance attestations for up to five years after upstream support ends. The decision to open-source DHI wasn&#8217;t purely altruistic. Docker controls Docker Hub, which processes more than 20 billion image pulls each month. Raising the security baseline of those pulls raises the security baseline of the entire software industry \u2014 and positions Docker as the steward of that secure foundation. When Jonathan Bryce, Executive Director of the Cloud Native Computing Foundation, praised the move, he highlighted the broader ecosystem impact: giving developers access to secure, well-maintained building blocks &#8220;helps strengthen the software supply chain together.&#8221; The AI Pivot: MCP and the Agent Economy If Docker&#8217;s security moves addressed an existing crisis, its AI investments are a bet on where development is heading. The company&#8217;s second major initiative of 2025 \u2014 the Docker MCP Catalog and Docker MCP Toolkit \u2014 arrived in May as a response to the rapid emergence of agentic AI systems and the protocol that is becoming their connective tissue. Model Context Protocol (MCP), originally developed by Anthropic, has emerged as a de facto open standard for enabling AI agents to communicate with external tools, databases, APIs, and services. Think of it as the USB-C port of AI integration: a universal connector that, once standardized, allows anything to plug into anything else. By mid-2025, the protocol had achieved broad industry alignment, and the pace of agentic AI development accelerated accordingly. The challenge Docker identified was familiar: a powerful new technology had arrived with massive potential but a fragmented, insecure developer experience. MCP servers \u2014 the services that expose tools and data to AI agents \u2014 were scattered across GitHub repositories, required separate installation for every AI client, ran untrusted code directly on developer machines, and had no centralized security or authentication model. Configuring a GitHub MCP server for Claude meant configuring it again from scratch for Cursor, VS Code, and every other AI client. &#8220;MCP has the potential to do for agentic AI interaction what containers did for app deployment \u2014 standardize and simplify a complex, fragmented landscape.&#8221; \u2014 Docker Engineering Team The MCP Catalog: A Hub for the Agent Era The Docker MCP Catalog, integrated directly into Docker Hub, addresses the discovery problem by providing a curated, verified collection of over 300 MCP servers packaged as container images \u2014 complete with versioning, provenance, and security updates. Developers can find, evaluate, and launch tools from partners including Stripe, Grafana Labs, GitHub, MongoDB, Neo4j, Elastic, Pulumi, and Heroku from a single location, rather than hunting across the open web. Crucially, because each MCP server runs as an isolated Docker container, the classic problems of environment conflicts, dependency collisions, and inconsistent behavior across machines are eliminated from the start. The same container-based isolation that made Docker indispensable for traditional application deployment applies with equal force to AI tool infrastructure. The MCP Toolkit: Security and Simplicity for Agents Discovery alone wasn&#8217;t sufficient. Docker&#8217;s MCP Toolkit \u2014 integrated into Docker Desktop \u2014 solves the operational complexity of managing MCP servers across projects and clients. Rather than configuring each server for each AI application separately, developers set up profiles once and connect all their clients to a centralized MCP Gateway. A &#8220;web-dev&#8221; profile might include GitHub and Playwright servers; a &#8220;backend&#8221; profile, database tools. Configure once, share across the team. The security model of the MCP Toolkit reflects hard lessons from the traditional container security world. Each MCP server runs with strict resource limits \u2014 capped at 1 CPU and 2 GB of memory \u2014 with no default access to the host filesystem. All MCP server images are digitally signed and include SBOMs for full transparency. Built-in OAuth support and secure credential storage mean that API keys and tokens never get hardcoded into environment variables or configuration files. The Toolkit also introduced Dynamic MCP \u2014 a capability that allows AI agents to discover, add, and compose MCP servers on-demand during a conversation, without any manual reconfiguration. One-click connections to leading AI clients \u2014 Claude, Cursor, VS Code, Windsurf, continue.dev, and Goose \u2014 mean developers can go from browsing the catalog to having a running, secured MCP server connected to their AI tools in seconds, not hours. MCP Toolkit Security Architecture Image signing and attestation: All catalog images are built and digitally signed by Docker, with SBOMs for full supply chain transparency. Runtime isolation: Each MCP tool runs in its own container with CPU, memory, and filesystem limits enforced by default. Credential management: Built-in OAuth support and secure storage prevent secrets from leaking into environment variables. Hardened MCP Servers: Docker extended its DHI methodology to MCP server images, launching hardened versions of popular servers including Grafana, MongoDB, GitHub, and Context7. A Year of Milestones The scope of Docker&#8217;s 2025 transformation is best appreciated through its timeline. The company moved with unusual speed, shipping a series of interconnected announcements that each built on the last. Q1 Docker Model Runner Docker enables developers to run large language models locally via a Docker Desktop extension, establishing a local-first AI development workflow without complex infrastructure setup. May Docker Hardened Images Launch DHI debuts as a commercial product \u2014 over 1,000 security-hardened container images with full SBOM coverage, CVE transparency, and SLSA Level 3 provenance. May Docker MCP Catalog and Toolkit Beta Docker launches its MCP ecosystem with 100+ verified servers and one-click integration with major AI clients, establishing a secure, centralized hub for AI agent tooling. Dec DHI Goes Free and Open Source Docker open-sources its full DHI catalog under Apache 2.0, making over 1,000 hardened images available to every developer at no cost \u2014 a move praised by the CNCF as a watershed for supply chain security. Dec Hardened MCP Servers DHI methodology extended to MCP server images, applying minimal-footprint hardening to AI infrastructure components including Grafana, MongoDB, GitHub, and Context7. The Competitive Landscape Docker&#8217;s moves are not made in a vacuum. The container security market, valued at roughly $3 billion in 2025, is projected to exceed $20 billion over the next decade, attracting well-funded competitors. Chainguard, a direct rival in hardened images, offers nearly 500 minimal container images with a similar focus on reducing known vulnerabilities, and provides production images with patch SLAs as a commercial offering. Echo Software, another competitor, uses AI agents to autonomously build and maintain vulnerability-free container images and recently secured significant funding. By making DHI free under Apache 2.0, Docker applied a classic open source playbook: raise the baseline security floor for the entire ecosystem, make it structurally difficult for competitors to charge for a commodity that Docker now provides at no cost, and compete on the value-add commercial tiers \u2014 DHI Enterprise with SLA-backed CVE remediation and Extended Lifecycle Support for regulated industries. In the AI tooling space, Docker&#8217;s advantage is its scale. No other company has Docker Hub&#8217;s distribution reach \u2014 20 billion monthly pulls, an established trust relationship with tens of millions of developers, and deep integration into the development toolchain. Applying that infrastructure to MCP servers gives Docker a credible claim to become the equivalent of Docker Hub for the agent era: the place where AI tools get discovered, verified, and run safely. What This Means for Developers For the individual developer, Docker&#8217;s 2025 initiatives translate into concrete, practical changes. The days of pulling a base image from Docker Hub and hoping it&#8217;s reasonably secure are over \u2014 or should be, now that a hardened alternative is free, minimal-friction, and actively maintained. Teams that previously layered security patches on top of unknown upstream risk can instead begin from a verified, minimal baseline. Docker&#8217;s own AI assistant can scan existing containers and automatically recommend or apply equivalent hardened images, reducing the migration burden that would otherwise make adoption impractical. On the AI side, the MCP Toolkit changes the economics of integrating AI agents into development workflows. What previously required hours of per-client configuration \u2014 and left credential management as an afterthought \u2014 can now be accomplished in minutes from Docker Desktop, with security built in by default rather than bolted on later. As agentic AI systems move from clever prototypes to operational tools that update infrastructure, resolve customer issues, and manage SaaS environments, the blast radius of a compromised agent tool grows correspondingly. Docker&#8217;s approach of container-based isolation and strict resource limits is a direct response to that new risk surface. &#8220;The real problem wasn&#8217;t what models say. It was what they could do. Once agents can act, blast radius matters more than the prompt.&#8221; \u2014 Docker 2025 Year in Review Looking Ahead Docker&#8217;s trajectory in 2025 suggests a company that has internalized a lesson from its own history: the developers who won the container era weren&#8217;t the ones with the most technically sophisticated solutions, but the ones who made the right thing the easy thing. Containers became universal not because they were theoretically superior, but because Docker made them frictionless. The same logic now applies to security and AI. Docker Hardened Images are free because the cost of security should be zero \u2014 or close enough to zero that no team has an economic excuse to skip it. The MCP Catalog and Toolkit are integrated into Docker Desktop because the right developer experience for AI agents shouldn&#8217;t require a manual, error-prone setup process that most teams will cut corners on. The next generation of container development will be AI-assisted, continuously hardened against supply chain threats, and built on infrastructure that knows its provenance from the first pull to the final deploy. Docker, betting everything on being that infrastructure, has made its strategy unmistakably clear. Whether it succeeds will depend on whether the ecosystem accepts its latest bid to define the standard \u2014 just as it did more than a decade ago with the original Docker container.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/rebaihamida.com\/?page_id=206\" \/>\n<meta property=\"og:site_name\" content=\"Next-Generation Tech Blogs\" \/>\n<meta property=\"article:modified_time\" content=\"2026-03-11T18:18:33+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/rebaihamida.com\/wp-content\/uploads\/2026\/03\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s-1024x683.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1024\" \/>\n\t<meta property=\"og:image:height\" content=\"683\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"10 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206\",\"url\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206\",\"name\":\"Docker's New Direction - Next-Generation Tech Blogs\",\"isPartOf\":{\"@id\":\"http:\\\/\\\/rebaihamida.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2026\\\/03\\\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s.png\",\"datePublished\":\"2026-03-11T16:06:01+00:00\",\"dateModified\":\"2026-03-11T18:18:33+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206#primaryimage\",\"url\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2026\\\/03\\\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s.png\",\"contentUrl\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2026\\\/03\\\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s.png\",\"width\":1536,\"height\":1024},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/?page_id=206#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"http:\\\/\\\/rebaihamida.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Docker&#8217;s New Direction\"}]},{\"@type\":\"WebSite\",\"@id\":\"http:\\\/\\\/rebaihamida.com\\\/#website\",\"url\":\"http:\\\/\\\/rebaihamida.com\\\/\",\"name\":\"Next-Generation Tech Blogs\",\"description\":\"Next-Generation Tech Blogs for Modern Thinkers\",\"publisher\":{\"@id\":\"http:\\\/\\\/rebaihamida.com\\\/#\\\/schema\\\/person\\\/f6dffae6f5fa8098da26264a0b318771\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"http:\\\/\\\/rebaihamida.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":[\"Person\",\"Organization\"],\"@id\":\"http:\\\/\\\/rebaihamida.com\\\/#\\\/schema\\\/person\\\/f6dffae6f5fa8098da26264a0b318771\",\"name\":\"Hamida Rebai\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2025\\\/12\\\/cropped-site-icon.png\",\"url\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2025\\\/12\\\/cropped-site-icon.png\",\"contentUrl\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2025\\\/12\\\/cropped-site-icon.png\",\"width\":512,\"height\":512,\"caption\":\"Hamida Rebai\"},\"logo\":{\"@id\":\"https:\\\/\\\/rebaihamida.com\\\/wp-content\\\/uploads\\\/2025\\\/12\\\/cropped-site-icon.png\"},\"sameAs\":[\"http:\\\/\\\/rebaihamida.com\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/hamida-rebai-trabelsi\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@RebaHamidaMVP\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Docker's New Direction - Next-Generation Tech Blogs","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/rebaihamida.com\/?page_id=206","og_locale":"en_US","og_type":"article","og_title":"Docker's New Direction - Next-Generation Tech Blogs","og_description":"How Docker is reinventing itself as the secure backbone of agentic AI development, without abandoning the developers who built the container revolution. Docker&#8217;s New Direction: Security, AI, and the Next Generation of Container Development For more than a decade, Docker&#8217;s name has been synonymous with containers \u2014 the lightweight, portable packaging format that quietly transformed how software gets built, shipped, and run. But in 2025, Docker made moves that signal something larger than an incremental product update. The company repositioned itself at the intersection of two of the most pressing concerns in modern software: supply chain security and the explosive rise of agentic AI. The result is a Docker that looks familiar on the surface, yet is fundamentally different underneath. The twin pillars of this transformation \u2014 the free release of Docker Hardened Images and the launch of the Docker MCP Catalog and Toolkit \u2014 represent a deliberate bet that Docker&#8217;s next decade won&#8217;t be defined by containerization alone, but by who controls the secure, trusted infrastructure on which AI-powered development runs. 20M+ Registered developers on Docker 20B+ Monthly Docker Hub pulls 1,000+ Hardened images now free 95% Reduction in attack surface vs. traditional base images $60B Supply chain attack cost (2025, projected) 300+ Verified MCP servers in catalog The Security Reckoning The context for Docker&#8217;s pivot to security is hard to overstate. Supply chain attacks \u2014 exploiting vulnerabilities baked into the open source libraries and base images that power virtually every modern application \u2014 have surged in frequency and cost. By 2025, the damage had tripled from 2021 levels, projected to exceed $60 billion globally. Every language, every ecosystem, every build and distribution step became a potential target. Against this backdrop, Docker launched its Docker Hardened Images (DHI) initiative in May 2025: a curated catalog of security-hardened, enterprise-grade container images designed to address software supply chain vulnerabilities head-on. The images are built on the widely adopted open source distributions Debian and Alpine, minimizing the attack surface by stripping out unnecessary components such as package managers and shells, and running as non-root users by default. Each image includes a complete software bill of materials (SBOM), transparent CVE data, cryptographic proof of authenticity, and SLSA Build Level 3 provenance \u2014 the gold standard of supply chain integrity. &#8220;Security has to start at the earliest point in development, and needs to be universally available to every developer. By making hardened images freely available and providing tooling that works with today&#8217;s AI coding agents, we&#8217;re giving the entire industry and community the best possible baseline to build on.&#8221; \u2014 Mark Cavage, President and COO, Docker, Inc. Initially a commercial offering, DHI underwent a dramatic shift in December 2025 when Docker made the entire catalog \u2014 more than 1,000 images \u2014 freely available to all developers under the Apache 2.0 open source license, with no restrictions on use, sharing, or building. The move was described internally as &#8220;a fundamental reset of the container security market.&#8221; What &#8220;Hardened&#8221; Actually Means The engineering behind Docker Hardened Images goes beyond simply patching known CVEs. The images eliminate the tools attackers most commonly exploit: package managers that could be used to install malicious software at runtime, shells that allow arbitrary command execution, and unnecessary system utilities that expand the attack surface. By starting from this minimal foundation and enforcing non-root execution, DHI makes entire categories of attacks structurally impossible rather than just temporarily patched. Docker claims the hardened images achieve up to 95% reductions in attack surface compared to traditional base images \u2014 a figure that has driven adoption among enterprises with strict compliance requirements. Adobe and Qualcomm are among the organizations that have staked their container security strategy on DHI for compliance in highly regulated environments, while startups like Attentive and Octopus Deploy have used it to accelerate their own compliance programs when selling to larger enterprise customers. Key Features of Docker Hardened Images Minimal footprint: Package managers, shells, and unnecessary utilities removed by default \u2014 entire attack classes eliminated rather than patched. Full transparency: Every image ships with a complete SBOM, public CVE data, and cryptographic provenance (SLSA Build Level 3). Non-root by default: Containers run without elevated privileges, enforcing least-privilege principles from the first pull. Enterprise SLAs: DHI Enterprise offers critical CVE remediation within seven days, with a roadmap toward 24-hour turnaround. Extended Lifecycle Support: Vulnerability updates and provenance attestations for up to five years after upstream support ends. The decision to open-source DHI wasn&#8217;t purely altruistic. Docker controls Docker Hub, which processes more than 20 billion image pulls each month. Raising the security baseline of those pulls raises the security baseline of the entire software industry \u2014 and positions Docker as the steward of that secure foundation. When Jonathan Bryce, Executive Director of the Cloud Native Computing Foundation, praised the move, he highlighted the broader ecosystem impact: giving developers access to secure, well-maintained building blocks &#8220;helps strengthen the software supply chain together.&#8221; The AI Pivot: MCP and the Agent Economy If Docker&#8217;s security moves addressed an existing crisis, its AI investments are a bet on where development is heading. The company&#8217;s second major initiative of 2025 \u2014 the Docker MCP Catalog and Docker MCP Toolkit \u2014 arrived in May as a response to the rapid emergence of agentic AI systems and the protocol that is becoming their connective tissue. Model Context Protocol (MCP), originally developed by Anthropic, has emerged as a de facto open standard for enabling AI agents to communicate with external tools, databases, APIs, and services. Think of it as the USB-C port of AI integration: a universal connector that, once standardized, allows anything to plug into anything else. By mid-2025, the protocol had achieved broad industry alignment, and the pace of agentic AI development accelerated accordingly. The challenge Docker identified was familiar: a powerful new technology had arrived with massive potential but a fragmented, insecure developer experience. MCP servers \u2014 the services that expose tools and data to AI agents \u2014 were scattered across GitHub repositories, required separate installation for every AI client, ran untrusted code directly on developer machines, and had no centralized security or authentication model. Configuring a GitHub MCP server for Claude meant configuring it again from scratch for Cursor, VS Code, and every other AI client. &#8220;MCP has the potential to do for agentic AI interaction what containers did for app deployment \u2014 standardize and simplify a complex, fragmented landscape.&#8221; \u2014 Docker Engineering Team The MCP Catalog: A Hub for the Agent Era The Docker MCP Catalog, integrated directly into Docker Hub, addresses the discovery problem by providing a curated, verified collection of over 300 MCP servers packaged as container images \u2014 complete with versioning, provenance, and security updates. Developers can find, evaluate, and launch tools from partners including Stripe, Grafana Labs, GitHub, MongoDB, Neo4j, Elastic, Pulumi, and Heroku from a single location, rather than hunting across the open web. Crucially, because each MCP server runs as an isolated Docker container, the classic problems of environment conflicts, dependency collisions, and inconsistent behavior across machines are eliminated from the start. The same container-based isolation that made Docker indispensable for traditional application deployment applies with equal force to AI tool infrastructure. The MCP Toolkit: Security and Simplicity for Agents Discovery alone wasn&#8217;t sufficient. Docker&#8217;s MCP Toolkit \u2014 integrated into Docker Desktop \u2014 solves the operational complexity of managing MCP servers across projects and clients. Rather than configuring each server for each AI application separately, developers set up profiles once and connect all their clients to a centralized MCP Gateway. A &#8220;web-dev&#8221; profile might include GitHub and Playwright servers; a &#8220;backend&#8221; profile, database tools. Configure once, share across the team. The security model of the MCP Toolkit reflects hard lessons from the traditional container security world. Each MCP server runs with strict resource limits \u2014 capped at 1 CPU and 2 GB of memory \u2014 with no default access to the host filesystem. All MCP server images are digitally signed and include SBOMs for full transparency. Built-in OAuth support and secure credential storage mean that API keys and tokens never get hardcoded into environment variables or configuration files. The Toolkit also introduced Dynamic MCP \u2014 a capability that allows AI agents to discover, add, and compose MCP servers on-demand during a conversation, without any manual reconfiguration. One-click connections to leading AI clients \u2014 Claude, Cursor, VS Code, Windsurf, continue.dev, and Goose \u2014 mean developers can go from browsing the catalog to having a running, secured MCP server connected to their AI tools in seconds, not hours. MCP Toolkit Security Architecture Image signing and attestation: All catalog images are built and digitally signed by Docker, with SBOMs for full supply chain transparency. Runtime isolation: Each MCP tool runs in its own container with CPU, memory, and filesystem limits enforced by default. Credential management: Built-in OAuth support and secure storage prevent secrets from leaking into environment variables. Hardened MCP Servers: Docker extended its DHI methodology to MCP server images, launching hardened versions of popular servers including Grafana, MongoDB, GitHub, and Context7. A Year of Milestones The scope of Docker&#8217;s 2025 transformation is best appreciated through its timeline. The company moved with unusual speed, shipping a series of interconnected announcements that each built on the last. Q1 Docker Model Runner Docker enables developers to run large language models locally via a Docker Desktop extension, establishing a local-first AI development workflow without complex infrastructure setup. May Docker Hardened Images Launch DHI debuts as a commercial product \u2014 over 1,000 security-hardened container images with full SBOM coverage, CVE transparency, and SLSA Level 3 provenance. May Docker MCP Catalog and Toolkit Beta Docker launches its MCP ecosystem with 100+ verified servers and one-click integration with major AI clients, establishing a secure, centralized hub for AI agent tooling. Dec DHI Goes Free and Open Source Docker open-sources its full DHI catalog under Apache 2.0, making over 1,000 hardened images available to every developer at no cost \u2014 a move praised by the CNCF as a watershed for supply chain security. Dec Hardened MCP Servers DHI methodology extended to MCP server images, applying minimal-footprint hardening to AI infrastructure components including Grafana, MongoDB, GitHub, and Context7. The Competitive Landscape Docker&#8217;s moves are not made in a vacuum. The container security market, valued at roughly $3 billion in 2025, is projected to exceed $20 billion over the next decade, attracting well-funded competitors. Chainguard, a direct rival in hardened images, offers nearly 500 minimal container images with a similar focus on reducing known vulnerabilities, and provides production images with patch SLAs as a commercial offering. Echo Software, another competitor, uses AI agents to autonomously build and maintain vulnerability-free container images and recently secured significant funding. By making DHI free under Apache 2.0, Docker applied a classic open source playbook: raise the baseline security floor for the entire ecosystem, make it structurally difficult for competitors to charge for a commodity that Docker now provides at no cost, and compete on the value-add commercial tiers \u2014 DHI Enterprise with SLA-backed CVE remediation and Extended Lifecycle Support for regulated industries. In the AI tooling space, Docker&#8217;s advantage is its scale. No other company has Docker Hub&#8217;s distribution reach \u2014 20 billion monthly pulls, an established trust relationship with tens of millions of developers, and deep integration into the development toolchain. Applying that infrastructure to MCP servers gives Docker a credible claim to become the equivalent of Docker Hub for the agent era: the place where AI tools get discovered, verified, and run safely. What This Means for Developers For the individual developer, Docker&#8217;s 2025 initiatives translate into concrete, practical changes. The days of pulling a base image from Docker Hub and hoping it&#8217;s reasonably secure are over \u2014 or should be, now that a hardened alternative is free, minimal-friction, and actively maintained. Teams that previously layered security patches on top of unknown upstream risk can instead begin from a verified, minimal baseline. Docker&#8217;s own AI assistant can scan existing containers and automatically recommend or apply equivalent hardened images, reducing the migration burden that would otherwise make adoption impractical. On the AI side, the MCP Toolkit changes the economics of integrating AI agents into development workflows. What previously required hours of per-client configuration \u2014 and left credential management as an afterthought \u2014 can now be accomplished in minutes from Docker Desktop, with security built in by default rather than bolted on later. As agentic AI systems move from clever prototypes to operational tools that update infrastructure, resolve customer issues, and manage SaaS environments, the blast radius of a compromised agent tool grows correspondingly. Docker&#8217;s approach of container-based isolation and strict resource limits is a direct response to that new risk surface. &#8220;The real problem wasn&#8217;t what models say. It was what they could do. Once agents can act, blast radius matters more than the prompt.&#8221; \u2014 Docker 2025 Year in Review Looking Ahead Docker&#8217;s trajectory in 2025 suggests a company that has internalized a lesson from its own history: the developers who won the container era weren&#8217;t the ones with the most technically sophisticated solutions, but the ones who made the right thing the easy thing. Containers became universal not because they were theoretically superior, but because Docker made them frictionless. The same logic now applies to security and AI. Docker Hardened Images are free because the cost of security should be zero \u2014 or close enough to zero that no team has an economic excuse to skip it. The MCP Catalog and Toolkit are integrated into Docker Desktop because the right developer experience for AI agents shouldn&#8217;t require a manual, error-prone setup process that most teams will cut corners on. The next generation of container development will be AI-assisted, continuously hardened against supply chain threats, and built on infrastructure that knows its provenance from the first pull to the final deploy. Docker, betting everything on being that infrastructure, has made its strategy unmistakably clear. Whether it succeeds will depend on whether the ecosystem accepts its latest bid to define the standard \u2014 just as it did more than a decade ago with the original Docker container.","og_url":"https:\/\/rebaihamida.com\/?page_id=206","og_site_name":"Next-Generation Tech Blogs","article_modified_time":"2026-03-11T18:18:33+00:00","og_image":[{"width":1024,"height":683,"url":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2026\/03\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s-1024x683.png","type":"image\/png"}],"twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"10 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/rebaihamida.com\/?page_id=206","url":"https:\/\/rebaihamida.com\/?page_id=206","name":"Docker's New Direction - Next-Generation Tech Blogs","isPartOf":{"@id":"http:\/\/rebaihamida.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/rebaihamida.com\/?page_id=206#primaryimage"},"image":{"@id":"https:\/\/rebaihamida.com\/?page_id=206#primaryimage"},"thumbnailUrl":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2026\/03\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s.png","datePublished":"2026-03-11T16:06:01+00:00","dateModified":"2026-03-11T18:18:33+00:00","breadcrumb":{"@id":"https:\/\/rebaihamida.com\/?page_id=206#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/rebaihamida.com\/?page_id=206"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/rebaihamida.com\/?page_id=206#primaryimage","url":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2026\/03\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s.png","contentUrl":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2026\/03\/ChatGPT-Image-11-mars-2026-12-h-04-min-28-s.png","width":1536,"height":1024},{"@type":"BreadcrumbList","@id":"https:\/\/rebaihamida.com\/?page_id=206#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"http:\/\/rebaihamida.com\/"},{"@type":"ListItem","position":2,"name":"Docker&#8217;s New Direction"}]},{"@type":"WebSite","@id":"http:\/\/rebaihamida.com\/#website","url":"http:\/\/rebaihamida.com\/","name":"Next-Generation Tech Blogs","description":"Next-Generation Tech Blogs for Modern Thinkers","publisher":{"@id":"http:\/\/rebaihamida.com\/#\/schema\/person\/f6dffae6f5fa8098da26264a0b318771"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"http:\/\/rebaihamida.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":["Person","Organization"],"@id":"http:\/\/rebaihamida.com\/#\/schema\/person\/f6dffae6f5fa8098da26264a0b318771","name":"Hamida Rebai","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2025\/12\/cropped-site-icon.png","url":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2025\/12\/cropped-site-icon.png","contentUrl":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2025\/12\/cropped-site-icon.png","width":512,"height":512,"caption":"Hamida Rebai"},"logo":{"@id":"https:\/\/rebaihamida.com\/wp-content\/uploads\/2025\/12\/cropped-site-icon.png"},"sameAs":["http:\/\/rebaihamida.com","https:\/\/www.linkedin.com\/in\/hamida-rebai-trabelsi\/","https:\/\/www.youtube.com\/@RebaHamidaMVP"]}]}},"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/pages\/206","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/rebaihamida.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=206"}],"version-history":[{"count":4,"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/pages\/206\/revisions"}],"predecessor-version":[{"id":213,"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/pages\/206\/revisions\/213"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/rebaihamida.com\/index.php?rest_route=\/wp\/v2\/media\/207"}],"wp:attachment":[{"href":"https:\/\/rebaihamida.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=206"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}