{"id":3137,"date":"2026-01-13T06:18:02","date_gmt":"2026-01-13T06:18:02","guid":{"rendered":"https:\/\/yodaplus.com\/blog\/?p=3137"},"modified":"2026-01-13T06:18:02","modified_gmt":"2026-01-13T06:18:02","slug":"open-llms-as-infrastructure-not-products","status":"publish","type":"post","link":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/","title":{"rendered":"Open LLMs as Infrastructure, Not Products"},"content":{"rendered":"<p data-start=\"347\" data-end=\"403\">Why do so many AI initiatives stall after early success?<\/p>\n<p data-start=\"405\" data-end=\"634\">Often, the problem is not the AI model. It is how the model is treated. Many teams approach large language models as finished products. They expect one model to solve every problem on its own. This mindset limits long-term value.<\/p>\n<p data-start=\"636\" data-end=\"722\">Open LLMs deliver their strongest impact when treated as infrastructure, not products.<\/p>\n<h3 data-start=\"724\" data-end=\"766\">Why product thinking breaks AI systems<\/h3>\n<p data-start=\"768\" data-end=\"989\">When organizations treat AI models as products, they design workflows around the model instead of the problem. Everything revolves around prompts, responses, and model behavior. Over time, this creates fragile AI systems.<\/p>\n<p data-start=\"991\" data-end=\"1127\">Small changes in prompts cause large output differences. Context becomes difficult to manage. Explainability suffers. AI risk increases.<\/p>\n<p data-start=\"1129\" data-end=\"1183\">This approach works for demos. It fails in production.<\/p>\n<p data-start=\"1185\" data-end=\"1325\">Infrastructure thinking changes the focus. The AI model becomes one component inside a larger AI system rather than the center of attention.<\/p>\n<h3 data-start=\"1327\" data-end=\"1362\">What infrastructure means in AI<\/h3>\n<p data-start=\"1364\" data-end=\"1519\">Infrastructure supports many workflows without dictating behavior. Databases, networks, and operating systems do not decide business logic. They enable it.<\/p>\n<p data-start=\"1521\" data-end=\"1726\">Open LLMs should play the same role in AI systems. They provide reasoning, language understanding, and generative capabilities. They do not control workflow logic, validation rules, or decision boundaries.<\/p>\n<p data-start=\"1728\" data-end=\"1856\">Agentic frameworks place open LLMs inside structured AI workflows where intelligent agents coordinate tasks using defined roles.<\/p>\n<h3 data-start=\"1858\" data-end=\"1907\">Why open LLMs fit infrastructure roles better<\/h3>\n<p data-start=\"1909\" data-end=\"2059\">Open LLMs offer flexibility that proprietary models often limit. Teams can inspect behavior, adjust deployment, and manage updates on their own terms.<\/p>\n<p data-start=\"2061\" data-end=\"2226\">This flexibility supports reliable AI. When open LLMs run inside controlled workflows, organizations can enforce responsible AI practices and improve explainable AI.<\/p>\n<p data-start=\"2228\" data-end=\"2412\">Open models also reduce AI risk by avoiding dependency on opaque systems. AI risk management becomes easier when teams understand how models operate and where they sit in the workflow.<\/p>\n<h3 data-start=\"2414\" data-end=\"2447\">How agentic AI uses open LLMs<\/h3>\n<p data-start=\"2449\" data-end=\"2655\">In agentic AI systems, open LLMs serve specific purposes. One agent may use an LLM for reasoning. Another may use it for summarization. Others may rely on semantic search or knowledge-based systems instead.<\/p>\n<p data-start=\"2657\" data-end=\"2787\">Workflow agents decide when and how models are invoked. Vector embeddings support retrieval. AI-driven analytics validate outputs.<\/p>\n<p data-start=\"2789\" data-end=\"2930\">This separation of concerns keeps autonomous systems stable. Autonomous agents act within clear boundaries rather than improvising endlessly.<\/p>\n<h3 data-start=\"2932\" data-end=\"2978\">Infrastructure enables multi-agent systems<\/h3>\n<p data-start=\"2980\" data-end=\"3101\">Multi-agent systems rely on coordination, not intelligence alone. Each AI agent must understand its role and limitations.<\/p>\n<p data-start=\"3103\" data-end=\"3231\">Open LLMs provide cognitive capability, but agentic frameworks provide structure. Together, they create dependable AI workflows.<\/p>\n<p data-start=\"3233\" data-end=\"3452\">This design also improves conversational AI. Responses remain consistent because workflows control how context flows between agents. Prompt engineering becomes simpler because prompts align with narrow responsibilities.<\/p>\n<h3 data-start=\"3454\" data-end=\"3506\">Why infrastructure thinking supports scalability<\/h3>\n<p data-start=\"3508\" data-end=\"3626\">AI systems built around products struggle to scale. Each new use case requires new prompts, new tuning, and new fixes.<\/p>\n<p data-start=\"3628\" data-end=\"3785\">Infrastructure-based AI systems scale naturally. New workflow agents reuse existing components. New AI models can replace old ones without redesigning logic.<\/p>\n<p data-start=\"3787\" data-end=\"3859\">This flexibility accelerates AI innovation while preserving reliability.<\/p>\n<h3 data-start=\"3861\" data-end=\"3900\">The impact on long-term AI strategy<\/h3>\n<p data-start=\"3902\" data-end=\"4030\">Treating open LLMs as infrastructure protects organizations from rapid model cycles. As AI models evolve, systems remain stable.<\/p>\n<p data-start=\"4032\" data-end=\"4198\">This approach also supports compliance and governance. Clear workflows enable auditing. Explainable AI becomes practical. Responsible AI practices become enforceable.<\/p>\n<p data-start=\"4200\" data-end=\"4273\">The future of AI systems depends on design discipline, not model novelty.<\/p>\n<h3 data-start=\"4275\" data-end=\"4283\">FAQs<\/h3>\n<p data-start=\"4285\" data-end=\"4451\"><strong data-start=\"4285\" data-end=\"4340\">Are open LLMs less capable than proprietary models?<\/strong><br data-start=\"4340\" data-end=\"4343\" \/>Not necessarily. When used inside well-designed workflows, open LLMs deliver strong and predictable results.<\/p>\n<p data-start=\"4453\" data-end=\"4575\"><strong data-start=\"4453\" data-end=\"4504\">Does infrastructure thinking reduce creativity?<\/strong><br data-start=\"4504\" data-end=\"4507\" \/>No. It channels creativity into repeatable and trustworthy outcomes.<\/p>\n<p data-start=\"4577\" data-end=\"4707\"><strong data-start=\"4577\" data-end=\"4625\">Can open LLMs support autonomous AI systems?<\/strong><br data-start=\"4625\" data-end=\"4628\" \/>Yes. They work best when autonomous agents operate within structured workflows.<\/p>\n<h3 data-start=\"4709\" data-end=\"4723\">Conclusion<\/h3>\n<p data-start=\"4725\" data-end=\"5162\">Open LLMs unlock their full value when treated as infrastructure rather than standalone products. Inside agentic AI systems, they support reliable reasoning, controlled automation, and long-term scalability. Organizations that adopt this mindset build AI systems that adapt without breaking. <a href=\"https:\/\/bit.ly\/4eHaCP9\"><strong data-start=\"5017\" data-end=\"5049\">Yodaplus Automation Services<\/strong><\/a> helps teams design AI architectures where open models power workflows without becoming single points of failure.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Why do so many AI initiatives stall after early success? Often, the problem is not the AI model. It is how the model is treated. Many teams approach large language models as finished products. They expect one model to solve every problem on its own. This mindset limits long-term value. Open LLMs deliver their strongest [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":3142,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[86,49],"tags":[],"class_list":["post-3137","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-agentic-ai","category-artificial-intelligence"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.0 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Open LLMs as Infrastructure, Not Products | Yodaplus Technologies<\/title>\n<meta name=\"description\" content=\"Why open LLMs work best as infrastructure inside reliable agentic AI systems rather than standalone AI products.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Open LLMs as Infrastructure, Not Products | Yodaplus Technologies\" \/>\n<meta property=\"og:description\" content=\"Why open LLMs work best as infrastructure inside reliable agentic AI systems rather than standalone AI products.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\" \/>\n<meta property=\"og:site_name\" content=\"Yodaplus Technologies\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/m.facebook.com\/yodaplustech\/\" \/>\n<meta property=\"article:published_time\" content=\"2026-01-13T06:18:02+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1081\" \/>\n\t<meta property=\"og:image:height\" content=\"722\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Yodaplus\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@yodaplustech\" \/>\n<meta name=\"twitter:site\" content=\"@yodaplustech\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Yodaplus\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":[\"Article\",\"BlogPosting\"],\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\"},\"author\":{\"name\":\"Yodaplus\",\"@id\":\"https:\/\/yodaplus.com\/blog\/#\/schema\/person\/b9d05d8179b088323926de247987842a\"},\"headline\":\"Open LLMs as Infrastructure, Not Products\",\"datePublished\":\"2026-01-13T06:18:02+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\"},\"wordCount\":669,\"publisher\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png\",\"articleSection\":[\"Agentic AI\",\"Artificial Intelligence\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\",\"url\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\",\"name\":\"Open LLMs as Infrastructure, Not Products | Yodaplus Technologies\",\"isPartOf\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png\",\"datePublished\":\"2026-01-13T06:18:02+00:00\",\"description\":\"Why open LLMs work best as infrastructure inside reliable agentic AI systems rather than standalone AI products.\",\"breadcrumb\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage\",\"url\":\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png\",\"contentUrl\":\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png\",\"width\":1081,\"height\":722,\"caption\":\"Open LLMs as Infrastructure, Not Products\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/yodaplus.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Open LLMs as Infrastructure, Not Products\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/yodaplus.com\/blog\/#website\",\"url\":\"https:\/\/yodaplus.com\/blog\/\",\"name\":\"Yodaplus Technologies\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/yodaplus.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/yodaplus.com\/blog\/#organization\",\"name\":\"Yodaplus Technologies Private Limited\",\"url\":\"https:\/\/yodaplus.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/yodaplus.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2025\/02\/yodaplus_logo_1.png\",\"contentUrl\":\"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2025\/02\/yodaplus_logo_1.png\",\"width\":500,\"height\":500,\"caption\":\"Yodaplus Technologies Private Limited\"},\"image\":{\"@id\":\"https:\/\/yodaplus.com\/blog\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/m.facebook.com\/yodaplustech\/\",\"https:\/\/x.com\/yodaplustech\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/yodaplus.com\/blog\/#\/schema\/person\/b9d05d8179b088323926de247987842a\",\"name\":\"Yodaplus\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/yodaplus.com\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/c1309be20047952d3cb894935d9b0c69?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/c1309be20047952d3cb894935d9b0c69?s=96&d=mm&r=g\",\"caption\":\"Yodaplus\"},\"sameAs\":[\"https:\/\/yodaplus.com\/blog\"],\"url\":\"https:\/\/yodaplus.com\/blog\/author\/admin_yoda\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Open LLMs as Infrastructure, Not Products | Yodaplus Technologies","description":"Why open LLMs work best as infrastructure inside reliable agentic AI systems rather than standalone AI products.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/","og_locale":"en_US","og_type":"article","og_title":"Open LLMs as Infrastructure, Not Products | Yodaplus Technologies","og_description":"Why open LLMs work best as infrastructure inside reliable agentic AI systems rather than standalone AI products.","og_url":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/","og_site_name":"Yodaplus Technologies","article_publisher":"https:\/\/m.facebook.com\/yodaplustech\/","article_published_time":"2026-01-13T06:18:02+00:00","og_image":[{"width":1081,"height":722,"url":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png","type":"image\/png"}],"author":"Yodaplus","twitter_card":"summary_large_image","twitter_creator":"@yodaplustech","twitter_site":"@yodaplustech","twitter_misc":{"Written by":"Yodaplus","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":["Article","BlogPosting"],"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#article","isPartOf":{"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/"},"author":{"name":"Yodaplus","@id":"https:\/\/yodaplus.com\/blog\/#\/schema\/person\/b9d05d8179b088323926de247987842a"},"headline":"Open LLMs as Infrastructure, Not Products","datePublished":"2026-01-13T06:18:02+00:00","mainEntityOfPage":{"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/"},"wordCount":669,"publisher":{"@id":"https:\/\/yodaplus.com\/blog\/#organization"},"image":{"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage"},"thumbnailUrl":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png","articleSection":["Agentic AI","Artificial Intelligence"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/","url":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/","name":"Open LLMs as Infrastructure, Not Products | Yodaplus Technologies","isPartOf":{"@id":"https:\/\/yodaplus.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage"},"image":{"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage"},"thumbnailUrl":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png","datePublished":"2026-01-13T06:18:02+00:00","description":"Why open LLMs work best as infrastructure inside reliable agentic AI systems rather than standalone AI products.","breadcrumb":{"@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#primaryimage","url":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png","contentUrl":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2026\/01\/Open-LLMs-as-Infrastructure-Not-Products.png","width":1081,"height":722,"caption":"Open LLMs as Infrastructure, Not Products"},{"@type":"BreadcrumbList","@id":"https:\/\/yodaplus.com\/blog\/open-llms-as-infrastructure-not-products\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/yodaplus.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Open LLMs as Infrastructure, Not Products"}]},{"@type":"WebSite","@id":"https:\/\/yodaplus.com\/blog\/#website","url":"https:\/\/yodaplus.com\/blog\/","name":"Yodaplus Technologies","description":"","publisher":{"@id":"https:\/\/yodaplus.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/yodaplus.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/yodaplus.com\/blog\/#organization","name":"Yodaplus Technologies Private Limited","url":"https:\/\/yodaplus.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/yodaplus.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2025\/02\/yodaplus_logo_1.png","contentUrl":"https:\/\/yodaplus.com\/blog\/wp-content\/uploads\/2025\/02\/yodaplus_logo_1.png","width":500,"height":500,"caption":"Yodaplus Technologies Private Limited"},"image":{"@id":"https:\/\/yodaplus.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/m.facebook.com\/yodaplustech\/","https:\/\/x.com\/yodaplustech"]},{"@type":"Person","@id":"https:\/\/yodaplus.com\/blog\/#\/schema\/person\/b9d05d8179b088323926de247987842a","name":"Yodaplus","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/yodaplus.com\/blog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/c1309be20047952d3cb894935d9b0c69?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/c1309be20047952d3cb894935d9b0c69?s=96&d=mm&r=g","caption":"Yodaplus"},"sameAs":["https:\/\/yodaplus.com\/blog"],"url":"https:\/\/yodaplus.com\/blog\/author\/admin_yoda\/"}]}},"_links":{"self":[{"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/posts\/3137","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/comments?post=3137"}],"version-history":[{"count":1,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/posts\/3137\/revisions"}],"predecessor-version":[{"id":3147,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/posts\/3137\/revisions\/3147"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/media\/3142"}],"wp:attachment":[{"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/media?parent=3137"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/categories?post=3137"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/yodaplus.com\/blog\/wp-json\/wp\/v2\/tags?post=3137"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}