{"version":"1.0","type":"rich","provider_name":"Acast","provider_url":"https://acast.com","height":250,"width":700,"html":"<iframe src=\"https://embed.acast.com/$/67aa9138f21071868cf73963/67bf91ec3beb1d1463bf7b6e?\" frameBorder=\"0\" width=\"700\" height=\"250\"></iframe>","title":"How does RAG work with KGs?","thumbnail_width":200,"thumbnail_height":200,"thumbnail_url":"https://open-images.acast.com/shows/67aa9138f21071868cf73963/1740607951165-07e680fa-18f9-4494-ae15-fd0a2f424b73.jpeg?height=200","description":"<p>Today I was joined by David Hyland-Wood to discuss how Retrieval-Augmented Generation (RAG) works with Knowledge Graphs. We also talked about LLM bullshit, and the relationship between RAG systems, knowledge graphs, and open-source large language models, and explore the importance of modeling knowledge graphs for effective output.</p>","author_name":"Jamie McCusker"}