Local GenAI LLMs with Ollama and Docker (Stream 262)

Опубликовано: 01 Январь 1970
на канале: Bret Fisher Cloud Native DevOps
6,032
170

👉 Edited version of this stream:    • Local GenAI LLMs with Ollama and Dock...  
Learn how to run your own local ChatGPT clone and GitHub Copilot clone by setting up Ollama and Docker's "GenAI Stack" to build apps on top of open source LLMs and closed-source SaaS models (GPT-4, etc.). Matt Williams is our guest to walk us through all the parts of this solution, and show us how Ollama can make it easier on Mac, Windows, and Linux to setup custom LLM stacks.

🗞️ Sign up for my weekly newsletter for the latest on upcoming guests and what I'm releasing: https://www.bretfisher.com/newsletter/

Matt Williams
============
  / technovangelist  
  / technovangelist  

Nirmal Mehta
============
  / nirmalkmehta  
  / normalfaults  
https://hachyderm.io/@nirmal

Bret Fisher

=========
  / bretefisher  
  / bretfisher  
https://www.bretfisher.com

Join my Community 🤜🤛
================
💌 Weekly newsletter on upcoming guests and stuff I'm working on: https://www.bretfisher.com/newsletter/
💬 Join the discussion on our Discord chat server   / discord  
👨‍🏫 Coupons for my Docker and Kubernetes courses https://www.bretfisher.com/courses/
🎙️ Podcast of this show https://www.bretfisher.com/podcast

Show Music 🎵
==========
waiting music: Jakarta - Bonsaye https://www.epidemicsound.com/track/Y...
intro music: I Need A Remedy (Instrumental Version) - Of Men And Wolves https://www.epidemicsound.com/track/z...
outro music: Electric Ballroom - Quesa https://www.epidemicsound.com/track/K...


Смотрите видео Local GenAI LLMs with Ollama and Docker (Stream 262) онлайн, длительностью часов минут секунд в хорошем качестве, которое загружено на канал Bret Fisher Cloud Native DevOps 01 Январь 1970. Делитесь ссылкой на видео в социальных сетях, чтобы ваши подписчики и друзья так же посмотрели это видео. Данный видеоклип посмотрели 6,032 раз и оно понравилось 170 посетителям.