Running Local LLMs

Running Local LLMs


Nick Gerakines will talk about running your own LLM. Running your own is pretty easy and there are a lot of tools and platforms that make it possible.
If you have the right hardware, with a little bit of elbow grease, you can run your own chatbots and interact with LLMs using session interfaces and APIs.

Who am I?

I’m Nick Gerakines, a software engineer and technology enthusiast. I’ve been in the AI/ML/CS space for a short while at UDRI / AFRL and the private sector at companies like PredictAP, DataDog, and Mattel.

Presentation Outline

  • Title
  • Who am I?
  • LLM primer
  • The LLM landscape
  • Sizzle demo – Asking a local LLM some questions
  • Ollama overview
  • Model directory
  • Running Ollama
  • Using the ollama run command
  • Interacting with ollama with Python
  • Closing
  • Bonus: github

© Gem City Machine Learning (ML),