Day 82 of 100 Days of AI

I have a 7 billion parameter large language model running locally on my Macbook thanks to Ollama!

This is great when travelling and there’s no Wifi to access ChatGPT or other large language models. What I can run on my Macbook today is not as powerful as GPT4o but it’s still handy for quick queries.

In the example below, I gave the Mistral 7B model a trial with a small task: create a simple Tic-Tac-Toe game in Python that I can run from my command line terminal. Below is the code it provided.

I had to fix 2 small errors and it worked almost immediately. The game logic isn’t quite right (wins are announced one move too late) but the program was broadly in the right direction.

Below is the output of the game while its running.

In the near future we are going to be able to run significantly more powerful large language models locally, all without an internet connection. This will be driven by a variety of model optimisation techniques (see my post about Apple’s work, for example) and improvements in computer hardware.