Private AI assistants#

When: Wednesday 29 May
Presenters: Alexandre Boucaud
Article writers: Alexandre Boucaud and Llama3

TL;DR#

  • Ollama - Service to download and use open source AI models on your system (privately)

  • Open WebUI - Web interface to chat with local AI models in a ChatGPT-like environment

  • Cody - IDE extension to use Ollama models as code assistants - integrates into VSCode and PyCharm

  • llm - Command line interface for to chat with any language model (remote or local) in your terminal

Rationale (by Llama3)#

As we navigate the fast-paced world of technology, it’s easy to get bogged down in the daily grind. Between programming, writing, and meeting deadlines, there are only so many hours in a day. That’s where private virtual assistants come in – AI-powered tools designed to help you stay ahead of the curve.

In this post, we’ll be exploring the world of virtual assistants and how they can simplify your workflow. We’ll also take a closer look at Ollama, a cutting-edge platform that lets you harness the power of pre-trained models right from your own computer.

The Benefits of virtual Assistants for APC members#

Here are a few examples of how virtual assistants help your APC colleagues :

  • Translate sentences, documents accurately and in both directions. It has been used for instance by foreign colleagues to handle exchanges with the French administration successfully :)

  • Generate ideas on a topic to kick start the writing and avoid the blank page, or templates for emails, letters, etc.

  • Rewrite paragraphs when there is a constrain to meet (maximum number of characters).

  • As a coding assistant to help with boring tasks like adding comments or documenting the code, explain how a piece of code works in natural language, produce code examples from a natural language description of the problem.

Ollama: a private virtual assistant platform#

Ollama is a popular private virtual assistant platform that lets you tap into pre-trained models for natural language processing (NLP), computer vision, and more. With Ollama, you can:

  • Access pre-trained models: download and use the latest open source models from your needs, 100% locally, and in one click.

  • Integrate with local tools: run Ollama on your own machine, integrating it seamlessly with your development environment, your terminal or even your browser.

  • Maintain control over your data: all communications between you and the private assistant remain fully private and are not shared over the internet.

  • Work when there is no access to the internet.

Installation and usage#

Ollama is compatible with Windows, macOS and Linux.

You can install it either via the command line

curl -fsSL https://ollama.com/install.sh | sh

or by downloading the app from the Ollama website.

Common usage#

Getting started is then as simple as downloading models from the library

ollama pull mistral

and then running them

ollama run mistral

Leveraging Ollama models#

There is now a plethora of tools that integrate with Ollama. Many are listed at the bottom of the README of Ollama’s GitHub repository.

Below I detail a subjective selection of these

Open WebUI#

Open WebUI is a famous open source web interface that lets you interact with a local language model on a browser like you would with ChatGPT-like solutions.

Cody : the coding assistant#

Cody is one of the first coding assistant that provided a API to integrate with Ollama models. It can be installed on VSCode and PyCharm.

The main features of Cody are:

  • code completion, refactoring, and documentation of the code

  • a natural language description of the code or entire repository to help understand it better (e.g. “what does this function do?”) and you can “chat” with a code repository to get better answers

  • provide code examples from a natural language description of the problem (e.g. Stack Overflow style)

  • generate unit tests for your code

llm: a command line interface for AI assistants#

Install the llm tool

pip install llm

and the plugin to interact with local models from Ollama

llm install llm-ollama

If Ollama is running on your computer, you can list the available models:

llm models

and see something like (provided you have already downloaded some models):

...
Ollama: mistral:latest (aliases: mistral)
Ollama: llama3:latest (aliases: llama3)
Ollama: aya:latest (aliases: aya)
...

You can the interact with the models using llm, either through a single prompt

llm -m mistral "What is Ollama?"

by providing a file containing your input, or by piping it

llm -m aya "Please translate into french the text $(cat /path/to/english.txt)"

or by launching a chat

llm chat -m llama3

Tip

llm also works with OpenAI models (and all big industry models) if you connect your application to your account via a token.