Accelerate Generative AI Inference on Amazon SageMaker AI with G7e Instances

Today, we are thrilled to announce the availability of G7e instances powered by NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs on Amazon…

AWS Machine Learning Blog AWS Machine Learning Blog Apr 20
Run Llama 2 uncensored locally August 1, 2023 This post will give some example comparisons running Llama 2 uncensored model versus its censored model.
Run Llama 2 uncensored locally August 1, 2023 This post will give some example comparisons running Llama 2 uncensored model versus its censored model.

This post will give some example comparisons running Llama 2 uncensored model versus its censored model.

Ollama Blog Ollama Blog Apr 20
Run Code Llama locally August 24, 2023 Meta's Code Llama is now available on Ollama to try.
Run Code Llama locally August 24, 2023 Meta's Code Llama is now available on Ollama to try.

Meta's Code Llama is now available on Ollama to try.

Ollama Blog Ollama Blog Apr 20
How to prompt Code Llama September 9, 2023 This guide walks through the different ways to structure prompts for Code Llama and its different variations and features including instructions, code completion and fill-in-the-middle (FIM).
How to prompt Code Llama September 9, 2023 This guide walks through the different ways to structure prompts for Code Llama and its different variations and features including instructions, code completion and fill-in-the-middle (FIM).

This guide walks through the different ways to structure prompts for Code Llama and its different variations and features including instruc…

Ollama Blog Ollama Blog Apr 20
Leveraging LLMs in your Obsidian Notes September 21, 2023 This post walks through how you could incorporate a local LLM using Ollama in Obsidian, or potentially any note taking tool.
Leveraging LLMs in your Obsidian Notes September 21, 2023 This post walks through how you could incorporate a local LLM using Ollama in Obsidian, or potentially any note taking tool.

This post walks through how you could incorporate a local LLM using Ollama in Obsidian, or potentially any note taking tool.

Ollama Blog Ollama Blog Apr 20
Ollama is now available as an official Docker image October 5, 2023 Ollama can now run with Docker Desktop on the Mac, and run inside Docker containers with GPU acceleration on Linux.
Ollama is now available as an official Docker image October 5, 2023 Ollama can now run with Docker Desktop on the Mac, and run inside Docker containers with GPU acceleration on Linux.

Ollama can now run with Docker Desktop on the Mac, and run inside Docker containers with GPU acceleration on Linux.

Ollama Blog Ollama Blog Apr 20
Vision models February 2, 2024 New vision models are now available: LLaVA 1.6, in 7B, 13B and 34B parameter sizes. These models support higher resolution images, improved text recognition and logical reasoning.
Vision models February 2, 2024 New vision models are now available: LLaVA 1.6, in 7B, 13B and 34B parameter sizes. These models support higher resolution images, improved text recognition and logical reasoning.

New vision models are now available: LLaVA 1.6, in 7B, 13B and 34B parameter sizes. These models support higher resolution images, improved…

Ollama Blog Ollama Blog Apr 20
OpenAI compatibility February 8, 2024 Ollama now has initial compatibility with the OpenAI Chat Completions API, making it possible to use existing tooling built for OpenAI with local models via Ollama.
OpenAI compatibility February 8, 2024 Ollama now has initial compatibility with the OpenAI Chat Completions API, making it possible to use existing tooling built for OpenAI with local models via Ollama.

Ollama now has initial compatibility with the OpenAI Chat Completions API, making it possible to use existing tooling built for OpenAI with…

Ollama Blog Ollama Blog Apr 20
Ollama now supports AMD graphics cards March 14, 2024 Ollama now supports AMD graphics cards in preview on Windows and Linux. All the features of Ollama can now be accelerated by AMD graphics cards on Ollama for Linux and Windows.
Ollama now supports AMD graphics cards March 14, 2024 Ollama now supports AMD graphics cards in preview on Windows and Linux. All the features of Ollama can now be accelerated by AMD graphics cards on Ollama for Linux and Windows.

Ollama now supports AMD graphics cards in preview on Windows and Linux. All the features of Ollama can now be accelerated by AMD graphics c…

Ollama Blog Ollama Blog Apr 20