Simon Willison's Weblog
mosesjonsson18 a editat această pagină 6 luni în urmă


That model was trained in part using their unreleased R1 "thinking" model. Today they have actually released R1 itself, along with a whole household of brand-new designs obtained from that base.

There's an entire lot of things in the new release.

DeepSeek-R1-Zero seems the base model. It's over 650GB in size and, like the majority of their other releases, is under a clean MIT license. DeepSeek warn that "DeepSeek-R1-Zero experiences challenges such as limitless repetition, poor readability, and language mixing." ... so they also released:

DeepSeek-R1-which "includes cold-start information before RL" and "attains performance comparable to OpenAI-o1 throughout mathematics, code, and reasoning tasks". That a person is also MIT certified, and is a comparable size.

I do not have the ability to run models larger than about 50GB (I have an M2 with 64GB of RAM), so neither of these two designs are something I can quickly play with myself. That's where the new distilled designs are available in.

To support the research study community, oke.zone we have open-sourced DeepSeek-R1-Zero, DeepSeek-R1, equipifieds.com and six thick designs distilled from DeepSeek-R1 based upon Llama and Qwen.

This is an interesting flex! They have actually designs based upon Qwen 2.5 (14B, 32B, Math 1.5 B and Math 7B) and Llama 3 (Llama-3.1 8B and Llama 3.3 70B Instruct).

Weirdly those Llama models have an MIT license attached, which I'm uncertain works with the underlying Llama license. Qwen models are Apache licensed so maybe MIT is OK?

(I also just noticed the MIT license files state "Copyright (c) 2023 DeepSeek" so they might need to pay a little bit more attention to how they copied those in.)

Licensing aside, classifieds.ocala-news.com these distilled designs are remarkable monsters.

Running DeepSeek-R1-Distill-Llama-8B-GGUF

Quantized versions are already beginning to reveal up. Up until now I've attempted simply among those- unsloth/DeepSeek-R 1-Distill-Llama-8B-GGUF released by Unsloth AI-and it's truly fun to have fun with.

I'm running it using the combination of Ollama, LLM and the llm-ollama plugin.

First I fetched and ran the model using Ollama itself:

This downloads around 8.5 GB of model information and starts an interactive chat user interface.

Once the design has been fetched, LLM can talk to it as well. I prefer utilizing LLM for experiments since it logs everything to SQLite for later expedition.

I set up the plugin and ran it like this:

Each reaction starts with a ... pseudo-XML tag containing the chain of idea utilized to help create the reaction. For "a joke about a pelican and a walrus who run a tea room together" that started like this:

Okay, so I require to come up with a joke about a pelican and a walrus running a tea space together. Hmm, that's an intriguing mix. Let me think of how these two characters might engage in a funny scenario.

First, let's consider their characteristics. Pelicans are known for their long beaks and Webbed feet, typically seen near the beach or water. Walruses have huge teeth, thick fur, and they're typically discovered in cooler climates, like icebergs or snowy areas. So, these 2 into a tea room setting is currently an amusing image.

[...]
It then believed for 20 paragraphs before outputting the joke! Here's the complete records and here's the output at the end, after the tag:

The Joke:

A pelican and a walrus choose to open a quaint little tea space together. The walrus, with its huge size, has a hard time to discover comfy chairs, so it sits on the table by mishap, knocking over the teapot. Meanwhile, the pelican, trying to help, uses its beak to place dishes on the table, triggering a couple of spills.

After a series of humorous accidents, the walrus looks up and says with a grin, "This isn't so fishy any longer." The pelican smirks and mentions, "Maybe not, however we do have a lot of krill in our tea!"

Explanation:

- Setup: Pelican and pattern-wiki.win Walrus run a not likely tea room. Their sizes cause comical chaos.