Integrate LM Studio with Jan
Quick Introduction
With LM Studio, you can discover, download, and run local Large Language Models (LLMs). In this guide, we will show you how to integrate and use your current models on LM Studio with Jan using 2 methods. The first method is integrating LM Studio server with Jan UI. The second method is migrating your downloaded model from LM Studio to Jan. We will use the Phi 2 - GGUF model on Hugging Face as an example.
Steps to Integrate LM Studio Server with Jan UI
1. Start the LM Studio Server
- Navigate to the
Local Inference Server
on the LM Studio application. - Select the model you want to use.
- Start the server after configuring the server port and options.
Modify the openai.json
file in the ~/jan/engines
folder to include the full URL of the LM Studio server.
{
"full_url": "http://localhost:<port>/v1/chat/completions"
}
- Replace
<port>
with the port number you set in the LM Studio server. The default port is1234
.
2. Modify a Model JSON
Navigate to the ~/jan/models
folder. Create a folder named <lmstudio-modelname>
, for example, lmstudio-phi-2
and create a model.json
file inside the folder including the following configurations:
- Set the
format
property toapi
. - Set the
engine
property toopenai
. - Set the
state
property toready
.
{
"sources": [
{
"filename": "phi-2-GGUF",
"url": "https://huggingface.co/TheBloke/phi-2-GGUF"
}
],
"id": "lmstudio-phi-2",
"object": "model",
"name": "LM Studio - Phi 2 - GGUF",
"version": "1.0",
"description": "TheBloke/phi-2-GGUF",
"format": "api",
"settings": {},
"parameters": {},
"metadata": {
"author": "Microsoft",
"tags": ["General", "Big Context Length"]
},
"engine": "openai"
}
3. Start the Model
- Restart Jan and navigate to the Hub.
- Locate your model and click the Use button.
4. Try Out the Integration of Jan and LM Studio
Steps to Migrate Your Downloaded Model from LM Studio to Jan (version 0.4.6 and older)
1. Migrate Your Downloaded Model
- Navigate to
My Models
in the LM Studio application and reveal the model folder.
-
Copy the model folder that you want to migrate to
~/jan/models
folder. -
Ensure the folder name property is the same as the model name of
.gguf
filename by changing the folder name if necessary. For example, in this case, we changed foldername fromTheBloke
tophi-2.Q4_K_S
.
2. Start the Model
- Restart Jan and navigate to the Hub. Jan will automatically detect the model and display it in the Hub.
- Locate your model and click the Use button to try the migrating model.
Steps to Pointing to the Downloaded Model of LM Studio from Jan (version 0.4.7+)
Starting from version 0.4.7, Jan supports importing models using an absolute filepath, so you can directly use the model from the LM Studio folder.
1. Reveal the Model Absolute Path
Navigate to My Models
in the LM Studio application and reveal the model folder. Then, you can get the absolute path of your model.
2. Modify a Model JSON
Navigate to the ~/jan/models
folder. Create a folder named <modelname>
, for example, phi-2.Q4_K_S
and create a model.json
file inside the folder including the following configurations:
- Ensure the
id
property matches the folder name you created. - Ensure the
url
property is the direct binary download link ending in.gguf
. Now, you can use the absolute filepath of the model file. In this example, the absolute filepath is/Users/<username>/.cache/lm-studio/models/TheBloke/phi-2-GGUF/phi-2.Q4_K_S.gguf
. - Ensure the
engine
property is set tonitro
.
{
"object": "model",
"version": 1,
"format": "gguf",
"sources": [
{
"filename": "phi-2.Q4_K_S.gguf",
"url": "<absolute-path-of-model-file>"
}
],
"id": "phi-2.Q4_K_S",
"name": "phi-2.Q4_K_S",
"created": 1708308111506,
"description": "phi-2.Q4_K_S - user self import model",
"settings": {
"ctx_len": 4096,
"embedding": false,
"prompt_template": "{system_message}\n### Instruction: {prompt}\n### Response:",
"llama_model_path": "phi-2.Q4_K_S.gguf"
},
"parameters": {
"temperature": 0.7,
"top_p": 0.95,
"stream": true,
"max_tokens": 2048,
"stop": ["<endofstring>"],
"frequency_penalty": 0,
"presence_penalty": 0
},
"metadata": {
"size": 1615568736,
"author": "User",
"tags": []
},
"engine": "nitro"
}
- If you are using Windows, you need to use double backslashes in the url property, for example:
C:\\Users\\username\\filename.gguf
.
3. Start the Model
- Restart Jan and navigate to the Hub.
- Jan will automatically detect the model and display it in the Hub.
- Locate your model and click the Use button to try the migrating model.