# Connect a model

Scorable subscription provides [a set of models](https://scorable.ai/settings/llm-accounts) you can use in your Judges and proxy. You are not limited by that selection, though. Integrating with cloud providers' models or connecting to locally hosted models is possible via UI, [through the SDK](https://sdk.rootsignals.ai/en/latest/examples.html#add-a-model) or the REST API.

{% hint style="info" %}
[Full model list](https://api.scorable.ai/public-models/).
{% endhint %}

### Huggingface example

To use an [HF inference endpoint](https://huggingface.co/docs/inference-endpoints/index), add the model endpoint via the SDK or through the REST API

```bash
curl --request POST \
     --url https://api.scorable.ai/v1/models/ \
     --header 'Authorization: Api-Key $SCORABLE_API_KEY' \
     --header 'accept: application/json' \
     --header 'content-type: application/json' \
     --data '
               {
                 "name": "huggingface/meta-llama/Meta-Llama-3-8B",
                 "url": "https://my-endpoint.huggingface.cloud",
                 "default_key": "$HF_KEY"
               }
            '
```

After adding the model, you can use it like any other model in your evaluators.

```python
skill = client.evaluators.create(
    name="My model test", prompt="Hello, my model!", model="huggingface/meta-llama/Meta-Llama-3-8B"
)
```
