|
--- |
|
library_name: transformers |
|
tags: |
|
- time series |
|
- multimodal |
|
- TimeSeries-Text-to-Text |
|
license: apache-2.0 |
|
--- |
|
|
|
# Mists-7B-v01-not-trained |
|
|
|
Mists(**Mis**tral **T**ime **S**eries) is a multimodal model that combines language and time series model. |
|
This model is based on the following models: |
|
- [mistralai/Mistral-7B-Instruct-v0.3](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3) |
|
- [HachiML/MOMENT-1-large-embedding-v0.1](https://huggingface.co/HachiML/MOMENT-1-large-embedding-v0.1) (an embedding model derived from [AutonLab/MOMENT-1-large](https://huggingface.co/AutonLab/MOMENT-1-large)) |
|
|
|
This is an experimental model. |
|
Since the adapter has not been trained, the model is not yet suitable for use. |
|
|
|
## How to use |
|
|
|
```Python |
|
!pip install accelerate |
|
``` |
|
|
|
```Python |
|
from transformers import AutoProcessor, AutoModel |
|
import torch |
|
|
|
model_id = "HachiML/Mists-7B-v01-not-trained" |
|
|
|
processor = AutoProcessor.from_pretrained(model_id, trust_remote_code=True) |
|
model = AutoModel.from_pretrained( |
|
model_id, |
|
torch_dtype=torch.float32, |
|
low_cpu_mem_usage=True, |
|
device_map="auto", |
|
trust_remote_code=True, |
|
) |
|
``` |
|
|
|
```Python |
|
import pandas as pd |
|
import torch |
|
|
|
hist_ndaq = pd.DataFrame("nasdaq_price_history.csv") |
|
time_series_data = hist_ndaq[["Open", "High", "Low", "Close", "Volume"]].iloc[:512] |
|
|
|
prompt = "USER: <time_series>\nWhat are the features of this data?\nASSISTANT:" |
|
inputs = processor(prompt, time_series_data, return_tensors='pt') |
|
|
|
device = "cuda" if torch.cuda.is_available() else "cpu" |
|
for key, item in inputs.items(): |
|
inputs[key] = inputs[key].to(device) |
|
|
|
output = model.generate(**inputs, max_new_tokens=200, do_sample=False) |
|
print(processor.decode(output[0], skip_special_tokens=False)) |
|
``` |
|
|