Text Generation
Transformers
PyTorch
olmo
Inference Endpoints
upiter commited on
Commit
37c27fa
•
1 Parent(s): a41d48a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -5,6 +5,7 @@ datasets:
5
  - HuggingFaceFW/fineweb
6
  base_model:
7
  - upiter/TinyCodeLM-150M
 
8
  ---
9
 
10
 
@@ -63,4 +64,4 @@ TinyCodeLM models were pretrained from scratch on a single H100 node (four GPUs)
63
  ```
64
 
65
  # Safety
66
- This work explores data-driven mechanisms for improving the quality of language model-generated code. Our synthetic data generation method relies on open-source data and our experiments leverage open-source software and resources. It is important to acknowledge that all language models for code synthesis have the potential to be misused – whether intentionally or unintentionally – for generation of code with vulnerabilities and/or malicious behaviors. Any and all model generated code has the potential to be harmful and must not be executed without precautions.
 
5
  - HuggingFaceFW/fineweb
6
  base_model:
7
  - upiter/TinyCodeLM-150M
8
+ library_name: transformers
9
  ---
10
 
11
 
 
64
  ```
65
 
66
  # Safety
67
+ This work explores data-driven mechanisms for improving the quality of language model-generated code. Our synthetic data generation method relies on open-source data and our experiments leverage open-source software and resources. It is important to acknowledge that all language models for code synthesis have the potential to be misused – whether intentionally or unintentionally – for generation of code with vulnerabilities and/or malicious behaviors. Any and all model generated code has the potential to be harmful and must not be executed without precautions.