File size: 1,579 Bytes
31d96b9
 
 
 
 
 
 
 
 
 
 
 
04dc8c8
 
 
 
 
 
 
 
 
31d96b9
f2f6dd5
 
 
04dc8c8
00731e6
04dc8c8
 
 
30eebeb
04dc8c8
3741440
 
6b1adf1
3741440
04dc8c8
c2cd4fe
 
 
 
04dc8c8
 
 
 
a67e7ca
 
 
 
 
 
 
04dc8c8
 
 
 
a67e7ca
 
04dc8c8
f2f6dd5
 
 
 
 
5bf4083
f2f6dd5
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
---
language:
- en
library_name: Pytorch
library_version: 2.0.1+cu118
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- spam detection
- email detection
- text classification
inference: true
model-index:
- name: foduucom/Mail-spam-detection
  results:
  - task:
      type: text-classification
    metrics:
    - type: precision
      value: 0.866
---


# Model Card for Text Classification for email-spam detection
This model is based on Text classification using pytorch library. In this model we propose to used a torchtext library for tokenize & vectorize data.
This model is used in corporate and industrial area for mail detection. It is used three label like job, enquiry and spam.
It achieve the following results on the evalution set:
- accuracy : 0.866

## model architecture for text classification :

<p align="center">
<!-- Smaller size image -->
<img src="https://huggingface.co/foduucom/Mail-spam-detection/resolve/main/text%20classification.jpeg" alt="Image" style="width:600px; height:400px;">
</p>

### Label for text classification:
- Enquiry
- Job
- Spam

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 64
- step_size: 10
- optimizer: Adam 
- lr_scheduler_type: StepLR
- lr_scheduler.StepLR:(optimizer,step_size=10,gamma=0.1)
- num_epochs: 10



### Framework versions
- Pytorch 2.0.1+cu118
- torchtext 0.15.2+cpu


```bibtex
@ModelCard{
  author    = {Nehul Agrawal and
               Rahul parihar},
  title     = {Text classification},
  year      = {2023}
}
```