Upload 2 files
40f3e10
-
1.52 kB
initial commit
-
1.71 kB
Upload 2 files
-
1.38 kB
Upload 2 files
-
234 Bytes
initial commit
-
6.12 kB
Update app.py
bp_model.pkl
Detected Pickle imports (5)
- "BackPropogation.BackPropogation",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar"
How to fix it?
4.3 kB
Upload 12 files
bp_tokeniser.pkl
Detected Pickle imports (4)
- "builtins.int",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "collections.OrderedDict"
How to fix it?
4.99 MB
Upload 12 files
-
392 MB
Upload 12 files
-
457 kB
Upload 12 files
dnn_tokeniser.pkl
Detected Pickle imports (4)
- "builtins.int",
- "keras.src.preprocessing.text.Tokenizer",
- "collections.defaultdict",
- "collections.OrderedDict"
How to fix it?
4.53 MB
Upload 12 files
-
41.2 MB
Upload 12 files
lstm_tokeniser.pkl
Detected Pickle imports (4)
- "builtins.int",
- "keras.src.preprocessing.text.Tokenizer",
- "collections.defaultdict",
- "collections.OrderedDict"
How to fix it?
4.53 MB
Upload 12 files
ppn_model.pkl
Detected Pickle imports (4)
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype",
- "numpy.ndarray",
- "Perceptron.Perceptron"
How to fix it?
2.27 kB
Upload 12 files
ppn_tokeniser.pkl
Detected Pickle imports (4)
- "builtins.int",
- "collections.defaultdict",
- "keras.src.preprocessing.text.Tokenizer",
- "collections.OrderedDict"
How to fix it?
4.85 MB
Upload 12 files
-
68 Bytes
Create requirements.txt
-
2.24 MB
Upload 12 files
rnn_tokeniser.pkl
Detected Pickle imports (4)
- "builtins.int",
- "keras.src.preprocessing.text.Tokenizer",
- "collections.defaultdict",
- "collections.OrderedDict"
How to fix it?
287 kB
Upload 12 files