Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

DistilBERT for masked language modelling trained on OpenSSH heap data structures dataset for the purpose of generating representations. This model was created for the thesis "Generating Robust Representations of Structures in OpenSSH Heap Dumps" by Johannes Garstenauer.

Model Description

  • Developed by: Johannes Garstenauer
  • Funded by [optional]: Universität Passau

Model Sources [optional]

Training Data

Training data: https://huggingface.co/datasets/johannes-garstenauer/structs_token_size_4_reduced_labelled_train Validation data: https://huggingface.co/datasets/johannes-garstenauer/structs_token_size_4_reduced_labelled_eval

Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.