Skip to content

Commit

Permalink
Introduction of the HLOB model.
Browse files Browse the repository at this point in the history
  • Loading branch information
AntoBr96 committed May 31, 2024
1 parent 55ee1f2 commit 8106c95
Show file tree
Hide file tree
Showing 7 changed files with 604 additions and 3 deletions.
40 changes: 38 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# LOBFrame

We release `LOBFrame' (see the [paper](https://arxiv.org/abs/2403.09267v1)), a novel, open-source code base which presents a renewed way to process large-scale Limit Order Book (LOB) data. This framework integrates all the latest cutting-edge insights from scientific research (see [Lucchese et al.](https://www.sciencedirect.com/science/article/pii/S0169207024000062), [Prata et al.](https://arxiv.org/pdf/2308.01915.pdf)) into a cohesive system. Its strength lies in the comprehensive nature of the implemented pipeline, which includes the data transformation and processing stage, an ultra-fast implementation of the training, validation, and testing steps, as well as the evaluation of the quality of a model's outputs through trading simulations. Moreover, it offers flexibility by accommodating the integration of new models, ensuring adaptability to future advancements in the field.
We release `LOBFrame' (see the two papers [`Deep Limit Order Book Forecasting'](https://arxiv.org/abs/2403.09267) and [`HLOB - Structure and Persistence of Information in Limit Order Books'](https://arxiv.org/abs/2405.18938)), a novel, open-source code base which presents a renewed way to process large-scale Limit Order Book (LOB) data. This framework integrates all the latest cutting-edge insights from scientific research (see [Lucchese et al.](https://www.sciencedirect.com/science/article/pii/S0169207024000062), [Prata et al.](https://arxiv.org/pdf/2308.01915.pdf)) into a cohesive system. Its strength lies in the comprehensive nature of the implemented pipeline, which includes the data transformation and processing stage, an ultra-fast implementation of the training, validation, and testing steps, as well as the evaluation of the quality of a model's outputs through trading simulations. Moreover, it offers flexibility by accommodating the integration of new models, ensuring adaptability to future advancements in the field.

## Introduction

In this tutorial, we show how to replicate the experiments presented in the paper titled __"Deep Limit Order Book Forecasting: A microstructural guide"__.
In this tutorial, we show how to replicate the experiments presented in the two papers titled __"Deep Limit Order Book Forecasting: A microstructural guide"__ and __"HLOB - Structure and persistence of Information in Limit Order Books"__.

Before starting, please remember to **ALWAYS CITE OUR WORK** as follows:

Expand All @@ -17,6 +17,17 @@ Before starting, please remember to **ALWAYS CITE OUR WORK** as follows:
}
```

```
@misc{briola2024hlob,
title={HLOB -- Information Persistence and Structure in Limit Order Books},
author={Antonio Briola and Silvia Bartolucci and Tomaso Aste},
year={2024},
eprint={2405.18938},
archivePrefix={arXiv},
primaryClass={q-fin.TR}
}
```

## Pre-requisites

Install the required packages:
Expand All @@ -25,6 +36,12 @@ Install the required packages:
pip3 install -r requirements.txt
```

If you are using a MacOS operating system, please proceed as follows:

```bash
pip3 install -r requirements_mac_os.txt
```

## Data
All the code in this repository exploits [LOBSTER](https://lobsterdata.com) data. To have an overview on their structure, please refer
to the official documentation available at the following [link](https://lobsterdata.com/info/DataStructure.php).
Expand All @@ -49,10 +66,26 @@ To start an experiment from scratch, you need to follow these steps:
```bash
python3 main --training_stocks "CSCO" --target_stocks "CSCO" --stages "torch_dataset_preparation,torch_dataset_preparation_backtest" --prediction_horizon 10
```
- If you are planning to use the HLOB model (see the paper titled [`HLOB - Structure and Persistence of Information in Limit Order Books'](https://arxiv.org/abs/2405.18938)), it is mandatory to execute the following command:
```bash
python3 main --training_stocks "CSCO" --target_stocks "CSCO" --stages "complete_homological_structures_preparation"
```
- Run the following command to train the model:
```bash
python3 main --training_stocks "CSCO" --target_stocks "CSCO" --stages "training"
```
Please notice that the currently available models are:
- deeplob
- transformer
- itransformer
- lobtransformer
- dla
- cnn1
- cnn2
- binbtabl
- binctabl
- axiallob
- hlob
- Run the following command to evaluate the model:
```bash
python3 main --training_stocks "CSCO" --target_stocks "CSCO" --experiment_id "<experiment_id_generated_in_the_training_stage>" --stages "evaluation"
Expand Down Expand Up @@ -90,6 +123,7 @@ We now provide the typical structure of a folder before an experiment's run:
├── data_processing
│   ├── data_process.py
│   └── data_process_utils.py
│   └── complete_homological_utils.py
├── loaders
│   └── custom_dataset.py
├── loggers
Expand Down Expand Up @@ -118,6 +152,8 @@ We now provide the typical structure of a folder before an experiment's run:
│   └── tabl_layer.py
│   ├── Transformer
│   └── transformer.py
|   ├── CompleteHCNN
│   └── complete_hcnn.py
├── optimizers
│   ├── executor.py
│   └── lightning_batch_gd.py
Expand Down
Loading

0 comments on commit 8106c95

Please sign in to comment.