Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to export predicted result in aneurysm3D example #2

Closed
syusaku625 opened this issue Nov 25, 2023 · 3 comments
Closed

How to export predicted result in aneurysm3D example #2

syusaku625 opened this issue Nov 25, 2023 · 3 comments

Comments

@syusaku625
Copy link

No description provided.

@rezaakb
Copy link
Owner

rezaakb commented Nov 26, 2023

If you use our package directly, you can access predictions as described in the tutorial. However, you raise a good point, and we will add it to the package for use with the Hydra config file.

preds_list = trainer.predict(model=model, datamodule=datamodule)
preds_dict = pinnstorch.utils.fix_predictions(preds_list)

@syusaku625
Copy link
Author

syusaku625 commented Nov 30, 2023

I tried to export predicted data following your advice, but I got following errors.
Please give me advise how to solve this error.

Traceback (most recent call last):
File "/home/shusaku/pinns-torch/examples/aneurysm3D/train.py", line 169, in main
metric_dict, _ = pinnstorch.train(
File "/home/shusaku/pinns-torch/pinnstorch/utils/utils.py", line 85, in wrap
raise ex
File "/home/shusaku/pinns-torch/pinnstorch/utils/utils.py", line 73, in wrap
metric_dict, object_dict = task_func(
File "/home/shusaku/pinns-torch/pinnstorch/train.py", line 174, in train
preds_list = trainer.predict(
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 864, in predict
return call._call_and_handle_interrupt(
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 44, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 903, in _predict_impl
results = self._run(model, ckpt_path=ckpt_path)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 989, in _run
results = self._run_stage()
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 1030, in _run_stage
return self.predict_loop.run()
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/loops/utilities.py", line 182, in _decorator
return loop_run(self, *args, **kwargs)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/loops/prediction_loop.py", line 122, in run
self._predict_step(batch, batch_idx, dataloader_idx, dataloader_iter)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/loops/prediction_loop.py", line 228, in _predict_step
batch = call._call_strategy_hook(trainer, "batch_to_device", batch, dataloader_idx=dataloader_idx)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 309, in _call_strategy_hook
output = fn(*args, **kwargs)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/strategies/strategy.py", line 269, in batch_to_device
return model._apply_batch_transfer_handler(batch, device=device, dataloader_idx=dataloader_idx)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/core/module.py", line 333, in _apply_batch_transfer_handler
batch = self._call_batch_hook("transfer_batch_to_device", batch, device, dataloader_idx)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/core/module.py", line 322, in _call_batch_hook
return trainer_method(trainer, hook_name, *args)
File "/home/shusaku/pinns-torch/test/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 157, in _call_lightning_module_hook
output = fn(*args, **kwargs)
File "/home/shusaku/pinns-torch/pinnstorch/models/pinn_module.py", line 241, in transfer_batch_to_device
self.copy_batch(batch)
File "/home/shusaku/pinns-torch/pinnstorch/models/pinn_module.py", line 329, in copy_batch
spatial, time, solution = self.static_batch[key]
TypeError: unhashable type: 'list'

@rezaakb
Copy link
Owner

rezaakb commented Dec 4, 2023

Sorry for any inconvenience. We have added a new feature to the package. Now you can add save_pred: true to the config file, and the prediction will be saved in the output directory. Please let me know if you still have issues with saving predictions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants