-
Notifications
You must be signed in to change notification settings - Fork 362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
features_at, coords_at function error #100
Comments
Thanks for reporting the issue. For now, to get unique batch indices, use
|
Hi, thank you for the answer. In case we have the continuous batch indices:
I think the function features_at works fine with code:
giving out the features with the batch index 2 Is there a built-n function to get the same result in case of skipping batch indices?
I just want to know if there's a better way for this job Thanks |
So currently using the batch indices would be the best work around for now. def feats_at_idx(x, idx):
bs = list(x.coords_man.get_batch_indices())
if idx not in bs:
return torch.FloatTensor([[]])
unique_idx = bs.index(idx)
return x.features_at(unique_idx) |
I'm sorry, but according to your function, the code
should work in case of
However, I get the following error.
Did I misunderstand your function? |
Right, I just pushed the latest commit that will now work as expected with the batch index. Please let me know if there's a problem. |
Hi, I have some trouble using features_at function
It seems to not work when there is a skip in the batch index
Thanks
The text was updated successfully, but these errors were encountered: