You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I fount that you have a code for attention based Seq2Seq model Seq2Seq_atn.py for pepper but it seems that you did not use it. Is there any specific reason that you are not using attention based one?
The text was updated successfully, but these errors were encountered:
Those were exploratory scripts that we either didn't follow through or didn't work. I don't exactly remember what was the case. One thing I remember was that the RNN solution had the highest sensitivity and we were aiming for that in PEPPER.
I fount that you have a code for attention based Seq2Seq model Seq2Seq_atn.py for pepper but it seems that you did not use it. Is there any specific reason that you are not using attention based one?
The text was updated successfully, but these errors were encountered: