Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed a small bug rarely causing type mismatch #24

Merged
merged 1 commit into from
Jul 12, 2018

Conversation

omaraltayyan
Copy link
Contributor

@omaraltayyan omaraltayyan commented Jul 5, 2018

when you use the derivative_extraction function, you are likely to concatenate it with your features, and your features' type might be chosen carefully in a manner sensitive to memory usage.

for example in my project i have a large dataset (which is normal for the use case of this library) which i use float32 as the datatype of it's values, since it's accuracy is enough and it reduces the memory footprint by half compared to float64, but when i used the derivative_extraction it calculated it's values in float64 and when i concatenated the derivatives and the original features all the values where converted to float64 and the system's memory usage was doubled, which wasn't immediately apparent.

this is not that critical and it could be fixed by converting types from outside the library call, but why bother with the inconvenience when changing the datatype of the derivatives array from the incoming feature's datatype is not needed.

when you use the derivative_extraction function, you are likely to concatenate it with your features, and your features might have a varing number of types which might be sensitive to memory usage.

for example in my project i have a large dataset (which is normal for the use case of this library) which i use float32 as the datatype of it's values, but when i used the derivative_extraction it calculated it's values in float64 and when i concatenated the derivatives and the original features all the values where converted to float64 and the system's memory usage was doubled

this is not that critical and it could be fixed by converting types from outside the library call, but why bother with the inconvenience when changing the datatype from the incoming feature's datatype is not needed.
@codecov
Copy link

codecov bot commented Jul 5, 2018

Codecov Report

Merging #24 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@          Coverage Diff          @@
##           master    #24   +/-   ##
=====================================
  Coverage     100%   100%           
=====================================
  Files           1      1           
  Lines          25     25           
=====================================
  Hits           25     25

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f5689ef...2593279. Read the comment docs.

@coveralls
Copy link

coveralls commented Jul 5, 2018

Coverage Status

Coverage remained the same at 100.0% when pulling 2593279 on omaraltayyan:patch-1 into f5689ef on astorfi:master.

@astorfi astorfi assigned astorfi and unassigned astorfi Jul 5, 2018
@astorfi
Copy link
Owner

astorfi commented Jul 5, 2018

Thank you for your pull request. I will investigate it ASAP.

@astorfi
Copy link
Owner

astorfi commented Jul 12, 2018

@omaraltayyan Thank you. I investigate it and will be happy to merge it.

@astorfi astorfi closed this Jul 12, 2018
@astorfi astorfi reopened this Jul 12, 2018
@astorfi astorfi merged commit ae9ceec into astorfi:master Jul 12, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants