Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Return-types #4] Parameter shift grad transform #2886

Merged
merged 160 commits into from
Sep 16, 2022
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
160 commits
Select commit Hold shift + click to select a range
a38477a
draft
antalszava Jul 13, 2022
f3c7f9c
more
antalszava Jul 14, 2022
f9c8767
Merge branch 'master' into refactor_return_shot_vector
antalszava Jul 14, 2022
61b932e
First working version
rmoyard Jul 14, 2022
d6125df
fix a bug; shot vector Counts returns tuple; Autograd interface update
antalszava Jul 14, 2022
6e3b347
Merge branch 'master' into refactor_return_shot_vector
antalszava Jul 14, 2022
7d32c4e
test file
antalszava Jul 14, 2022
8cee2e1
single_measurement
antalszava Jul 14, 2022
e1b7fb1
more tests
antalszava Jul 14, 2022
966cf99
test
rmoyard Jul 15, 2022
39865d4
first multi-meas attempt
antalszava Jul 15, 2022
e5d49b6
multi_measures
antalszava Jul 16, 2022
903e8b3
Update
rmoyard Jul 18, 2022
06579b2
Update
rmoyard Jul 18, 2022
61040ba
Merge branch 'return_types' into refactor_return_shot_vector
rmoyard Jul 18, 2022
8f5053a
merge
rmoyard Jul 18, 2022
005a89f
Update
rmoyard Jul 18, 2022
bf55fe4
Merge branch 'master' into return_types
rmoyard Jul 18, 2022
941da8b
Merge branch 'return_types' into refactor_return_shot_vector
rmoyard Jul 18, 2022
2a615b5
update
rmoyard Jul 19, 2022
9eb4d32
Merge branch 'refactor_return_shot_vector' of https://github.com/Penn…
rmoyard Jul 19, 2022
b86f752
Update
rmoyard Jul 19, 2022
08d9c65
Merge branch 'return_types' of https://github.com/PennyLaneAI/pennyla…
rmoyard Jul 19, 2022
c934fa9
Remove autograd_changes
rmoyard Jul 19, 2022
7f3608c
Comment
rmoyard Jul 20, 2022
465e3d4
Update
rmoyard Jul 20, 2022
57a22d3
Coverage
rmoyard Jul 20, 2022
d31b786
Typo
rmoyard Jul 20, 2022
b5f510c
Add tests
rmoyard Jul 20, 2022
9a1e963
Merge branch 'return_types' into refactor_return_shot_vector
rmoyard Jul 20, 2022
8d8d456
Move tests
rmoyard Jul 20, 2022
8521d50
Revert autograd
rmoyard Jul 20, 2022
48d481a
Unecessary change
rmoyard Jul 20, 2022
b28fdfd
Solve issue
rmoyard Jul 20, 2022
47fcae9
Add statistics new
rmoyard Jul 20, 2022
392dcd7
Merge branch 'master' into return_types
rmoyard Jul 21, 2022
5a15ee1
Merge branch 'master' into return_types
rmoyard Jul 21, 2022
2f0797f
Update tests/test_new_return_types.py
rmoyard Jul 22, 2022
bec1648
Update pennylane/_qubit_device.py
rmoyard Jul 22, 2022
b3cb3a2
:memo: Update from review
rmoyard Jul 22, 2022
8a08be1
:white_check_mark: Update test
rmoyard Jul 22, 2022
2ec1f53
Merge branch 'return_types' of https://github.com/PennyLaneAI/pennyla…
rmoyard Jul 22, 2022
eb471cc
:sparkles: QNode integration
rmoyard Jul 25, 2022
8e0d25f
:recycle: Remove global
rmoyard Jul 25, 2022
9b49eed
Merge branch 'return_types' into refactor_return_shot_vector
antalszava Jul 25, 2022
c6fcf73
[skip-ci]
antalszava Jul 25, 2022
fbda016
:white_check_mark: Add tests for QNode integration
rmoyard Jul 25, 2022
7d344c0
more test cases for probs and sample multi (proj; tensor product)
antalszava Jul 25, 2022
bca05b7
more test cases for probs and sample multi (proj; tensor product)
antalszava Jul 25, 2022
5c04ab0
multi-measure: Counts tests
antalszava Jul 25, 2022
0a4f288
test docstrings
antalszava Jul 25, 2022
6f20785
:sparkles: Support Autograd Jacobian
rmoyard Jul 26, 2022
3623a98
Merge branch 'refactor_return_shot_vector' into return_types_qnode
rmoyard Jul 26, 2022
2d59341
adjust test
antalszava Jul 26, 2022
c5d842d
resolve
antalszava Jul 26, 2022
05982e3
Merge branch 'master' into return_types
antalszava Jul 26, 2022
07e94d7
Merge branch 'return_types' into refactor_return_shot_vector
antalszava Jul 26, 2022
8d43f30
no more is_sampled; probs sample with obs test
antalszava Jul 26, 2022
2dd9717
probs and sample test
antalszava Jul 26, 2022
bc9009b
:sparkles: All interface backprop Jacobian
rmoyard Jul 26, 2022
4b8b579
:white_check_mark: Update tests
rmoyard Jul 26, 2022
74a356d
more tests
antalszava Jul 26, 2022
1de79a2
probs tests
antalszava Jul 26, 2022
e7f9a2c
more more tests
antalszava Jul 27, 2022
fb394bf
refactor
antalszava Jul 27, 2022
bfe3a65
refactor tests
antalszava Jul 27, 2022
0a30c53
update cov; add vanilla counts test (xfail) for finite shots
antalszava Jul 27, 2022
3993fbd
restore statistics docstring
antalszava Jul 27, 2022
5c249de
Merge branch 'refactor_return_shot_vector' into return_types_qnode
rmoyard Jul 27, 2022
3759fd6
Update tests/test_new_return_types.py
rmoyard Jul 27, 2022
8ece6e0
process counts for finite shots
antalszava Jul 27, 2022
82e42ac
create shot_vec_statistics aux method
antalszava Jul 27, 2022
7219640
Apply suggestions from code review
antalszava Jul 27, 2022
0d86242
fix
antalszava Jul 27, 2022
4a5224e
suggestion
antalszava Jul 27, 2022
77286a3
refactors
antalszava Jul 27, 2022
ae35bc5
revert to have squeeze in expval
antalszava Jul 27, 2022
85a8a80
docstring and more tests
antalszava Jul 27, 2022
5f50c11
Update pennylane/interfaces/execution.py
rmoyard Jul 27, 2022
66b7dfc
:white_check_mark: Add default mixed to tests
rmoyard Jul 27, 2022
73e5e3c
Merge branch 'master' into return_types
rmoyard Jul 27, 2022
3a13558
Merge branch 'refactor_return_shot_vector' into return_types_qnode
rmoyard Jul 27, 2022
fbd3120
:white_check_mark: Upddate test due to merge issues
rmoyard Jul 27, 2022
3dfd26e
:bug: Bug introduced due to checking backprop
rmoyard Jul 27, 2022
a656083
:bug: Bug introduced due to checking backprop
rmoyard Jul 27, 2022
3541fb1
Merge branch 'master' into return_types
rmoyard Jul 28, 2022
72f7757
Merge branch 'master' into return_types
rmoyard Jul 28, 2022
84ab21f
:white_check_mark: Separate the tape and qnode tests
rmoyard Jul 28, 2022
badddc1
:recycle: Refactor after review
rmoyard Jul 29, 2022
2544b7a
Merge branch 'master' into return_types
antalszava Jul 29, 2022
7a32588
[Return-types #2] Refactor return types (shot vector cases) (#2815)
antalszava Jul 29, 2022
9dccc8c
resolve
antalszava Jul 29, 2022
b407ab7
draft
antalszava Aug 2, 2022
c4f7f6c
tidy
antalszava Aug 2, 2022
c10f136
draft
antalszava Aug 2, 2022
c21b7f3
new gradients test file
antalszava Aug 2, 2022
aa82c16
new test
antalszava Aug 2, 2022
df12082
test
antalszava Aug 2, 2022
2678126
more testing using the original file
antalszava Aug 3, 2022
61fc0f1
no enable_return
antalszava Aug 3, 2022
e415c97
add active_return conditions
antalszava Aug 3, 2022
877374f
[skip-ci]
antalszava Aug 3, 2022
49e0870
draft
antalszava Aug 3, 2022
e05e3ec
update file name
antalszava Aug 3, 2022
f0c1a6b
more tests
antalszava Aug 3, 2022
c0d65a7
get involutory case
antalszava Aug 15, 2022
0253908
more tests and uncomment rest of the execute_new logic (otherwise get…
antalszava Aug 16, 2022
8a34880
no finite diff checks for now
antalszava Aug 16, 2022
da6a524
resolve conflicts
antalszava Aug 19, 2022
9bedc59
counts test
antalszava Aug 20, 2022
9ba18ea
state warnings (merge master remaining conflict resolution step)
antalszava Aug 20, 2022
3e90012
current param shift tests pass
antalszava Aug 20, 2022
9d89935
Hamiltonian tests
antalszava Aug 20, 2022
5950f82
more tests
antalszava Aug 21, 2022
f4eb087
Merge branch 'master' into grad_transforms_new_return
antalszava Aug 21, 2022
425dd28
Remove previous return_types files
antalszava Aug 21, 2022
8f5f69a
Merge branch 'master' into grad_transforms_new_return
AlbertMitjans Aug 31, 2022
2522da1
Merge branch 'master' into grad_transforms_new_return
AlbertMitjans Aug 31, 2022
18be4a1
Merge branch 'master' into grad_transforms_new_return
AlbertMitjans Sep 1, 2022
4e1daab
Merge branch 'master' into grad_transforms_new_return
antalszava Sep 6, 2022
31b902e
Update pennylane/gradients/parameter_shift.py
antalszava Sep 6, 2022
a00e7f2
Merge branch 'grad_transforms_new_return' of github.com:PennyLaneAI/p…
antalszava Sep 6, 2022
30852db
[skip ci]
antalszava Sep 6, 2022
77c6c1a
linting
antalszava Sep 6, 2022
205d8b6
addressing comments
antalszava Sep 7, 2022
b27e048
insert adjoint fix logic
antalszava Sep 7, 2022
5296800
explicit case for JAX
antalszava Sep 7, 2022
39f7156
refactor; [skip ci]
antalszava Sep 7, 2022
5fc22d7
Merge branch 'master' into grad_transforms_new_return
AlbertMitjans Sep 8, 2022
d87436d
test no warning for multi-measure probs and expval
antalszava Sep 8, 2022
81882bb
updates due to change in axes
antalszava Sep 8, 2022
ca90d43
add a comment; [skip-ci]
antalszava Sep 8, 2022
15874b8
Merge branch 'grad_transforms_new_return' of github.com:PennyLaneAI/p…
antalszava Sep 8, 2022
323db3c
resolve
antalszava Sep 8, 2022
c7899f7
condition simplified
antalszava Sep 8, 2022
3184e7c
refactor as suggested
antalszava Sep 8, 2022
776953c
not covered logic removal
antalszava Sep 8, 2022
fa19266
linting, coverage
antalszava Sep 8, 2022
c6d4086
no else required
antalszava Sep 8, 2022
af3e41c
refactor as suggested
antalszava Sep 8, 2022
dd8d66d
some lines not yet covered
antalszava Sep 8, 2022
d5fe208
Merge branch 'master' into grad_transforms_new_return
AlbertMitjans Sep 9, 2022
94f051a
Merge branch 'master' into grad_transforms_new_return
AlbertMitjans Sep 9, 2022
513ef4b
comment test requiring Autograd interface changes; reset interfaces/a…
antalszava Sep 9, 2022
d3fdb8d
Merge branch 'grad_transforms_new_return' of github.com:PennyLaneAI/p…
antalszava Sep 9, 2022
b0a3c62
Merge branch 'master' into grad_transforms_new_return
antalszava Sep 9, 2022
d48cb9b
Merge branch 'master' into grad_transforms_new_return
antalszava Sep 12, 2022
0b8e02c
revert execution.py file changes
antalszava Sep 12, 2022
c1ddd87
update comments
antalszava Sep 12, 2022
dc01746
simplify condition
antalszava Sep 15, 2022
6f091a8
Update using the new convetion
antalszava Sep 15, 2022
8b62153
Merge branch 'master' into grad_transforms_new_return
antalszava Sep 15, 2022
9fd25d4
simplify new logic
antalszava Sep 15, 2022
cfa49d3
handle convert_like in aux func manually because we have tuples; no a…
antalszava Sep 15, 2022
241ff0f
Apply review comment wrt. zero_rep
antalszava Sep 15, 2022
ef67d8e
TODO comments for finite diff usages in the tests
antalszava Sep 15, 2022
dfe8672
TODO comments for finite diff usages in the tests
antalszava Sep 15, 2022
4f3a5e6
Merge branch 'master' into grad_transforms_new_return
antalszava Sep 15, 2022
aca7ac2
suggestions applied
antalszava Sep 16, 2022
72f356c
Merge branch 'master' into grad_transforms_new_return
antalszava Sep 16, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
draft
  • Loading branch information
antalszava committed Jul 13, 2022
commit a38477a5add838ede87ae4f06474b25ae0f249a6
52 changes: 37 additions & 15 deletions pennylane/_qubit_device.py
Original file line number Diff line number Diff line change
Expand Up @@ -263,8 +263,6 @@ def execute(self, circuit, **kwargs):
if self.shots is not None or circuit.is_sampled:
self._samples = self.generate_samples()

multiple_sampled_jobs = circuit.is_sampled and self._has_partitioned_shots()

# compute the required statistics
if not self.analytic and self._shot_vector is not None:

Expand All @@ -277,24 +275,38 @@ def execute(self, circuit, **kwargs):
circuit.observables, shot_range=[s1, s2], bin_size=shot_tuple.shots
)

counts_shot_vector = isinstance(r[0], list) and isinstance(r[0][0], dict)

if qml.math._multi_dispatch(r) == "jax": # pylint: disable=protected-access
r = r[0]
elif not isinstance(r[0], dict):
# Measurement types except for Counts
r = qml.math.squeeze(r)
if isinstance(r, (np.ndarray, list)) and r.shape and isinstance(r[0], dict):
# TODO: what if multi-measure and dict is not the first?

if counts_shot_vector:
r = r
else:
# Measurement types except for Counts
r = qml.math.squeeze(r)

r_is_seq = (isinstance(r, np.ndarray) and r.shape or isinstance(r, list))
has_dict = r_is_seq and (isinstance(r[0], dict) or (isinstance(r[0], list) and isinstance(r[0][0], dict)))

#if ((isinstance(r, np.ndarray) and r.shape) or isinstance(r, list)) and isinstance(r[0], dict):
if isinstance(r, (np.ndarray)) and r.shape and isinstance(r[0], dict):
# This happens when measurement type is Counts
results.append(r)

elif counts_shot_vector:
# TODO: if shot vector: good that we extend?
results.extend(r)

elif shot_tuple.copies > 1:
results.extend(r.T)
else:
results.append(r.T)

s1 = s2

if not multiple_sampled_jobs:
# Can only stack single element outputs
results = qml.math.stack(results)
print('final results after appending', results, type(results))

else:
results = self.statistics(circuit.observables)
Expand All @@ -306,10 +318,19 @@ def execute(self, circuit, **kwargs):
if len(circuit.measurements) == 1:
if circuit.measurements[0].return_type is qml.measurements.State:
# State: assumed to only be allowed if it's the only measurement
results = self._asarray(results, dtype=self.C_DTYPE)

if self._has_partitioned_shots():
#TODO: revisit finite shots and State: do we disallow it?
results = tuple([self._asarray(r, dtype=self.C_DTYPE) for r in results])
else:
results = self._asarray(results, dtype=self.C_DTYPE)

elif circuit.measurements[0].return_type is not qml.measurements.Counts:
# Measurements with expval, var or probs
results = self._asarray(results, dtype=self.R_DTYPE)
if self._has_partitioned_shots():
results = tuple([self._asarray(r, dtype=self.R_DTYPE) for r in results])
else:
results = self._asarray(results, dtype=self.R_DTYPE)

elif all(
ret in (qml.measurements.Expectation, qml.measurements.Variance)
Expand All @@ -325,7 +346,8 @@ def execute(self, circuit, **kwargs):

results = self._asarray(results)
else:
results = tuple(self._asarray(r) for r in results)
results = tuple(r for r in results)
print('still: final results after appending?', results, type(results))

# increment counter for number of executions of qubit device
self._num_executions += 1
Expand Down Expand Up @@ -487,9 +509,9 @@ def statistics(self, observables, shot_range=None, bin_size=None):
)

elif obs.return_type is Counts:
results.append(
self.sample(obs, shot_range=shot_range, bin_size=bin_size, counts=True)
)
r = self.sample(obs, shot_range=shot_range, bin_size=bin_size, counts=True)
print("raw result: ", r)
results.append(r)

elif obs.return_type is Probability:
results.append(
Expand Down
3 changes: 3 additions & 0 deletions pennylane/interfaces/autograd.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ def _execute(
with qml.tape.Unwrap(*tapes):
res, jacs = execute_fn(tapes, **gradient_kwargs)

antalszava marked this conversation as resolved.
Show resolved Hide resolved
print("Result now: ", res, type(res))
for i, r in enumerate(res):

if isinstance(r, np.ndarray):
Expand All @@ -125,6 +126,8 @@ def _execute(

elif isinstance(res[i], tuple):
res[i] = tuple(np.tensor(r) for r in res[i])
elif tapes[i]._qfunc_output.return_type is qml.measurements.Counts:
continue

else:
res[i] = qml.math.toarray(res[i])
Expand Down
5 changes: 4 additions & 1 deletion pennylane/interfaces/execution.py
Original file line number Diff line number Diff line change
Expand Up @@ -316,7 +316,10 @@ def cost_fn(params, x):

if isinstance(cache, bool) and cache:
# cache=True: create a LRUCache object
cache = LRUCache(maxsize=cachesize, getsizeof=lambda x: qml.math.shape(x)[0])

# TODO: changed from qml.math.shape to len because shape converts to
# arrays under the hood -> may create ragged array (best solution?)
cache = LRUCache(maxsize=cachesize, getsizeof=lambda x: len(x))
setattr(cache, "_persistent_cache", False)

batch_execute = set_shots(device, override_shots)(device.batch_execute)
Expand Down
10 changes: 4 additions & 6 deletions pennylane/qnode.py
Original file line number Diff line number Diff line change
Expand Up @@ -625,6 +625,7 @@ def __call__(self, *args, **kwargs):
override_shots=override_shots,
**self.execute_kwargs,
)
print("result of execute: ", res)

if autograd.isinstance(res, (tuple, list)) and len(res) == 1:
# If a device batch transform was applied, we need to 'unpack'
Expand All @@ -646,14 +647,11 @@ def __call__(self, *args, **kwargs):

self._update_original_device()

if isinstance(self._qfunc_output, Sequence) or (
self.tape.is_sampled and self.device._has_partitioned_shots()
):
if isinstance(self._qfunc_output, Sequence) or self.device._has_partitioned_shots() or self._qfunc_output.return_type is qml.measurements.Counts:
# Shot vectors outputs are also sequences
return res
if self._qfunc_output.return_type is qml.measurements.Counts:
# return a dictionary with counts not as a single-element array
return res[0]

# Squeeze arraylike outputs
return qml.math.squeeze(res)


Expand Down