-
-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix flaky REM test / refactor #2464
Changes from 6 commits
07a3d7f
e6c24e7
fb2ce62
df71a39
7570704
f8725eb
65bef0a
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -14,38 +14,35 @@ | |
|
||
|
||
def sample_probability_vector( | ||
probability_vector: npt.NDArray[np.float64], samples: int | ||
) -> List[Bitstring]: | ||
probability_vector: Sequence[float], samples: int | ||
) -> list[str]: | ||
"""Generate a number of samples from a probability distribution as | ||
bitstrings. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It could be useful to add an example of input --> output for our future selves. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Considering this left us a bit confused when first encountering it, I think that's a great idea. Done in f8725eb. |
||
|
||
Args: | ||
probability_vector: A probability vector. | ||
samples: The number of samples to generate. | ||
|
||
Returns: | ||
A list of sampled bitstrings. | ||
|
||
Example: | ||
>>> sample_probability_vector([0, 1/2, 1/4, 1/4], 4) | ||
['01', '10', '11', '11'] | ||
""" | ||
# sample using the probability distribution given | ||
num_values = len(probability_vector) | ||
choices = np.random.choice(num_values, size=samples, p=probability_vector) | ||
|
||
# convert samples to binary strings | ||
bit_width = int(np.log2(num_values)) | ||
binary_repr_vec = np.vectorize(np.binary_repr) | ||
binary_strings = binary_repr_vec(choices, width=bit_width) | ||
|
||
# split the binary strings into an array of ints | ||
bitstrings = ( | ||
np.apply_along_axis( # type: ignore | ||
func1d=np.fromstring, # type: ignore | ||
axis=1, | ||
arr=binary_strings[:, None], | ||
dtype="U1", # type: ignore | ||
if not np.log2(num_values).is_integer(): | ||
raise ValueError( | ||
"The length of the probability vector must be a power of 2." | ||
) | ||
.astype(np.uint8) | ||
.tolist() | ||
|
||
sampled_indices = np.random.choice( | ||
num_values, size=samples, p=probability_vector | ||
) | ||
|
||
bit_width = int(np.log2(num_values)) | ||
bitstrings = [format(index, f"0{bit_width}b") for index in sampled_indices] | ||
|
||
return bitstrings | ||
|
||
|
||
|
@@ -60,7 +57,7 @@ def bitstrings_to_probability_vector( | |
bitstrings: All measured bitstrings. | ||
|
||
Returns: | ||
A probabiity vector corresponding to the measured bitstrings. | ||
A probability vector corresponding to the measured bitstrings. | ||
""" | ||
pv = np.zeros(2 ** len(bitstrings[0])) | ||
for bs in bitstrings: | ||
|
@@ -132,7 +129,7 @@ def generate_tensored_inverse_confusion_matrix( | |
|
||
def closest_positive_distribution( | ||
quasi_probabilities: npt.NDArray[np.float64], | ||
) -> npt.NDArray[np.float64]: | ||
) -> List[float]: | ||
natestemen marked this conversation as resolved.
Show resolved
Hide resolved
|
||
"""Given the input quasi-probability distribution returns the closest | ||
positive probability distribution (with respect to the total variation | ||
distance). | ||
|
@@ -163,7 +160,7 @@ def distance(probabilities: npt.NDArray[np.float64]) -> np.float64: | |
raise ValueError( | ||
"REM failed to determine the closest positive distribution." | ||
) | ||
return result.x | ||
return result.x.tolist() | ||
|
||
|
||
def mitigate_measurements( | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am curious as to what's usually preferred when we specify the return type?
I see you have used
list[str]
here but you have also usedfrom typing import List
before specifying the return type asList[str]
on another line.Searching around for this shows that we do not need to use the typing module for
>= Python 3.9
. Do we need to add this as a to-do for other modules in mitiq?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't we have this exact same discussion with @vprusso in another PR? iirc there we learned that
list[str]
is the way to go.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we did. Which is why I was asking if we need to add this to our todos.
We still use things like
from typing import List
elsewhere.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need to be so proactive. The aliases are deprecated, but they are unlikely to be removed in the foreseeable future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, we should not take action until we need to on this. In the meantime lets try to use the builtins instead of
typing.List
, but it's not a big concern since they both achieve the same goal.If we do migrate eventually, we will have to import a large majority of the types from
collections.abc
since many of the types we rely on (e.g.typing.Sequence
) are deprecated in favor of coming fromcollections.abc
.