Determine similarity between two sequence of quaternions while allowing a degree of freedom around $Z$-axis
$begingroup$
A person holds his phone and rotates it in space in a sequence. I am able to obtain a sequence of quaternions from the phone's motion sensors representing the rotation of the phone from the phone coordinate system to the world coordinate system.
A second person holds his phone and again rotates it in a sequence. I want to determine if the second person is rotating in the same sequence as the first person. I learned that I can use Dynamic Time Warping to compute a similarity score, however, my problem is a little more difficult. I do not know if the second person is facing the same direction as the first person when rotating his phone.
The following graph depicts the phone's coordinate system.
The following graph depicts the world coordinate system
As an example with a sequence of $1$, the first person is facing north. He holds the phone flat (screen pointing up), with the top side of the phone pointing away from his body. In this case, the phone's coordinate system coincide with the world's coordinate system. I get an identity quaternion. The second person is facing west, and he holds the phone in the same way. Because the phone's coordinate system is $90$ degrees around $Z$ off from the world coordinate system, I get a quaternion of $(0i, 0j, 0.7k, 0.7)$, which is basically a rotation of $90$ degree around $Z$.
From the person's perspective, both hold the phone in the exact same orientation. The question is that how do I find the angle of rotation around $Z$ such that it minimizes the differences between the two quaternion sequences? In this example, I want the algorithm to say this is an exact match.
Even though the example uses a sequence of $1$, I really want to match a longer sequence of rotations.
I am only interested in rotation similarity, and do not care about translation in space. I also consider a phone facing up to be different from a phone facing down.
I thought about decomposing a quaternion into a rotation around $Z$ and another quaternion without the $Z$ component, but I think the decomposition is not unique. Any other ideas what I can do to check if the two sequences of rotations are similar, subject to a constant rotation around $Z$?
3d rotations quaternions curves
$endgroup$
add a comment |
$begingroup$
A person holds his phone and rotates it in space in a sequence. I am able to obtain a sequence of quaternions from the phone's motion sensors representing the rotation of the phone from the phone coordinate system to the world coordinate system.
A second person holds his phone and again rotates it in a sequence. I want to determine if the second person is rotating in the same sequence as the first person. I learned that I can use Dynamic Time Warping to compute a similarity score, however, my problem is a little more difficult. I do not know if the second person is facing the same direction as the first person when rotating his phone.
The following graph depicts the phone's coordinate system.
The following graph depicts the world coordinate system
As an example with a sequence of $1$, the first person is facing north. He holds the phone flat (screen pointing up), with the top side of the phone pointing away from his body. In this case, the phone's coordinate system coincide with the world's coordinate system. I get an identity quaternion. The second person is facing west, and he holds the phone in the same way. Because the phone's coordinate system is $90$ degrees around $Z$ off from the world coordinate system, I get a quaternion of $(0i, 0j, 0.7k, 0.7)$, which is basically a rotation of $90$ degree around $Z$.
From the person's perspective, both hold the phone in the exact same orientation. The question is that how do I find the angle of rotation around $Z$ such that it minimizes the differences between the two quaternion sequences? In this example, I want the algorithm to say this is an exact match.
Even though the example uses a sequence of $1$, I really want to match a longer sequence of rotations.
I am only interested in rotation similarity, and do not care about translation in space. I also consider a phone facing up to be different from a phone facing down.
I thought about decomposing a quaternion into a rotation around $Z$ and another quaternion without the $Z$ component, but I think the decomposition is not unique. Any other ideas what I can do to check if the two sequences of rotations are similar, subject to a constant rotation around $Z$?
3d rotations quaternions curves
$endgroup$
add a comment |
$begingroup$
A person holds his phone and rotates it in space in a sequence. I am able to obtain a sequence of quaternions from the phone's motion sensors representing the rotation of the phone from the phone coordinate system to the world coordinate system.
A second person holds his phone and again rotates it in a sequence. I want to determine if the second person is rotating in the same sequence as the first person. I learned that I can use Dynamic Time Warping to compute a similarity score, however, my problem is a little more difficult. I do not know if the second person is facing the same direction as the first person when rotating his phone.
The following graph depicts the phone's coordinate system.
The following graph depicts the world coordinate system
As an example with a sequence of $1$, the first person is facing north. He holds the phone flat (screen pointing up), with the top side of the phone pointing away from his body. In this case, the phone's coordinate system coincide with the world's coordinate system. I get an identity quaternion. The second person is facing west, and he holds the phone in the same way. Because the phone's coordinate system is $90$ degrees around $Z$ off from the world coordinate system, I get a quaternion of $(0i, 0j, 0.7k, 0.7)$, which is basically a rotation of $90$ degree around $Z$.
From the person's perspective, both hold the phone in the exact same orientation. The question is that how do I find the angle of rotation around $Z$ such that it minimizes the differences between the two quaternion sequences? In this example, I want the algorithm to say this is an exact match.
Even though the example uses a sequence of $1$, I really want to match a longer sequence of rotations.
I am only interested in rotation similarity, and do not care about translation in space. I also consider a phone facing up to be different from a phone facing down.
I thought about decomposing a quaternion into a rotation around $Z$ and another quaternion without the $Z$ component, but I think the decomposition is not unique. Any other ideas what I can do to check if the two sequences of rotations are similar, subject to a constant rotation around $Z$?
3d rotations quaternions curves
$endgroup$
A person holds his phone and rotates it in space in a sequence. I am able to obtain a sequence of quaternions from the phone's motion sensors representing the rotation of the phone from the phone coordinate system to the world coordinate system.
A second person holds his phone and again rotates it in a sequence. I want to determine if the second person is rotating in the same sequence as the first person. I learned that I can use Dynamic Time Warping to compute a similarity score, however, my problem is a little more difficult. I do not know if the second person is facing the same direction as the first person when rotating his phone.
The following graph depicts the phone's coordinate system.
The following graph depicts the world coordinate system
As an example with a sequence of $1$, the first person is facing north. He holds the phone flat (screen pointing up), with the top side of the phone pointing away from his body. In this case, the phone's coordinate system coincide with the world's coordinate system. I get an identity quaternion. The second person is facing west, and he holds the phone in the same way. Because the phone's coordinate system is $90$ degrees around $Z$ off from the world coordinate system, I get a quaternion of $(0i, 0j, 0.7k, 0.7)$, which is basically a rotation of $90$ degree around $Z$.
From the person's perspective, both hold the phone in the exact same orientation. The question is that how do I find the angle of rotation around $Z$ such that it minimizes the differences between the two quaternion sequences? In this example, I want the algorithm to say this is an exact match.
Even though the example uses a sequence of $1$, I really want to match a longer sequence of rotations.
I am only interested in rotation similarity, and do not care about translation in space. I also consider a phone facing up to be different from a phone facing down.
I thought about decomposing a quaternion into a rotation around $Z$ and another quaternion without the $Z$ component, but I think the decomposition is not unique. Any other ideas what I can do to check if the two sequences of rotations are similar, subject to a constant rotation around $Z$?
3d rotations quaternions curves
3d rotations quaternions curves
edited Dec 24 '18 at 9:01
Glorfindel
3,41381830
3,41381830
asked Dec 16 '14 at 5:13
MathChallengedMathChallenged
11
11
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
If both people carry the phone in the same orientation, then the curves will be similar, regardless of the direction (N.S.E.W)
The problem will come if the two people hold hold the phone in different orientations, for example phone pointing up vs phone pointing down etc. THIS is the problem to solve.
One trick to solve this is to combine all directions $(X,Y,Z)$ into a single time series. This single time series will be invariant to phone orientation.
One caveat: Combining three axis is trivial in outer space, but here on earth there is a constant $-9.8Ms^2$ acceleration shared between the three axes. Many phones can do a "trick" to remove this for you automatically (so, at rest on your desk, the three axes are all zero).
The conversion to 1d can cause ambiguity between two different motions in a handful of cases, but in practice it works VERY well.
How can you compute DTW on a fast moving stream? The problem has been solved here http://www.cs.ucr.edu/~eamonn/UCRsuite.html
I recommend reading http://www.cs.ucr.edu/~eamonn/SDM_RealisticTSClassifcation_cameraReady.pdf
You must z-normalize both time series before comparison
You should limit the warping window (Ratanamahatana, C. A. and Keogh. E. (2004). Everything you know about Dynamic Time Warping is Wrong. )
$endgroup$
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
add a comment |
$begingroup$
Really late response that relates to those who find themselves in this post.
When both people hold the phone flat (screen pointing up) but differ only around the $Z$-Axis the solution is to change the reference frame of the phone.
Before sampling our data from the phone sensors we need to have a calibration phase where we will store the Initial Orientation Quaternion (the direction the person is facing before he starts the rotation) and Invert it. Inverting a quaternion is really easy when the Quaternion is normalised. We just need its' conjugate and call it for example Calibration_Quaternion.
Next we need to understand the product of two quaternions.
Let $A,B$ be two unit Quaternions (Normalised). In $A * B$, $A$ is the base orientation and $B$ is the rotation you apply to it. If you used $B$ as a base and applied $A$ as a rotation, you get something different.Remember that a Quaternion can be both an orientation and a rotation and also rotation is not commutative, meaning $A * B$ is different from $B * A$
To acquire now the new reference frame we just multiply our Current Rotation by Calibration_Quaternion to cancel out the difference, giving us an orientation in our desired frame of reference.
So to sum up. In calibration phase we store the Conjugate Quaternion and then Multiply each element of the Quaternion Series with it.
$$SampledQuaternion*CalibrationQuaternion$$
If you do this for both Quaternion Series you can proceed to Dynamic Time Warping with no problem. Unfortunately, if the one person is holding the phone upside down for example and the other flat (screen pointing up), the above method does not apply and I would love to find a solution.
At last remember to always normalize quaternions even if you don't need to.
Remarkable piece of Information: Coordinate Systems and Transformations
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1070176%2fdetermine-similarity-between-two-sequence-of-quaternions-while-allowing-a-degree%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If both people carry the phone in the same orientation, then the curves will be similar, regardless of the direction (N.S.E.W)
The problem will come if the two people hold hold the phone in different orientations, for example phone pointing up vs phone pointing down etc. THIS is the problem to solve.
One trick to solve this is to combine all directions $(X,Y,Z)$ into a single time series. This single time series will be invariant to phone orientation.
One caveat: Combining three axis is trivial in outer space, but here on earth there is a constant $-9.8Ms^2$ acceleration shared between the three axes. Many phones can do a "trick" to remove this for you automatically (so, at rest on your desk, the three axes are all zero).
The conversion to 1d can cause ambiguity between two different motions in a handful of cases, but in practice it works VERY well.
How can you compute DTW on a fast moving stream? The problem has been solved here http://www.cs.ucr.edu/~eamonn/UCRsuite.html
I recommend reading http://www.cs.ucr.edu/~eamonn/SDM_RealisticTSClassifcation_cameraReady.pdf
You must z-normalize both time series before comparison
You should limit the warping window (Ratanamahatana, C. A. and Keogh. E. (2004). Everything you know about Dynamic Time Warping is Wrong. )
$endgroup$
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
add a comment |
$begingroup$
If both people carry the phone in the same orientation, then the curves will be similar, regardless of the direction (N.S.E.W)
The problem will come if the two people hold hold the phone in different orientations, for example phone pointing up vs phone pointing down etc. THIS is the problem to solve.
One trick to solve this is to combine all directions $(X,Y,Z)$ into a single time series. This single time series will be invariant to phone orientation.
One caveat: Combining three axis is trivial in outer space, but here on earth there is a constant $-9.8Ms^2$ acceleration shared between the three axes. Many phones can do a "trick" to remove this for you automatically (so, at rest on your desk, the three axes are all zero).
The conversion to 1d can cause ambiguity between two different motions in a handful of cases, but in practice it works VERY well.
How can you compute DTW on a fast moving stream? The problem has been solved here http://www.cs.ucr.edu/~eamonn/UCRsuite.html
I recommend reading http://www.cs.ucr.edu/~eamonn/SDM_RealisticTSClassifcation_cameraReady.pdf
You must z-normalize both time series before comparison
You should limit the warping window (Ratanamahatana, C. A. and Keogh. E. (2004). Everything you know about Dynamic Time Warping is Wrong. )
$endgroup$
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
add a comment |
$begingroup$
If both people carry the phone in the same orientation, then the curves will be similar, regardless of the direction (N.S.E.W)
The problem will come if the two people hold hold the phone in different orientations, for example phone pointing up vs phone pointing down etc. THIS is the problem to solve.
One trick to solve this is to combine all directions $(X,Y,Z)$ into a single time series. This single time series will be invariant to phone orientation.
One caveat: Combining three axis is trivial in outer space, but here on earth there is a constant $-9.8Ms^2$ acceleration shared between the three axes. Many phones can do a "trick" to remove this for you automatically (so, at rest on your desk, the three axes are all zero).
The conversion to 1d can cause ambiguity between two different motions in a handful of cases, but in practice it works VERY well.
How can you compute DTW on a fast moving stream? The problem has been solved here http://www.cs.ucr.edu/~eamonn/UCRsuite.html
I recommend reading http://www.cs.ucr.edu/~eamonn/SDM_RealisticTSClassifcation_cameraReady.pdf
You must z-normalize both time series before comparison
You should limit the warping window (Ratanamahatana, C. A. and Keogh. E. (2004). Everything you know about Dynamic Time Warping is Wrong. )
$endgroup$
If both people carry the phone in the same orientation, then the curves will be similar, regardless of the direction (N.S.E.W)
The problem will come if the two people hold hold the phone in different orientations, for example phone pointing up vs phone pointing down etc. THIS is the problem to solve.
One trick to solve this is to combine all directions $(X,Y,Z)$ into a single time series. This single time series will be invariant to phone orientation.
One caveat: Combining three axis is trivial in outer space, but here on earth there is a constant $-9.8Ms^2$ acceleration shared between the three axes. Many phones can do a "trick" to remove this for you automatically (so, at rest on your desk, the three axes are all zero).
The conversion to 1d can cause ambiguity between two different motions in a handful of cases, but in practice it works VERY well.
How can you compute DTW on a fast moving stream? The problem has been solved here http://www.cs.ucr.edu/~eamonn/UCRsuite.html
I recommend reading http://www.cs.ucr.edu/~eamonn/SDM_RealisticTSClassifcation_cameraReady.pdf
You must z-normalize both time series before comparison
You should limit the warping window (Ratanamahatana, C. A. and Keogh. E. (2004). Everything you know about Dynamic Time Warping is Wrong. )
edited Jul 27 '17 at 19:20
Irregardless
435417
435417
answered Dec 17 '14 at 3:58
user2313186user2313186
1
1
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
add a comment |
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
$begingroup$
Thanks for the answer, will look into the reference too, but I do care about the direction. I added more clarification to the description. Basically, I want to match a sequence of rotations from a person's perspective, but the quaternion I receive from phone is with respect to the world coordinate system, and depending on which direction the user faces, the quaternion has an added rotation around Z, which I am hoping I can filer out.
$endgroup$
– MathChallenged
Dec 17 '14 at 18:18
add a comment |
$begingroup$
Really late response that relates to those who find themselves in this post.
When both people hold the phone flat (screen pointing up) but differ only around the $Z$-Axis the solution is to change the reference frame of the phone.
Before sampling our data from the phone sensors we need to have a calibration phase where we will store the Initial Orientation Quaternion (the direction the person is facing before he starts the rotation) and Invert it. Inverting a quaternion is really easy when the Quaternion is normalised. We just need its' conjugate and call it for example Calibration_Quaternion.
Next we need to understand the product of two quaternions.
Let $A,B$ be two unit Quaternions (Normalised). In $A * B$, $A$ is the base orientation and $B$ is the rotation you apply to it. If you used $B$ as a base and applied $A$ as a rotation, you get something different.Remember that a Quaternion can be both an orientation and a rotation and also rotation is not commutative, meaning $A * B$ is different from $B * A$
To acquire now the new reference frame we just multiply our Current Rotation by Calibration_Quaternion to cancel out the difference, giving us an orientation in our desired frame of reference.
So to sum up. In calibration phase we store the Conjugate Quaternion and then Multiply each element of the Quaternion Series with it.
$$SampledQuaternion*CalibrationQuaternion$$
If you do this for both Quaternion Series you can proceed to Dynamic Time Warping with no problem. Unfortunately, if the one person is holding the phone upside down for example and the other flat (screen pointing up), the above method does not apply and I would love to find a solution.
At last remember to always normalize quaternions even if you don't need to.
Remarkable piece of Information: Coordinate Systems and Transformations
$endgroup$
add a comment |
$begingroup$
Really late response that relates to those who find themselves in this post.
When both people hold the phone flat (screen pointing up) but differ only around the $Z$-Axis the solution is to change the reference frame of the phone.
Before sampling our data from the phone sensors we need to have a calibration phase where we will store the Initial Orientation Quaternion (the direction the person is facing before he starts the rotation) and Invert it. Inverting a quaternion is really easy when the Quaternion is normalised. We just need its' conjugate and call it for example Calibration_Quaternion.
Next we need to understand the product of two quaternions.
Let $A,B$ be two unit Quaternions (Normalised). In $A * B$, $A$ is the base orientation and $B$ is the rotation you apply to it. If you used $B$ as a base and applied $A$ as a rotation, you get something different.Remember that a Quaternion can be both an orientation and a rotation and also rotation is not commutative, meaning $A * B$ is different from $B * A$
To acquire now the new reference frame we just multiply our Current Rotation by Calibration_Quaternion to cancel out the difference, giving us an orientation in our desired frame of reference.
So to sum up. In calibration phase we store the Conjugate Quaternion and then Multiply each element of the Quaternion Series with it.
$$SampledQuaternion*CalibrationQuaternion$$
If you do this for both Quaternion Series you can proceed to Dynamic Time Warping with no problem. Unfortunately, if the one person is holding the phone upside down for example and the other flat (screen pointing up), the above method does not apply and I would love to find a solution.
At last remember to always normalize quaternions even if you don't need to.
Remarkable piece of Information: Coordinate Systems and Transformations
$endgroup$
add a comment |
$begingroup$
Really late response that relates to those who find themselves in this post.
When both people hold the phone flat (screen pointing up) but differ only around the $Z$-Axis the solution is to change the reference frame of the phone.
Before sampling our data from the phone sensors we need to have a calibration phase where we will store the Initial Orientation Quaternion (the direction the person is facing before he starts the rotation) and Invert it. Inverting a quaternion is really easy when the Quaternion is normalised. We just need its' conjugate and call it for example Calibration_Quaternion.
Next we need to understand the product of two quaternions.
Let $A,B$ be two unit Quaternions (Normalised). In $A * B$, $A$ is the base orientation and $B$ is the rotation you apply to it. If you used $B$ as a base and applied $A$ as a rotation, you get something different.Remember that a Quaternion can be both an orientation and a rotation and also rotation is not commutative, meaning $A * B$ is different from $B * A$
To acquire now the new reference frame we just multiply our Current Rotation by Calibration_Quaternion to cancel out the difference, giving us an orientation in our desired frame of reference.
So to sum up. In calibration phase we store the Conjugate Quaternion and then Multiply each element of the Quaternion Series with it.
$$SampledQuaternion*CalibrationQuaternion$$
If you do this for both Quaternion Series you can proceed to Dynamic Time Warping with no problem. Unfortunately, if the one person is holding the phone upside down for example and the other flat (screen pointing up), the above method does not apply and I would love to find a solution.
At last remember to always normalize quaternions even if you don't need to.
Remarkable piece of Information: Coordinate Systems and Transformations
$endgroup$
Really late response that relates to those who find themselves in this post.
When both people hold the phone flat (screen pointing up) but differ only around the $Z$-Axis the solution is to change the reference frame of the phone.
Before sampling our data from the phone sensors we need to have a calibration phase where we will store the Initial Orientation Quaternion (the direction the person is facing before he starts the rotation) and Invert it. Inverting a quaternion is really easy when the Quaternion is normalised. We just need its' conjugate and call it for example Calibration_Quaternion.
Next we need to understand the product of two quaternions.
Let $A,B$ be two unit Quaternions (Normalised). In $A * B$, $A$ is the base orientation and $B$ is the rotation you apply to it. If you used $B$ as a base and applied $A$ as a rotation, you get something different.Remember that a Quaternion can be both an orientation and a rotation and also rotation is not commutative, meaning $A * B$ is different from $B * A$
To acquire now the new reference frame we just multiply our Current Rotation by Calibration_Quaternion to cancel out the difference, giving us an orientation in our desired frame of reference.
So to sum up. In calibration phase we store the Conjugate Quaternion and then Multiply each element of the Quaternion Series with it.
$$SampledQuaternion*CalibrationQuaternion$$
If you do this for both Quaternion Series you can proceed to Dynamic Time Warping with no problem. Unfortunately, if the one person is holding the phone upside down for example and the other flat (screen pointing up), the above method does not apply and I would love to find a solution.
At last remember to always normalize quaternions even if you don't need to.
Remarkable piece of Information: Coordinate Systems and Transformations
answered Jan 28 at 19:49
Giorgis3Giorgis3
64
64
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1070176%2fdetermine-similarity-between-two-sequence-of-quaternions-while-allowing-a-degree%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown