Internally here in Leuven we already created skeleton message types,
I've split them up for the arms, torso and legs, the reason for this
is that sometimes you want to track the arms separately. As Patrick
stated we have also have connected messages that state the covariance
for each of the variables. (separate because sometimes you don't need them)
We also have message types for the head, hands, ... included because
these are things you can track with an individual tracker.
All of these messages are input to a combined EKF, PF estimator.
I'm also interested in building a common message type for this. I
would prefer this message type to be 'scalable'.
Currently we only use joint angles and map this to a KDL tree but our
OpenGL renderer also supports bone lengths so I prefer the package
can be extended to include this also.
One of the things we're working hard is also to recognize gestures.
PS: A recent video with PR2, LWR, Kinect and augmented VR goggles
illustrating this framework will be put on-line as soon I find the
time to edit the video.
On 28 March 2011 05:29, Patrick Goebel <email@example.com> wrote: > Hi Marcus,
> I haven't thought through use cases very much but one application would be
> creating a library of human gestures and poses that the robot can learn to
> recognize. The only reason I created a Skeleton message type for my
> pi_tracker package is that I wanted to have access to the joint confidence
> values. Otherwise, I think the standard ROS tf/tfMessage type comes close.
> In the meantime, and just for fun, I have created a skeleton_markers package
> http://www.ros.org/wiki/skeleton_markers >
> This allows the visualization of the tracked joints in RViz. There are two
> ways to run the package--two different Python scripts: one is used with
> pi_tracker and subscribes to the skeleton message topic. The other uses
> just the transforms published by the openni_tracker package. See the Wiki
> page for details. Here is a short video of the result:
> http://www.youtube.com/watch?v=nTRi_kIgGW0 >
> On 03/26/2011 03:27 PM, Marcus Liebhardt wrote:
> Hi there!
> That indeed looks like a tree to me.
> I would be interested in the use cases you have in mind for these skeleton
> What information would you like to gather in those messages?
> Poses of the tracked frames, interconnections of each frame, also distances
> between the connected frames?
> I don't have deep knowledge about the processing of the KInect or the
> openni_tracker. I'm currently just using the transforms. But I think there
> are simplifications made for some frames, which could be useful to take into
> account. For example, I think the neck is always half way between the left
> and right shoulder, and the head is always on top of it. Things like that
> might be interesting as well to keep in the message, so that one can easily
> reproduce the previously mentioned skeleton - if that is one of the use
> Best regards,
> :-) Marcus
> 2011/3/26 Patrick Goebel <firstname.lastname@example.org>
>> I see what you mean--the user looks more like a cactus than a tree. :)))
>> However, I think if you take the torso joint as the root, don't you end up
>> with a tree structure? e.g.
>> Maybe this is what you meant by it depends on how you define them.
>> On 03/26/2011 12:22 PM, David Lu!! wrote:
>> It depends on how you define them. The skeleton visualized with the kinect
>> isn't a tree, although it could be converted to be one, I suppose
>> On Sat, Mar 26, 2011 at 1:11 PM, Patrick Goebel <email@example.com>
>>> Hi David,
>>> I agree it would be nice to have a standardized Skeleton message. If all
>>> skeleton's are trees (is this true?) then perhaps an existing tree message
>>> type could be used if it exists. KDL has kinematic chains and tf is already
>>> built around trees (right?). That's about as far as I've thought it
>>> On 03/22/2011 04:17 PM, David Lu!! wrote:
>>> Hey Ros-users-
>>> Has there been any talk of creating a standardized Skeleton message?
>>> Right now, it seems like the openni_tracker package just publishes
>>> transforms. It seems like if the Kinect does skeleton tracking, there should
>>> be a skeleton message.
>>> I know pi_tracker has its own Skeleton message, which looks like it might
>>> fit the bill, although I don't think it defines which parts are connected
>>> (hip bone is connected to the thigh bone).
>>> http://www.ros.org/doc/api/pi_tracker/html/msg/Skeleton.html >>> It might be good to have for not only the Kinect, but for other motion
>>> capture rigs (Vicon and the like).
>>> Are there any other similar message out there?
>>> ros-users mailing list
>>> firstname.lastname@example.org >>> https://code.ros.org/mailman/listinfo/ros-users >>
>> ros-users mailing list
>> email@example.com >> https://code.ros.org/mailman/listinfo/ros-users >>
> ros-users mailing list
> firstname.lastname@example.org > https://code.ros.org/mailman/listinfo/ros-users > _______________________________________________
ros-users mailing list
This message was posted to the following mailing lists: