Dear Ana,
That project sounds interesting to me! Maybe you can use the class
mitk::NavigationDataObjectVisualizationFilter[1] and set your volume
data set as representation object. But I am not sure if this is already
sufficient for your project. There is also support for tracked US to
e.g. perform a calibration between US image and tracker (see class
mitk::TrackedUltrasound[2] and plugin
org.mitk.gui.qt.igt.app.ultrasoundtrackingnavigation). But this is more
for real-time US data and might need some customization to connect
tracker and offline/simulated image data.
Please note that the easiest way to activate this in your superbuild,
choose MITK_BUILD_CONFIGURATION = “mitkNavigationModules” in cmake
before compiling the superbuild.
I hope this helps a bit.
Best,
Alfred Franz
Links:
[1]
https://docs.mitk.org/nightly/classmitk_1_1NavigationDataObjectVisualizationFilter.html
[2] https://docs.mitk.org/nightly/classmitk_1_1TrackedUltrasound.html
Am 29.03.2023 um 20:15 schrieb anagha m gandha:
Hello,
I am using the MITK workbench for US navigation using the Polhemus
tracker. I am using it for medical simulation only. I am not using a
video device. My requirement is to simply load a volume data and
visualise the tracker movements on it. Right now I am unable to
simultaneously do both. Is this possible? If so what MITK
functionality do i need to make use of so that i can do this?
Another approach i used was trying to send the position data to slicer
using OpenIGTLink so that i can visualise it on a volume. But since it
only sends transform data i am unable to do that.
What do you think would be the best approach for me to do a simulation
with MITK only tracking data ?Thanks in advance.
-Ana
_______________________________________________
mitk-users mailing list
mitk-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mitk-users
_______________________________________________
mitk-users mailing list
mitk-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mitk-users