Sounds like you would want something similar to the wireless sound detection demo.
See the wireless sound demo for a use of the GraphicalLocator actor which extends the Locator actor: $PTII/ptolemy/domains/wireless/demo/WirelessSoundDetection/WirelessSoundDetection.htm There is an applet version at http://ptolemy.eecs.berkeley.edu/ptolemyII/ptII4.0/ptII4.0.1/ptolemy/domains/wireless/demo/WirelessSoundDetection/WirelessSoundDetection.htm The wireless specific versoin of Ptolemy II is described in: http://ptolemy.eecs.berkeley.edu/publications/papers/04/VisualSense/VisualSense.pdf and http://ptolemy.eecs.berkeley.edu/papers/04/VisualSenseERLMemo0408/index.htm For a model that moves an icon, see $PTII/ptolemy/actor/parameters/demo/Bouncer/Bouncer.xml Double click on the Bouncer actor to see that _location is set to [250.0, locationY] and locationY is set to centerPoint _Christopher -------- Dear All i'm a new member in the group and I'm also a new user of ptolemy. I was reading the documentation but I coildn't find an answer to my question Can visual sense be used to model an efficient target tracking algorithm for wireless sensor networks. I will need to model the used routing protocol and the clustering algorithm. Moreover I'll need to move some of the sensors? Can visual sense do that and if yes then how. I have been trying to move the sensors using the location class but to no avail. Many thanx in advance for your cooperation -- Eng. Ghada Badawy Research Assistant Computer Department, Faculty of Engineering Cairo University --------------------------------------------------------------------------- - Posted to the ptolemy-hackers mailing list. Please send administrative mail for this list to: [EMAIL PROTECTED] -------- ---------------------------------------------------------------------------- Posted to the ptolemy-hackers mailing list. Please send administrative mail for this list to: [EMAIL PROTECTED]