On 4/9/2010 12:13 PM, Andy Pugh wrote:
> On 9 April 2010 16:41, Slavko Kocjancic<[email protected]>  wrote:
>
>    
>> After that I start Camera window. I right click camera window and select
>> always on top and move that window over AXIS preview place. (just fit
>> inside)
>> After that I start Learn.py and make that window always on top too and
>> move over code window of AXIS.
>> So now I have all windows visible doesn't matter witch is selected.
>> Now I must click somewhere on AXIS interface and jog machine to
>> reference point (with help of webcam)
>>      
> Two possibilities occur to me:
> 1) You could add "jog" and "store" buttons to the learn.py interface.
> 2) You could hook up a USB joypad for jogging (well worth doing
> anyway) and use one or more of the buttons on that to store the
> learned position. (You can link physical buttons to hal pins that will
> run G-code fragments, see the [HALUI] section on this
> page:http://wiki.linuxcnc.org/cgi-bin/emcinfo.pl/emcinfo.pl?Adding_More_Controls_To_Simple_Remote_Pendant
>
>
>    

>>You can link physical buttons to hal pins that will
run G-code fragments, see the [HALUI] section on this.

Yes, very cool and as Andy pointed out a while back that fragment can call a G 
code subroutine which can have even more code in it..  :-)

I might be trying that out very soon.

Dave




------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Emc-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/emc-users

Reply via email to