This way, when I am building UI interfaces, i can see exactly how the touch effects the objects and the various handlers, stop a handler even, check the variables, make sure everything is passed correctly, etc.
I develop on a touchscreen Windows 8 PC. I think this idea could be implemented as easily as allowing me to send touch messages using my screen in the normal Livecode IDE. Or maybe for those who don't have a touch screen, have a switch that enables the mouse to send touch messages. Having the orientations change, having a simulated native keyboard, contact list, datebook, and media gallery would be really fancy and nice in the future as well.
I tried this btw and it doesn't really work.
Code: Select all
on mouseDown
touchStart
end mouseDown
on mouseUp
touchEnd
end mouseUp
on mouseMove
put item 1 of the mouseloc into pX
put item 2 of the mouseloc into pY
touchMove 1, pX, pY
end mouseMove
So maybe someone else has found a better way. Maybe someone else has an API that solves this. Maybe others have encountered the same situation as myself with this and want to support this post. So if you find yourself in any of these categories please post.
Thanks
Will