Yes, the Iphone uses an OOP--in fact, from what I understand, you can program in anything--C++, for example, so yes; the architecture of the objects would have to swap data back and forth, but it's pretty modular. One way around the processing load might be to establish a link back to the user's home computer--desktop, laptop, whatever, which might be able to do some of the work, depending on band width, comm reception, etc. Thanks for the reading matter; I'm going to have to look into this further. Since I don't have an Iphone, nor a job to generate the money for one, I'll have to find someone's Iphone to field test the stuff on; I intend to contact Apple about beta testing some stuff, too. Right now, I'm sorta in a stuck mode, spinning my wheels (which makes my dog look at me funny), though. working on that, too...
Your description of tactile interface between environmental components and the user leads me to wonder, "How will the computer know to produce a splashing sound for water, or rustle of leaves for plants?" That kind of pattern recognitions and intuitive guess work is far beyond anything I've ever seen done with a computer. This is one of the major problems with an environment like "Second Life," even with Max the Guide Dog's assistance; too hard to recognize ambiguous environmental components. Definitely got some reading to do. Mark BurningHawk Skype and Twitter: BurningHawk1969 MSN: [email protected] My home page: http://MarkBurningHawk.net/ --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "MacVisionaries" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/macvisionaries?hl=en -~----------~----~----~----~------~----~------~--~---
