10:46 am Austin Chau: Office Hours, 2009-09-09 [Now Open] Welcome to API office hours! Our door is open ...
Please toppost your question or comment about the robot, gadget, or embed API to make sure we see it. You can do so by replying to this blip. A transcript of this wave will be made publicly available. If you don't want to appear in the transcript, delete your blip after getting a response. Be careful not to remove neighboring blips. tag:office-hours 12:02 pm Joachim Larsen: Have you had to ban any users from wave thus far? and if so what would a ballpark figure be? Will you have any 'official' channels thru which to accept spam/abuse reports? 12:04 pm Brian Kennish: I don't think we've banned anyone. And, yes, there will be better ways to report bugs and spam. 12:03 pm Alexandro Jimenez: Is there a way to delete the private blips inside a wave? I know that before there was a way to even delete the trunk blip, and this bug allowed to delete the private blips, but that bug was fixed, and the only other way to delete them now is to make a selection blip, and add the private blip on it, to just delete the main blip holding the selected blips. But in some cases the private blips come back in the wave at the end. 12:06 pm Brian Kennish: You mean using the client? 12:08 pm Alexandro Jimenez: No, I just mean like blipping someone inside a wave, or just simply adding a Private Reply 12:07 pm Brian Kennish: But you're referring to doing this in the client not the API, right? 12:08 pm Alexandro Jimenez: Oh sorry yes. 12:09 pm Brian Kennish: Got it. So I wasn't aware of this, but it sounds like a bug. Do you know if it's been filed already? If not, can you submit it. 12:10 pm Alexandro Jimenez: I'm not even sure where to file the bugs. 12:11 pm Brian Kennish: Here: http://code.google.com/p/google-wave-resources/issues/list. 12:12 pm Alexandro Jimenez: Oh ok, I just did a quick search for "Private Blip" and did not find it, so I will file the bug now. Could I use the Create Bug button in wave to do so too? 11:58 am Joachim Larsen: When / If Google opensources more of wave, which license do you envision it will be under, and would embracing AGPL go some ways towards alleviating whatever constraints you are under at this point? 11:59 am Brian Kennish: It should be under the same Apache license (2.0). 11:55 am Joachim Larsen: Hi, Thank you for the invite, thank you for bringing us wave :) I have seen some pretty cool gadgets and robots spring up here and there. unfortunately I can see the spellcheck got lost along the way ;) What happened with Bryce's blog? What do you guys(and gals) see as potential showstoppers going forward? It seems to me that without the efficient transforms then playback and related functionality becomes a bit problematic. Can you confirm or dispel anything regarding patent wrangling? Have you seen the rather entertaining and poignant video on google sessions entitled 'Poisonous People'? (And no I am not trying to be one) I am simply wondering about the timing issues they raised regarding when to open up to opensource and to 'not program in a cave' 12:01 pm Brian Kennish: Don't think I've seen it, but those guys are on my team, so I've heard them speak a few times. Will watch it later! 12:13 pm Joachim Larsen: Thanks, please do give them my regards. 'Genius Myth' was another fine presentation, I hope they find time to do some more. I must admit I am deeply intrigued by what the reasons behind this almost cloak and dagger approach to unveiling wave could be. 12:16 pm Brian Kennish: There are no cloaks or daggers involved (there are boomerangs, though -- the Aussies seem to be into those). The client and server code to be released are deeply tied to proprietary Google infrastructure. We have to abstract out the Google parts before the code can be useful to anyone else and there's lots of code to do that for. 12:17 pm Joachim Larsen: I understand as much, but what I don't understand is why those parts aren't simply patented? This perhaps shows my naivete when it comes to these matters, are patents not generally considered valuable? 12:17 pm Brian Kennish: Patents take a while. I'm not sure how you could tell at this point what is or isn't being patented. 12:19 pm Joachim Larsen: Nod. Then I am confused. Why didn't you simply wait until the patents(if any) were settled? 12:18 pm Brian Kennish: So like roughly 2016? 12:21 pm Cameron Neylon: Without wishing to put words into the Googlers mouths I'd imagine it would just take too long and they want people out there building stuff. Pulling out any of the proprietary bits means that the open-source community can go mad confidently and drive wider adoption. Besides software patents aren't worth the paper their written on - by the time they're internationally recognised the technology has moved on anyway. 12:21 pm Joachim Larsen: Well, yeah they can, expect that they will know that they are potentially implementing patented techniques when they get their 'fedones' functional. But I guess I am veering off course. Thank you for your answers. 11:36 am Vadim Barsukov: When will be added the opportunity to remove participants from the wave? API already has this feature. And how will it be done? Each participant will be able to remove any other participants? Or it can make only the waves owner? 11:44 am Brian Kennish: Although there's an API method to remove participants, it won't actually do anything if you call it. ;-) So the API is a little ahead of the client there. We have to wait for the client team to catch up. I'm pretty sure they won't by September 30th because, as you've noted, it's tricky to get it just right. 11:45 am Brian Kennish: Yup, just confirmed with a client eng -- won't be available by the 30th, but it's high priority to ship afterwards. 11:48 am Peter LaBanca: The only thing I am worried about is how much support will Wave really get from the community if it is released with all these features unavailable? Some people will not even bother with it until it is fully functional. That might be something to take into account. 11:50 am Brian Kennish: That's OK. We're trying something new here by releasing a product so early. And we definitely don't think it's for every developer. That's why we asked if you're willing to "swim with sharks" on our signup form. So it's fine by us if some developers prefer to wait till things are more stable around here. 11:51 am Peter LaBanca: That brings up another question as to who will actually be able to access the public consumer preview? You say that not all features will be available by Sept. 30th, but if you allow the public to view it without the ability to remove participants, the ability to have draft text, etc. then some people might not like the idea from the get-go 11:53 am Brian Kennish: It'll still be early adopters. I'm not sure we've defined yet who'll get accounts on the 30th, but it'll be some combination of initial signups, sandbox developers, etc. We're still dubbing the 30th release pre-beta software. 11:54 am Peter LaBanca: I strongly suggest Wave be released to friends/families of sandbox developers, as these are the people we talk to most, thereby being able to really test the software 11:55 am Brian Kennish: We will have an invite system, similar to that Gmail had. I.e., each person that gets an account will also get N invites so they have people to talk to. 12:05 pm Egon Willighagen: I'd love that... with my science peers I will not use the Map much... but with family I would :) 11:57 am Peter LaBanca: Well I look forward to the 30th to really see the power of Wave. Google has definitely out-done themselves. Let's just hope Wave holds up to "real-life" situations, for lack of a better term. 11:43 am Peter LaBanca: I think an admin system of sorts would be good for removing participants of a wave. The wave owner would be the "superadmin" with the ability to add other admins. Only the admins could then remove participants from the wave. 11:32 am Egon Willighagen: Another question... I have been working on webservices using XMPP (the IO-DATA XEP)... and since Wave is XMPP based too, I was wondering what I would need to do, to call such services directly, and bypassing a robot (which could wrap any service, of course...) ... A gadget perhaps? 12:02 pm Cameron Neylon: Hey Egon, I was looking for what I thought was a partial answer to your question but can't find it now. But I think its the case at the moment that there isn't any way of talking direct to the server i.e. speaking the protocol directly. But the Googlers will have a better idea of that no doubt. A gadget can only see its local data and context so it can't poll any external service. And the Robots can only currently talk to the wave through a proxy. As I understand it anyway. 11:22 am Rick Bullotta: Will the issues with robot-created blips getting a "temporary" id instead of permanent id be addressed in the Sept 30th build? 11:25 am Marcel Prasetya: Not for the event based communication model (the current one). We are working on opening up a JSON-RPC endpoint, that will allow you to make an RPC call like wavelet.appendBlip() to our server, and we will send back a response with the new blip metadata. 11:26 am Rick Bullotta: i suppose that a "mixed mode" would be common, then, where you detect an event via the existing mechanism, then use the JSON-RPC interface to do any actual manipulation? 11:27 am Marcel Prasetya: You can still use the existing mechanism to modify blip content if the blip is part of the event bundle. With this JSON-RPC mechanism, you can eventually fetch a wave that is not part of the event. 11:28 am Rick Bullotta: I have a use case where i need to persist the ID of a blip I create in response to an event - and the "callback" with a data document approach is ugly. any chance of providing a "sync" or other mechanism to enable the existing API to get a permanent ID? or will the mixed mode approach using some JSON-RPC calls the best solution? 11:32 am Marcel Prasetya: In the future, we are thinking of making a "two-way" communication between the robot proxy and the robot, something like: 1. Robot proxy sends events to the robot 2. Robot responds with ops 3. Robot proxy replies with error codes or other data (in the case of createFoo(), maybe responds with foo metadata) I suppose that would be a better solution that the "mixed-mode", although it can be a stop-gap until we have that ready. 11:33 am Rick Bullotta: sounds good. thanks. 11:33 am Marcel Prasetya: You're welcome. 11:39 am Vadim Barsukov: Robots represent a serious potential threat to the security and privacy. What is the policy Google about this? Will the possibility of certification, digital signature, etc.? 11:35 am Brian Kennish: To misquote Lars, we decided not to have any security or privacy problems. Actually, it's something we've been thinking about for a while and just started working on. It obviously won't be fully baked, but I think you'll start seeing some stuff in the next few weeks. 11:35 am Rick Bullotta: I'll be sure to misquote that one frequently - love it! ;-) 11:36 am Marcel Prasetya: Plus, we've decided not to have spam :) j/k 11:20 am Dionysios Synodinos: I've heard Sept. 30th mentioned as the expected date for the public consumer preview - at what point do you expect the API to be more solidified? Am I correct in assuming that the API is still subject to change between now and release? (obvious answer is yes, that's why it's a dev preview :) ) 11:21 am Peter LaBanca: Although I am not a Googler, I would say yes, it will change, but the change will not be substantial enough to break any of the robots code. Any existing API will remain intact, with the modifications of some classes and the addition of others. 11:27 am Brian Kennish: The robot API has been and will continue to change a bunch. That's why we're encouraging developers not to use the API directly, but rather use the Java or Python client library. We're keeping those relatively stable and backwards compatible. The main focus of September 30th is getting the client faster and more stable for consumers. But we also want to solidify the wire protocol so developers can start using it directly (and write additional libraries on top of it). [The part I started saying before ...] I'm not sure if we'll hit the 30th for the solidifying of the wire protocol, but it is a stretch goal. 11:28 am Dionysios Synodinos: Ahh, ok - that's good to know. Is there a published roadmap for when these sorts of goals are targeted for? For instance, knowing that the wire protocol will be solidified, and then possibly some of the robot APIs after that, is helpful from an API user perspective as it allows us to better focus on areas that are considered more stable for testing (as opposed to areas that are under heavy flux). 11:28 am Brian Kennish: Yes, we posted a roadmap of sorts on the forum (but it doesn't have specific dates attached). Let me see if I can find it ... 11:39 am Brian Kennish: Due to Groups suckiness, I can't find a link. But, due to Gmail awesomeness, I can tell you it's called "About what's coming in the API" and was posted by Douwe. If Groups starts cooperating with me later, I paste an actual link. 11:40 am Peter LaBanca: The link is: http://groups.google.com/group/google-wave-api/browse_thread/thread/66cbc0a044da69cc/19016ddcf316a917?lnk=gst&q=Douwe#19016ddcf316a917 11:40 am Brian Kennish: Thank you! 11:29 am Dionysios Synodinos: thanks! 11:19 am Brian Kennish: And it's not a *beta* -- it's a consumer preview. 10:55 am Egon Willighagen: I was wondering where I can find an overview of 1) recognized annotations and 2) HTML markup I can add to blips... 10:59 am Brian Kennish: Hey Egon, we don't have this stuff documented, but we should ... I don't think it's been requested in our issue tracker, so you might want to add it. 11:00 am Egon Willighagen: what issue tracker should I post too? I guess the one on code.google.com I guess? 11:00 am Brian Kennish: Right: http://code.google.com/p/google-wave-resources/issues/list. 11:40 am Egon Willighagen: thanx Filed as bug #194. 11:52 am Cameron Neylon: There is a wider point in here somewhere about trying to communicate re: Annotation spaces in general. I guess there will be quite a lot of names effectively "reserved" for in client markup and lots of people potentially heading in different directions. Might be nice to have some sort of way of sharing intentions at least. 11:00 am David Byttow: I would like to also note that the HTML markup operation is using the same mechanism that we use for pasting HTML into the editor. Not sure how much that helps, but you can safely assume that whatever is interpreted when pasting would be allowed with append markup. 11:04 am Brian Kennish: Also, the markup appendMarkup supports pretty much matches up with the buttons that appear on the editor toolbar -- bold, italic, etc. 11:09 am Peter LaBanca: How would we go about using HTML markup for colors? I know that the bold, italic, and other StyleText properties work, but nothing works for color. I tried the <font> tag which I know is deprecated, I tried adding CSS styles to <b> tags, all with no success. 11:10 am David Byttow: font/color should work, but may not. CSS is not quite supported fully. Colors should typically be done with annotations, we have some stuff coming online with more support for styles. 11:11 am Peter LaBanca: Yea, right now my robot does use annotations to color text, but I was hoping to be able to use HTML to be able to reuse the coloring code in multiple applications without needing to recode the coloring class. 11:11 am David Byttow: There's a tradeoff there that I'm sure you're aware of. If you're just replacing content with new content, then you can just use append markup, but when it comes to changing the color of bits of text, you're better off using annotations and just modifying the ranges, etc. 11:12 am David Byttow: Annotations are going to become very important in the near future, I believe. Especially when we allow for events based on annotated bits of text, so you don't have to do manual string searches and allow for finer control of text twiddling in large content (if that makes sense). 11:14 am Peter LaBanca: Yea, annotations are very useful. I was just curious about the different ways to add annotations (such as the use of appendMarkup to convert to annotations). I guess I will just stick to using setAnnotation in my robot until the API expands with more functionality. 11:14 am David Byttow: Thanks. 11:01 am Egon Willighagen: I tried pasting in <sub>, but it came out as plain text... as < that is... I guess I should have namespaced the elements, but was also not entirely sure which method I was supposed to use to add it snippet to the blip... 11:00 am David Byttow: I can tell you right now that <sub> is not supported. Adding it to issue tracker is probably a good idea. 11:01 am Egon Willighagen: Will do. Quite important to scientific applications, I'd say :) 11:02 am Egon Willighagen: That makes me wonder... There is some JS for MathML support in Firefox... would such a think actually work in a Wave set up? ... just pondering here... 11:03 am Egon Willighagen: we used that at http://qsar.sourceforge.net/dicts/qsar-descriptors/index.xhtml 11:04 am David Byttow: I'm not familiar, but you could potentially surface this with a gadget, I'm assuming. 10:55 am Egon Willighagen: Regarding the HTML, I would like to sub- and superscript parts of the text, e.g. in molecular formula. 10:53 am Vadim Barsukov: Hello! Already i can ask questions? My robot needs to have access to the tags of the wave. At least be able to read the tags. Is there this feature in the robot's Wave API (Java)? 10:59 am David Byttow: Not yet, unfortunately. Tags will be part of a data document but we have not yet supported this in the API. Do you have any use cases for this? 11:02 am Vadim Barsukov: Well, for example, search for waves, marked by the specified tag. Will robot access to the lists of the waves? Searches, move to folders, read/unread, mute/unmute, archive etc. 11:03 am David Byttow: Robots will have access to waves that they are a part of (as part of our JSON rpc protocol revamp). We have some plans for accessing waves that you are a part of, too, but that is further out. 11:03 am Monika Adamczyk: Are there any plan to implement wave reorganization capabilities. E.g. if I have nested blips and I need to change their order, will it be possible to do it. 11:04 am David Byttow: This is more of a client-related question it seems. Our client team does have plans for implementing this "wave surgery", allowing you to re-order blips, etc. 11:04 am Monika Adamczyk: That could be also robot capability. After all, robot is just a participant that could do some cleanup work on a wave. 11:16 am Vadim Barsukov: Can a robot to manage the client interface? For example, add items to the menu (like gadgets extension API) or modify saved searches? If it's can do a human-participant, why this not can а robo-participant? :) 11:17 am Brian Kennish: No, or more accurately, not yet -- it's something we've been thinking about. 11:17 am Vadim Barsukov: A practical example: when installing the robot is also need to automatically add gadgets and items in the menu. Can a robot do it myself? 11:05 am David Byttow: Well, yes, in fact, it may be able to be done today. Inline blips are anchored by an element tag, this could technically be re-ordered. But once we support it in the client, we will look at allowing robot access too, which will probably be pretty simple. --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Google Wave API" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/google-wave-api?hl=en -~----------~----~----~----~------~----~------~--~---
