Thanks for the response, Brian :) After a bit more debugging and research, I found the problem. In hindsight it's obvious, but I was putting too much faith into how the characters were being encoded. That "%65E5%672C%72AC" was completely incorrect, and instead the individual bytes needed to be encoded and sent to twitter. Once I changed the oAuth code to do that, it's working flawlessly. Thanks again for your response!
On Sep 18, 2:59 pm, "Brian Smith" <br...@briansmith.org> wrote: > Mageuzi wrote: > > I'm sorry for posting a follow up so soon, but I spent another few > > hours trying to debug this again last night, and still without > > success. It seems to be encoding the characters properly (%65E5%672C > > %72AC in this case), and so I assume it is generating the signature > > properly. After all, it works perfectly fine with English characters. > > So any guidance would be much appreciated, I'm running out of things > > to check. > > > Thank you again in advance. > > It probably isn't generating the signature properly. Try using a different > library to post the same message and likely you will find that they are > calculating the signature differently. Calculating bad signatures for > non-ASCII characters is probably the most common bug in OAuth libraries, > because the authors often test with ASCII characters but not non-ASCII > characters. If the library you are using has a mechanism for you to get the > signature base string, use that mechanism to retrieve it and post it here. > > Also, try using a different OAuth library. > > がんばってください! > > - Brian