Hi all. 

As I may have mentioned I wrote an SQL Server agent that uses sockets to listen 
for data from a client. The client first arrayEncodes an array, then encrypts 
the result using an encryption library I wrote. Then it sends the data to the 
Server Agent. 

The Agent receives the encrypted stream, decrypts it using the exact same 
library, arrayDecodes the data, then performs a query against the database and 
retrieves th data as an array. All that works. 

The agent then arrayEncodes the array and re-encrypts it using THE SAME 
LIBRARY. It then sends the data back to the client which of course, decrypts it 
and arrayDecodes it producing the resultant array. All fine and good. But wait 
there's more! 

When I run the Server Agent on my Macintosh all works as expected. I get an 
array of the queried data. When I run it on a Windows desktop however, the data 
coming back is corrupted somehow! The decryption command produces an error: 
ERROR: (SSL error: wrong final block length)

How can that be? It's the same file I just copy it to the windows machine from 
the Mac along with the encryption library. Everything should be exactly the 
same! 

Bob S


_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to