Hi Dave,

On Tue, Jun 5, 2018 at 4:56 PM, Dave Page <dp...@pgadmin.org> wrote:

> Hi
>
> On Tue, Jun 5, 2018 at 9:50 AM, Aditya Toshniwal <aditya.toshniwal@
> enterprisedb.com> wrote:
>
>> Hi Hackers,
>>
>> PFA updated patch. The sqleditor change is sent separately and removed
>> from current patch as suggested.
>> The test cases were running fine when the module was specified using
>> --pkg but were failing in complete run. Fixed that.
>>
>
> I did a quick test by creating a SQL_ASCII database containing a simple
> table:
>
> CREATE TABLE sql_ascii (id serial primary key, data text);
>
> And then populated it with data:
>
> /Library/PostgreSQL/9.4/bin/psql -d sql_ascii -U postgres -c "INSERT INTO
> sql_acsii (data) VALUES ('[Windows-1252]   Euro: \x80   Double dagger:
> \x87');"
> /Library/PostgreSQL/9.4/bin/psql -d sql_ascii -U postgres -c "INSERT INTO
> sql_ascii (data) VALUES ('[Latin-1]   Yen: \xa5   Half: \xbd');"
> /Library/PostgreSQL/9.4/bin/psql -d sql_ascii -U postgres -c "INSERT INTO
> sql_ascii (data) VALUES ('[Japanese]   Ship: \xe8\x88\xb9');"
> /Library/PostgreSQL/9.4/bin/psql -d sql_ascii -U postgres -c "INSERT INTO
> sql_ascii (data) VALUES ('[Invalid UTF-8]  Blob: \xf4\xa5\xa3\xa5');"
>
> I then right-clicked the table in the treeview, and selected the option to
> view all rows, and immediately saw an error:
>
> 2018-06-05 12:23:27,319: SQL pgadmin: Execute (async) for server #1 -
> CONN:1187535 (Query-id: 8522474):
> SELECT * FROM public.sql_ascii
> ORDER BY id ASC
> 2018-06-05 12:23:27,320: ERROR pgadmin: Failed to execute query
> (execute_async) for the server #1 - CONN:1187535(Query-id: 8522474):
> Error Message:ERROR:  invalid byte sequence for encoding "UTF8": 0x80
> SQL state: 22021
>
> Running "SELECT * FROM sql_ascii" in the query tool resulted in the same
> error, however, if I ran "SET client_encoding = 'SQL_ASCII';" first, I do
> see results.
>
> I have confirmed that I've restarted the server after applying the patch.
>
> What am I missing? Why don't we just set the client_encoding to SQL_ASCII
> if it's a SQL_ASCII database?
>

It is by default same as the server encoding. But, the following existing
code in  web/pgadmin/utils/driver/psycopg2/connection.py makes the
client_encoding as UNICODE for every connection. I am not sure it should be
removed.

        status = _execute(cur, "SET DateStyle=ISO;"

                               "SET client_min_messages=notice;"

                               "SET bytea_output=escape;"

                               "SET client_encoding='UNICODE';")


Note that this testing was on Python 2.7.10 on MacOS.
>
>
>>
>> Kindly review.
>>
>> Thanks and Regards,
>> Aditya Toshniwal
>> Software Engineer | EnterpriseDB Software Solutions | Pune
>> "Don't Complain about Heat, Plant a tree"
>>
>> On Tue, Jun 5, 2018 at 10:15 AM, Aditya Toshniwal <
>> aditya.toshni...@enterprisedb.com> wrote:
>>
>>> Hi
>>>
>>> On Tue, Jun 5, 2018 at 1:08 AM, Joao De Almeida Pereira <
>>> jdealmeidapere...@pivotal.io> wrote:
>>>
>>>> Hello Aditya,
>>>>
>>>>>
>>>>>
>>>>> There is no change related to notifications in this patch.
>>>>> The below code is minor fix related to connection status of sql
>>>>> editor. Can you please share the code snippet if it is not the below.
>>>>>
>>>>> -        # Check for the asynchronous notifies statements.
>>>>> -        conn.check_notifies(True)
>>>>> -        notifies = conn.get_notifies()
>>>>> +        if status is not None:
>>>>> +            # Check for the asynchronous notifies statements.
>>>>> +            conn.check_notifies(True)
>>>>> +            notifies = conn.get_notifies()
>>>>>
>>>>>
>>>> This is a minor fix, but is it related to querying SQL_ASCII database?
>>>>
>>> No its not. It is something I found when I was working on SQL_ASCII
>>> related changes.
>>> Well then, will send a separate patch for it.
>>>
>>>>
>>>> Thanks
>>>> Victoria && Joao
>>>>
>>>
>>>
>>
>
>
> --
> Dave Page
> Blog: http://pgsnake.blogspot.com
> Twitter: @pgsnake
>
> EnterpriseDB UK: http://www.enterprisedb.com
> The Enterprise PostgreSQL Company
>

Reply via email to