Hi,

PFA rebased patch RM1405 (version 4)

Neel will be sending patch for QT issue for same functionality.

-- 
*Harshal Dhumal*
*Software Engineer*

EnterpriseDB India: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

On Mon, Jul 18, 2016 at 4:02 PM, Dave Page <dp...@pgadmin.org> wrote:

> Perfect, thanks!
>
> On Mon, Jul 18, 2016 at 11:24 AM, Neel Patel <neel.pa...@enterprisedb.com>
> wrote:
>
>> Yes. I am just testing with different OS. I will send it next 1-2 hour.
>>
>> Is it fine ?
>>
>> Thanks,
>> Neel Patel
>>
>> On Mon, Jul 18, 2016 at 3:53 PM, Dave Page <dp...@pgadmin.org> wrote:
>>
>>> Any chance of getting that in the next couple of hours so I can get it
>>> into beta 3?
>>>
>>> On Mon, Jul 18, 2016 at 11:18 AM, Dave Page <dp...@pgadmin.org> wrote:
>>>
>>>> Thanks!
>>>>
>>>> On Mon, Jul 18, 2016 at 10:42 AM, Neel Patel <
>>>> neel.pa...@enterprisedb.com> wrote:
>>>>
>>>>> Hi Dave,
>>>>>
>>>>> Yes, it will break in runtime. If we need to support downloading at
>>>>> server side then we need to change the runtime code.
>>>>> I will send the patch for runtime to support server side download CSV
>>>>> file.
>>>>>
>>>>> Thanks,
>>>>> Neel Patel
>>>>>
>>>>> On Fri, Jul 15, 2016 at 3:05 PM, Dave Page <dp...@pgadmin.org> wrote:
>>>>>
>>>>>> Hi
>>>>>>
>>>>>> This seems to break downloads in the runtime. It works in Safari and
>>>>>> Chrome, but in the runtime the suggested filename is
>>>>>> "6980287?query=SELECT+*+FROM+pem.probe_column%0AORDER+BY+id%0AASC+&filename=probe_column.csv",
>>>>>> and after changing that and hitting OK, the "Downloading file" dialogue 
>>>>>> is
>>>>>> shown indefinitely.
>>>>>>
>>>>>> On Wed, Jul 13, 2016 at 1:16 PM, Harshal Dhumal <
>>>>>> harshal.dhu...@enterprisedb.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> PFA rebased patch for RM1405
>>>>>>>
>>>>>>> --
>>>>>>> *Harshal Dhumal*
>>>>>>> *Software Engineer*
>>>>>>>
>>>>>>> EnterpriseDB India: http://www.enterprisedb.com
>>>>>>> The Enterprise PostgreSQL Company
>>>>>>>
>>>>>>> On Wed, Jul 13, 2016 at 5:15 PM, Harshal Dhumal <
>>>>>>> harshal.dhu...@enterprisedb.com> wrote:
>>>>>>>
>>>>>>>> Yes sure
>>>>>>>>
>>>>>>>> --
>>>>>>>> *Harshal Dhumal*
>>>>>>>> *Software Engineer*
>>>>>>>>
>>>>>>>> EnterpriseDB India: http://www.enterprisedb.com
>>>>>>>> The Enterprise PostgreSQL Company
>>>>>>>>
>>>>>>>> On Wed, Jul 13, 2016 at 5:10 PM, Dave Page <dp...@pgadmin.org>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi
>>>>>>>>>
>>>>>>>>> Can you rebase this please?
>>>>>>>>>
>>>>>>>>> Thanks.
>>>>>>>>>
>>>>>>>>> On Mon, Jul 11, 2016 at 9:16 AM, Harshal Dhumal <
>>>>>>>>> harshal.dhu...@enterprisedb.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> PFA patch for RM1405. Also to allow download in runtime we will
>>>>>>>>>> still need patch
>>>>>>>>>> <https://www.postgresql.org/message-id/CACCA4P3JOe40WYMGjhpSWYGR%3DWuvRbbp2gfDKLnU%2B1rXuW9Www%40mail.gmail.com>
>>>>>>>>>> sent by Neel.
>>>>>>>>>>
>>>>>>>>>> Changes: To download query result to CSV directly from server and
>>>>>>>>>> not to use download attr. of anchor tag (<a>) as it's not support by 
>>>>>>>>>> all of
>>>>>>>>>> major browsers (e.g. Safari).
>>>>>>>>>> Also it's not feasible to load data in html to download if result
>>>>>>>>>> set of query is very huge (in GBs).
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> *Harshal Dhumal*
>>>>>>>>>> *Software Engineer*
>>>>>>>>>>
>>>>>>>>>> EnterpriseDB India: http://www.enterprisedb.com
>>>>>>>>>> The Enterprise PostgreSQL Company
>>>>>>>>>>
>>>>>>>>>> On Wed, Jun 29, 2016 at 4:53 PM, Akshay Joshi <
>>>>>>>>>> akshay.jo...@enterprisedb.com> wrote:
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Wed, Jun 29, 2016 at 3:52 PM, Murtuza Zabuawala <murtuza.
>>>>>>>>>>> zabuaw...@enterprisedb.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Yes, It will not work in runtime as well but I think Neel is
>>>>>>>>>>>> working for fix in run time for this issue, We might able to fix 
>>>>>>>>>>>> it in run
>>>>>>>>>>>> time but issue persists in Safari unless they add support in 
>>>>>>>>>>>> browser itself.
>>>>>>>>>>>>
>>>>>>>>>>>> https://webkit.org/status/#feature-download-attribute
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> > On 29-Jun-2016, at 3:40 pm, Dave Page <dp...@pgadmin.org>
>>>>>>>>>>>> wrote:
>>>>>>>>>>>> >
>>>>>>>>>>>> > On Tue, Jun 28, 2016 at 10:33 AM, Murtuza Zabuawala
>>>>>>>>>>>> > <murtuza.zabuaw...@enterprisedb.com> wrote:
>>>>>>>>>>>> >> Yes Dave, I agree that downloading files has been supported
>>>>>>>>>>>> in browsers since long .
>>>>>>>>>>>> >>
>>>>>>>>>>>> >> But in general we send request & then receives files from
>>>>>>>>>>>> web server but in our case we are fetching our data from Backbone 
>>>>>>>>>>>> models &
>>>>>>>>>>>> then converting it to CSV format for downloading as a file at 
>>>>>>>>>>>> client side
>>>>>>>>>>>> in browser itself.
>>>>>>>>>>>> >
>>>>>>>>>>>> > If Safari doesn't support client-side saving of files, then I
>>>>>>>>>>>> have to
>>>>>>>>>>>> > wonder if our runtime will either - both are webkit based.
>>>>>>>>>>>> >
>>>>>>>>>>>> > So I guess the next question to ask is; why don't we just
>>>>>>>>>>>> generate the
>>>>>>>>>>>> > CSV on the server side?
>>>>>>>>>>>> >
>>>>>>>>>>>>
>>>>>>>>>>>> @Akshay,
>>>>>>>>>>>> Can you please suggest on above?
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>    As we already have complete data in backgrid's full
>>>>>>>>>>> collection, so I have used it instead of fetching it again.
>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> > --
>>>>>>>>>>>> > Dave Page
>>>>>>>>>>>> > Blog: http://pgsnake.blogspot.com
>>>>>>>>>>>> > Twitter: @pgsnake
>>>>>>>>>>>> >
>>>>>>>>>>>> > EnterpriseDB UK: http://www.enterprisedb.com
>>>>>>>>>>>> > The Enterprise PostgreSQL Company
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> Sent via pgadmin-hackers mailing list (
>>>>>>>>>>>> pgadmin-hackers@postgresql.org)
>>>>>>>>>>>> To make changes to your subscription:
>>>>>>>>>>>> http://www.postgresql.org/mailpref/pgadmin-hackers
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>> *Akshay Joshi*
>>>>>>>>>>> *Principal Software Engineer *
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> *Phone: +91 20-3058-9517 <%2B91%2020-3058-9517>Mobile: +91
>>>>>>>>>>> 976-788-8246*
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Dave Page
>>>>>>>>> Blog: http://pgsnake.blogspot.com
>>>>>>>>> Twitter: @pgsnake
>>>>>>>>>
>>>>>>>>> EnterpriseDB UK: http://www.enterprisedb.com
>>>>>>>>> The Enterprise PostgreSQL Company
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Dave Page
>>>>>> Blog: http://pgsnake.blogspot.com
>>>>>> Twitter: @pgsnake
>>>>>>
>>>>>> EnterpriseDB UK: http://www.enterprisedb.com
>>>>>> The Enterprise PostgreSQL Company
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Dave Page
>>>> Blog: http://pgsnake.blogspot.com
>>>> Twitter: @pgsnake
>>>>
>>>> EnterpriseDB UK: http://www.enterprisedb.com
>>>> The Enterprise PostgreSQL Company
>>>>
>>>
>>>
>>>
>>> --
>>> Dave Page
>>> Blog: http://pgsnake.blogspot.com
>>> Twitter: @pgsnake
>>>
>>> EnterpriseDB UK: http://www.enterprisedb.com
>>> The Enterprise PostgreSQL Company
>>>
>>
>>
>
>
> --
> Dave Page
> Blog: http://pgsnake.blogspot.com
> Twitter: @pgsnake
>
> EnterpriseDB UK: http://www.enterprisedb.com
> The Enterprise PostgreSQL Company
>
diff --git a/web/pgadmin/tools/datagrid/templates/datagrid/index.html b/web/pgadmin/tools/datagrid/templates/datagrid/index.html
index f2f6085..67cf107 100644
--- a/web/pgadmin/tools/datagrid/templates/datagrid/index.html
+++ b/web/pgadmin/tools/datagrid/templates/datagrid/index.html
@@ -182,6 +182,7 @@
             </div>
         </div>
         <div id="editor-panel"></div>
+        <iframe id="download-csv" style="display:none"></iframe>
     </div>
 </div>
 {% endblock %}
diff --git a/web/pgadmin/tools/sqleditor/__init__.py b/web/pgadmin/tools/sqleditor/__init__.py
index acaca18..7829c2c 100644
--- a/web/pgadmin/tools/sqleditor/__init__.py
+++ b/web/pgadmin/tools/sqleditor/__init__.py
@@ -28,6 +28,7 @@ from pgadmin.utils.sqlautocomplete.autocomplete import SQLAutoComplete
 
 from config import PG_DEFAULT_DRIVER
 
+
 # import unquote from urlib for python2.x and python3.x
 try:
     from urllib import unquote
@@ -1215,3 +1216,58 @@ def save_file():
             'status': True,
         }
     )
+
+
+@blueprint.route('/query_tool/download/<int:trans_id>', methods=["GET"])
+@login_required
+def start_query_download_tool(trans_id):
+    sync_conn = None
+    status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
+
+    if status and conn is not None \
+            and trans_obj is not None and session_obj is not None:
+
+        data = request.args if request.args else None
+        try:
+            if data and 'query' in data:
+                sql = data['query']
+                conn_id = str(random.randint(1, 9999999))
+                sync_conn = conn.manager.connection(
+                    did=trans_obj.did,
+                    conn_id=conn_id,
+                    auto_reconnect=False,
+                    async=False
+                )
+
+                sync_conn.connect(autocommit=False)
+
+                # This returns generator of records.
+                status, gen = sync_conn.execute_on_server_as_csv(sql, records=2000)
+
+                if not status:
+                    conn.manager.release(conn_id=conn_id, did=trans_obj.did)
+                    return internal_server_error(errormsg=str(gen))
+
+                def cleanup():
+                    conn.manager.connections[sync_conn.conn_id]._release()
+                    del conn.manager.connections[sync_conn.conn_id]
+
+                r = Response(gen(), mimetype='text/csv')
+
+                if 'filename' in data and data['filename'] != "":
+                    filename = data['filename']
+                else:
+                    import time
+                    filename = str(int(time.time())) + ".csv"
+
+                r.headers["Content-Disposition"] = "attachment;filename={0}".format(filename)
+
+                r.call_on_close(cleanup)
+
+                return r
+
+        except Exception as e:
+            conn.manager.release(conn_id=conn_id, did=trans_obj.did)
+            return internal_server_error(errormsg=str(e))
+    else:
+        return internal_server_error(errormsg=gettext("Transaction status check failed."))
diff --git a/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js b/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
index 036b07d..5544135 100644
--- a/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
+++ b/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
@@ -2602,76 +2602,63 @@ define(
 
         // This function will download the grid data as CSV file.
         _download: function() {
-          var self = this;
-          var coll = self.collection.fullCollection === undefined ? self.collection : self.collection.fullCollection;
-
-          if (self.columns != undefined &&
-              coll != undefined &&
-              coll.length > 0)
-          {
-            var csv_col = _.indexBy(self.columns, 'name'),
-                labels = _.pluck(self.columns, 'label'),
-                keys = _.pluck(self.columns, 'name');
-
-            // Fetch the items from fullCollection and convert it as csv format
-            var csv = keys.join(',') + '\n';
-            csv += coll.map(function(item) {
-                return _.map(keys, function(key) {
-                  var cell = csv_col [key].cell,
-                      // suppose you want to preserve custom formatters
-                      formatter = cell.prototype && cell.prototype.formatter;
-
-                  return formatter && formatter.fromRaw ?
-                            formatter.fromRaw(item.get(key), item) : item.get(key);
-                }).join(',');
-            }).join('\n');
-
-            // Download the file.
-            var encodedUri = encodeURI('data:text/csv&charset=utf-8&filename=download.csv&value=' + csv),
-                    link = document.createElement('a');
-            link.setAttribute('href', encodedUri);
-
-            /* If download is from view data then file name should be
-             * the object name for which data is to be displayed.
-             */
-            if (!self.is_query_tool) {
-              $.ajax({
-                url: "{{ url_for('sqleditor.index') }}" + "object/get/" + self.transId,
-                method: 'GET',
-                success: function(res) {
-                  if (res.data.status) {
-                    filename = res.data.result + '.csv';
-                    link.setAttribute('download', filename);
-                    link.click();
-                  }
-                },
-                error: function(e) {
-                  if (e.readyState == 0) {
-                    alertify.alert('Get Object Name Error',
-                     '{{ _('Not connected to the server or the connection to the server has been closed.') }}'
-                    );
-                    return;
-                  }
+          var self = this,
+          selected_code = self.gridView.query_tool_obj.getSelection(),
+          sql = "";
 
-                  var msg = e.responseText;
-                  if (e.responseJSON != undefined &&
-                      e.responseJSON.errormsg != undefined)
-                    msg = e.responseJSON.errormsg;
+          if (selected_code.length > 0)
+            sql = selected_code;
+          else
+            sql = self.gridView.query_tool_obj.getValue();
 
-                  alertify.alert('Get Object Name Error', msg);
+          // If it is an empty query, do nothing.
+          if (sql.length <= 0) return;
+
+          /* If download is from view data then file name should be
+           * the object name for which data is to be displayed.
+           */
+          if (!self.is_query_tool) {
+            $.ajax({
+              url: "{{ url_for('sqleditor.index') }}" + "object/get/" + self.transId,
+              method: 'GET',
+              success: function(res) {
+                if (res.data.status) {
+                  filename = res.data.result + '.csv';
+                  self._trigger_csv_download(sql, filename);
+                 }
+              },
+              error: function(e) {
+                if (e.readyState == 0) {
+                  alertify.alert('Get Object Name Error',
+                   '{{ _('Not connected to the server or the connection to the server has been closed.') }}'
+                  );
+                  return;
                 }
-              });
-            }
-            else {
-              var cur_time = new Date();
-              var filename = 'data-' + cur_time.getTime() + '.csv';
-              link.setAttribute('download', filename);
-              link.click();
-            }
-          }
-          else {
-            alertify.alert('Download Data', 'No data is available to download');
-          }
+
+                var msg = e.responseText;
+                if (e.responseJSON != undefined &&
+                    e.responseJSON.errormsg != undefined)
+                  msg = e.responseJSON.errormsg;
+
+                alertify.alert('Get Object Name Error', msg);
+              }
+            });
+           }
+           else {
+            var cur_time = new Date();
+            var filename = 'data-' + cur_time.getTime() + '.csv';
+            self._trigger_csv_download(sql, filename);
+           }
+
+        },
+        // Trigger query result download to csv.
+        _trigger_csv_download: function(query, filename) {
+          var self = this,
+            link = $(this.container).find("#download-csv"),
+            url = "{{ url_for('sqleditor.index') }}" + "query_tool/download/" + self.transId;
+
+          url +="?" + $.param({query:query, filename:filename});
+          link.attr("src", url);
         },
 
         _auto_rollback: function() {
diff --git a/web/pgadmin/utils/driver/psycopg2/__init__.py b/web/pgadmin/utils/driver/psycopg2/__init__.py
index ec0a8fc..7b901d4 100644
--- a/web/pgadmin/utils/driver/psycopg2/__init__.py
+++ b/web/pgadmin/utils/driver/psycopg2/__init__.py
@@ -17,6 +17,8 @@ import datetime
 import os
 import random
 import select
+import sys
+import csv
 
 import psycopg2
 import psycopg2.extras
@@ -32,6 +34,11 @@ from .keywords import ScanKeyword
 from ..abstract import BaseDriver, BaseConnection
 from .cursor import DictCursor
 
+if sys.version_info < (3,):
+    from StringIO import StringIO
+else:
+    from io import StringIO
+
 _ = gettext
 
 ASYNC_WAIT_TIMEOUT = 0.01  # in seconds or 10 milliseconds
@@ -284,7 +291,10 @@ Failed to connect to the database server(#{server_id}) for connection ({conn_id}
         # autocommit flag does not work with asynchronous connections.
         # By default asynchronous connection runs in autocommit mode.
         if self.async == 0:
-            self.conn.autocommit = True
+            if 'autocommit' in kwargs and kwargs['autocommit'] == False:
+                self.conn.autocommit = False
+            else:
+                self.conn.autocommit = True
             register_date_typecasters(self.conn)
 
         status, res = self.execute_scalar("""
@@ -384,11 +394,12 @@ WHERE
 
         return True, None
 
-    def __cursor(self):
+    def __cursor(self, server_cursor=False):
         cur = getattr(g, str(self.manager.sid) + '#' + self.conn_id, None)
 
         if self.connected() and cur and not cur.closed:
-            return True, cur
+            if not server_cursor or (server_cursor and cur.name):
+                return True, cur
 
         if not self.connected():
             status = False
@@ -419,7 +430,13 @@ Attempt to reconnect failed with the error:
                 return False, msg
 
         try:
-            cur = self.conn.cursor(cursor_factory=DictCursor)
+            if server_cursor:
+                # Providing name to cursor will create server side cursor.
+                cursor_name = "CURSOR:{0}".format(self.conn_id)
+                cur = self.conn.cursor(name=cursor_name,
+                                       cursor_factory=DictCursor)
+            else:
+                cur = self.conn.cursor(cursor_factory=DictCursor)
         except psycopg2.Error as pe:
             errmsg = gettext("""
 Failed to create cursor for psycopg2 connection with error message for the \
@@ -471,6 +488,76 @@ Attempt to reconnect it failed with the error:
         if self.async == 1:
             self._wait(cur.connection)
 
+
+    def execute_on_server_as_csv(self, query, params=None, formatted_exception_msg=False, records=2000):
+        status, cur = self.__cursor(server_cursor=True)
+        self.row_count = 0
+
+        if not status:
+            return False, str(cur)
+        query_id = random.randint(1, 9999999)
+
+        current_app.logger.log(25,
+                               "Execute (with server cursor) for server #{server_id} - {conn_id} (Query-id: {query_id}):\n{query}".format(
+                                   server_id=self.manager.sid,
+                                   conn_id=self.conn_id,
+                                   query=query,
+                                   query_id=query_id
+                               )
+                               )
+        try:
+            self.__internal_blocking_execute(cur, query, params)
+        except psycopg2.Error as pe:
+            cur.close()
+            errmsg = self._formatted_exception_msg(pe, formatted_exception_msg)
+            current_app.logger.error(
+                "Failed to execute query ((with server cursor) for the server #{server_id} - {conn_id} (Query-id: {query_id}):\nError Message:{errmsg}".format(
+                    server_id=self.manager.sid,
+                    conn_id=self.conn_id,
+                    query=query,
+                    errmsg=errmsg,
+                    query_id=query_id
+                )
+            )
+            return False, errmsg
+
+        def gen():
+
+            results = cur.fetchmany(records)
+            if not results:
+                if not cur.closed:
+                    cur.close()
+                return
+
+            header = [c.to_dict()['name'] for c in cur.ordered_description()]
+
+            res_io = StringIO()
+
+            csv_writer = csv.DictWriter(
+                res_io, fieldnames=header, delimiter=str(','), quoting=csv.QUOTE_NONNUMERIC
+            )
+            csv_writer.writeheader()
+            csv_writer.writerows(results)
+
+            yield res_io.getvalue().strip(str('\r\n'))
+
+            while True:
+                results = cur.fetchmany(records)
+
+                if not results:
+                    if not cur.closed:
+                        cur.close()
+                    break
+                res_io = StringIO()
+
+                csv_writer = csv.DictWriter(
+                    res_io, fieldnames=header, delimiter=str(','), quoting=csv.QUOTE_NONNUMERIC
+                )
+                csv_writer.writerows(results)
+                yield res_io.getvalue().strip(str('\r\n'))
+
+        return True, gen
+
     def execute_scalar(self, query, params=None, formatted_exception_msg=False):
         status, cur = self.__cursor()
         self.row_count = 0
@@ -1151,7 +1238,8 @@ class ServerManager(object):
         raise Exception("Information is not available.")
 
     def connection(
-            self, database=None, conn_id=None, auto_reconnect=True, did=None
+            self, database=None, conn_id=None, auto_reconnect=True, did=None,
+            async=None
     ):
         msg_active_conn = gettext(
             "Server has no active connection. Please connect to the server."
@@ -1197,7 +1285,10 @@ WHERE db.oid = {0}""".format(did))
         if my_id in self.connections:
             return self.connections[my_id]
         else:
-            async = 1 if conn_id is not None else 0
+            if async is None:
+                async = 1 if conn_id is not None else 0
+            else:
+                async = 1 if async is True else 0
             self.connections[my_id] = Connection(
                 self, my_id, database, auto_reconnect, async
             )
-- 
Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgadmin-hackers

Reply via email to