Hi,

PFA patch for RM1405. Also to allow download in runtime we will still need
patch
<https://www.postgresql.org/message-id/CACCA4P3JOe40WYMGjhpSWYGR%3DWuvRbbp2gfDKLnU%2B1rXuW9Www%40mail.gmail.com>
sent by Neel.

Changes: To download query result to CSV directly from server and not to
use download attr. of anchor tag (<a>) as it's not support by all of major
browsers (e.g. Safari).
Also it's not feasible to load data in html to download if result set of
query is very huge (in GBs).





-- 
*Harshal Dhumal*
*Software Engineer*

EnterpriseDB India: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

On Wed, Jun 29, 2016 at 4:53 PM, Akshay Joshi <akshay.jo...@enterprisedb.com
> wrote:

>
>
> On Wed, Jun 29, 2016 at 3:52 PM, Murtuza Zabuawala <murtuza.zabuawala@
> enterprisedb.com> wrote:
>
>> Yes, It will not work in runtime as well but I think Neel is working for
>> fix in run time for this issue, We might able to fix it in run time but
>> issue persists in Safari unless they add support in browser itself.
>>
>> https://webkit.org/status/#feature-download-attribute
>>
>>
>> > On 29-Jun-2016, at 3:40 pm, Dave Page <dp...@pgadmin.org> wrote:
>> >
>> > On Tue, Jun 28, 2016 at 10:33 AM, Murtuza Zabuawala
>> > <murtuza.zabuaw...@enterprisedb.com> wrote:
>> >> Yes Dave, I agree that downloading files has been supported in
>> browsers since long .
>> >>
>> >> But in general we send request & then receives files from web server
>> but in our case we are fetching our data from Backbone models & then
>> converting it to CSV format for downloading as a file at client side in
>> browser itself.
>> >
>> > If Safari doesn't support client-side saving of files, then I have to
>> > wonder if our runtime will either - both are webkit based.
>> >
>> > So I guess the next question to ask is; why don't we just generate the
>> > CSV on the server side?
>> >
>>
>> @Akshay,
>> Can you please suggest on above?
>>
>
>    As we already have complete data in backgrid's full collection, so I
> have used it instead of fetching it again.
>
>>
>> > --
>> > Dave Page
>> > Blog: http://pgsnake.blogspot.com
>> > Twitter: @pgsnake
>> >
>> > EnterpriseDB UK: http://www.enterprisedb.com
>> > The Enterprise PostgreSQL Company
>>
>>
>>
>> --
>> Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
>> To make changes to your subscription:
>> http://www.postgresql.org/mailpref/pgadmin-hackers
>>
>
>
>
> --
> *Akshay Joshi*
> *Principal Software Engineer *
>
>
>
> *Phone: +91 20-3058-9517Mobile: +91 976-788-8246*
>
diff --git a/web/pgadmin/tools/datagrid/templates/datagrid/index.html b/web/pgadmin/tools/datagrid/templates/datagrid/index.html
index b138a46..fc918c4 100644
--- a/web/pgadmin/tools/datagrid/templates/datagrid/index.html
+++ b/web/pgadmin/tools/datagrid/templates/datagrid/index.html
@@ -181,6 +181,7 @@
             </div>
         </div>
         <div id="editor-panel"></div>
+        <iframe id="download-csv" style="display:none"></iframe>
     </div>
 </div>
 {% endblock %}
diff --git a/web/pgadmin/tools/sqleditor/__init__.py b/web/pgadmin/tools/sqleditor/__init__.py
index c713fb5..48fd0ba 100644
--- a/web/pgadmin/tools/sqleditor/__init__.py
+++ b/web/pgadmin/tools/sqleditor/__init__.py
@@ -28,6 +28,7 @@ from pgadmin.utils.sqlautocomplete.autocomplete import SQLAutoComplete
 
 from config import PG_DEFAULT_DRIVER
 
+
 # import unquote from urlib for python2.x and python3.x
 try:
     from urllib import unquote
@@ -1212,3 +1213,58 @@ def save_file():
             'status': True,
         }
     )
+
+
+@blueprint.route('/query_tool/download/<int:trans_id>', methods=["GET"])
+@login_required
+def start_query_download_tool(trans_id):
+    sync_conn = None
+    status, error_msg, conn, trans_obj, session_obj = check_transaction_status(trans_id)
+
+    if status and conn is not None \
+            and trans_obj is not None and session_obj is not None:
+
+        data = request.args if request.args else None
+        try:
+            if data and 'query' in data:
+                sql = data['query']
+                conn_id = str(random.randint(1, 9999999))
+                sync_conn = conn.manager.connection(
+                    did=trans_obj.did,
+                    conn_id=conn_id,
+                    auto_reconnect=False,
+                    async=False
+                )
+
+                sync_conn.connect(autocommit=False)
+
+                # This returns generator of records.
+                status, gen = sync_conn.execute_on_server_as_csv(sql, records=2000)
+
+                if not status:
+                    conn.manager.release(conn_id=conn_id, did=trans_obj.did)
+                    return internal_server_error(errormsg=str(gen))
+
+                def cleanup():
+                    conn.manager.connections[sync_conn.conn_id]._release()
+                    del conn.manager.connections[sync_conn.conn_id]
+
+                r = Response(gen(), mimetype='text/csv')
+
+                if 'filename' in data and data['filename'] != "":
+                    filename = data['filename']
+                else:
+                    import time
+                    filename = str(int(time.time())) + ".csv"
+
+                r.headers["Content-Disposition"] = "attachment;filename={0}".format(filename)
+
+                r.call_on_close(cleanup)
+
+                return r
+
+        except Exception as e:
+            conn.manager.release(conn_id=conn_id, did=trans_obj.did)
+            return internal_server_error(errormsg=str(e))
+    else:
+        return internal_server_error(errormsg=gettext("Transaction status check failed."))
diff --git a/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js b/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
index 965d56e..c355ece 100644
--- a/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
+++ b/web/pgadmin/tools/sqleditor/templates/sqleditor/js/sqleditor.js
@@ -2589,76 +2589,63 @@ define(
 
         // This function will download the grid data as CSV file.
         _download: function() {
-          var self = this;
-          var coll = self.collection.fullCollection === undefined ? self.collection : self.collection.fullCollection;
-
-          if (self.columns != undefined &&
-              coll != undefined &&
-              coll.length > 0)
-          {
-            var csv_col = _.indexBy(self.columns, 'name'),
-                labels = _.pluck(self.columns, 'label'),
-                keys = _.pluck(self.columns, 'name');
-
-            // Fetch the items from fullCollection and convert it as csv format
-            var csv = labels.join(',') + '\n';
-            csv += coll.map(function(item) {
-                return _.map(keys, function(key) {
-                  var cell = csv_col [key].cell,
-                      // suppose you want to preserve custom formatters
-                      formatter = cell.prototype && cell.prototype.formatter;
-
-                  return formatter && formatter.fromRaw ?
-                            formatter.fromRaw(item.get(key), item) : item.get(key);
-                }).join(',');
-            }).join('\n');
-
-            // Download the file.
-            var encodedUri = encodeURI('data:text/csv;charset=utf-8,' + csv),
-                    link = document.createElement('a');
-            link.setAttribute('href', encodedUri);
-
-            /* If download is from view data then file name should be
-             * the object name for which data is to be displayed.
-             */
-            if (!self.is_query_tool) {
-              $.ajax({
-                url: "{{ url_for('sqleditor.index') }}" + "object/get/" + self.transId,
-                method: 'GET',
-                success: function(res) {
-                  if (res.data.status) {
-                    filename = res.data.result + '.csv';
-                    link.setAttribute('download', filename);
-                    link.click();
-                  }
-                },
-                error: function(e) {
-                  if (e.readyState == 0) {
-                    alertify.alert('Get Object Name Error',
-                     '{{ _('Not connected to the server or the connection to the server has been closed.') }}'
-                    );
-                    return;
-                  }
+          var self = this,
+          selected_code = self.gridView.query_tool_obj.getSelection(),
+          sql = "";
 
-                  var msg = e.responseText;
-                  if (e.responseJSON != undefined &&
-                      e.responseJSON.errormsg != undefined)
-                    msg = e.responseJSON.errormsg;
+          if (selected_code.length > 0)
+            sql = selected_code;
+          else
+            sql = self.gridView.query_tool_obj.getValue();
+
+          // If it is an empty query, do nothing.
+          if (sql.length <= 0) return;
 
-                  alertify.alert('Get Object Name Error', msg);
+          /* If download is from view data then file name should be
+           * the object name for which data is to be displayed.
+           */
+          if (!self.is_query_tool) {
+            $.ajax({
+              url: "{{ url_for('sqleditor.index') }}" + "object/get/" + self.transId,
+              method: 'GET',
+              success: function(res) {
+                if (res.data.status) {
+                  filename = res.data.result + '.csv';
+                  self._trigger_csv_download(sql, filename);
                 }
-              });
-            }
-            else {
-              var cur_time = new Date();
-              var filename = 'data-' + cur_time.getTime() + '.csv';
-              link.setAttribute('download', filename);
-              link.click();
-            }
+              },
+              error: function(e) {
+                if (e.readyState == 0) {
+                  alertify.alert('Get Object Name Error',
+                   '{{ _('Not connected to the server or the connection to the server has been closed.') }}'
+                  );
+                  return;
+                }
+
+                var msg = e.responseText;
+                if (e.responseJSON != undefined &&
+                    e.responseJSON.errormsg != undefined)
+                  msg = e.responseJSON.errormsg;
+
+                alertify.alert('Get Object Name Error', msg);
+              }
+            });
           }
           else {
-            alertify.alert('Download Data', 'No data is available to download');
+            var cur_time = new Date();
+            var filename = 'data-' + cur_time.getTime() + '.csv';
+            self._trigger_csv_download(sql, filename);
           }
+
+        },
+        // Trigger query result download to csv.
+        _trigger_csv_download: function(query, filename) {
+          var self = this,
+            link = $(this.container).find("#download-csv"),
+            url = "{{ url_for('sqleditor.index') }}" + "query_tool/download/" + self.transId;
+
+          url +="?" + $.param({query:query, filename:filename});
+          link.attr("src", url);
         },
 
         _auto_rollback: function() {
diff --git a/web/pgadmin/utils/driver/psycopg2/__init__.py b/web/pgadmin/utils/driver/psycopg2/__init__.py
index 0bfcf8a..f2a878b 100644
--- a/web/pgadmin/utils/driver/psycopg2/__init__.py
+++ b/web/pgadmin/utils/driver/psycopg2/__init__.py
@@ -17,10 +17,12 @@ import datetime
 import os
 import random
 import select
+import sys
+import csv
 
 import psycopg2
 import psycopg2.extras
-from flask import g, current_app, session
+from flask import g, current_app, session, Response
 from flask.ext.babel import gettext
 from flask.ext.security import current_user
 from pgadmin.utils.crypto import decrypt
@@ -31,6 +33,11 @@ from pgadmin.model import Server, User
 from .keywords import ScanKeyword
 from ..abstract import BaseDriver, BaseConnection
 
+if sys.version_info < (3,):
+    from StringIO import StringIO
+else:
+    from io import StringIO
+
 _ = gettext
 
 ASYNC_WAIT_TIMEOUT = 0.01  # in seconds or 10 milliseconds
@@ -280,7 +287,10 @@ Failed to connect to the database server(#{server_id}) for connection ({conn_id}
         # autocommit flag does not work with asynchronous connections.
         # By default asynchronous connection runs in autocommit mode.
         if self.async == 0:
-            self.conn.autocommit = True
+            if 'autocommit' in kwargs and kwargs['autocommit'] == False:
+                self.conn.autocommit = False
+            else:
+                self.conn.autocommit = True
             register_date_typecasters(self.conn)
 
         status, res = self.execute_scalar("""
@@ -380,11 +390,12 @@ WHERE
 
         return True, None
 
-    def __cursor(self):
+    def __cursor(self, server_cursor=False):
         cur = getattr(g, str(self.manager.sid) + '#' + self.conn_id, None)
 
         if self.connected() and cur and not cur.closed:
-            return True, cur
+            if not server_cursor or (server_cursor and cur.name):
+                return True, cur
 
         if not self.connected():
             status = False
@@ -415,7 +426,13 @@ Attempt to reconnect failed with the error:
                 return False, msg
 
         try:
-            cur = self.conn.cursor(cursor_factory=psycopg2.extras.DictCursor)
+            if server_cursor:
+                # Providing name to cursor will create server side cursor.
+                cursor_name = "CURSOR:{0}".format(self.conn_id)
+                cur = self.conn.cursor(name=cursor_name,
+                                       cursor_factory=psycopg2.extras.DictCursor)
+            else:
+                cur = self.conn.cursor(cursor_factory=psycopg2.extras.DictCursor)
         except psycopg2.Error as pe:
             errmsg = gettext("""
 Failed to create cursor for psycopg2 connection with error message for the \
@@ -467,6 +484,83 @@ Attempt to reconnect it failed with the error:
         if self.async == 1:
             self._wait(cur.connection)
 
+    def execute_on_server_as_csv(self, query, params=None, formatted_exception_msg=False, records=2000):
+        status, cur = self.__cursor(server_cursor=True)
+        self.row_count = 0
+
+        if not status:
+            return False, str(cur)
+        query_id = random.randint(1, 9999999)
+
+        current_app.logger.log(25,
+                               "Execute (with server cursor) for server #{server_id} - {conn_id} (Query-id: {query_id}):\n{query}".format(
+                                   server_id=self.manager.sid,
+                                   conn_id=self.conn_id,
+                                   query=query,
+                                   query_id=query_id
+                               )
+                               )
+        try:
+            self.__internal_blocking_execute(cur, query, params)
+        except psycopg2.Error as pe:
+            cur.close()
+            errmsg = self._formatted_exception_msg(pe, formatted_exception_msg)
+            current_app.logger.error(
+                "Failed to execute query ((with server cursor) for the server #{server_id} - {conn_id} (Query-id: {query_id}):\nError Message:{errmsg}".format(
+                    server_id=self.manager.sid,
+                    conn_id=self.conn_id,
+                    query=query,
+                    errmsg=errmsg,
+                    query_id=query_id
+                )
+            )
+            return False, errmsg
+
+        import copy
+
+        def gen():
+
+            results = cur.fetchmany(records)
+            if not results:
+                if not cur.closed:
+                    cur.close()
+                return
+
+            columns = cur.description and [
+                    copy.deepcopy(desc._asdict()) for desc in cur.description
+                    ] or []
+
+            header = [c['name'] for c in columns]
+
+            args_csv_io = StringIO()
+
+            csv_writer = csv.writer(
+                args_csv_io, delimiter=str(','), quoting=csv.QUOTE_NONNUMERIC
+            )
+            csv_writer.writerow(header)
+            csv_writer.writerows(results)
+
+            yield args_csv_io.getvalue().strip(str('\r\n'))
+
+            while True:
+                results = cur.fetchmany(records)
+
+                if not results:
+                    if not cur.closed:
+                        cur.close()
+                    break
+
+                args_csv_io = StringIO()
+
+                csv_writer = csv.writer(
+                    args_csv_io, delimiter=str(','), quoting=csv.QUOTE_MINIMAL
+                )
+
+                csv_writer.writerows(results)
+                yield args_csv_io.getvalue().strip(str('\r\n'))
+
+        return True, gen
+
     def execute_scalar(self, query, params=None, formatted_exception_msg=False):
         status, cur = self.__cursor()
         self.row_count = 0
@@ -1143,7 +1237,8 @@ class ServerManager(object):
         raise Exception("Information is not available.")
 
     def connection(
-            self, database=None, conn_id=None, auto_reconnect=True, did=None
+            self, database=None, conn_id=None, auto_reconnect=True, did=None,
+            async=None
     ):
         msg_active_conn = gettext(
             "Server has no active connection. Please connect to the server."
@@ -1189,7 +1284,10 @@ WHERE db.oid = {0}""".format(did))
         if my_id in self.connections:
             return self.connections[my_id]
         else:
-            async = 1 if conn_id is not None else 0
+            if async is None:
+                async = 1 if conn_id is not None else 0
+            else:
+                async = 1 if async is True else 0
             self.connections[my_id] = Connection(
                 self, my_id, database, auto_reconnect, async
             )
-- 
Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgadmin-hackers

Reply via email to